Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check for scope if test assign test scope to interdependent projects. #14

Closed
wants to merge 1 commit into from

Conversation

ScrapCodes
Copy link
Contributor

With this that test-jar struggle ends. We no longer need to detect that, I somehow "feel" if a project dependency on another project (in multiproject build) has test scope then we should do this.

@ScrapCodes ScrapCodes mentioned this pull request May 19, 2014
@@ -81,13 +83,19 @@ object MavenProjectHelper {
} yield depProject
case _ => Nil
}

def projectToClasspathDep(x: (Project, Boolean)): ClasspathDep[ProjectReference] = x match {
case (project, true) => project % "compile->compile;test->test"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

@ScrapCodes
Copy link
Contributor Author

Hey @jsuereth, Can you take another look ? does it look good now !

@metasim
Copy link
Member

metasim commented May 21, 2015

@ScrapCodes Do you think this PR is still necessary? If so, could you please add a test case that triggers the bug before the PR, and yet passes after the PR is applied?

My intuition is that messing with the translation between Maven and sbt scopes without adequate tests could miss some edge cases and potentially let issues leak through.

@JoshRosen
Copy link

Apache Spark currently uses a custom fork of the sbt-pom-reader plugin which incorporates this change and getting this patch merged upstream is probably a blocker to us moving back to the official sbt-pom-reader releases (see https://issues.apache.org/jira/browse/SPARK-14401). While I don't have time to work on this personally, I bet that we could try reverting this change in Spark's build, figure out what breaks, then use that knowledge to derive a regression test to add here.

@JoshRosen
Copy link

I attempted to use version 2.1.0-RC2 of this plugin in Spark (apache/spark#12310) and have an example of a test which fails without these changes:

[error] /home/jenkins/workspace/SparkPullRequestBuilder@2/graphx/src/test/scala/org/apache/spark/graphx/EdgeRDDSuite.scala:20: object SparkFunSuite is not a member of package org.apache.spark
[error] import org.apache.spark.SparkFunSuite
[error]        ^

In this case, we have a spark-core project whose test-jar is supposed to include SparkFunSuite and a spark-graphx project which depends on spark-core in compile scope and also depends on the spark-core test-jar in test scope:

<dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_${scala.binary.version}</artifactId>
      <version>${project.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_${scala.binary.version}</artifactId>
      <version>${project.version}</version>
      <type>test-jar</type>
      <scope>test</scope>
    </dependency>
[...]

I haven't dug in too deeply yet and will try to pull the output of dependencyTree from SBT later today, but my hunch is that this is an instance of #38 and therefore I think it might be good to produce a new pull request which more narrowly-scopes this change to apply only to test-jar dependencies.

@JoshRosen
Copy link

This also looks related to #37. In short, it seems that several folks are experiencing similar problems with test-jar dependencies.

@JoshRosen
Copy link

Another subtlety which I don't think this current PR handles correctly: in Maven, declaring a dependency on a test-jar does not pull in that test-jar artifact's transitive dependencies (https://issues.apache.org/jira/browse/MNG-1378), whereas I think the approach taken here does pull in those transitive deps. We've been bitten by this in Spark before when people add only a test-jar dependency, forget its transitive dependencies, then see the SBT build pass, commit, and have the Maven build fail.

@metasim
Copy link
Member

metasim commented Jan 9, 2017

This has remained with open questions but no progress for long enough that I'm closing it to get it out of the list. Reopen if there's disagreement.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants