Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support sbt 1.4.0 #73

Merged
merged 9 commits into from
Oct 17, 2020
Merged

Support sbt 1.4.0 #73

merged 9 commits into from
Oct 17, 2020

Conversation

anilkumarmyla
Copy link
Contributor

@anilkumarmyla anilkumarmyla commented Oct 7, 2020

PR to address #70 and makes the plugin work with sbt 1.4.0

However this is a hack and doesn't address backward compatibility or has blunders. Reviews, comments or better versions welcome.

Copy link
Owner

@cb372 cb372 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for looking into this!

But I'm a bit confused about how your changes would solve the reported issue: java.lang.ClassCastException: sbt.internal.inc.MappedVirtualFile cannot be cast to java.io.File. Can you explain whatever you've learned from your investigation so far?

@@ -1 +1 @@
sbt.version=1.3.13
sbt.version=1.4.0
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably do this (from the release notes):

Note to plugin authors
If you build and publish a plugin using sbt 1.4.0, your users will also be forced to upgrade to sbt 1.4.0 immediately. >To prevent this you can cross build your plugin against sbt 1.2.8 (while using sbt 1.4.0) as follows:

pluginCrossBuild / sbtVersion := "1.2.8"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since this project uses a custom release process, do you want me to update this line?

-  releaseStepCommandAndRemaining(s"^^$latestSbt_1_x_version publish"),
+  releaseStepCommandAndRemaining(s"^^1.3.13 publish"),

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

err... yes I think so 😅

src/main/scala/explicitdeps/ExplicitDepsPlugin.scala Outdated Show resolved Hide resolved
@anilkumarmyla
Copy link
Contributor Author

Can you explain whatever you've learned from your investigation so far?

I'll try to explain using the diff in sbt/zinc@v1.3.5...v1.4.0

The major change comes in Relations.scala where allLibraryDeps changed from java.io.File to xsbti.VirtualFileRef extended by sbt.internal.inc.MappedVirtualFile, which is apparent from the error message

-  def allLibraryDeps: collection.Set[File]
+  def allLibraryDeps: collection.Set[VirtualFileRef]

Relations.scala sbt/zinc@v1.3.5...v1.4.0#diff-8f19e01c26867080078f05c80f07ae4c
VirtualRef.java sbt/zinc@v1.3.5...v1.4.0#diff-b9247d850cc6ddef7f174c10b4eb76c0

This PR simply converts the xsbti.VirtualFileRef to java.io.File using the String id() function which is a unique identifier for the file. However I also noticed that this id() function returns a value with ${CSR_CACHE}/path that needs to be interpolated again with substituting the actual value of coursier cache path to get the file system path.

@cb372
Copy link
Owner

cb372 commented Oct 12, 2020

I see, I didn't spot the call to .id(). It all makes a bit more sense now. Thanks for the explanation.

So conceptually what we want to do is:

def toFile(x: AnyRef): File = x match {
  case virtual: xsbti.VirtualFileRef =>
    // sbt 1.4.0 or newer
    val path = virtual.id().replaceAllLiterally("${CSR_CACHE}", csrCacheDirectoryValue)
    new java.io.File(path)
  case file: java.io.File =>
    // sbt 1.3.x or older
    file
}

val allLibraryDeps = analysis.relations.allLibraryDeps.map(toFile)

But I think that will probably blow up in sbt 1.3.x or older because the xsbti.VirtualFileRef class doesn't exist. So we might need to use reflection to do the instanceof check.

As for compatibility with sbt 1.2.x and older, where the csrCacheDirectory key doesn't exist, I think we can do something like this instead of making our task depend on csrCacheDirectory.value:

val extracted: Extracted = Project.extract(state.value)
val settings = extracted.session.original
settings.find(_.key.key.label == "csrCacheDirectory") match {
  case Some(csrCacheDirectorySetting) =>
    // sbt 1.3.0 or newer
    val csrCacheDirectoryValue = csrCacheDirectorySetting.init.evaluate(extracted.structure.data)
    ...
  case None =>
    // sbt 1.2.x or older so the key doesn't exist
}

Does this make sense?

@anilkumarmyla
Copy link
Contributor Author

Does this make sense?

Makes perfect sense, I'll try doing your suggestions and see how it goes.

@anilkumarmyla
Copy link
Contributor Author

@cb372 thanks for your help and review, now the PR is backward compatible and build passes for sbt 0.13.x, 1.2.x, 1.3.x and 1.4.x! I've marked it as ready for review, take another look when you get a chance

@anilkumarmyla
Copy link
Contributor Author

However there is something wrong, it doesn't work locally for me. I'll check what's wrong

@anilkumarmyla
Copy link
Contributor Author

Interestingly, if i use sbt publishLocal with project/build.properties set to sbt 1.4.0 the plugin works in a project with sbt 1.4.0, a publish with sbt 1.3.13 doesn't work in a project with sbt 1.4.0

[error] scala.ScalaReflectionException: value id encapsulates multiple overloaded alternatives and cannot be treated as a method. Consider invoking `<offending symbol>.asTerm.alternatives` and manually picking the required method
@cb372
Copy link
Owner

cb372 commented Oct 14, 2020

LGTM. As soon as I get a chance I'd like to do some manual testing with various sbt versions before merging.

@cb372 cb372 changed the title Hacky path to sbt 1.4.0 Support sbt 1.4.0 Oct 14, 2020
Unfortunately the reflection approach didn't work if we publish using
1.3.13 and run against 1.4.0.

It throws a ClassCastException as soon as we do
`analysis.relations.allLibraryDeps.map(...)`, because the type signature
of `allLibraryDeps` it was built against (`Set[File]`) doesn't match the
signature of `allLibraryDeps` on the classpath (`Set[VirtualFileRef]`).

Worked around this by casting to `Set[AnyRef]`, but that means we can't
use a type tag anymore. So I gave up on reflection and used `getClass`
instead. Pretty nasty but it works.

Confirmed manually that if we publish using 1.4.0 we can only run
against 1.4.0, so we need to publish using 1.3.13.
This is a bit more robust than relying on the `toString` implementation
@cb372
Copy link
Owner

cb372 commented Oct 17, 2020

I played around with various sbt versions locally and ended up making some small changes. See my commit comments for the details.

Once Travis is done, I'll merge and make a release. Thank you for your work on this fix!

By the way, Travis doesn't catch this kind of compatibility issue because it runs sbt "^^${SBT_VERSION}" scripted, meaning the sbt version used for publishLocal and the sbt version used in the test projects is always the same. Would be nice to have a more realistic test matrix, whereby we do ^^1.3.13 publishLocal and then test it against all the supported sbt versions.

@anilkumarmyla
Copy link
Contributor Author

Thanks for taking care of this!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants