Update to sbt 1.1.0

Issue #130 resolved
Simon Leischnig created an issue

Update OPAL, all subprojects and original plugoins to use sbt 1.1.0

Comments (9)

  1. Michael Eichberg repo owner

    Please use the feature/sbt1.0.x branch for prototyping the support; you can create a pull request from that branch once it is working.

  2. Simon Leischnig reporter

    Thanks for the already in-place updated versions in .properties and .sbt files, that saved me time

  3. Simon Leischnig reporter

    Detected Issue No. 1:

    Inevitable collateral of the update will be that the changes made due to #119 in PR https://bitbucket.org/delors/opal/pull-requests/434 will have to be rolled back. In a nutshell, that PR introduced a custom initialization order for sbt files, following the documentation on Initialization order of the sbt 1.x docs: http://www.scala-sbt.org/0.13/docs/Setting-Initialization.html

    This page does seem to be out-of-sync though with the actual 1.x API, containing legacy examples for API 0.13, as they use sbt.AddSettings, which in sbt 1.x is located in the internal packages which cannot be accessed within a build. Therefore, I have to assume the functionality introduced with PR 434 is deprecated for sbt 1.x . However, the documentation also suggests that .sbt files other than build.sbt are actually initialized AFTER the build.sbt file. We had https://bitbucket.org/delors/opal/issues/119 because we observed a different, seemingly-unspecified behavior which behaved as suggested in the doc on some machines, and on others not.

    My way of resolving this in the wake of updating to 1.x is to remove the functionality of PR 434 altogether. When this PR is merged, I suggest re-opening #119, referencing this comment as the reason, and testing the behaviour on different machines (on mine, the local.sbt is given precedence over build.sbt, but other machines may show different behavior as observed before). IMO, chances are good that from sbt 1.x, the unspecified behavior of .sbt initialization is gone and all is well.

  4. Simon Leischnig reporter

    Issue No. 2

    Since scala 2.11, scala.xml is separate from the standard libraries. Solution: adding it to the library dependencies where necessary.

  5. Simon Leischnig reporter

    Issue No. 3

    It seems that sbt 1.x rejects certain .scala files according to:

    [error] /home/simon/gitShared/opal_hiwi/DEVELOPING_OPAL/tools/src/main/scala/org/opalj/hermes/queries/EffectiveFieldAccesses.scala:1:1: [error] Found names but no class, trait or object is defined in the compilation unit. [error] The incremental compiler cannot record the dependency information in such case. [error] Some errors like unused import referring to a non-existent class might not be reported.

    Solution for now: renaming that file to .scala_archive to preserve the code that is contained in comments only there.

  6. Simon Leischnig reporter

    seemingly almost done here. There are still test errors. Since I have not tested for a while now and have never seen those, I must assume I am not done yet (although I have no clue yet how I would approach fixing those failed test cases)

    Could it be this is known behavior and it would be o.k. to open a PR with these test errors?

    [info] - should be able to analyze a 3-dimensional array initialization with potential exceptions *** FAILED ***
    [info]   java.lang.NullPointerException:
    [info]   at org.opalj.ai.domain.l1.DefaultArraysTest.$anonfun$new$31(DefaultArraysTest.scala:462)
    [info]   at org.opalj.ai.domain.l1.DefaultArraysTest.$anonfun$new$31$adapted(DefaultArraysTest.scala:445)
    [info]   at org.opalj.ai.domain.l1.DefaultArraysTest.$anonfun$evaluateMethod$2(DefaultArraysTest.scala:67)
    [info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
    [info]   at org.opalj.ai.common.XHTML$.dumpOnFailureDuringValidation(XHTML.scala:129)
    [info]   at org.opalj.ai.domain.l1.DefaultArraysTest.evaluateMethod(DefaultArraysTest.scala:67)
    [info]   at org.opalj.ai.domain.l1.DefaultArraysTest.$anonfun$new$30(DefaultArraysTest.scala:445)
    [info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
    [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
    [info]   ...
    [info] array accesses that lead to exceptions
    [info] - if an index is out of bounds a corresponding exception should be thrown even if the store is potentially impossible
    [info] - should lead to an array store exception if the value cannot be stored in the array
    [info] array stores
    [info] - should be able to analyze a method that updates a value stored in an array in a branch *** FAILED ***
    [info]   ComputedValue(int = 2) was not equal to ComputedValue(an int) (DefaultArraysTest.scala:534)
    [info] - should be able to detect a possible array store exception and the default array value *** FAILED ***
    [info]   ComputedValueOrException({_ <: java.lang.Cloneable, null}[↦20;refId=108],List(java.lang.ArrayIndexOutOfBoundsException[↦-100020;refId=107])) was not equal to ComputedValueOrException(null,List(ObjectType(java/lang/ArrayStoreException))) (DefaultArraysTest.scala:566)
    
  7. Simon Leischnig reporter

    Issue which still came up: setting the id of a project definition is not as easy anymore as it was in 0.13. I was not able to find a good way to do it.

    Instead, I came up with using scala backticks variable names to set the project id to the required string, and delegated to the old abbreviation. Like this:

    lazy val de = `DependenciesExtractionLibrary`
    lazy val `DependenciesExtractionLibrary` = (project in file("OPAL/de"))
    ...
    

    this way, we can use any ids even if it is not allowed in standard scala variable names.

  8. Log in to comment