Regression in compile_extension_module with numpy intc arrays
See pull request #396 for unit tests which show the regression. The unit tests pass with dolfin master before August 26. 2017 and fail with the current master branch.
-
Passing numpy.array(xxx, dtype=numpy.intc) to a C++ module expecting Array<int> used to work, but now compile_extension_module incorrectly complains that the C++ code requires 64 bit integers, which is not the case
-
There is no unit test for this functionality in compile_extension_module which is why this regression could happen in the first place
The linked pull request only handles the second point, I have not yet found out exactly what has changed to break compile_extension_module
The Array support is going away with the introduction of pybind11, but an equivalent set of unit test for the new way of doing this should still be included, and the old way should still work for the 2017.2 version
Comments (13)
-
reporter -
Stupid question, did you clean the instant cache?
-
reporter Yes, this is with a clean cache. I am now trying on the CI infrastructure to see if it is a real bug or just me
https://bitbucket.org/fenics-project/dolfin/commits/2583a5decc5dd9ce2bcd0e0d882d3a9ef2e4e34e
If the unit tests fail then it is probably real, otherwise we at least have a unit test for this :-)
-
reporter The CI runner gets the same error as I get so this is a regression. As far as I can see dolfin/swig/typemaps/array.i and numpy.i has not been touched since 2015 ... I am really not into SWIG, so I do not really know how to figure out what is wrong
-
It's just a guess, but can you check if it is related to the issue with DOLFINConfig.cmake that I have reported in
#930? -
reporter Thanks for the hint! This is very likely closely related, I tried the code with the dolfin commit made just before the ones given in your report and it worked fine
For posteriority, here is a recipe using latest master for all packages except dolfin where f510b50ca600bdd0dfe0c36ca1fa2d31071c464d from August 26. is used
export FENICS_PYTHON=python2 && \ export PIP_NO_CACHE_DIR=off && \ FENICS_VERSION=5c8f44ecf4924f8d74dfcc6b5dfe4b6744dc0dc8 /opt/bin/fenics-update fiat && \ FENICS_VERSION=61cd6d9e51dc3e48c7a24fb066eb4c88f8d33645 /opt/bin/fenics-update dijitso && \ FENICS_VERSION=b9e56604c5aacf1fa45ebd9876f207073b311b50 /opt/bin/fenics-update instant && \ FENICS_VERSION=4fbc66b400fe07496a052b9e559af530d6e34b4a /opt/bin/fenics-update ufl && \ FENICS_VERSION=159a44234f387961442895c20256b400bad5234e /opt/bin/fenics-update ffc && \ FENICS_VERSION=f510b50ca600bdd0dfe0c36ca1fa2d31071c464d /opt/bin/fenics-update dolfin && \ echo /opt/lib/python2.7/site-packages >> $(python2 -c "import site; print(site.getsitepackages()[0])")/fenics.pth
-
I'm not sure what you're really expecting. To me, the correct behaviour would be an error because of the mixing 32 and 64-bit ints.
The good news is (I've written this a lot recently!) that it will be work fine with pybind11 because the process is not obscured by the SWIG later. You could try it now - experimental pybind11 interface is in
master
. Don't usedolfin::Array
, but useEigen
since pybind11 maps nicely Eigen <--> NumPy. -
reporter The problem is that I can no longer pass a numpy array of 32 bit integers to C++ because something inside compile_extension_module erroneously insists that the C++ code requires 64 bit integers when it in fact wants to get the same 32 bit integers that I am trying to pass.
This is a regression for everyone needing to communicate integers between Python and custom C++ modules. My use case is way to slow in Python so I need extension modules, and luckily Dolfin has those and it is very easy to use!
pybind11: I use compile_extension_module quite a lot and need to pass arrays of int and double. If the end result is Array<*> or Eigen arrays is not really important to me, though Eigen is slightly nicer if it plays well with integer types. Right now I do not have time to port to Python3, but I will sometime this fall and then I can test the pybind11 bindings.
-
reporter - changed milestone to 2017.2
- changed title to Regression in compile_extension_module with numpy intc arrays
- edited description
-
reporter -
reporter Adding this to line 211 of dolfin/CMakeLists.txt fixes the problem (from pull request #397)
list(APPEND DOLFIN_PYTHON_DEFINITIONS "-DHAS_MPI")
-
@trlandet Is this issue fixed now?
-
reporter - changed status to resolved
Closed by pull request #396
- Log in to comment
I cannot reproduce the error with the latest official Docker dev image, but that is 2 months old(?), which is similar to my Singularity build.