multi-build / multibuild Goto Github PK
View Code? Open in Web Editor NEWMachinery for building and testing Python Wheels for Linux, OSX and (less flexibly) Windows.
License: Other
Machinery for building and testing Python Wheels for Linux, OSX and (less flexibly) Windows.
License: Other
In #86, I suggested using the -m virtualenv
option to create a virtual environment. However, this requires dropping support for Python 2.6
im experimenting using this to build wheels for the lxml project. They support python 3.3 and 2.6 but for reasons I dont quite understand the builds fail on mac with these two versions. I suspect you already know about this since the readme example travis file does not include these versions (well, they are there for linux but not osx) but... I did not see an issue so I opened this.
Tag
https://github.com/Bachmann1234/lxml-wheels/releases/tag/wheel-test-2
2.6 failing build
https://travis-ci.org/Bachmann1234/lxml-wheels/jobs/234384030
hdiutil: attach failed - image not recognized
3.3 failing build
https://travis-ci.org/Bachmann1234/lxml-wheels/jobs/234384032
installer: Package name is Python
installer: Installing at base path /
2017-05-20 19:24:23.527 installer[2500:5293] Package /Volumes/Python/Python.mpkg/Contents/Packages/PythonFramework-3.3.pkg uses a deprecated pre-10.2 format (or uses a newer format but is invalid).
installer: The install failed (The Installer could not install the software because there was no software found to install.)
Hi,
Let me preface this issue by just saying thank you very much for all your work in providing an extremely useful project. We use it all the time.
Since there are no releases, I was wondering what your development model is? Currently, in our own builds we checkout a specific hash but it is fairly arbitrary which commit gets picked.
I don't see a devel branch or something similar which would suggest that HEAD of master is always stable and you just merge desired features from a devel branch into master occasionally. So is HEAD of master stable in any way? Or could you tag certain commits when you consider the current state of the project stable? Then we could check out the latest tag.
Thanks for considering this.
Disclaimer: I know nothing about macOS.
Can the call to delocate-listdeps be dropped? It takes a long time to run on travis-ci, I think.
I am unable to determine the cause, but this seems to have happened recently.
Error: pkg-config-0.29.1_2 already installed
See: https://travis-ci.org/MacPython/pytables-wheels/jobs/221319221#L63
Fix at : MacPython/matplotlib-wheels@58efe0e
Should this fix go in multibuild?
Rather than building OpenBLAS on manylinux, we can improve performance by fetching it from here:
It would seem that using AppVeyor to build Windows wheels is now quite mature, e.g.
However, right now the multibuild README does not mention this - should it?
I'm using multibuild for a project which needs to build some C libraries and the ./autogen.sh script fails with "libtoolize: command not found" error.
https://travis-ci.org/anthrotype/ttfautohint-python/jobs/315444322#L515
How is it possible to install libtoolize (and possibly other missing development tools, e.g. bison, flex, ragel) in the manylinux docker container?
Thanks for your help
From https://github.com/matthew-brett/multibuild/blob/master/README.rst
We try to keep the master branch stable, so consider doing regular or per-build updates of your Multibuild code to current master.
I presume this is about updating the git submodule(s), and since you'd also want to update your main project's submodule how about suggesting this:
$ git submodule foreach git pull origin master
$ git commit -a -m "Update submodules"
I'm hitting the travis-ci "10m without stdout/stderr output" limit in repair_wheelhouse
on macOS. It's arising because:
I think just echoing something between them might be the most robust/easiest fix. Otherwise one can prefix the call to repair_wheelhouse
with travis_wait <timeout>
.
Which would you prefer? I'm happy to create a PR.
Thanks again for setting this all up. It's a great tool to have.
https://github.com/matthew-brett/multibuild/blob/devel/library_builders.sh defines numerous environment variables, globally. In MacOS, that includes compiler flags.
There are a least two problems here:
-arch i386 -arch x86_64
" comes from 'cuz I never expected something called "library_builders.sh
" to affect anything but library builds and blamed CMake logic or Travis.)library_builders.sh
, so it's impossible to override them there.I'm not quite sure which course of action would be in line with the big design, so prefer to discuss this first.
I have run into this problem and while other incarnations of it are mentioned elsewhere, I think here is a better place to file an issue as it will be discovered faster by users of multibuild
.
Elsewhere
python-versioneer/python-versioneer#38
python-versioneer/python-versioneer#121
matplotlib/matplotlib#6605
Diagnosis
The underlying factor in this problem is that the project being built is a submodule. A git
submodule does not have a .git
directory at the package root, rather it has a .git
file whose contents is a relative path to what is the real ".git" directory.
For example:
gitdir: ../.git/modules/scikit-misc
When pip wheel
creates a wheel (in this case from the submodule), it copies the package to a temporary directory, then builds from the temporary directory. The copying breaks the relative path in the .git
and so the build does not happen in a valid git
repository. Hence, Versioneer which uses git describe
will not be able to determine the version.
Solution
One of the solutions i.e. using build_bdist_wheel
instead of build_pip_wheel
is already present in the code, but as a work around it is (was for me) hard to discover.
I think a better solution would be to replace the relative path to git
directory with an absolute path. A good place for this seems to be clean_code
function in common_utils.sh
. A counter against this approach, is that git
may change things around in the future.
What would exactly be needed to support python 3.6 in the MacPython/*-wheels repo's?
I suspect https://github.com/pypa/manylinux should also first have 3.6 builds for the linux case?
It took a couple of tweaks to refresh the multibuild configuration for our repository https://github.com/biopython/biopython-wheels/ but I was able to build all our Mac and Linux wheels using TravisCI, and almost all the Windows wheels.
Our current appveyor.yml
file,
This single target is failing:
- PYTHON: "C:\\Miniconda34-x64"
PYTHON_VERSION: "3.4"
PYTHON_ARCH: "64"
BUILD_DEPENDS: "numpy==1.9.0"
TEST_DEPENDS: "numpy==1.9.0"
https://ci.appveyor.com/project/biopython/biopython-wheels/build/1.0.54/job/86t1cjvxigjqodv4
python setup.py bdist_wheel
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-3.4
creating build\lib.win-amd64-3.4\Bio
copying Bio\bgzf.py -> build\lib.win-amd64-3.4\Bio
copying Bio\DocSQL.py -> build\lib.win-amd64-3.4\Bio
...
running build_ext
building 'Bio.cpairwise2' extension
Traceback (most recent call last):
File "setup.py", line 478, in <module>
install_requires=REQUIRES,
File "C:\Miniconda34-x64\lib\distutils\core.py", line 148, in setup
dist.run_commands()
File "C:\Miniconda34-x64\lib\distutils\dist.py", line 955, in run_commands
self.run_command(cmd)
File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
cmd_obj.run()
File "C:\Miniconda34-x64\lib\site-packages\wheel\bdist_wheel.py", line 179, in run
self.run_command('build')
File "C:\Miniconda34-x64\lib\distutils\cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
cmd_obj.run()
File "C:\Miniconda34-x64\lib\distutils\command\build.py", line 126, in run
self.run_command(cmd_name)
File "C:\Miniconda34-x64\lib\distutils\cmd.py", line 313, in run_command
self.distribution.run_command(command)
File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
cmd_obj.run()
File "setup.py", line 206, in run
build_ext.run(self)
File "C:\Miniconda34-x64\lib\site-packages\setuptools\command\build_ext.py", line 50, in run
_build_ext.run(self)
File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 339, in run
self.build_extensions()
File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 448, in build_extensions
self.build_extension(ext)
File "C:\Miniconda34-x64\lib\site-packages\setuptools\command\build_ext.py", line 183, in build_extension
_build_ext.build_extension(self, ext)
File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 503, in build_extension
depends=ext.depends)
File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 460, in compile
self.initialize()
File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 371, in initialize
vc_env = query_vcvarsall(VERSION, plat_spec)
File "C:\Miniconda34-x64\lib\site-packages\setuptools\msvc9_support.py", line 52, in query_vcvarsall
return unpatched['query_vcvarsall'](version, *args, **kwargs)
File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 287, in query_vcvarsall
raise ValueError(str(list(result.keys())))
ValueError: ['path']
Command exited with code 1
(Updated to include final two lines which I missed in initial copy-and-paste)
Any thoughts on what might be breaking on this specific platform (64-bit Python 3.4)?
I consider this low priority as most of our Windows users are likely to want a more recent Python.
Hi,
I'm trying to build wheels for mpdaf, which needs cfitsio for a C extension. So I'm building the shared lib for cfitsio (https://github.com/musevlt/mpdaf-wheels/blob/master/config.sh), which now works on Linux but fails on OSX with this error: delocate.delocating.DelocationError: library "/Users/travis/build/musevlt/mpdaf-wheels/libcfitsio.2.dylib" does not exist
https://travis-ci.org/musevlt/mpdaf-wheels/jobs/347532865
It looks like libcfitsio.2.dylib
has been build correctly, so if you have any idea of what's going on here ?
Also, do you think it would be useful to have the cfitsio build recipe in multilib ? (I could not use build_simple
because of cfitsio naming, e.g. cfitsio3370.tar.gz with no dash, sigh).
Btw, is there a way to download the wheels directly from Travis, or is it required to put them on rackspace? If so, could you setup a key for this project ?
Great project ๐ , unfortunately for OSX I get these errors:
multibuild/common_utils.sh: line 106: shell_session_update: command not found
multibuild/common_utils.sh: line 45: shell_session_update: command not found
See here the logs:
https://travis-ci.org/maartenbreddels/vaex-wheels/jobs/166164630
and relevant links:
https://travis-ci.org/maartenbreddels/vaex-wheels
https://github.com/maartenbreddels/vaex-wheels
Any ideas? I saw #4, but nothing there helped, and the line numbers in these errors don't make sense.
I would really like to use these scripts in other open source projects (e.g. reference them directly in their travis jobs) and contribute here but currently there is no license for them specified. Even though it may not be a runtime dependency, a clear licensing would be very helpful.
Trying to fix #58 in a general way, we put the build PREFIX
into CPPFLAGS
and the LDFLAGS
- see https://github.com/matthew-brett/multibuild/pull/67/files . However this causes a scipy build failure - see scipy/scipy#8136 and https://travis-ci.org/scipy/scipy/jobs/299202869 .
The error was:
ld: warning: directory not found for option '-Lgfortran: warning: x86_64 conflicts with i386 (arch flags ignored)
/usr/local/Cellar/gcc/6.2.0/lib/gcc/6/gcc/x86_64-apple-darwin15.6.0/6.2.0'
Undefined symbols for architecture x86_64:
"_PyArg_ParseTupleAndKeywords", referenced from:
_f2py_rout__fftpack_zfft in _fftpackmodule.o
_f2py_rout__fftpack_drfft in _fftpackmodule.o
_f2py_rout__fftpack_zrfft in _fftpackmodule.o
_f2py_rout__fftpack_zfftnd in _fftpackmodule.o
_f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o
_f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o
_f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o
...
"_PyBytes_FromString", referenced from:
_PyInit__fftpack in _fftpackmodule.o
...
@xoviat kindly reverted the PREFIX
changes in #71, #74.
This seems to be a problem with our gfortran install no longer finding the Python includes - time to investigate ...
From my understanding, the mac building process is done on 10.6. However, my extension doesn't build on 10.6. Is there any way to specify target os version for wheel?
Hi,
Thanks for this project, it's a huge time-saver. I just noticed that the example .travis.yml file in the README should have the include/exclude sections under matrix rather than at the root.
The command for OSX has -arch i386 -arch x86_64
which appears to build for both. It is possible to skip i386? I'm using __int128_t which is only available for x86_64.
I'm not entirely clear on how everything hangs together. Is it possible that lzma is broken in one of the multibuild images?
I'm getting this error:
File "/opt/cp36m/lib/python3.6/lzma.py", line 27, in <module>
from _lzma import *
ModuleNotFoundError: No module named '_lzma'
I've used multibuild successfully at https://github.com/symengine/symengine-wheels and it works great for linux, osx and windows. Thanks for automating this process and I was able to get this running in a day.
One thing that I changed was that instead of using the wheelhouse uploader, I used the github release deployment in travis-ci and appveyor to upload the wheels to a github release of the repo. (I have to make a release in the wheels repo everytime I want the artifacts to be uploaded, but that's only a minor inconvenience.)
Cheers !
(Btw, I couldn't find any mailing list or chat room to say this, so I'm opening this issue and closing it.)
https://blog.travis-ci.com/2017-11-21-xcode8-3-default-image-announce
My osx builds are throwing a sudo: pip: command not found. It started right after they switched to the new default.
I realize that these scripts are mostly for travis, so of course your project will be hosted on git, but these are also nice for building wheels for any arbitrary docker environment. Would you be open to a patch that adds support for projects hosted on hg?
See the failed macOS builds on:
https://travis-ci.org/zopefoundation/zope.interface/builds/309275249
The Travis config:
https://github.com/zopefoundation/zope.interface/blob/d50e3d4bed1cecb8b308656ca509824fd379c6cc/.travis.yml#L22-L33
$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then git clone https://github.com/MacPython/terryfy; fi
$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then source terryfy/travis_tools.sh; fi
$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then get_python_environment $TERRYFY_PYTHON venv; fi
sudo: pip: command not found
And the only place where that hits a sudo pip
seems to be here:
https://github.com/matthew-brett/multibuild/blob/7d67726f08cb4ab80c235c5a97ee43b64f6112f1/osx_utils.sh#L253-L262
So for whatever reason /usr/local/bin
is not in $PATH
at that time on Travis or there is a user error in the Travis config?
in osx_utils.sh
, the set_py_vars
inserts PYTHON_EXE
directory at beginning of the PATH
variable:
https://github.com/matthew-brett/multibuild/blob/master/osx_utils.sh#L195
So deactivate
does not have any effect of this, and the venv/bin is still in the PATH
:
https://travis-ci.org/googlei18n/compreffor/jobs/168875918#L204
Travis CI would issue a warning when PATH variable was modified and the rvm
tool is run (e.g. at the deploy stage when deployment dependencies are installed):
Warning! PATH is not properly set up, '/Users/travis/.rvm/gems/ruby-2.0.0-p648/bin' is not at first place,
usually this is caused by shell initialization files - check them for 'PATH=...' entries,
it might also help to re-add RVM to your dotfiles: 'rvm get stable --auto-dotfiles',
to fix temporarily in this shell session run: 'rvm use ruby-2.0.0-p648'.
Ref #58.
Pretty self explanatory, but allows much nicer .travis.yml
files.
None of the examples I could find for setting tests in config.sh
seemed to care about the present working directory. It appears in the NumPy ecosystem most libraries can trigger their tests via importing the library and calling a function.
Right now Biopython's tests are run via a specific script, and they expect to be in the main source code's Tests/
folder.
The following worked in config.sh
but is inelegant for Mac vs Linux (which appear to setup their environments and slightly differently, affecting the environment variables available):
if [ -n "$IS_OSX" ]; then
cd ${TRAVIS_BUILD_DIR}/biopython/Tests
else
cd /io/biopython/Tests
fi
python run_tests.py --offline
# Assuming I should restore the working directory...
if [ -n "$IS_OSX" ]; then
cd ${TRAVIS_BUILD_DIR}
else
cd /io/
fi
How should I be doing this?
For some reasons our builds are giving an error for xcode image 8.3.
The matthewbrett/trusty
image is missing the following libraries required by https://www.python.org/dev/peps/pep-0513/#the-manylinux1-policy :
libICE.so.6
libSM.so.6
libGL.so.1
libgobject-2.0.so.0
libgthread-2.0.so.0
libglib-2.0.so.0
So, whatever tested wheel is using them fails to load.
You can see what I've done here, but basically the idea is to allow the use of ccache with multibuild. The relevant commands are below:
docker run --rm -v ${HOME}/.ccache:/ccache -v ${HOME}/.cache:/cache -v `pwd`:/io
In container:
# Link ccache and pip cache
ln -s /ccache $HOME/.ccache
I'm not sure how this should be implemented, but one idea is to enable it with a USE_CCACHE=1
variable defined in the travis env
section.
I've noticed that some output is suppressed from build_config. Is there a way to disable this?
Basically the same as matthew-brett/delocate#31, but this one creates even more Mac jobs, putting more burden on the system.
Please consider reducing the Mac use.
Thanks.
Sometimes I have seen ValueError: unsupported hash type blake2s
in my TravisCI linux builds, e.g. https://travis-ci.org/biopython/biopython-wheels/jobs/252869226 for 32-bit Python 3.6.0
Status: Downloaded newer image for matthewbrett/trusty:32
ERROR:root:code for hash blake2b was not found.
Traceback (most recent call last):
File "/opt/cp36m/lib/python3.6/hashlib.py", line 243, in <module>
globals()[__func_name] = __get_hash(__func_name)
File "/opt/cp36m/lib/python3.6/hashlib.py", line 119, in __get_openssl_constructor
return __get_builtin_constructor(name)
File "/opt/cp36m/lib/python3.6/hashlib.py", line 113, in __get_builtin_constructor
raise ValueError('unsupported hash type ' + name)
ValueError: unsupported hash type blake2b
This is not unique to my builds, e.g. https://travis-ci.org/MacPython/astropy-wheels/jobs/251133396
This in itself does not seem to be breaking the builds, i.e. it looks like a harmless but scary warning.
It appears to be restricted to 32bit builds only, which fits with https://bugs.python.org/issue30192 hashlib module breaks with 64-bit kernel and 32-bit user space being that cause. That was fixed and back-ported to Python 3.6 in python/cpython#2042 which may or may not be included in Python 3.6.2 due later this month?
I just started getting this error this morning:
Traceback (most recent call last):
File "/io/multibuild/supported_wheels.py", line 9, in <module>
from pip.pep425tags import get_supported
ImportError: No module named pep425tags
ERROR: You must give at least one requirement to install (maybe you meant "pip install https://nipy.bic.berkeley.edu/manylinux"?)
See the complete log at: https://travis-ci.org/rlhelinski/bientropy-wheels/jobs/366528797
Not sure what's causing it.
https://www.appveyor.com/updates/
Miniconda3 3.16.0 (Python 3.4.3) is now installed into C:\Miniconda34 directory (x64 to C:\Miniconda34-x64 respectively), and C:\Miniconda3 is now mapped to the latest Miniconda3 4.3.27 (Python 3.6.2).
Any objections to merging devel into master. I guess we'll just have to hope for the best.
I'm thinking particularly of the bug referenced here, now fixed on multibuild devel - #146
Not sure if there is an easy solution here but might be nice to set variable to blank rather than being unbound to allow stricter bash flags e.g set -u
.
Right now setting that yields:
/io/multibuild/common_utils.sh: line 129: IS_OSX: unbound variable
I'm building an universal py2.py3 binary wheel which contains a native shared library loaded via ctypes.
I would like to test that building the wheel with, say, python2 inside the manylinux1 container produces a wheel that can be installed on python3 on the ubuntu container.
Is this kind of "cross-compilation" currently possible with multibuild?
If so, how?
Thank you in advance
The url of the official manylinux1 image is currently hardcoded:
Currently I have to compile from source a series of development tools (see #92), and I was thinking if I could instead use a copy of the official manylinux1 docker image that has those tools precompiled, so that I could speed up the build.
It would be nice if this url could be overridden by an environment variable, e.g. $MANYLINUX_DOCKER_URL or similar.
Thanks
Right now rsync
in fetch_unpack
runs in verbose mode (-v
). This can lead to it printing a lot of deleting
messages (in the case the folder has build artifacts).
I'm wondering if anyone has tried extending the AppVeyor system from just building wheels to also build Windows installers (EXE and MSI fileS)?:
build_script:
...
# build wheel:
- cd %REPO_DIR%
- git checkout %BUILD_COMMIT%
- "%CMD_IN_ENV% python setup.py bdist_wheel"
i.e. Could we add (or being more rigorous, define additional targets for):
- "%CMD_IN_ENV% python setup.py bdist_wininst"
- "%CMD_IN_ENV% python setup.py bdist_msi"
I appreciate see that NumPy never went down this route (offering special Windows installers up until https://sourceforge.net/projects/numpy/files/NumPy/1.10.2/ and then only wheels with
https://pypi.python.org/pypi/numpy/1.11.0 onwards).
Do people think that pip
is mature enough that we need not worry about Windows installers for Python packages from now on?
So we have a complex issue with our build system. You noted yourself that auditwheel 1.6 introduced problems (pypa/auditwheel#68) which also broke all our builds for https://github.com/opencobra/cobrapy Python 2.7, 3.4, 3.5, 3.6. Hacking in an auditwheel version pinned to 1.5 at least restores our build for all Python 3.4+ versions. However, since the auditwheel update (or something else around May 1) Python 2.7 builds remain broken.
We've been poking around in different parts of the build system but so far unsuccessful. Do you have an idea where we could look further? And also, could you clarify the following, please?
auditwheel is for Python 3.3+ only. So the correct way to build something for 2.7 seems to be to build the wheel in 2.7 but then perform the wheel repair step in 3.3+. We've had a look at the various multibuild scripts but we cannot discern whether that distinction is actually being made and where. Any words on this matter will be greatly appreciated.
Releasing a signed source distribution and signed wheels for all platforms is difficult to automate because one has to wait for the artifacts to be built before signing them. Are there any good strategies for automating this process here? Could a builder submit wheels to some server (e.g. AWS Lambda "server") for the signature before uploading? The submission would have to be authenticated, of course.
Bref, is there currently a wheelhouse-signer
?
I've tried to setup a minimal multibuild environment from the example code, but I've been getting this vague error:
: command not found
Shows up on $ build_wheel $REPO_DIR $PLAT
on Linux,
And on $ source multibuild/travis_steps.sh
on Mac
https://travis-ci.org/HexDecimal/libtcod-cffi-wheels/builds/160198514
I'm not too familiar with the Unix shell. I'm not certain on how to debug this issue.
After succesfully getting wheels build for vaex, I've tried with the kapteyn package, however, the osx builds fail, giving me a strange error:
https://travis-ci.org/kapteyn-astro/kapteyn-wheels/builds/171972026
Do you have any ideas?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.