Giter VIP home page Giter VIP logo

multibuild's People

Contributors

anthrotype avatar ariddell avatar aruntejch avatar asenyaev avatar cancan101 avatar cclauss avatar dlech avatar hcho3 avatar hexdecimal avatar hugovk avatar isuruf avatar liath avatar matthew-brett avatar mattip avatar multibuilder avatar native-api avatar ogarod avatar peterjc avatar radarhere avatar rmax avatar robbuckley avatar saimn avatar sanurielf avatar sgillies avatar snowman2 avatar superbobry avatar tacaswell avatar tomkooij avatar xhochy avatar xoviat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

multibuild's Issues

Drop Python 2.6

In #86, I suggested using the -m virtualenv option to create a virtual environment. However, this requires dropping support for Python 2.6

Python 2.6, 3.3 mac builds fails

im experimenting using this to build wheels for the lxml project. They support python 3.3 and 2.6 but for reasons I dont quite understand the builds fail on mac with these two versions. I suspect you already know about this since the readme example travis file does not include these versions (well, they are there for linux but not osx) but... I did not see an issue so I opened this.

Tag
https://github.com/Bachmann1234/lxml-wheels/releases/tag/wheel-test-2

2.6 failing build
https://travis-ci.org/Bachmann1234/lxml-wheels/jobs/234384030

hdiutil: attach failed - image not recognized

3.3 failing build
https://travis-ci.org/Bachmann1234/lxml-wheels/jobs/234384032

installer: Package name is Python
installer: Installing at base path /
2017-05-20 19:24:23.527 installer[2500:5293] Package /Volumes/Python/Python.mpkg/Contents/Packages/PythonFramework-3.3.pkg uses a deprecated pre-10.2 format (or uses a newer format but is invalid).
installer: The install failed (The Installer could not install the software because there was no software found to install.)

project development model

Hi,

Let me preface this issue by just saying thank you very much for all your work in providing an extremely useful project. We use it all the time.

Since there are no releases, I was wondering what your development model is? Currently, in our own builds we checkout a specific hash but it is fairly arbitrary which commit gets picked.

I don't see a devel branch or something similar which would suggest that HEAD of master is always stable and you just merge desired features from a devel branch into master occasionally. So is HEAD of master stable in any way? Or could you tag certain commits when you consider the current state of the project stable? Then we could check out the latest tag.

Thanks for considering this.

``repair_wheelhouse`` does not generate output within 10m

I'm hitting the travis-ci "10m without stdout/stderr output" limit in repair_wheelhouse on macOS. It's arising because:

  • delocate-listdeps takes ~5 minutes
  • delocate-wheel takes ~5 minutes
    and they both generate no output.

I think just echoing something between them might be the most robust/easiest fix. Otherwise one can prefix the call to repair_wheelhouse with travis_wait <timeout> .

Which would you prefer? I'm happy to create a PR.

Thanks again for setting this all up. It's a great tool to have.

Environment variables defined in library_builders.sh are problematic

https://github.com/matthew-brett/multibuild/blob/devel/library_builders.sh defines numerous environment variables, globally. In MacOS, that includes compiler flags.

There are a least two problems here:

  • Even though it declares it defines "library compilation flags", they also affect the project itself which is counterintuitive. (It took me a week to find out where "-arch i386 -arch x86_64" comes from 'cuz I never expected something called "library_builders.sh" to affect anything but library builds and blamed CMake logic or Travis.)
  • The script's provided values are clearly designed to be overridable by presetting them in the environment. But in Docker environment, no user code is run before sourcing library_builders.sh, so it's impossible to override them there.

I'm not quite sure which course of action would be in line with the big design, so prefer to discuss this first.

Packages that use versioneer get an unknown version

I have run into this problem and while other incarnations of it are mentioned elsewhere, I think here is a better place to file an issue as it will be discovered faster by users of multibuild.

Elsewhere
python-versioneer/python-versioneer#38
python-versioneer/python-versioneer#121
matplotlib/matplotlib#6605

Diagnosis
The underlying factor in this problem is that the project being built is a submodule. A git submodule does not have a .git directory at the package root, rather it has a .git file whose contents is a relative path to what is the real ".git" directory.

For example:

gitdir: ../.git/modules/scikit-misc

When pip wheel creates a wheel (in this case from the submodule), it copies the package to a temporary directory, then builds from the temporary directory. The copying breaks the relative path in the .git and so the build does not happen in a valid git repository. Hence, Versioneer which uses git describe will not be able to determine the version.

Solution
One of the solutions i.e. using build_bdist_wheel instead of build_pip_wheel is already present in the code, but as a work around it is (was for me) hard to discover.

I think a better solution would be to replace the relative path to git directory with an absolute path. A good place for this seems to be clean_code function in common_utils.sh. A counter against this approach, is that git may change things around in the future.

Miniconda34-x64 failing with raise ValueError(str(list(result.keys())))

It took a couple of tweaks to refresh the multibuild configuration for our repository https://github.com/biopython/biopython-wheels/ but I was able to build all our Mac and Linux wheels using TravisCI, and almost all the Windows wheels.

Our current appveyor.yml file,

https://github.com/biopython/biopython-wheels/blob/77b948054f4207cec563e922781f93054f82745d/appveyor.yml

This single target is failing:

    - PYTHON: "C:\\Miniconda34-x64"
      PYTHON_VERSION: "3.4"
      PYTHON_ARCH: "64"
      BUILD_DEPENDS: "numpy==1.9.0"
      TEST_DEPENDS: "numpy==1.9.0"

https://ci.appveyor.com/project/biopython/biopython-wheels/build/1.0.54/job/86t1cjvxigjqodv4

python setup.py bdist_wheel
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-3.4
creating build\lib.win-amd64-3.4\Bio
copying Bio\bgzf.py -> build\lib.win-amd64-3.4\Bio
copying Bio\DocSQL.py -> build\lib.win-amd64-3.4\Bio
...
running build_ext
building 'Bio.cpairwise2' extension
Traceback (most recent call last):
  File "setup.py", line 478, in <module>
    install_requires=REQUIRES,
  File "C:\Miniconda34-x64\lib\distutils\core.py", line 148, in setup
    dist.run_commands()
  File "C:\Miniconda34-x64\lib\distutils\dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
    cmd_obj.run()
  File "C:\Miniconda34-x64\lib\site-packages\wheel\bdist_wheel.py", line 179, in run
    self.run_command('build')
  File "C:\Miniconda34-x64\lib\distutils\cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
    cmd_obj.run()
  File "C:\Miniconda34-x64\lib\distutils\command\build.py", line 126, in run
    self.run_command(cmd_name)
  File "C:\Miniconda34-x64\lib\distutils\cmd.py", line 313, in run_command
    self.distribution.run_command(command)
  File "C:\Miniconda34-x64\lib\distutils\dist.py", line 974, in run_command
    cmd_obj.run()
  File "setup.py", line 206, in run
    build_ext.run(self)
  File "C:\Miniconda34-x64\lib\site-packages\setuptools\command\build_ext.py", line 50, in run
    _build_ext.run(self)
  File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 339, in run
    self.build_extensions()
  File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 448, in build_extensions
    self.build_extension(ext)
  File "C:\Miniconda34-x64\lib\site-packages\setuptools\command\build_ext.py", line 183, in build_extension
    _build_ext.build_extension(self, ext)
  File "C:\Miniconda34-x64\lib\distutils\command\build_ext.py", line 503, in build_extension
    depends=ext.depends)
  File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 460, in compile
    self.initialize()
  File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 371, in initialize
    vc_env = query_vcvarsall(VERSION, plat_spec)
  File "C:\Miniconda34-x64\lib\site-packages\setuptools\msvc9_support.py", line 52, in query_vcvarsall
    return unpatched['query_vcvarsall'](version, *args, **kwargs)
  File "C:\Miniconda34-x64\lib\distutils\msvc9compiler.py", line 287, in query_vcvarsall
    raise ValueError(str(list(result.keys())))
ValueError: ['path']
Command exited with code 1

(Updated to include final two lines which I missed in initial copy-and-paste)

Any thoughts on what might be breaking on this specific platform (64-bit Python 3.4)?

I consider this low priority as most of our Windows users are likely to want a more recent Python.

delocate error with cfitsio on MacOS

Hi,

I'm trying to build wheels for mpdaf, which needs cfitsio for a C extension. So I'm building the shared lib for cfitsio (https://github.com/musevlt/mpdaf-wheels/blob/master/config.sh), which now works on Linux but fails on OSX with this error: delocate.delocating.DelocationError: library "/Users/travis/build/musevlt/mpdaf-wheels/libcfitsio.2.dylib" does not exist
https://travis-ci.org/musevlt/mpdaf-wheels/jobs/347532865
It looks like libcfitsio.2.dylib has been build correctly, so if you have any idea of what's going on here ?

Also, do you think it would be useful to have the cfitsio build recipe in multilib ? (I could not use build_simple because of cfitsio naming, e.g. cfitsio3370.tar.gz with no dash, sigh).

Btw, is there a way to download the wheels directly from Travis, or is it required to put them on rackspace? If so, could you setup a key for this project ?

shell_session_update: command not found

Great project ๐Ÿ‘ , unfortunately for OSX I get these errors:
multibuild/common_utils.sh: line 106: shell_session_update: command not found
multibuild/common_utils.sh: line 45: shell_session_update: command not found
See here the logs:
https://travis-ci.org/maartenbreddels/vaex-wheels/jobs/166164630
and relevant links:
https://travis-ci.org/maartenbreddels/vaex-wheels
https://github.com/maartenbreddels/vaex-wheels

Any ideas? I saw #4, but nothing there helped, and the line numbers in these errors don't make sense.

Clear licensing

I would really like to use these scripts in other open source projects (e.g. reference them directly in their travis jobs) and contribute here but currently there is no license for them specified. Even though it may not be a runtime dependency, a clear licensing would be very helpful.

Scipy build failures with PREFIX on include / library path

Trying to fix #58 in a general way, we put the build PREFIX into CPPFLAGS and the LDFLAGS - see https://github.com/matthew-brett/multibuild/pull/67/files . However this causes a scipy build failure - see scipy/scipy#8136 and https://travis-ci.org/scipy/scipy/jobs/299202869 .

The error was:

ld: warning: directory not found for option '-Lgfortran: warning: x86_64 conflicts with i386 (arch flags ignored)
/usr/local/Cellar/gcc/6.2.0/lib/gcc/6/gcc/x86_64-apple-darwin15.6.0/6.2.0'
Undefined symbols for architecture x86_64:
  "_PyArg_ParseTupleAndKeywords", referenced from:
      _f2py_rout__fftpack_zfft in _fftpackmodule.o
      _f2py_rout__fftpack_drfft in _fftpackmodule.o
      _f2py_rout__fftpack_zrfft in _fftpackmodule.o
      _f2py_rout__fftpack_zfftnd in _fftpackmodule.o
      _f2py_rout__fftpack_destroy_zfft_cache in _fftpackmodule.o
      _f2py_rout__fftpack_destroy_zfftnd_cache in _fftpackmodule.o
      _f2py_rout__fftpack_destroy_drfft_cache in _fftpackmodule.o
      ...
  "_PyBytes_FromString", referenced from:
      _PyInit__fftpack in _fftpackmodule.o
...

@xoviat kindly reverted the PREFIX changes in #71, #74.

This seems to be a problem with our gfortran install no longer finding the Python includes - time to investigate ...

ext only buildable on mac 10.12

From my understanding, the mac building process is done on 10.6. However, my extension doesn't build on 10.6. Is there any way to specify target os version for wheel?

Readme: include/exclude should be under matrix

Hi,
Thanks for this project, it's a huge time-saver. I just noticed that the example .travis.yml file in the README should have the include/exclude sections under matrix rather than at the root.

Skip i386 on OSX build

The command for OSX has -arch i386 -arch x86_64 which appears to build for both. It is possible to skip i386? I'm using __int128_t which is only available for x86_64.

lzma missing?

I'm not entirely clear on how everything hangs together. Is it possible that lzma is broken in one of the multibuild images?

I'm getting this error:

  File "/opt/cp36m/lib/python3.6/lzma.py", line 27, in <module>
    from _lzma import *
ModuleNotFoundError: No module named '_lzma'

Github releases for deployment.

I've used multibuild successfully at https://github.com/symengine/symengine-wheels and it works great for linux, osx and windows. Thanks for automating this process and I was able to get this running in a day.

One thing that I changed was that instead of using the wheelhouse uploader, I used the github release deployment in travis-ci and appveyor to upload the wheels to a github release of the repo. (I have to make a release in the wheels repo everytime I want the artifacts to be uploaded, but that's only a minor inconvenience.)

Cheers !

(Btw, I couldn't find any mailing list or chat room to say this, so I'm opening this issue and closing it.)

Build steps assume project is hosted in a git repository

I realize that these scripts are mostly for travis, so of course your project will be hosted on git, but these are also nice for building wheels for any arbitrary docker environment. Would you be open to a patch that adds support for projects hosted on hg?

macOS builds do not find pip when trying to uninstall the Travis virtualenv pip

See the failed macOS builds on:
https://travis-ci.org/zopefoundation/zope.interface/builds/309275249

The Travis config:
https://github.com/zopefoundation/zope.interface/blob/d50e3d4bed1cecb8b308656ca509824fd379c6cc/.travis.yml#L22-L33

$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then git clone https://github.com/MacPython/terryfy; fi
$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then source terryfy/travis_tools.sh; fi
$ if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then get_python_environment $TERRYFY_PYTHON venv; fi
sudo: pip: command not found

And the only place where that hits a sudo pip seems to be here:
https://github.com/matthew-brett/multibuild/blob/7d67726f08cb4ab80c235c5a97ee43b64f6112f1/osx_utils.sh#L253-L262

So for whatever reason /usr/local/bin is not in $PATH at that time on Travis or there is a user error in the Travis config?

in osx build, venv stays in $PATH even after deactivate

in osx_utils.sh, the set_py_vars inserts PYTHON_EXE directory at beginning of the PATH variable:
https://github.com/matthew-brett/multibuild/blob/master/osx_utils.sh#L195

So deactivate does not have any effect of this, and the venv/bin is still in the PATH:
https://travis-ci.org/googlei18n/compreffor/jobs/168875918#L204

Travis CI would issue a warning when PATH variable was modified and the rvm tool is run (e.g. at the deploy stage when deployment dependencies are installed):

Warning! PATH is not properly set up, '/Users/travis/.rvm/gems/ruby-2.0.0-p648/bin' is not at first place,
         usually this is caused by shell initialization files - check them for 'PATH=...' entries,
         it might also help to re-add RVM to your dotfiles: 'rvm get stable --auto-dotfiles',
         to fix temporarily in this shell session run: 'rvm use ruby-2.0.0-p648'.

Clarify expected working directory when tests are run

None of the examples I could find for setting tests in config.sh seemed to care about the present working directory. It appears in the NumPy ecosystem most libraries can trigger their tests via importing the library and calling a function.

Right now Biopython's tests are run via a specific script, and they expect to be in the main source code's Tests/ folder.

The following worked in config.sh but is inelegant for Mac vs Linux (which appear to setup their environments and slightly differently, affecting the environment variables available):

    if [ -n "$IS_OSX" ]; then
        cd ${TRAVIS_BUILD_DIR}/biopython/Tests
    else
        cd /io/biopython/Tests
    fi
    python run_tests.py --offline
    # Assuming I should restore the working directory...
    if [ -n "$IS_OSX" ]; then
        cd ${TRAVIS_BUILD_DIR}
    else
        cd /io/
    fi

How should I be doing this?

Allowing use of ccache for multibuild

You can see what I've done here, but basically the idea is to allow the use of ccache with multibuild. The relevant commands are below:

docker run --rm -v ${HOME}/.ccache:/ccache -v ${HOME}/.cache:/cache -v `pwd`:/io

In container:

# Link ccache and pip cache
ln -s /ccache $HOME/.ccache

I'm not sure how this should be implemented, but one idea is to enable it with a USE_CCACHE=1 variable defined in the travis env section.

Output suppressed

I've noticed that some output is suppressed from build_config. Is there a way to disable this?

ValueError: unsupported hash type blake2s

Sometimes I have seen ValueError: unsupported hash type blake2s in my TravisCI linux builds, e.g. https://travis-ci.org/biopython/biopython-wheels/jobs/252869226 for 32-bit Python 3.6.0

Status: Downloaded newer image for matthewbrett/trusty:32
ERROR:root:code for hash blake2b was not found.
Traceback (most recent call last):
  File "/opt/cp36m/lib/python3.6/hashlib.py", line 243, in <module>
    globals()[__func_name] = __get_hash(__func_name)
  File "/opt/cp36m/lib/python3.6/hashlib.py", line 119, in __get_openssl_constructor
    return __get_builtin_constructor(name)
  File "/opt/cp36m/lib/python3.6/hashlib.py", line 113, in __get_builtin_constructor
    raise ValueError('unsupported hash type ' + name)
ValueError: unsupported hash type blake2b

This is not unique to my builds, e.g. https://travis-ci.org/MacPython/astropy-wheels/jobs/251133396

This in itself does not seem to be breaking the builds, i.e. it looks like a harmless but scary warning.

It appears to be restricted to 32bit builds only, which fits with https://bugs.python.org/issue30192 hashlib module breaks with 64-bit kernel and 32-bit user space being that cause. That was fixed and back-ported to Python 3.6 in python/cpython#2042 which may or may not be included in Python 3.6.2 due later this month?

No module named pep425tags

I just started getting this error this morning:

Traceback (most recent call last):
  File "/io/multibuild/supported_wheels.py", line 9, in <module>
    from pip.pep425tags import get_supported
ImportError: No module named pep425tags
ERROR: You must give at least one requirement to install (maybe you meant "pip install https://nipy.bic.berkeley.edu/manylinux"?)

See the complete log at: https://travis-ci.org/rlhelinski/bientropy-wheels/jobs/366528797

Not sure what's causing it.

Time to merge devel into master

Any objections to merging devel into master. I guess we'll just have to hope for the best.

I'm thinking particularly of the bug referenced here, now fixed on multibuild devel - #146

Avoid Using Unbound Variable

Not sure if there is an easy solution here but might be nice to set variable to blank rather than being unbound to allow stricter bash flags e.g set -u.

Right now setting that yields:

/io/multibuild/common_utils.sh: line 129: IS_OSX: unbound variable

Test universal py2.py3 wheel on different version of python?

I'm building an universal py2.py3 binary wheel which contains a native shared library loaded via ctypes.
I would like to test that building the wheel with, say, python2 inside the manylinux1 container produces a wheel that can be installed on python3 on the ubuntu container.

Is this kind of "cross-compilation" currently possible with multibuild?
If so, how?

Thank you in advance

allow to pass user-defined manylinux1 docker image url

The url of the official manylinux1 image is currently hardcoded:

https://github.com/matthew-brett/multibuild/blob/1b97cd55094743702ef7ad3161d579d27f9250d4/travis_linux_steps.sh#L80

Currently I have to compile from source a series of development tools (see #92), and I was thinking if I could instead use a copy of the official manylinux1 docker image that has those tools precompiled, so that I could speed up the build.

It would be nice if this url could be overridden by an environment variable, e.g. $MANYLINUX_DOCKER_URL or similar.

Thanks

Building EXE and MSI installers with AppVeyor?

I'm wondering if anyone has tried extending the AppVeyor system from just building wheels to also build Windows installers (EXE and MSI fileS)?:

build_script:
    ...
    # build wheel:
    - cd %REPO_DIR%
    - git checkout %BUILD_COMMIT%
    - "%CMD_IN_ENV% python setup.py bdist_wheel"

i.e. Could we add (or being more rigorous, define additional targets for):

    - "%CMD_IN_ENV% python setup.py bdist_wininst"
    - "%CMD_IN_ENV% python setup.py bdist_msi"

I appreciate see that NumPy never went down this route (offering special Windows installers up until https://sourceforge.net/projects/numpy/files/NumPy/1.10.2/ and then only wheels with
https://pypi.python.org/pypi/numpy/1.11.0 onwards).

Do people think that pip is mature enough that we need not worry about Windows installers for Python packages from now on?

Broken builds for Python 2.7

So we have a complex issue with our build system. You noted yourself that auditwheel 1.6 introduced problems (pypa/auditwheel#68) which also broke all our builds for https://github.com/opencobra/cobrapy Python 2.7, 3.4, 3.5, 3.6. Hacking in an auditwheel version pinned to 1.5 at least restores our build for all Python 3.4+ versions. However, since the auditwheel update (or something else around May 1) Python 2.7 builds remain broken.

We've been poking around in different parts of the build system but so far unsuccessful. Do you have an idea where we could look further? And also, could you clarify the following, please?

auditwheel is for Python 3.3+ only. So the correct way to build something for 2.7 seems to be to build the wheel in 2.7 but then perform the wheel repair step in 3.3+. We've had a look at the various multibuild scripts but we cannot discern whether that distinction is actually being made and where. Any words on this matter will be greatly appreciated.

Options for automating entire release process (including signing)?

Releasing a signed source distribution and signed wheels for all platforms is difficult to automate because one has to wait for the artifacts to be built before signing them. Are there any good strategies for automating this process here? Could a builder submit wheels to some server (e.g. AWS Lambda "server") for the signature before uploading? The submission would have to be authenticated, of course.

Bref, is there currently a wheelhouse-signer?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.