Giter VIP home page Giter VIP logo

staged-recipes's Introduction

About

This repo is a holding area for recipes destined for a conda-forge feedstock repo. To find out more about conda-forge, see https://github.com/conda-forge/conda-smithy.

Join the chat at https://gitter.im/conda-forge/conda-forge.github.io

Feedstock conversion status

create_feedstocks

Failures with the above job are often caused by API rate limits from the various services used by conda-forge. This can result in empty feedstock repositories and will resolve itself automatically. If the issue persists, support can be found on Gitter.

Getting started

  1. Fork this repository.
  2. Make a new folder in recipes for your package. Look at the example recipe, our documentation and the FAQ for help.
  3. Open a pull request. Building of your package will be tested on Windows, Mac and Linux.
  4. When your pull request is merged a new repository, called a feedstock, will be created in the github conda-forge organization, and build/upload of your package will automatically be triggered. Once complete, the package is available on conda-forge.

Grayskull - recipe generator for Python packages on pypi

For Python packages available on pypi it is possible to use grayskull to generate the recipe. The user should review the recipe generated, specially the license and dependencies.

Installing grayskull: conda install -c conda-forge grayskull

Generating recipe: grayskull pypi PACKAGE_NAME_HERE

FAQ

1. How do I start editing the recipe?

Look at one of these examples in this repository and modify it as necessary.

Follow the order of the sections in the example recipe. If you make a copy of example recipe, please remove the example's explainer comments from your recipe. Add your own comments to the recipe and build scripts to explain unusual build behavior or recipe options.

If there are details you are not sure about please open a pull request. The conda-forge team will be happy to answer your questions.

2. How do I populate the hash field?

If your package is on PyPI, you can get the sha256 hash from your package's page on PyPI; look for the SHA256 link next to the download link for your package.

You can also generate a hash from the command line on Linux (and Mac if you install the necessary tools below). If you go this route, the sha256 hash is preferable to the md5 hash.

To generate the md5 hash: md5 your_sdist.tar.gz

To generate the sha256 hash: openssl sha256 your_sdist.tar.gz

You may need the openssl package, available on conda-forge: conda install openssl -c conda-forge

3. How do I exclude a platform?

Use the skip key in the build section along with a selector:

build:
    skip: true  # [win]

A full description of selectors is in the conda docs.

If the package can otherwise be noarch you can also skip it by using virtual packages.

Note: As the package will always be built on linux, it needs to be at least available on there.

4. What does the build: 0 entry mean?

The build number is used when the source code for the package has not changed but you need to make a new build. For example, if one of the dependencies of the package was not properly specified the first time you build a package, then when you fix the dependency and rebuild the package you should increase the build number.

When the package version changes you should reset the build number to 0.

5. Do I have to import all of my unit tests into the recipe's test field?

No, you do not. The main purpose of the test section is to test whether this conda package was built and installed correctly (not whether the upstream package contains bugs).

6. Do all of my package's dependencies have to be in conda(-forge) already?

Short answer: yes. Long answer: In principle, as long as your dependencies are in at least one of your user's conda channels they will be able to install your package. In practice, that is difficult to manage, and we strive to get all dependencies built in conda-forge.

7. When or why do I need to use {{ PYTHON }} -m pip install . -vv?

This should be the default install line for most Python packages. This is preferable to python setup.py because it handles metadata in a conda-friendlier way.

8. Do I need bld.bat and/or build.sh?

In many cases, no. Python packages almost never need it. If the build can be done with one line you can put it in the script line of the build section.

9. What does being a conda-forge feedstock maintainer entail?

The maintainers "job" is to:

  • keep the feedstock updated by merging maintenance PRs from conda-forge's bots;
  • keep the package updated by bumping the version whenever there is a new release;
  • answer questions about the package on the feedstock issue tracker.

10. Why are there recipes already in the recipes directory? Should I do something about it?

When a PR of recipe(s) is ready to go, it is merged into main. This will trigger a CI build specially designed to convert the recipe(s). However, for any number of reasons the recipe(s) may not be converted right away. In the interim, the recipe(s) will remain in main until they can be converted. There is no action required on the part of recipe contributors to resolve this. Also it should have no impact on any other PRs being proposed. If these recipe(s) pending conversion do cause issues for your submission, please ping conda-forge/core for help.

11. Some checks failed, but it wasn't my recipe! How do I trigger a rebuild?

Sometimes, some of the CI tools' builds fail due to no error within your recipe. If that happens, you can trigger a rebuild by re-creating the last commit and force pushing it to your branch:

# edit your last commit, giving it a new time stamp and hash
# (you can just leave the message as it is)
git commit --amend
# push to github, overwriting your branch
git push -f

If the problem was due to scripts in the staged-recipes repository, you may be asked to "rebase" once these are fixed. To do so, run:

# If you didn't add a remote for conda-forge/staged-recipes yet, also run
# these lines:
# git remote add upstream https://github.com/conda-forge/staged-recipes.git
# git fetch --all
git rebase upstream/main
git push -f

12. My pull request passes all checks, but hasn't received any attention. How do I call attention to my PR? What is the customary amount of time to wait?

Thank you very much for putting in this recipe PR!

This repository is very active, so if you need help with a PR, please let the right people know. There are language-specific teams for reviewing recipes.

Language Name of review team
python @conda-forge/help-python
python/c hybrid @conda-forge/help-python-c
r @conda-forge/help-r
java @conda-forge/help-java
nodejs @conda-forge/help-nodejs
c/c++ @conda-forge/help-c-cpp
perl @conda-forge/help-perl
Julia @conda-forge/help-julia
ruby @conda-forge/help-ruby
other @conda-forge/staged-recipes

Once the PR is ready for review, please mention one of the teams above in a new comment. i.e. @conda-forge/help-some-language, ready for review! Then, a bot will label the PR as 'review-requested'.

Due to GitHub limitations, first time contributors to conda-forge are unable to ping conda-forge teams directly, but you can ask a bot to ping the team using a special command in a comment on the PR to get the attention of the staged-recipes team. You can also consider asking on our Gitter channel if your recipe isn't reviewed promptly.

All apologies in advance if your recipe PR does not receive prompt attention. This is a high volume repository and the reviewers are volunteers. Review times vary depending on the number of reviewers on a given language team and may be days or weeks. We are always looking for more staged-recipe reviewers. If you are interested in volunteering, please contact a member of @conda-forge/core. We'd love to have your help!

13. Is there a changelog for this repository?

There's no changelog file, but the following git command gives a good overview of the recent changes in the repository:

$ git log --merges -- ':!recipes' 

staged-recipes's People

Contributors

bastianzim avatar beckermr avatar bgruening avatar bollwyvl avatar carterbox avatar cbrueffer avatar chrisburr avatar conda-forge-admin avatar dbast avatar dopplershift avatar dpryan79 avatar goanpeca avatar hmaarrfk avatar isuruf avatar jaimergp avatar jakirkham avatar jan-janssen avatar marcelotrevisani avatar mariusvniekerk avatar mementorc avatar ocefpaf avatar pmlandwehr avatar rxm7706 avatar scopatz avatar sodre avatar synapticarbors avatar thewchan avatar tobias-fischer avatar traversaro avatar xhochy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

staged-recipes's Issues

Linter not failing builds properly

This has happened a few times lately. We really need to straighten this out so we get proper failures pre-merge not failed deployments post-merge. Even if we look carefully, we may miss something the linter doesn't and we don't want to be doing a post-merge scramble to fix everything.

Recipes which depend on blas and lapack

I have a recipe that I am looking to submit that has compiled extensions which links to blas and lapack. Does anyone have experience building package which depend on these or have any suggestions on how get these to build in a portable manner? I'm only looking to support Linux and OS X with the package which hopefully makes the process a bit easier.

AppVeyor speed-up!

@jakirkham @ocefpaf
Feodor Fitsner (founder of AppVeyor) reached me, and after a short conversation, he suggested to bump CPU speed for Conda-Forge projects. He asks for an account which is used for Conda-Forge, so he can apply the changes. Is it conda-forge? (I will forward the information via e-mail)

Organising the conda communities and establishing best practices.

We all love conda and there are many communities that build awesome packages that are easy to use. I would like to see more exchange between these communities to finally share more build-scripts, to develop one best-practice guide and finally to have channels that can be used together without breaking recipes - a list of trusted channels with similar guidelines.

For example the bioconda community - specialised on bioinformatic software. They have some very nice guides how to develop packages, they are reviewing and bulk-patches recipes if there are new features in conda to make the overall experience even better.
ping @johanneskoester, @daler and @chapmanb from BioConda fame

Omnia has a lot of cheminformatic software and a nice build-box based on phusion/holy-build-box-64 + CUDA and AMD APP SDK.
ping @kyleabeauchamp, @jchodera

With conda-forge there is now a new one and it would be great to get all interested people together to join forces here and don't replicate our recipes or copy them from one channel to the other just to make them compatible.

Another point is that we probably want to move recipes to default at some point and deliver our work back to Continuum - so that we can benefit from each other.

I can imagine that we all form a group of trusted communities and channels and activate them by default in our unified build-box - or we have one giant community channel. All this I would like to discuss with everyone that is interested and come up with a plan how to make this happen :)

What do you all think about this?
As a next step I would like to create a doodle to find a meeting data where at least one representative from all communities can participate.

Many thanks to Continuum Analytics for there continues support and the awesome development behind scientific python and this package manager.
ping @jakirkham @msarahan

AppVeyor broken on staged-recipes

Appears AppVeyor has stopped working on staged-recipes. Am seeing error messages like this when trying to restart anything.

Cannot read last commit information from GitHub repository

See the following related issues.

Travis-CI failure on master

@pelson is this related to the latest conda-smithy?

See https://travis-ci.org/conda-forge/staged-recipes/jobs/115055485

Repository created, please edit conda-forge.yml to configure the upload channels
and afterwards call 'conda smithy register-github'
usage: a tool to help create, administer and manage feedstocks.
       [-h]
       {init,register-github,register-ci,regenerate,recipe-lint,rerender} ...
a tool to help create, administer and manage feedstocks.: error: invalid choice: 'github-create' (choose from 'init', 'register-github', 'register-ci', 'regenerate', 'recipe-lint', 'rerender')
Traceback (most recent call last):
  File ".CI/create_feedstocks.py", line 118, in <module>
    subprocess.check_call(['conda', 'smithy', 'github-create', feedstock_dir] + owner_info)
  File "/Users/travis/miniconda/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['conda', 'smithy', 'github-create', '/var/folders/gw/_2jq29095y7b__wtby9dg_5h0000gn/T/tmp_pPzpc__feedstocks/autoconf-feedstock', '--organization', 'conda-forge']' returned non-zero exit status 2

Automating boiler plate

There is quite a bit of boiler plate that goes into recipes. It is a pain making sure it is right for bother reviewers and reviewees (?). We should move towards automating this by using all means at our disposal (e.g. jinja, scripts, metapackages, etc.). The hope is that most of this (if not all) of this could be removed. What I would like us to do here is figure out what and how. One step in this direction would be taking a hard look at this script by @183amir, which converts PyPI packages into conformant recipes.

cc @msarahan @ocefpaf @pelson

Rate limited

Currently we are rate limited here. So, wait a bit before merging anything. Let's try restarting in ~20min of this issue being opened.

Long running quiet Travis builds terminated

For an example of this problem, see PR and this build.

Travis CI doesn't like it if something doesn't produce output for 10 mins. While this can be the sign of a hang, there are perfectly reasonable scenarios where it isn't a hang, but a silent long running command. Unfortunately, there is no simple way to change the time limit (unless this has changed recently 🙏).

One way to workaround this is to use travis_wait, which is essentially a hack that writes to disk periodically. We could try to run conda build through this, but IIRC it gobbles up stdout, which is really unhelpful not too mention this would need to go in the CI and thus affect all recipes in the staging area, which I really don't like. If we have to add it to feedstocks that need it as a workaround, that could be ok until we come up with something better. Just as long as we verify that this let's the build complete correctly and is not merely hiding some more serious problem. In other words, have submitters try and then remove before merging.

Alternatively, we could come up with some background process that does a similar thing to travis_wait.

Any other thoughts on this problem? Other potential solutions?

Rate limited

Currently we are rate limited here. So, wait a bit before merging anything. Let's try restarting in ~30min-1hr of this issue being opened.

Uploading to a custom channel

Is there a way to set up a feedstock to upload to a custom channel in addition to uploading to the conda-forge channel?

opencv3

Hi everyone,

Are we going to have opencv3 in conda-forge? If yes, could someone who has experience please do it?
Maybe @jakirkham you can do it?

Rate limited

Currently we are rate limited here. So, wait a bit before merging anything. Let's try restarting in ~20min of this issue being opened.

Python install line standard

Can we establish a standard for how we do the whole python setup.py dance?

Please vote by posting a comment with the corresponding number. Majority after 1 week wins?

  1. python setup.py install --single-version-externally-managed --record=record.txt
  2. python setup.py install --single-version-externally-managed --root=/
  3. python setup.py install --old-and-unmanageable
  4. python -m pip install --no-deps .
  5. pip install --no-deps .

I give up on strikethrough. 4 is out, due to error (https://travis-ci.org/conda-forge/staged-recipes/jobs/127374916#L405)

Getting your run dependencies right

Hey folks,

Something I said in the comments of the merged RIOS PR:

I'm not convinced you should relax your gdal 2.0.2 spec (or perhaps it should become a >= spec). The TravisCI output suggests that rios isn't working with gdal 2.0.0. You need to prevent people from installing this package alongside 2.0.0, then.

I wanted to make sure that this doesn't get lost in an already closed/merged thread, because it doesn't just apply to RIOS, it applies across the entire conda-forge ecosystem.

It is important that a package's run requirements have sufficient version requirements to ensure proper operation. In particular: it is not sufficient to assume that a user will have, or obtain, the latest version of a given dependency. If you know your package is not compatible with older versions of a dependency, you must reflect that in the run dependencies.

For example: suppose I have a functioning environment built containing gdal 2.0.0 from defaults; then I type conda install rios. If rios simply specifies gdal as a run-time dependency, with no version specification, then it will happily install it without upgrading gdal. But rios will be broken according to this. This means that an unqualified runtime dependency on gdal is incorrect. On the other hand, if rios includes gdal >2.0 as its runtime dependency, it will cause conda to upgrade past the failure.

On the other hand, you want your dependencies to be as relaxed as possible, otherwise you invite a litany of unsatisfiable specs in complex environments. So, for example, the current rios recipe specifies gdal 2.0.2. In my view, this ought to be relaxed to gdal >2.0 or gdal >=2.0.2, unless you have good reason to believe that subsequent versions of gdal will break rios.

To be fair, the problems here are made worse by channel collision issues: it could be that conda-forge's version of gdal 2.0* works just fine, but the ones in default do not. We're working on that, as you know. But for the foreseeable future, people are going to be mixing and matching packages from defaults and conda-forge, so one should still take care to lock down dependencies to avoid breakage.

Adding more example recipes to better demonstrate common usage patterns

See this comment from @scopatz. Basically, reviewers and contributors could hopefully do less work if we have a few common examples to look at that demonstrate ideal usage patterns.

For instance, we are commonly pointing out that we want to use script for one line statements. We should have this in the example. Also, setuptools arguments should be included. We might even want to explain they should be dropped for distutils only installs.

Having an example of the UNIX only recipe would be very helpful for showing how to skip Windows.

Also, having a cross platform (or at least Windows) build so that we can demonstrate features.

These should be simple to come up with, but are a huge benefit as they reward users for simply copying and tweaking the examples.

Collections of recipes needing attention?

I'm already feeling overwhelmed with keeping track of recipes that either need Windows modification or need attention at some later date (e.g. to include some Continuum changes that I know are available)

Is there any way we can consolidate these recipes/repos into a list that is easier to approach and to track? Perhaps a wiki page? Some Google spreadsheet? Create issues on those repos and assign people to them?

Master is failing to create the feedstocks

CI support files regenerated. These need to be pushed to github!
[master 6f53a0b] Re-render the feedstock after CI registration.
 2 files changed, 7 insertions(+), 7 deletions(-)
 rewrite conda-forge.yml (85%)
rm 'recipes/check/build.sh'
rm 'recipes/check/meta.yaml'
CI Summary for conda-forge/libmo_unpack-feedstock (can take ~30s):
 * conda-forge/libmo_unpack-feedstock already enabled on travis-ci
 * conda-forge/libmo_unpack-feedstock already enabled on CircleCI
 * conda-forge/libmo_unpack-feedstock already enabled on appveyor
Traceback (most recent call last):
  File "/Users/travis/miniconda/bin/conda-smithy", line 9, in <module>
    load_entry_point('conda-smithy==0.7.1', 'console_scripts', 'conda-smithy')()
  File "/Users/travis/miniconda/lib/python2.7/site-packages/conda_smithy/cli.py", line 227, in main
    args.subcommand_func(args)
  File "/Users/travis/miniconda/lib/python2.7/site-packages/conda_smithy/cli.py", line 162, in __call__
    ci_register.appveyor_configure(owner, repo)
  File "/Users/travis/miniconda/lib/python2.7/site-packages/conda_smithy/ci_register.py", line 114, in appveyor_configure
    raise ValueError(response)
ValueError: <Response [404]>
Traceback (most recent call last):
  File ".CI/create_feedstocks.py", line 123, in <module>
    subprocess.check_call(['conda', 'smithy', 'register-ci', '--feedstock_directory', feedstock_dir] + owner_info)
  File "/Users/travis/miniconda/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['conda', 'smithy', 'register-ci', '--feedstock_directory', '/var/folders/gw/_2jq29095y7b__wtby9dg_5h0000gn/T/tmpmtorVK__feedstocks/libmo_unpack-feedstock', '--organization', 'conda-forge']' returned non-zero exit status 1

"Push" cancel buttons on all builds of a PR when new `git push`

Just a thought on refining the jobs in the CI build queues when dealing with PRs. Frequently people are tweaking their PRs based on the quick feedback from the community (thanks for being so great everyone 😄). Unfortunately, this results in CI backlog of builds particularly for the slower ones (namely AppVeyor, sometimes Travis, very rarely Circle). It would be good if we just "pushed" the cancel button. I do this manually to some extent, but it is hard to keep up with and not always a good use of time. It is not obvious to me where this would run. Though putting it on the fastest of the CIs would make sense (so Circle). Thoughts? Other ideas?

Maintainer preferences

EDIT: This has moved to these docs in this PR ( conda-forge/conda-forge.github.io#95 ). So I have cut it from here to avoid unnecessary duplication and to avoid this potentially becoming stale/contradictory. However, it is still important to list people's general preferences regarding what they are maintaining generally to avoid overpinging. This issues has been restructured accordingly.

[FYI] recently uploaded PyPi packages don't have short package links

The packages used to be under ../../packages/source/<first letter>/<pkg name>/, now only available as ../../packages/<two-char hash>/<two-char hash>/<long hash>/.
The according links also changed in the "simple" index page. This has changed the disk layout of the bandersnatch. Our build system broke due to the mirror of bandersnatch doe not have the same disk layouts in the disk.

Related issues:

Example:

There is a package colorlog with two versions: 2.6.1 (old) and 2.6.3 (new). The following link works fine:

https://pypi.python.org/packages/source/c/colorlog/colorlog-2.6.1.tar.gz

However, if you just change the version number, you will get 404:

https://pypi.python.org/packages/source/c/colorlog/colorlog-2.6.3.tar.gz

The working link to 2.6.3 version is:

https://pypi.python.org/packages/cd/e7/7e14ce72038e83bc475a1f708485e5ea4a789beef26ff64b732918f860e6/colorlog-2.6.3.tar.gz

Handling duplicate package additions

If someone tries to add a recipe for a feedstock that is already present, it would be nice if the differences could be placed in a PR against the existing feedstock. Ideally, the PR would ping the user who made the change and, if possible, make them the author of the PR so they could change it as needed.

Somebody just broke `gcc` on Linux

isl:                  0.16.1-0         
Removing old build environment
Removing old work directory
BUILD START: bob.learn.em-2.0.8-py27_0
Using Anaconda Cloud api site https://api.anaconda.org
Fetching package metadata: ..........
Solving package specifications: .........

The following NEW packages will be INSTALLED:

    bob.blitz:            2.0.8-np111py27_4
    bob.core:             2.1.2-py27_1     
    bob.extension:        2.0.11-py27_4    
    bob.io.base:          2.0.8-py27_0     
    bob.learn.activation: 2.0.4-py27_2     
    bob.learn.linear:     2.0.7-py27_2     
    bob.math:             2.0.3-py27_2     
    bob.sp:               2.0.4-py27_2     
    boost:                1.60.0-py27_0    
    cloog:                0.18.0-0         
    cmake:                3.5.0-3          
    gcc:                  4.8.5-3          
    gmp:                  6.1.0-2          
    hdf5:                 1.8.16-2         
    icu:                  56.1-2           
    isl:                  0.16.1-0         
    libblitz:             0.10-1           
    libgcc:               4.8.5-1          
    libgfortran:          3.0-0            
    mkl:                  11.3.1-0         
    mpc:                  1.0.3-1          
    mpfr:                 3.1.4-1          
    numpy:                1.11.0-py27_0    
    openblas:             0.2.14-4         
    openssl:              1.0.2g-0         
    pip:                  8.1.1-py27_1     
    pkg-config:           0.28-1           
    python:               2.7.11-0         
    readline:             6.2-2            
    setuptools:           20.7.0-py27_0    
    sqlite:               3.9.2-0          
    tk:                   8.5.18-0         
    wheel:                0.29.0-py27_0    
    zlib:                 1.2.8-0          

Linking packages ...
/home/amir/miniconda/envs/_build/gcc/libexec/gcc/x86_64-unknown-linux-gnu/4.8.5/cc1: error while loading shared libraries: libisl.so.10: cannot open shared object file: No such file or directory           |  76%
Installation failed: gcc is not able to compile a simple 'Hello, World' program.
Error: post-link failed for: gcc-4.8.5-3

Package inkscape

It seems nbconvert uses inkscape to transform svg images to whatever is used to include such images in latex/PDF files.

There is a recipe in conda-recipes: https://github.com/conda/conda-recipes/tree/master/inkscape but this uses a repackaged windows download.

Not sure how hard it would be to use a build from source approach... It uses MSYS/MINGW: http://wiki.inkscape.org/wiki/index.php/Compiling_Inkscape_on_Windows and especially the devlibs repo http://wiki.inkscape.org/wiki/index.php/Inkscape_Devlibs makes me too scared to try... :-(

Recipes that use SSE/AVX instructions

Some recipes utilise intrinsics such as SSE and AVX. Enabling these intrinsics greatly improves the speed of many applications. However, most applications do not guard against a particular intrinsic not existing at run time and will simply crash if the architecture it is run on does not support a given instruction.

For example, OpenCV supports up to SSE 4.1 instructions - yet it is feasible that someone may download OpenCV on an older Linux box that does not support SSE 4.1. Therefore, I imagine we want to do something like the proposal in #80 whereby we have features for different SSE levels - where supported?

Homebrew formulas

Hi there,

I'm running into one dependency after the other.

Most of the time I Google for "homebrew formular packagexyz"
and copy parts from there.

Why not write a wrapper for those formulas?

AppVeyor badge

AppVeyor badge appears to be broken. @pelson is staged-recipes still in your account? I cannot see the master logs either.

Travis is broken on master

Calculating the recipes which need to be turned into feedstocks.
Making feedstock for nilearn
Traceback (most recent call last):
  File "/Users/travis/miniconda/bin/conda-smithy", line 9, in <module>
    load_entry_point('conda-smithy==0.10.0', 'console_scripts', 'conda-smithy')()
  File "/Users/travis/miniconda/lib/python3.5/site-packages/conda_smithy/cli.py", line 228, in main
    args.subcommand_func(args)
  File "/Users/travis/miniconda/lib/python3.5/site-packages/conda_smithy/cli.py", line 101, in __call__
    generate_feedstock_content(feedstock_directory, args.recipe_directory, meta)
  File "/Users/travis/miniconda/lib/python3.5/site-packages/conda_smithy/cli.py", line 32, in generate_feedstock_content
    configure_feedstock.main(target_directory)
  File "/Users/travis/miniconda/lib/python3.5/site-packages/conda_smithy/configure_feedstock.py", line 323, in main
    for key, value in file_config.items():
AttributeError: 'list' object has no attribute 'items'
Traceback (most recent call last):
  File ".CI/create_feedstocks.py", line 94, in <module>
    '--feedstock-directory', feedstock_dir])
  File "/Users/travis/miniconda/lib/python3.5/subprocess.py", line 584, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['conda', 'smithy', 'init', '/Users/travis/build/conda-forge/staged-recipes/recipes/nilearn', '--feedstock-directory', '/var/folders/gw/_2jq29095y7b__wtby9dg_5h0000gn/T/tmp1xgk4lu5__feedstocks/nilearn-feedstock']' returned non-zero exit status 1

CI status

Travis-CI failed during the merge of #100 and I had to re-start it. I Wonder if we should send the status e-mail to the mailing list, or to a few project managers.

Latest conda-smithy is prevent poliastro feedstock creation

Repository registered at github, now call 'conda smithy register-ci'
Making feedstock for poliastro
/Users/travis/build/conda-forge/staged-recipes/recipes/poliastro has some lint:
  Selectors are suggested to take a "  # [<selector>]" form.
Traceback (most recent call last):
  File ".CI/create_feedstocks.py", line 93, in <module>
    subprocess.check_call(['conda', 'smithy', 'recipe-lint', recipe_dir])
  File "/Users/travis/miniconda/lib/python3.5/subprocess.py", line 584, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['conda', 'smithy', 'recipe-lint', '/Users/travis/build/conda-forge/staged-recipes/recipes/poliastro']' returned non-zero exit status 1

I am working on that.

Set `CONDA_NPY` on Linux

We still need to set the CONDA_NPY dummy environment on Linux. I guess that is because we are still an older conda-build (1.18.2) there.

Traceback (most recent call last):
  File "/opt/conda/bin/conda-build-all", line 9, in <module>
    load_entry_point('conda-build-all==0.11.0', 'console_scripts', 'conda-build-all')()
  File "/opt/conda/lib/python3.5/site-packages/conda_build_all/cli.py", line 85, in main
    b.main()
  File "/opt/conda/lib/python3.5/site-packages/conda_build_all/builder.py", line 207, in main
    recipe_metas = self.fetch_all_metas()
  File "/opt/conda/lib/python3.5/site-packages/conda_build_all/builder.py", line 150, in fetch_all_metas
    recipe_metas = sort_dependency_order(recipe_metas)
  File "/opt/conda/lib/python3.5/site-packages/conda_build_all/builder.py", line 97, in sort_dependency_order
    meta.parse_again()
  File "/opt/conda/lib/python3.5/site-packages/conda_build/metadata.py", line 356, in parse_again
    self.meta = parse(self._get_contents(permit_undefined_jinja))
  File "/opt/conda/lib/python3.5/site-packages/conda_build/metadata.py", line 634, in _get_contents
    env.globals.update(context_processor(self, path))
  File "/opt/conda/lib/python3.5/site-packages/conda_build/jinja_context.py", line 65, in context_processor
    ctx = get_environ(m=initial_metadata)
  File "/opt/conda/lib/python3.5/site-packages/conda_build/environ.py", line 248, in get_dict
    d['PKG_BUILD_STRING'] = str(m.build_id())
  File "/opt/conda/lib/python3.5/site-packages/conda_build/metadata.py", line 461, in build_id
    for ms in self.ms_depends():
  File "/opt/conda/lib/python3.5/site-packages/conda_build/metadata.py", line 434, in ms_depends
    ms = handle_config_version(ms, ver)
  File "/opt/conda/lib/python3.5/site-packages/conda_build/metadata.py", line 307, in handle_config_version
    raise RuntimeError("'%s' requires external setting" % ms.spec)
RuntimeError: 'numpy x.x' requires external setting

See https://circleci.com/gh/conda-forge/staged-recipes/2056 and #381 (comment)

Ping @pelson

Rate Limited

Currently we are rate limited here. So, wait a bit before merging anything. Let's try restarting in ~20min of this issue being opened.

package a latex distribution

The usecase would be that one could install pandoc (already packaged) and latex to get pdf output form nbconvert. For that to work it would be nessesary if the tex distribution would install the binaries into <env>\Scripts on windows.

Unfortunately, this seems to be a lot harder than I thought (only tried windows):

repackage miktex portable

miktex directory structure means that the bin dir has to be separate from the script dir in the conda env (bin is in .\miktex\bin and it expects the rest of the stuff in ., so we end up with a directory structure below the conda env main dir :-/). But this means that we have to add .\miktex\bin to the path, which is feasible using activate.d scripts, but this again means that one would have to activate such an environment before starting the notebook server (or any other commandline program), which is IMO not gonna work (e.g. who activates the main environment? And my notebook server is installed in an env, but is started by just specifying the complete path to jupyter.exe in that env).

Miktex has the disadvantage that it is only available on windows, but the advantage, that the install size is little because it installs (can install) missing packages automatically.

texlive

The portable binary installer installs itself into ./bin/win32, so that's also not gonna work :-( It seems it is possible to install texlive from source in the required form (setting bindir in the configure step), but just reading the documentation made me too scared to try (and it uses svn, which I don't have installed...). And actually supporting a latex distributions is a bigger job than I want to try...

texlife is available in mac/linux/windows, but installing missing packages is a manual step (or the package gets really big...)...

So, what are the options?

  • Use a repackaged miktex portable, use a activate.d script and warn user, that they have to use activate (I've actually no clue how that work out in the root environment), if they want to have access to the latex binaries in e.g. nbconvert. Maybe get nbconvert, mpl and friends to add the dir if it exist (either upstream or in a patch).
  • Try to compile texlive on all platforms, putting the bindir in the dir which is anyway included in the path
  • I've no clue if it would be possible to edit the binaries of texlive to remove the \win32 part?

Any (other) ideas?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.