Comments (13)
Can we submit the patches in tranches, say 2-5k at a time?
I guess so. My first instinct was that this would lead to conflicts left and right, but after further thought, as long as no-one overrides our global pinning of 1.2 in their feedstock, all packages should continue to build against 1.2 for now, but pull in 1.3 in the environment if all other packages are compatible with it already (this obviously needs a looser run-export on the zlib feedstock but that's trivial).
That way we could do a gradual roll-out. In fact, we may be able to slice by month. Say we first patch all the May 2024 builds, see what happens, then April, etc. At some point we could then either bump the global pinning to 1.3 and just see where things break (and fix up the stragglers), or actually do a full migration (which at that point would be far less log-jammy, as all recent feedstocks would already be compatible).
from conda-forge-pinning-feedstock.
@conda-forge/core, someone care to approve the zlib PRs? Repopatch approach already has Matt's LGTM, but if you have an opinion on this, please comment!
I approved. Thanks for tackling this!
from conda-forge-pinning-feedstock.
We've now patched ~100k artefacts back to October 2021 to have a looser (major-only) pin. This effort will be concluded with loosening the global pinning as well: #5996
from conda-forge-pinning-feedstock.
Ouch. How many packages would be rebuilt?
from conda-forge-pinning-feedstock.
About 500 feedstocks according to a quick search. This would also logjam with pretty much any other migration. It'd be a lot of work, and a lot of compute resources wasted on doing something that's not necessary from the POV of ABI compatibility.
So I think we should go with the repodatapatching approach. But we could constrain it to something like "builds in the last 6 month" (which should get us down to ~20k, but will create conflicts for feedstocks that never got rebuilt in that time), or - which is waaaaaay more work - only patch the last 1-2 builds of each feedstock (plus their migration_branches, where still active).
from conda-forge-pinning-feedstock.
With a package as pervasive as zlib
, it's kinda "damned if you do, damned if you don't" from the POV of repo-patching. If we use just a major-level pin and 1.4 breaks ABI, we'd have to patch that cap into all builds. If we're on a minor-level pin, we'd have to loosen the run-export every time a minor version version that's actually ABI-compatible comes along.
I guess given the ABI history (and how 1.3 is still compatible), we could bet on major-level compatibility and just hope for the best. But even in that case, we'd have to patch out the existing caps at least once.
from conda-forge-pinning-feedstock.
Can we submit the patches in traunches, say 2-5k at a time?
Then we'd avoid big disruptions in the CDN process.
from conda-forge-pinning-feedstock.
this obviously needs a looser run-export on the zlib feedstock but that's trivial
[...]
Say we first patch all the May 2024 builds...
Opened:
- conda-forge/zlib-feedstock#79
- conda-forge/zlib-feedstock#80
- conda-forge/conda-forge-repodata-patches-feedstock#739 (around ~4500 packages)
from conda-forge-pinning-feedstock.
@conda-forge/core, someone care to approve the zlib PRs? Repopatch approach already has Matt's LGTM, but if you have an opinion on this, please comment!
from conda-forge-pinning-feedstock.
I can see the linked PRs were merged few days ago. What needs to be done to be e.g able to install python+zlib-1.3 from conda forge? Does new build for python need to be done?
from conda-forge-pinning-feedstock.
We are submitting the repodata patches in tranches to avoid any CDN breakages. Once we patch python and any other deps in the env, it will work. So the only thing to do is to wait.
from conda-forge-pinning-feedstock.
What needs to be done to be e.g able to install python+zlib-1.3 from conda forge?
This is already possible; you can try
mamba create -n test_env python=3.<x> zlib=1.3
for any python from 3.8-3.12, and it will resolve successfully. However, the more packages you want to add to the environment, the more packages you pull in transitively, the higher the chance that any one of those hasn't yet been rebuilt in the last couple of month (i.e. as far back as we've patched the CDN up until now).
So aside from waiting for the repodata patches to roll out, you can also reduce your environment (if you really need zlib 1.3 ASAP).
from conda-forge-pinning-feedstock.
It seems this effort is pushing the resolver into some contortions. Despite the recipe still pinning zlib 1.2, I get
The following packages are incompatible
├─ libarrow 15.0.2.* h0870315_12_cpu is installable and it requires
│ └─ libgoogle-cloud >=2.24.0,<2.25.0a0 , which requires
│ └─ libcurl >=8.7.1,<9.0a0 , which requires
│ └─ libssh2 >=1.11.0,<2.0a0 , which requires
│ └─ libzlib >=1.2.13,<1.3.0a0 , which can be installed;
└─ libzlib 1.3.1.* h87427d6_1 is not installable because it conflicts with any installable versions previously reported.
(for an output that doesn't depend on zlib). libssh2
was last rebuilt in June 2023, so not yet touched by the current set of repodata patches, but still, I would have expected the solver to find a solution with 1.2.
from conda-forge-pinning-feedstock.
Related Issues (20)
- Migration to clang_osx64 17
- Possibility of Intel compiler toolchains? HOT 1
- Dropping CUDA 11.2 HOT 15
- CUDA 12 version in cbc.yaml - plans? HOT 4
- Migration libprotobuf4251 not being performed for abi_migration_branches ? HOT 6
- adding libparallelproj to conda-forge-pinning-feedstock HOT 10
- Decide how to handle newly-split `google_cloud_*` packages HOT 2
- add pre-commit CI to make sure yaml is valid HOT 8
- @conda-forge-admin rerender HOT 1
- @conda-forge-admin rerender HOT 1
- CUDA 12 builds install sysroot_linux-64 2.28 in the build prefix HOT 4
- 3 simultaneous poppler migrations HOT 3
- Add rdkit to global pinning HOT 3
- migrations on arrow-cpp-feedstock run into timeout while rerendering HOT 21
- numpy2.0 migration PRs have numpy1 .26 for python 3.12 HOT 2
- Close Python 3.12 migration HOT 4
- Migration not running for qt6_main HOT 5
- set upload on branch due to pre-commit
- Add azure-* CPP packages to global pinning HOT 9
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from conda-forge-pinning-feedstock.