Comments (4)
Hmm. I don't think we offer that at the moment!
FWIW if this is just for debugging purposes then you could add jax.debug.print
statements to the input or output of your function.
If you really want to interact with the history programmatically then (a) I'm quite curious what the use-case is, but also (b) we could probably add an additional optx.minimise(..., saveat=...)
argument without too much difficulty.
from optimistix.
Hi,
Thanks for the quick response. I am currently trying to use optimistix to implement Model Agnostic Meta-Learning and its implicit version (https://arxiv.org/abs/1909.04630). I was considering using a multi_step_solve
approach from above for the outer meta-learning loop because it makes the training very quick. However, I need to be able to monitor the meta-losses, which is why I was looking into the less efficient single_step_solve
. The saveat
option would be very useful!
On a side note, I had a look at the interactive stepping example, and I thought that it could be useful for solvers to have an update
method for performing a single optimisation step, similar to JAXOpt. What do you think?
from optimistix.
Makes sense. I'll mark this as a feature request for a saveat
option. (And I'm sure we'd be happy to take a PR on this.)
For performing a single optimisation steps, then I think we already have this, as the step
method on the solver?
from optimistix.
Thanks Patrick, I will try to make a PR on this :)
from optimistix.
Related Issues (20)
- Error in "optimistix/docs/examples /optimise_diffeq.ipynb" HOT 1
- Issue with vmap `optx.least_squares`. HOT 2
- grad of vmap of function which wraps an optax solver occasionally fails HOT 2
- `BestSoFar...` wanted behavior ? HOT 1
- Classical newton methods HOT 6
- Non-finite values in the root function are not handled well HOT 2
- Will constrained optimization be supported? HOT 4
- Behavior of BFGS HOT 2
- pytree output structure mismatch error in backprop during vmap HOT 9
- Incompatibility of least_squares and custom_vjp HOT 2
- Zero implicit gradients when using `ImplicitAdjoint` with CG solver HOT 4
- Would an exhaustive grid search have a place in `optimistix`? HOT 2
- Using `optimistix` with an `equinox` model HOT 2
- Incompatibility with jax 0.4.27 HOT 1
- Possibly of interest HOT 1
- Unexpected behaviour with JAX version HOT 3
- Slow compile of least_squares with large dict parameters HOT 2
- Can't vmap across input using Gauss Newton fwd HOT 11
- Question: errorhandling, BFGS minimization, vmap, and best practices HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from optimistix.