Comments (26)
Hi Omar,
we have now released the SONG source code (after much ado...). You may find it in this link.
https://github.com/damithsenanayake/SONG
from umap.
That certainly sounds like an interesting approach. I would be interested to know more about training the convolutional layers to process the image and give the embedding. In the meantime I am finishing up the required refactoring to get the simplified transform method in place. I think I'll have the leave the NN work to others.
from umap.
Hi All,
I've recently put my method SONG up on arXiv (https://arxiv.org/abs/1912.04896), which may be an alternative approach for 'Model Persistence with UMAP' in Leland's long-term road-map. Please give it a read and let me know what you guys think.
https://arxiv.org/abs/1912.04896
from umap.
I think the SONG option listed above is not unreasonable. In practice real-time transforms are not something that will be available any time soon in UMAP.
from umap.
Hi Omar,
We will be soon making the SONG code available (in a couple of weeks). Please stay tuned. Cheers.
from umap.
I think the answer to that is that all the requisite code to manage that has not been developed yet -- specifically the prediction aspect. There are a few ways to do that, but the most viable is something like parametric t-SNE where one trains a function approximator to learn a functional transform to match the embedding (in this case a neural network). I should note that in UMAPs case this would look somewhat akin to a "deep" word2vec type training. Other prediction modes are possible whereby one retrains the model holding the training points embedding locations fixed and optimizing only the locations of the new points.
In other words for now it is more an "in principle" statement -- none of this is hard in the sense that I believe there are no theoretical obstructions to making this work, so from an algorithmic research point of view it is "solved" on some level. In practice yes, there is code that needs to be written to make this actually practical, and some of that is potentially somewhat non-trivial.
from umap.
Nice! It sounds like both approaches are viable. I'm going to try and clean up the notebooks and then get a working version of my current approach working as a transform operation within UMAP itself. I would certainly be interested in neural network approaches as well though (I just don't have much expertise in that area).
from umap.
from umap.
ok thx for the explanation ! that's exactly what I thought, the method allows it but it's not really implemented yet. Just wanted to make sure. I would be happy to contribute on this at some point.
from umap.
Contributions are more than welcome -- especially on a parametric version as I have limited neural network experience.
from umap.
I have experimental code in some notebooks I wrote out of curiousity that can do a transform operation on new data under basic UMAP theory assumptions (place new data assuming the training embedding is fixed -- which is not different than, say, PCA). On my one test so far on MNIST digits it did great -- but then everything does great on MNIST digits. I think it should generalise though -- I'll have to put all the pieces together properly and try it on a few other datatsets. One downside is that is it "slow" -- based on timings of doing it piecemeal I think we're talking say ~20s for 10000 test points compared to 1m40s for 60000 training points fit time. Does this seem reasonable to you?
from umap.
Nice, I'd say 20sec is really reasonable ! Interestingly I did an experiment as well. In fact I tried something really naive just out of curiosity. I first projected the data into the manifold using the UMAP code. Then I wrote a simple fully connected neural network and trained it on the result of the UMAP. Essentially learning the function that does the projection. Then I used that model to do the dim reduction in my predictive model (instead of the actual UMAP). Of course that model is totally specific to my problem/dataset, but the accuracy I get is similar to the one I got with the actual UMAP.
from umap.
fwiw i ported laurens' parametric t-SNE implemented to keras a few years ago https://github.com/kylemcdonald/Parametric-t-SNE and i tried both approaches: training a net to produce the same embedding as a previously "solved" t-SNE, and training a net to optimize for t-SNE's loss function directly. both gave interesting and useful results.
it gets really exciting when you can start using domain-specific operations like convolution or recurrence. for example, imagine UMAP running on images in a way that is simultaneously optimizing a convolutional network for processing the images at the same time as optimizing the embedding.
from umap.
@lmcinnes just as a question do you have a tutorial covering the math behind umap?
from umap.
@lmcinnes Thank you so much! Not gonna lie I am studying your implementation in umap, and it is very impressive! Very amazed to see a mathematician who is also a profound programmer.
from umap.
An easy implementation of a transform method would be what Laureen suggests for his t-SNE.
Using a multivariate regressor (a neural network?) fitted using the training dataset and its low-dimensional representation and use it to project new data onto the same manifold.
It is a naive approach but easy to implement.
The other way is the analogous parametric t-SNE approach that fits the model parameters using the t-SNE loss function directly.
@kylemcdonald implementation is a very good starting point. If you want we can work together on this.
from umap.
So this is the implementation of the naive multivariate regressor to project new data on the fitted UMAP embedding space
https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb
It's based on @kylemcdonald implementation using Keras.
In the next days I'll have a look at a more refined model, analogous to parametric t-SNE.
from umap.
@paoloinglese: This looks really interesting -- it would add some non-trivial dependencies, but is certainly worth looking into further. At the very least it would be very nice to have a documentation page simialr to your notebook demonstrating how to do this sort of thing. I look forward to hearing more.
from umap.
@lmcinnes Ok great! I'll prepare something in the next few days.
from umap.
@lmcinnes I've updated the notebook setting the tensorflow backend and added a simple k-nn classification using the UMAP embedding of the train set and predicting using the neural network predicted UMAP embedding for the test set
https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb
Unfortunately, I don't have much time to put more text in the notebook. I guess the overall idea is pretty clear, as suggested previously in other messages here on Github.
from umap.
thanks for this
from umap.
So I had tried a simple Auto-Encoder on Cifar10 a few months ago,
here it is.
https://github.com/HyperPotatoNeo/Deep-learning-experiments-with-umap/blob/master/AE%20vs%20UMAP_%20Cifar10.ipynb
from umap.
Any updates on this? UMAP is giving me great results, but I want to run it in real-time for new unseen points and the transform method is slow. Anybody knows how to get around this?
from umap.
I think the SONG idea is pretty good. I would like to give it a try, but I have not found any code. I would like to reproduce its results and test it out with my datasets.
from umap.
So this is the implementation of the naive multivariate regressor to project new data on the fitted UMAP embedding space
https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb
It's based on @kylemcdonald implementation using Keras.
In the next days I'll have a look at a more refined model, analogous to parametric t-SNE.
Today I checked this link but it is not valid any more.
from umap.
@none0none yes, I removed it after @lmcinnes et al. published a refined model for parametric UMAP https://arxiv.org/abs/2009.12981
from umap.
Related Issues (20)
- scipy.sparse._csparsetools.lil_get_lengths Error Running UMAP
- Not able to work with old embedder object created using python 3.8 HOT 1
- Setting a random state still leads to stochastic results
- Implementation of sciki-learn's get_feature_names_out() API is not correct
- Is 'n_training_epochs' working for parameteric UMAP?
- visualize video data
- How to combine UMAP models in new data?
- Edit instructions to make them compatible with zsh
- Empty API page on UMAP API Guide? HOT 1
- PCA diagnostic error HOT 2
- Speed inquries HOT 2
- UMAP crashes when torch also imported before first run HOT 2
- Unable to pickle trained UMAP instance
- Reducing Model Size for UMAP on Large Datasets HOT 2
- umap.UMAP accepts strings as n_neighbors and min_dist, causing later failures
- Optimal dimensions
- RunUMAP Failing HOT 1
- Semi-deterministic output even though randon_state is set
- TypeError: Dispatcher._rebuild() got an unexpected keyword argument 'impl_kind' HOT 1
- illegal hardware instruction python HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from umap.