Giter VIP home page Giter VIP logo

Comments (26)

damithsenanayake avatar damithsenanayake commented on April 28, 2024 4

Hi Omar,
we have now released the SONG source code (after much ado...). You may find it in this link.
https://github.com/damithsenanayake/SONG

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024 3

That certainly sounds like an interesting approach. I would be interested to know more about training the convolutional layers to process the image and give the embedding. In the meantime I am finishing up the required refactoring to get the simplified transform method in place. I think I'll have the leave the NN work to others.

from umap.

damithsenanayake avatar damithsenanayake commented on April 28, 2024 2

Hi All,
I've recently put my method SONG up on arXiv (https://arxiv.org/abs/1912.04896), which may be an alternative approach for 'Model Persistence with UMAP' in Leland's long-term road-map. Please give it a read and let me know what you guys think.

https://arxiv.org/abs/1912.04896

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024 2

I think the SONG option listed above is not unreasonable. In practice real-time transforms are not something that will be available any time soon in UMAP.

from umap.

damithsenanayake avatar damithsenanayake commented on April 28, 2024 2

Hi Omar,
We will be soon making the SONG code available (in a couple of weeks). Please stay tuned. Cheers.

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024 1

I think the answer to that is that all the requisite code to manage that has not been developed yet -- specifically the prediction aspect. There are a few ways to do that, but the most viable is something like parametric t-SNE where one trains a function approximator to learn a functional transform to match the embedding (in this case a neural network). I should note that in UMAPs case this would look somewhat akin to a "deep" word2vec type training. Other prediction modes are possible whereby one retrains the model holding the training points embedding locations fixed and optimizing only the locations of the new points.

In other words for now it is more an "in principle" statement -- none of this is hard in the sense that I believe there are no theoretical obstructions to making this work, so from an algorithmic research point of view it is "solved" on some level. In practice yes, there is code that needs to be written to make this actually practical, and some of that is potentially somewhat non-trivial.

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024 1

Nice! It sounds like both approaches are viable. I'm going to try and clean up the notebooks and then get a working version of my current approach working as a transform operation within UMAP itself. I would certainly be interested in neural network approaches as well though (I just don't have much expertise in that area).

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024 1

from umap.

mlaprise avatar mlaprise commented on April 28, 2024

ok thx for the explanation ! that's exactly what I thought, the method allows it but it's not really implemented yet. Just wanted to make sure. I would be happy to contribute on this at some point.

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024

Contributions are more than welcome -- especially on a parametric version as I have limited neural network experience.

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024

I have experimental code in some notebooks I wrote out of curiousity that can do a transform operation on new data under basic UMAP theory assumptions (place new data assuming the training embedding is fixed -- which is not different than, say, PCA). On my one test so far on MNIST digits it did great -- but then everything does great on MNIST digits. I think it should generalise though -- I'll have to put all the pieces together properly and try it on a few other datatsets. One downside is that is it "slow" -- based on timings of doing it piecemeal I think we're talking say ~20s for 10000 test points compared to 1m40s for 60000 training points fit time. Does this seem reasonable to you?

from umap.

mlaprise avatar mlaprise commented on April 28, 2024

Nice, I'd say 20sec is really reasonable ! Interestingly I did an experiment as well. In fact I tried something really naive just out of curiosity. I first projected the data into the manifold using the UMAP code. Then I wrote a simple fully connected neural network and trained it on the result of the UMAP. Essentially learning the function that does the projection. Then I used that model to do the dim reduction in my predictive model (instead of the actual UMAP). Of course that model is totally specific to my problem/dataset, but the accuracy I get is similar to the one I got with the actual UMAP.

from umap.

kylemcdonald avatar kylemcdonald commented on April 28, 2024

fwiw i ported laurens' parametric t-SNE implemented to keras a few years ago https://github.com/kylemcdonald/Parametric-t-SNE and i tried both approaches: training a net to produce the same embedding as a previously "solved" t-SNE, and training a net to optimize for t-SNE's loss function directly. both gave interesting and useful results.

it gets really exciting when you can start using domain-specific operations like convolution or recurrence. for example, imagine UMAP running on images in a way that is simultaneously optimizing a convolutional network for processing the images at the same time as optimizing the embedding.

from umap.

JaeDukSeo avatar JaeDukSeo commented on April 28, 2024

@lmcinnes just as a question do you have a tutorial covering the math behind umap?

from umap.

JaeDukSeo avatar JaeDukSeo commented on April 28, 2024

@lmcinnes Thank you so much! Not gonna lie I am studying your implementation in umap, and it is very impressive! Very amazed to see a mathematician who is also a profound programmer.

from umap.

paoloinglese avatar paoloinglese commented on April 28, 2024

An easy implementation of a transform method would be what Laureen suggests for his t-SNE.

Using a multivariate regressor (a neural network?) fitted using the training dataset and its low-dimensional representation and use it to project new data onto the same manifold.
It is a naive approach but easy to implement.
The other way is the analogous parametric t-SNE approach that fits the model parameters using the t-SNE loss function directly.

@kylemcdonald implementation is a very good starting point. If you want we can work together on this.

from umap.

paoloinglese avatar paoloinglese commented on April 28, 2024

So this is the implementation of the naive multivariate regressor to project new data on the fitted UMAP embedding space

https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb

It's based on @kylemcdonald implementation using Keras.
In the next days I'll have a look at a more refined model, analogous to parametric t-SNE.

from umap.

lmcinnes avatar lmcinnes commented on April 28, 2024

@paoloinglese: This looks really interesting -- it would add some non-trivial dependencies, but is certainly worth looking into further. At the very least it would be very nice to have a documentation page simialr to your notebook demonstrating how to do this sort of thing. I look forward to hearing more.

from umap.

paoloinglese avatar paoloinglese commented on April 28, 2024

@lmcinnes Ok great! I'll prepare something in the next few days.

from umap.

paoloinglese avatar paoloinglese commented on April 28, 2024

@lmcinnes I've updated the notebook setting the tensorflow backend and added a simple k-nn classification using the UMAP embedding of the train set and predicting using the neural network predicted UMAP embedding for the test set

https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb

Unfortunately, I don't have much time to put more text in the notebook. I guess the overall idea is pretty clear, as suggested previously in other messages here on Github.

from umap.

JaeDukSeo avatar JaeDukSeo commented on April 28, 2024

thanks for this

from umap.

HyperPotatoNeo avatar HyperPotatoNeo commented on April 28, 2024

So I had tried a simple Auto-Encoder on Cifar10 a few months ago,
here it is.
https://github.com/HyperPotatoNeo/Deep-learning-experiments-with-umap/blob/master/AE%20vs%20UMAP_%20Cifar10.ipynb

from umap.

omaralvarez avatar omaralvarez commented on April 28, 2024

Any updates on this? UMAP is giving me great results, but I want to run it in real-time for new unseen points and the transform method is slow. Anybody knows how to get around this?

from umap.

omaralvarez avatar omaralvarez commented on April 28, 2024

I think the SONG idea is pretty good. I would like to give it a try, but I have not found any code. I would like to reproduce its results and test it out with my datasets.

from umap.

none0none avatar none0none commented on April 28, 2024

So this is the implementation of the naive multivariate regressor to project new data on the fitted UMAP embedding space

https://github.com/paoloinglese/Parametric-t-SNE/blob/master/Parametric%20UMAP%20(Keras).ipynb

It's based on @kylemcdonald implementation using Keras.
In the next days I'll have a look at a more refined model, analogous to parametric t-SNE.

Today I checked this link but it is not valid any more.

from umap.

paoloinglese avatar paoloinglese commented on April 28, 2024

@none0none yes, I removed it after @lmcinnes et al. published a refined model for parametric UMAP https://arxiv.org/abs/2009.12981

from umap.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.