Giter VIP home page Giter VIP logo

Comments (14)

lhmRyan avatar lhmRyan commented on July 29, 2024 1

My personal homepage is still under construction. If you have any question, feel free to contact me via email.

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

about cuDNN: Our codes use cudnn v4. You can download the tar file of cudnn v4, untar it, and add the "include" and "lib64" directories to Makefile.config (e.g. INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /home/liu/cudnnv4/include; LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /home/liu/cudnnv4/lib64), and then you should be able to use compile with cudnn.

about "make test": I modified the code of "io.cpp" to support multi-label LMDB/leveldb, but I didn't modify the codes for testing, thus "make test" won't work here. However, our codes works normally as long as you can run "make all" successfully.

from deep-supervised-hashing-dsh.

liuyuying0829 avatar liuyuying0829 commented on July 29, 2024

Thank you so much !!! i used "make test -i" to ignored the errors(crazy...)and follow your example usage CIFAR-10/train_full.sh to continue your experiment and i did't use the cudnn and now it running ,but the speed is not fast,about 12 seconds per 100 iterations ,the loss is ave 1.2,is that normal?

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

Without cudnn, 12 seconds per 100 iterations is normal. With our codes, the final loss on CIFAR-10 is about 0.6 (12-bit code), and the corresponding retrieval mAP is about 0.67.

from deep-supervised-hashing-dsh.

liuyuying0829 avatar liuyuying0829 commented on July 29, 2024

Thank you for your help!!!

from deep-supervised-hashing-dsh.

liuyuying0829 avatar liuyuying0829 commented on July 29, 2024

Dear authors,
l have a question about why you use the image pair as the training input ?

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

We use the image pairs and their corresponding similarity labels so that the network could learn to preserve the relationships of images defined by the similarity labels. Such scheme is widely adopted in many hashing methods.

Actually, other loss functions for metric learning, triplet ranking loss for example, could also be adopted, which we've evaluated in our recent experiments. Therefore, the use of image pair is just an option, while other options are also available.

from deep-supervised-hashing-dsh.

liuyuying0829 avatar liuyuying0829 commented on July 29, 2024

Thank you so much for your detailed and patient explanation!! If i may ask? Do you have personal homepage ?Because i am very interested in your research and want to learn from your study.

from deep-supervised-hashing-dsh.

ahhan02 avatar ahhan02 commented on July 29, 2024

Dear lhmRyan, I followed your advices to include cudnnv4, and run "make all" successfully.

However, my model didn't converge at all and the loss is 10.8 avg. b.t.w., I haven't changed any code except Makefile.config!!!

I have no idea about it, could you give me some advice?? @lhmRyan

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

@ahhan02 It's strange. I just cloned this repository to my computer, compiled it, downloaded the data of cifar-10, and trained the model. After 100 iterations, the loss drops to about 2.3.

You could try training the model again. If it still don't converge, please provide some more information, and maybe we can figure out the problem together.

from deep-supervised-hashing-dsh.

ahhan02 avatar ahhan02 commented on July 29, 2024

Aha, I recompiled the project without cudnnv4, and it works now....

Anyway, I appreciate your help. @lhmRyan

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

That's strange. I'll have a look at my code.

from deep-supervised-hashing-dsh.

liuyuying0829 avatar liuyuying0829 commented on July 29, 2024

hello,anthor,nowadays,i read your cvpr2016 paper again,and i have a question about the loss in the paper,is the Eqn.(6), sigma(x)=1 when x alone to (-1,0)&[1,INF) and otherwise equal to -1, and Eqn.(5) use it, but in Eqn.(5),b(i,j) is a vector ? or this b(i,j) has been relaxed to a continuous variable ?

from deep-supervised-hashing-dsh.

lhmRyan avatar lhmRyan commented on July 29, 2024

@liuyuying0829 Hi. In Eqn(3)~(5), b_i,j have been relaxed to continuous vectors.

As explained below Eqn(6), this equation is applied to Eqn(5) element-wisely. Namely, Eqn(6) is applied to each element of b_i,j separately.

from deep-supervised-hashing-dsh.

Related Issues (14)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.