Giter VIP home page Giter VIP logo

Comments (17)

lxtGH avatar lxtGH commented on May 28, 2024

Did you try to finetune the resnet?

from gcnet.

kfxw avatar kfxw commented on May 28, 2024

I have the similar problem. When training from scratch, the model converges very slowly.

from gcnet.

ZzzjzzZ avatar ZzzjzzZ commented on May 28, 2024

I have the similar problem. When training from scratch, the model converges very slowly.

Me too

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

Sorry for the late reply.

At the first place, we don't have enough resources for training from scratch.
Afterwards, we tried train the whole network from scratch on ImageNet. We didn't observe similar issue. I suggested you train it for 110 or 120 epoch to see the finally performance.

Noted that we use the same augmentation method as SENet.

from gcnet.

kfxw avatar kfxw commented on May 28, 2024

@xvjiarui
Hi! In terms of the fusion setting, did the case you mentioned use the 'add' one or the 'scale' one?

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

We use the 'add' one by default. On the other hand, 'scale' is similar to SENet. Both of them should not have converge issue.

from gcnet.

kfxw avatar kfxw commented on May 28, 2024

@xvjiarui
Thx for your reply. Btw, would you mind sharing the classification training codes in this repo? That would be of great help.

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

@xvjiarui
Thx for your reply. Btw, would you mind sharing the classification training codes in this repo? That would be of great help.

Our internal code base is used for classification training. We may try to release a cleaned version in the future. But it is not on schedule yet.

The block structure is the same. You could simply add it to your own code.

from gcnet.

taoxinlily avatar taoxinlily commented on May 28, 2024

Using the best setting of GC-ResNet50 and train it from scratch on ImageNet, I found it will be stuck in a high loss in the early epochs before the training loss begins to decline normally. Therefore the final result is much lower than original ResNet50. Note that one difference from the original paper is that the GC modules are embedded in each bottleneck exactly as SE does, for a fair comparison.

Does anyone have the same problem?

This may be the case since the authors report the ImageNet results via a finetuning setting, which is not very common when validating models on ImageNet Benchmarks. At least all other modules (SE, SK, BAM, CBAM, AA) are following a training-from-scratch setting.

Hi! Did you solve the problem?

from gcnet.

Shiro-LK avatar Shiro-LK commented on May 28, 2024

@xvjiarui
I am also trying to use gc block on classifier such as resnet, vgg16 etc and I would like to be sure that I am doing thing good.
First : in the resnet backbone, we just need to use the global context block before the downsample in the bottleneck/basicblock class
Second : for the global contexxt module, the inplane parameter is the depth of the feature map which will feed the gc module and the plane parameter is equal to inplane//16, is it right ?
regarding the parameter "pool", I suppose "att" is better ? And for the "fusions" parameter "channel add" is also better ? Why this parameter take only a list? I am not sure to understand.

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

For all the yes/no questions, the answers are yes. You are understanding correctly.
The fusions as a list indicate that multiple fusion methods could be used together.

from gcnet.

ma-xu avatar ma-xu commented on May 28, 2024

For all the yes/no questions, the answers are yes. You are understanding correctly.
The fusions as a list indicate that multiple fusion methods could be used together.
@xvjiarui
Hi, thanks a lot for your great work. I appreciate it. However, I tried to train the network on ImageNet and GC achieves a worse performance than original ResNet. Then I follow the paper that fine-tune the ResNet50 for other 40 epochs using cosine schedule, and the performance is still bad. Could you please share you fine-tuning code, and I can cite your work in my research. Thanks a lot.

from gcnet.

Shiro-LK avatar Shiro-LK commented on May 28, 2024

@xvjiarui

Hi, Thanks for your reply.
Do you still have the model trained on imagenet with gcblock ?

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

Hi, @13952522076
Sorry for the late reply.
Currently, I am too busy to release that part of the code.
If the issue is overfitting, you are suggested to adopt the augmentations in the original paper as well as the drop out on the gc block branch.

from gcnet.

xvjiarui avatar xvjiarui commented on May 28, 2024

Hi, @Shiro-LK
The models are not available for now. You will be informed when I train them again.

from gcnet.

ma-xu avatar ma-xu commented on May 28, 2024

Hi, @13952522076
Sorry for the late reply.
Currently, I am too busy to release that part of the code.
If the issue is overfitting, you are suggested to adopt the augmentations in the original paper as well as the drop out on the gc block branch.

Thanks a lot for your reply. Looks like not the issue of overfitting (train loss vs. val loss). Anyway, appreciate your work and it helped a lot. 😺

from gcnet.

yhonker avatar yhonker commented on May 28, 2024

@ma-xu Hello, excuse me, has the problem been solved?

from gcnet.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.