Giter VIP home page Giter VIP logo

Comments (3)

HKervadec avatar HKervadec commented on July 17, 2024

Hi,

Does the FAQ answers your question? https://github.com/LIVIAETS/boundary-loss#can-the-loss-be-negative

Can the loss be negative?

Yes. As the distance map is signed (meaning that inside the object, the distance is negative), a perfect prediction will sum only negative distances, leading to a negative value. As we are in a minimization setting, this is not an issue.

Also

Is boundary loss optimized towards zero or towards to -inf by torch.adam optimizer?
Unless you clip it yourself, adam will minimize toward -infinity.

Let me know,

Hoel

from boundary-loss.

meijie0401 avatar meijie0401 commented on July 17, 2024

I see! Based on your explanation, for example, when the predicted boundary is perfect, let's say the loss is -6. And adam still wants to minimize it toward -infinity. However, now no matter how to adjust the predicted perfect boundary, the loss will be bigger because the current boundary is perfect. So the loss will at -6, right?

But the question is, if the loss stays at -6, then there must be a non-zero gradient that will adjust the weights of the network and adjust this perfect boundary. Or do you mean when loss stays at -6 (perfect boundary), the gradients are all zeros?

from boundary-loss.

HKervadec avatar HKervadec commented on July 17, 2024

Hej,

So the loss will at -6, right?

Yes in that case the loss will stick at -6, no matter how hard ADAM tries to go lower.

In the case of the boundary loss, its gradient (wrt the softmax) is the distance map. So you are right:

But the question is, if the loss stays at -6, then there must be a non-zero gradient that will adjust the weights of the network and adjust this perfect boundary.

It will indeed "reinforce" the confidence of those labels (if possible), while the perfect loss will be constant (hopefully). I think that most other losses will have a similar effect (hence the risk of overfitting, and all the regularizers used when training deep neural networks).

In practice, it hasn't been an issue at all for us (as showed by the validation performances over time), and it would be easy to deal with if an issue appeared.

Let me know if things are clearer,

Hoel

from boundary-loss.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.