Giter VIP home page Giter VIP logo

Comments (12)

iperov avatar iperov commented on August 24, 2024 1

oh this is unobvious behavior of pytorch, this is why I prefer tensorflow :D
thanks for your time.

from funit.

iperov avatar iperov commented on August 24, 2024

also I checked running_mean and running_var of AdaptiveInstanceNorm2d during training
they are always static, running_mean is always zero and running_var is always one.

so instead of

out = F.batch_norm(
            x_reshaped, running_mean, running_var, self.weight, self.bias,
            True, self.momentum, self.eps)

you are actually training

x * weight + bias

without any normalization at all!

from funit.

iperov avatar iperov commented on August 24, 2024

I think more logically first normalize the input and then apply style coded gamma and beta.

from funit.

iperov avatar iperov commented on August 24, 2024

quote from paper:

For each sample, AdaIN first normalizes the activations of a sample in each channel to have a zeromean and unit variance.

but it does not!

from funit.

mingyuliutw avatar mingyuliutw commented on August 24, 2024

@iperov
You missed

x_reshaped = x.contiguous().view(1, b * c, *x.size()[2:])

before

out = F.batch_norm(
x_reshaped, running_mean, running_var, self.weight, self.bias,
True, self.momentum, self.eps)

from funit.

iperov avatar iperov commented on August 24, 2024

Thanks for the reply.

Nothing is missed.

  1. Concatenating batch and channels together you compute mean and var across all batch, it's called batch normalization.
    But in instance normalization, mean and var are computed across channels per each batch.

  2. currently running_mean, running_var that you pass in batch_norm are always zero and one. Therefore your adain layer works like x * weight + bias.

  3. I made batchnorm works with momentum 0.1 in my keras port. It crashes the model immediately.

from funit.

iperov avatar iperov commented on August 24, 2024
D acc: 0.5641    G acc: 0.4028
Elapsed time in update: 21.891000
Iteration: 00000006/00100000
Python 3.6.8 (tags/v3.6.8:3c6b436a57, Dec 24 2018, 00:16:47) [MSC v.1916 64 bit
(AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
(InteractiveConsole)
>>> running_mean
tensor([0., 0., 0.,  ..., 0., 0., 0.], device='cuda:0')
>>> running_var
tensor([1., 1., 1.,  ..., 1., 1., 1.], device='cuda:0')

but

x * weight + bias.

also works.
May be not so good as with normalization :)

from funit.

mingyuliutw avatar mingyuliutw commented on August 24, 2024

@iperov

batch_norm computes a mean value and a deviation value per channel. By doing b x c, we are treating each sample's each channel as a channel. There are b*c channel now. Now, the effect of batch norm is equivalent to instance norm.

from funit.

iperov avatar iperov commented on August 24, 2024

Yes, my mistake, you're right.

But what about

>>> running_mean
tensor([0., 0., 0.,  ..., 0., 0., 0.], device='cuda:0')
>>> running_var
tensor([1., 1., 1.,  ..., 1., 1., 1.], device='cuda:0')

?

from funit.

mingyuliutw avatar mingyuliutw commented on August 24, 2024

Adding zero and then multiplying one is equal to doing nothing.

from funit.

iperov avatar iperov commented on August 24, 2024

F.batch_norm should do

(x - running_mean) / sqrt(running_var+eps)

then compute current mean and var
and add to running_mean and running_var with momentum 0.1
So I guess there is no normalization applied.
am I wrong?

from funit.

mingyuliutw avatar mingyuliutw commented on August 24, 2024

torch.nn.functional.batch_norm(input, running_mean, running_var, weight=None, bias=None, training=False, momentum=0.1, eps=1e-05)

Above is from the Pytorch doc. Since we set the training mode to true, running_mean and running var are not used.

out = F.batch_norm(
x_reshaped, running_mean, running_var, self.weight, self.bias,
True, self.momentum, self.eps)

from funit.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.