Giter VIP home page Giter VIP logo

ezgan's People

Contributors

jonbruner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

ezgan's Issues

Error possibly linked to definition of variable scope

Error:

ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?

at line : with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:

Suggestions for improvements.

This is a really nice intro for GAN. I have some suggestions that might perhaps improve the performance or fix some error (due to api change?).

  1. The batch normalization layer has a "is_training" argument, which should be set to False at the time of evaluation (e.g. when generating the images). Also I believe you need to update the moving mean and variance manually (refer to tensorflow batch normalization api page).

  2. You are not reusing the generator when producing fake images. Just replace

generator(batch_size, z_dimensions)

with Gz anywhere other than its initial definition would clear this issue.

  1. Use tf.variable_scope('generator') and tf.variable_scope('discriminator') instead of passing 'reuse' argument to the model function.

  2. Using transposed_conv rather than resize_images when do up-sampling might improve the generator, since it gives more details instead of mere interpolations.

the problem of d_trainer_fake = tf.train.AdamOptimizer(0.0001).minimize(d_loss_fake, var_list=d_vars)

Hi,

Thanks for sharing the code, when running ezgan.ipynb, i got the following error message, what can be the problem of it?

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-5-08535f4b8698> in <module>()
     31 # Increasing from 0.001 in GitHub version
     32 with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:
---> 33     d_trainer_fake = tf.train.AdamOptimizer(0.0001).minimize(d_loss_fake, var_list=d_vars)
     34     d_trainer_real = tf.train.AdamOptimizer(0.0001).minimize(d_loss_real, var_list=d_vars)
     35 

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\optimizer.py in minimize(self, loss, global_step, var_list, gate_gradients, aggregation_method, colocate_gradients_with_ops, name, grad_loss)
    407 
    408     return self.apply_gradients(grads_and_vars, global_step=global_step,
--> 409                                 name=name)
    410 
    411   def compute_gradients(self, loss, var_list=None,

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\optimizer.py in apply_gradients(self, grads_and_vars, global_step, name)
    550                        ([str(v) for _, _, v in converted_grads_and_vars],))
    551     with ops.init_scope():
--> 552       self._create_slots([_get_variable_for(v) for v in var_list])
    553     update_ops = []
    554     with ops.name_scope(name, self._name) as name:

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\adam.py in _create_slots(self, var_list)
    129     # Create slots for the first and second moments.
    130     for v in var_list:
--> 131       self._zeros_slot(v, "m", self._name)
    132       self._zeros_slot(v, "v", self._name)
    133 

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\optimizer.py in _zeros_slot(self, var, slot_name, op_name)
    982     named_slots = self._slot_dict(slot_name)
    983     if _var_key(var) not in named_slots:
--> 984       new_slot_variable = slot_creator.create_zeros_slot(var, op_name)
    985       self._restore_slot_variable(
    986           slot_name=slot_name, variable=var,

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\slot_creator.py in create_zeros_slot(primary, name, dtype, colocate_with_primary)
    177     return create_slot_with_initializer(
    178         primary, initializer, slot_shape, dtype, name,
--> 179         colocate_with_primary=colocate_with_primary)
    180   else:
    181     if isinstance(primary, variables.Variable):

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\slot_creator.py in create_slot_with_initializer(primary, initializer, shape, dtype, name, colocate_with_primary)
    151       with ops.colocate_with(primary):
    152         return _create_slot_var(primary, initializer, "", validate_shape, shape,
--> 153                                 dtype)
    154     else:
    155       return _create_slot_var(primary, initializer, "", validate_shape, shape,

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\training\slot_creator.py in _create_slot_var(primary, val, scope, validate_shape, shape, dtype)
     63       use_resource=resource_variable_ops.is_resource_variable(primary),
     64       shape=shape, dtype=dtype,
---> 65       validate_shape=validate_shape)
     66   variable_scope.get_variable_scope().set_partitioner(current_partitioner)
     67 

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\ops\variable_scope.py in get_variable(name, shape, dtype, initializer, regularizer, trainable, collections, caching_device, partitioner, validate_shape, use_resource, custom_getter, constraint)
   1295       partitioner=partitioner, validate_shape=validate_shape,
   1296       use_resource=use_resource, custom_getter=custom_getter,
-> 1297       constraint=constraint)
   1298 get_variable_or_local_docstring = (
   1299     """%s

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\ops\variable_scope.py in get_variable(self, var_store, name, shape, dtype, initializer, regularizer, reuse, trainable, collections, caching_device, partitioner, validate_shape, use_resource, custom_getter, constraint)
   1091           partitioner=partitioner, validate_shape=validate_shape,
   1092           use_resource=use_resource, custom_getter=custom_getter,
-> 1093           constraint=constraint)
   1094 
   1095   def _get_partitioned_variable(self,

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\ops\variable_scope.py in get_variable(self, name, shape, dtype, initializer, regularizer, reuse, trainable, collections, caching_device, partitioner, validate_shape, use_resource, custom_getter, constraint)
    437           caching_device=caching_device, partitioner=partitioner,
    438           validate_shape=validate_shape, use_resource=use_resource,
--> 439           constraint=constraint)
    440 
    441   def _get_partitioned_variable(

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\ops\variable_scope.py in _true_getter(name, shape, dtype, initializer, regularizer, reuse, trainable, collections, caching_device, partitioner, validate_shape, use_resource, constraint)
    406           trainable=trainable, collections=collections,
    407           caching_device=caching_device, validate_shape=validate_shape,
--> 408           use_resource=use_resource, constraint=constraint)
    409 
    410     if custom_getter is not None:

~\AppData\Local\Continuum\Anaconda3\envs\deeplab\lib\site-packages\tensorflow\python\ops\variable_scope.py in _get_single_variable(self, name, shape, dtype, initializer, regularizer, partition_info, reuse, trainable, collections, caching_device, validate_shape, use_resource, constraint)
    763       raise ValueError("Variable %s does not exist, or was not created with "
    764                        "tf.get_variable(). Did you mean to set "
--> 765                        "reuse=tf.AUTO_REUSE in VarScope?" % name)
    766     if not shape.is_fully_defined() and not initializing_from_value:
    767       raise ValueError("Shape of a new variable (%s) must be fully defined, "

ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=tf.AUTO_REUSE in VarScope?

Cannot save checkpoints

When I run EZGAN.ipynb, I have the following error: Parent directory of models/pretrained_gan.ckpt doesn’t exist, can’t save.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.