Giter VIP home page Giter VIP logo

Comments (1)

Julius-ZCJ avatar Julius-ZCJ commented on August 24, 2024

I discard this function 'load_initial_weights', because i think w and b in net have init ,such as code:

def conv(self,x, filter_height, filter_width, num_filters, stride_y,
stride_x, name,padding='SAME', groups=1):

    # Get number of input channels
    input_channels = int(x.get_shape()[-1])
    
    # Create lambda function for the convolution
    convolve = lambda i, k: tf.nn.conv2d(i, k,
                                     strides=[1, stride_y, stride_x, 1],
                                     padding=padding)
    with tf.variable_scope(name) as scope:
        # Create tf variables for the weights and biases of the conv layer
        w=tf.random_normal_initializer(mean=0.0, stddev=0.001, seed=None, dtype=tf.float32) 
        #b=tf.tf.constant_initializer(value)
        weights = tf.get_variable('weights', shape=[filter_height,
                                                filter_width,
                                                input_channels/groups,
                                                num_filters],
                                                initializer=w,
                                                trainable=True)
        biases = tf.get_variable('biases', shape=[num_filters],initializer=tf.ones_initializer(),trainable=True)
        
        
    if groups == 1:
        conv = convolve(x, weights)
    else:
         # Split input and weights and convolve them separately
         input_groups = tf.split(axis=3, num_or_size_splits=groups, value=x)
         weight_groups = tf.split(axis=3, num_or_size_splits=groups,
                             value=weights)
         output_groups = [convolve(i, k) for i, k in zip(input_groups, weight_groups)]

        # Concat the convolved output together again
         conv = tf.concat(axis=3, values=output_groups)#拼接张量
    # Add biases
    bias = tf.reshape(tf.nn.bias_add(conv, biases), tf.shape(conv))     
    # Apply relu function
    relu = tf.nn.relu(bias, name=scope.name)    
    return relu

if i delete function 'load_initial_weights', Have any influence on net?

from finetune_alexnet_with_tensorflow.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.