Giter VIP home page Giter VIP logo

Comments (3)

aymericdamien avatar aymericdamien commented on May 2, 2024

It depends what kind of data you want to use (sequence, image, features...). You need to format your dataset as a numpy array.
For such network https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/3%20-%20Neural%20Networks/multilayer_perceptron.py, you can use:
X (your data): 2-D numpy array, with shape: [number_of_samples, features]
Y (your data labels): 2-D array (An array of binary arrays; for class "1" out of a total of 5 classes: [0, 1, 0, 0, 0]) with shape [number_of_samples, labels_binary_array]

But it woulds required some changes:

batch_xs, batch_ys = mnist.train.next_batch(batch_size)

You can change this line to instead return a batch of your own data (with shapes as above).

If you data are not so large, maybe you don't need batch, and can directly feed all data:

xs, ys = YOUR_DATA, YOUR_LABELS

with tf.Session() as sess:
    sess.run(init)

    # Training cycle
    for epoch in range(training_epochs):
        cost = 0.
        # Fit training using batch data
        sess.run(optimizer, feed_dict={x: xs, y: ys})
        # Compute average loss
        cost = sess.run(cost, feed_dict={x: xs, y: ys})
        # Display logs per epoch step
        if epoch % display_step == 0:
            print "Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(cost)

    print "Optimization Finished!"

from tensorflow-examples.

rihun avatar rihun commented on May 2, 2024

Thanks.
It is really very helpful to me.

Regards
Nahid

On Tue, Mar 22, 2016 at 10:56 PM, Aymeric Damien [email protected]
wrote:

It depends what kind of data you want to use (sequence, image,
features...). You need to format your dataset as a numpy array.
For such network
https://github.com/aymericdamien/TensorFlow-Examples/blob/master/examples/3%20-%20Neural%20Networks/multilayer_perceptron.py,
you can use:
X (your data): 2-D numpy array, with shape: [number_of_samples, features]
Y (your data labels): 2-D array (An array of binary arrays; for class "1"
out of a total of 5 classes: [0, 1, 0, 0, 0]) with shape
[number_of_samples, labels_binary_array]

But it woulds required some changes:

batch_xs, batch_ys = mnist.train.next_batch(batch_size)

You can change this line to instead return a batch of your own data (with
shapes as above).

If you data are not so large, maybe you don't need batch, and can directly
feed all data:

xs, ys = YOUR_DATA, YOUR_LABELS

with tf.Session() as sess:
sess.run(init)

# Training cycle
for epoch in range(training_epochs):
    cost = 0.
    # Fit training using batch data
    sess.run(optimizer, feed_dict={x: xs, y: ys})
    # Compute average loss
    cost = sess.run(cost, feed_dict={x: xs, y: ys})
    # Display logs per epoch step
    if epoch % display_step == 0:
        print "Epoch:", '%04d' % (epoch+1), "cost=", "{:.9f}".format(cost)

print "Optimization Finished!"


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#21 (comment)

from tensorflow-examples.

rihun avatar rihun commented on May 2, 2024

Hi

I did my programme as follows.

................................................................................................................

Import MINST data

#import input_data
#mnist = input_data.read_data_sets("/tmp/data/", one_hot=True)

import tensorflow as tf
from numpy import genfromtxt
mnist_data=genfromtxt('data_1.csv', delimiter=',')
mnist_label=genfromtxt('Level_1.csv', delimiter=',')

Parameters

learning_rate = 0.001
training_epochs = 15
batch_size = 100
display_step = 1

Network Parameters

n_hidden_1 = 64 # 1st layer num features
n_hidden_2 = 64 # 2nd layer num features
n_input = 9 # MNIST data input (img shape: 28*28)
n_classes = 2 # MNIST total classes (0-9 digits)

tf Graph input

x = tf.placeholder("float", [None, n_input])
y = tf.placeholder("float", [None, n_classes])

Create model

def multilayer_perceptron(_X, _weights, _biases):
layer_1 = tf.nn.relu(tf.add(tf.matmul(_X, _weights['h1']),
_biases['b1'])) #Hidden layer with RELU activation
layer_2 = tf.nn.relu(tf.add(tf.matmul(layer_1, _weights['h2']),
_biases['b2'])) #Hidden layer with RELU activation
return tf.matmul(layer_2, _weights['out']) + _biases['out']

Store layers weight & bias

weights = {
'h1': tf.Variable(tf.random_normal([n_input, n_hidden_1])),
'h2': tf.Variable(tf.random_normal([n_hidden_1, n_hidden_2])),
'out': tf.Variable(tf.random_normal([n_hidden_2, n_classes]))
}
biases = {
'b1': tf.Variable(tf.random_normal([n_hidden_1])),
'b2': tf.Variable(tf.random_normal([n_hidden_2])),
'out': tf.Variable(tf.random_normal([n_classes]))
}

Construct model

pred = multilayer_perceptron(x, weights, biases)

Define loss and optimizer

cost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(pred, y)) #
Softmax loss
optimizer =
tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost) # Adam
Optimizer

Initializing the variables

xs, ys = mnist_data, mnist_label

init = tf.initialize_all_variables()
with tf.Session() as sess:
sess.run(init)

Training cycle

for epoch in range(training_epochs):
    cost = 0.
    # Fit training using batch data
    sess.run(optimizer, feed_dict={x: xs, y: ys})
    # Compute average loss
    cost = sess.run(cost, feed_dict={x: xs, y: ys})
    # Display logs per epoch step
    if epoch % display_step == 0:
        print "Epoch:", '%04d' % (epoch+1), "cost=",

"{:.9f}".format(cost)

print "Optimization Finished!"

......................................................................................................................................

But I got the following error.

File "", line 55 with tf.Session()
as sess: ^ IndentationError: unexpected indent

Here I attach my data file and Labels file. Could you please help me to
solve it.

Thanks and regrads
Nahid

On Wed, Mar 23, 2016 at 5:09 PM, Aymeric Damien [email protected]
wrote:

Closed #21
#21.


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#21 (comment)

from tensorflow-examples.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.