Comments (15)
I'm getting the same error using Google Colab.
from introtodeeplearning.
It sems that to get deterministic behavior in the latest Tensor Flow you need to use
tf.keras.utils.set_random_seed(1)
tf.config.experimental.enable_op_determinism()
However this will not solve the problem, as you need to recalculate the results in test library anyway
from introtodeeplearning.
It looks like #118 has solved the issue. When will we merge it?
from introtodeeplearning.
global seed alone fails to ensure reproducibility, probably have to specify an operation-level seed to get the same result
from introtodeeplearning.
hi, i got the same question, did you get this done!
from introtodeeplearning.
Ran into the same exact issue.
from introtodeeplearning.
Getting same issue here:
AssertionError Traceback (most recent call last)
in
34 # test the output!
35 print(y.numpy())
---> 36 mdl.lab1.test_custom_dense_layer_output(y)3 frames
/usr/local/lib/python3.9/dist-packages/numpy/testing/_private/utils.py in assert_array_compare(comparison, x, y, err_msg, verbose, header, precision, equal_nan, equal_inf)
842 verbose=verbose, header=header,
843 names=('x', 'y'), precision=precision)
--> 844 raise AssertionError(msg)
845 except ValueError:
846 import tracebackAssertionError:
Arrays are not almost equal to 7 decimals
[FAIL] output is of incorrect value. expected [[0.14785646 0.23566338 0.8568521 ]] but got [[0.2697859 0.45750418 0.66536945]]
Mismatched elements: 3 / 3 (100%)
Max absolute difference: 0.2218408
Max relative difference: 0.4848935
x: array([[0.1478565, 0.2356634, 0.8568521]], dtype=float32)
y: array([[0.2697859, 0.4575042, 0.6653695]], dtype=float32)
Did anyone get any solution?
from introtodeeplearning.
Yes, the suggested PR by prashantkhurana works for me. Maybe the version of mitdeeplearning should be upped.
I use NVidia CUDA RTX A5000 with x64 ubuntu, tensorflow 2.12, python 3.11
UPD: also tested on Apple M1 Pro tensorflow-metal w tensorflow 2.12
from introtodeeplearning.
# Google Colab
class OurDenseLayer(tf.keras.layers.Layer):
def __init__(self, n_output_nodes):
super(OurDenseLayer, self).__init__()
self.n_output_nodes = n_output_nodes
def build(self, input_shape):
d = int(input_shape[-1])
# Define and initialize parameters: a weight matrix W and bias b
# Note that parameter initialization is random!
self.W = self.add_weight("weight", shape=[d, self.n_output_nodes]) # note the dimensionality
self.b = self.add_weight("bias", shape=[1, self.n_output_nodes]) # note the dimensionality
def call(self, x):
'''TODO: define the operation for z (hint: use tf.matmul)'''
z = tf.matmul(x, self.W) + self.b
'''TODO: define the operation for out (hint: use tf.sigmoid)'''
y = tf.sigmoid(z)
return y
# Since layer parameters are initialized randomly, we will set a random seed for reproducibility
tf.random.set_seed(1)
layer = OurDenseLayer(3)
layer.build((1,2))
x_input = tf.constant([[1,2.]], shape=(1,2))
y = layer.call(x_input)
# test the output!
print(y.numpy())
mdl.lab1.test_custom_dense_layer_output(y)
AssertionError:
Arrays are not almost equal to 7 decimals
[FAIL] output is of incorrect value. expected [[0.2086701 0.68627274 0.14512508]] but got [[0.2697859 0.45750418 0.66536945]]
Mismatched elements: 3 / 3 (100%)
Max absolute difference: 0.52024436
Max relative difference: 0.78188795
x: array([[0.2086701, 0.6862727, 0.1451251]], dtype=float32)
y: array([[0.2697859, 0.4575042, 0.6653695]], dtype=float32)
same here
from introtodeeplearning.
Getting the same assertion error:
Arrays are not almost equal to 7 decimals
[FAIL] output is of incorrect value. expected [[0.9217561 0.39742658 0.14384568]] but got [[0.2697859 0.45750418 0.66536945]]
Mismatched elements: 3 / 3 (100%)
from introtodeeplearning.
Getting same assertion error
from introtodeeplearning.
Same error getting on Google colab. Any fix yet?
from introtodeeplearning.
Still getting this error on colab even after replacing tf.random.set_seed(1) with
tf.keras.utils.set_random_seed(1)
tf.config.experimental.enable_op_determinism()
from introtodeeplearning.
Still getting this error on colab even after replacing tf.random.set_seed(1) with tf.keras.utils.set_random_seed(1) tf.config.experimental.enable_op_determinism()
I think is's because mitdeeplearning library file mitdeeplearning/lab1.py need to be patched also with the new values but in colab you use remote version, this should work after version update. See the PR above. I tested in both on x64 and Apple M1 so likely it will work.
from introtodeeplearning.
same error
/usr/local/lib/python3.10/dist-packages/ipykernel/ipkernel.py:283: DeprecationWarning: should_run_async
will not call transform_cell
automatically in the future. Please pass the result to transformed_cell
argument and any exception that happen during thetransform in preprocessing_exc_tuple
in IPython 7.17 and above.
and should_run_async(code)
/usr/lib/python3.10/random.py:370: DeprecationWarning: non-integer arguments to randrange() have been deprecated since Python 3.10 and will be removed in a subsequent version
return self.randrange(a, b+1)
[[0.27064407 0.1826951 0.50374055]]
AssertionError Traceback (most recent call last)
in <cell line: 40>()
38 # test the output!
39 print(y.numpy())
---> 40 mdl.lab1.test_custom_dense_layer_output(y)
1 frames
[... skipping hidden 2 frame]
/usr/local/lib/python3.10/dist-packages/numpy/testing/_private/utils.py in assert_array_compare(comparison, x, y, err_msg, verbose, header, precision, equal_nan, equal_inf)
842 verbose=verbose, header=header,
843 names=('x', 'y'), precision=precision)
--> 844 raise AssertionError(msg)
845 except ValueError:
846 import traceback
AssertionError:
Arrays are not almost equal to 7 decimals
[FAIL] output is of incorrect value. expected [[0.27064407 0.1826951 0.50374055]] but got [[0.2697859 0.45750418 0.66536945]]
Mismatched elements: 3 / 3 (100%)
Max absolute difference: 0.27480906
Max relative difference: 0.60067004
x: array([[0.2706441, 0.1826951, 0.5037405]], dtype=float32)
y: array([[0.2697859, 0.4575042, 0.6653695]], dtype=float32)
from introtodeeplearning.
Related Issues (20)
- lab1 part 2: Batch definition to create training examples issue HOT 1
- [lab3 part1] bias_wrapped_dense_NN.call error HOT 4
- Lab 2, Part 1, Section 1.4: Missing `from_logits=True` argument? HOT 2
- Lab 3 Part 2 resampling gives error HOT 2
- Lab3, Part1: ValueError: too many values to unpack (expected 2) HOT 3
- xtra_labs, autonomous_driving: Cannot open in Google Colab
- Lab3 part1 ensemble_NN.compile
- problem of mdl.lab3.plot_accuracy_vs_risk
- Lab 1 Defining a network layer - Assertion error HOT 3
- Lab 3 (Section 1.3) - Too Many Values to Unpack Error HOT 2
- Lab1 module 'mitdeeplearning' has no attribute 'lab1' HOT 1
- Lab1 Part1 issue with func(), HOT 1
- Lab3 Part2 Section3.4 AttributeError Problem
- Images not loading in lab python notebooks HOT 1
- Lab 3 dependency issue: capsa HOT 3
- From one day to the next Lab1 Music model load weight misinterprets shape. Serialization error?
- [Lab 1 Part 2] - Missing softmax argument in Dense layer
- Lab 3 missing
- Requesting to replace obsolete TensorFlow with PyTorch
- lab1 part 2 (2.14) is failing with tensorflow 2.16
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from introtodeeplearning.