Comments (3)
Thanks for your interest in this project! This is a curious issue whereas I'm not familiar with Web Assembly at all.
Anyway, enabling array generator functions to use pre-instantiated Rng
object solves this problem as you said. This abstraction is definitely useful, but I want to provide default Rng
. Therefore the compromise will be something like:
let arr1 = ndarray_ext::ArrRng::default().glorot_uniform(&[2, 3]);
let arr2 = ndarray_ext::ArrRng::new(Pcg).glorot_uniform(&[2, 3]);
How do you think?
In this case, the library implementation will be as follows...
struct ArrRng<R: Rng> {
rng: R
}
impl Default for ArrRng<rand::XorShiftRng>
{
fn default() -> Self
{
ArrRng {
rng: rand::weak_rng()
}
}
}
impl<R: Rng> ArrRng<R> {
pub fn new(rng: R) -> Self
{
ArrRng {
rng
}
}
}
impl<R: Rng> ArrRng<R>
{
pub fn glorot_uniform(&mut self, shape: &[usize]) -> NdArray
{
let s = (6. / shape[0] as f64).sqrt();
let dist = rand::distributions::Range::new(-s, s);
NdArray::from_shape_fn(shape, |_| dist.ind_sample(&mut self.rng) as f32)
}
...
}
from rust-autograd.
Yes, that would solve my problem. I like this design too 🥇
It looks like you basically have it sorted out but let me know if you would like a hand with anything.
from rust-autograd.
Ok, next version (may take several days) will contain code close to those.
Thank you I will; immediate PRs are also welcome since this is an experimental project 👍
from rust-autograd.
Related Issues (20)
- segfault when calling grad() in a loop HOT 19
- access_elem failed HOT 1
- Unexpected gradient shape HOT 3
- Not differentiable with the given tensors error when trying to train multi-input neural network HOT 1
- "ndarray" and "autograd::ndarray" HOT 2
- Support for lgamma function HOT 22
- Gradient error for tensor of different dimensions HOT 7
- Alternatives for `tf.where()` HOT 12
- Bug for `g.argmax` HOT 1
- Index out of bound in DivOp when dividing 2 scalars HOT 2
- Documentation about GradientContext::set_input_grads is misleading HOT 1
- Use of 'extern crate' in examples HOT 1
- Dead PDF link in source HOT 1
- dropout with train=false produces error HOT 1
- Could I get a value "0" when it is not differentiable at any variable ,No panic? HOT 2
- Wrong docstrings
- softmax_cross_entropy outputs shape [-1], when it should output shape [-1, 1]. HOT 1
- Newbie-friendly documentation would be a huge benefit HOT 3
- Upgrade to ndarray 0.15 HOT 2
- "unreachable code" panic on certain uses of `grad_with_default`
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rust-autograd.