pierreablin / picard Goto Github PK
View Code? Open in Web Editor NEWPreconditioned ICA for Real Data
Home Page: https://pierreablin.github.io/picard
License: BSD 3-Clause "New" or "Revised" License
Preconditioned ICA for Real Data
Home Page: https://pierreablin.github.io/picard
License: BSD 3-Clause "New" or "Revised" License
Hi @pierreablin! The current PyPI release is missing the scikit-learn
requirement, which you have already fixed in the latest dev version. Would you mind making a new release so that installing from PyPI works again out of the box?
I cannot find a scipy import statement in the code base, though scipy is listed as an installation requirement. I used the following command:
find . -type f | xargs grep scipy | grep import
Hello,
a while back, I created a conda-forge
package for picard
, and I've just updated it to version 0.6:
package: https://anaconda.org/conda-forge/python-picard (it might take an hour or so for 0.6 to show up)
recipe: https://github.com/conda-forge/python-picard-feedstock
I was wondering if it could make sense to include this in the installation instructions? WDYT?
In the README, it says the following about the Picard settings:
ortho=False, extended=False: same solution as Infomax
ortho=False, extended=True: same solution as extended-Infomax
ortho=True, extended=True: same solution as FastICA
It'd be nice to have an example (using real data, e.g., EEG data [because that's probably what most users deal with]) and compare in this example directly:
where the non-picard implementations are taken from MNE-Python (or sklearn in case of FastICA)
Hello,
First of all, thank you for this great package. It helps me a lot in my project.
Running picard on simple data sets I encountered a FloatingPointError with infomax and the exponential density model.
Thank you for your help,
Nicolas Captier
package version : 0.7
When running picard
function with ortho = False
, extended = False
and fun = 'exp'
a FloatingPointError appears. I tested different data sets and I systematically encountered this error.
import numpy as np
from picard import picard
N, T = 3, 1000
S = np.random.laplace(size=(N, T))
A = np.random.randn(N, N)
X = np.dot(A, S)
K, W, Y = picard(X , ortho = False , extended = False , fun = 'exp')
Hi,
Thank you for the amazing package.
I have a question related to computing the mean during preprocessing.
The picard
function takes X
as a shape of (n_features, n_samples)
.
Lines 25 to 27 in 1557b10
(axis=-1)
:Lines 166 to 169 in 1557b10
But, I guess it should be the other way around?
I really appreciate your help in advance.
Minho
nose is no longer actively maintained, pytest is probably a better option for testing.
Now that you have added the extended
parameter, it would be great if you made a new release on PyPI.
The command
zunzun@zunzun-laptop:~/github/l-bfgs-ica$ find . -type f | xargs grep scipy
yields one numpy reference and then these two items:
./doc/conf.py: 'scipy': 'http://docs.scipy.org/doc/scipy-0.17.0/reference',
./doc/index.rst: $ pip install numpy matplotlib scipy numexpr
I installed python-picard.
I could do 'import picard' but when I try use it with MNE
ica = ICA(n_components=0.95, method='picard', allow_ref_meg=True,random_state=0, max_iter=100)
I get following error
cannot import name 'picard' from 'picard' (unknown location)
When installing into a Docker image, we get:
Python 3.8.10 (default, May 12 2021, 15:46:43)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import picard
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.8/site-packages/picard/__init__.py", line 23, in <module>
from .dropin_sklearn import Picard # noqa
File "/usr/local/lib/python3.8/site-packages/picard/dropin_sklearn.py", line 10, in <module>
from sklearn.decomposition import FastICA
ModuleNotFoundError: No module named 'sklearn'
I think that's because the setup.py
doesn't have a "requires" section, but not sure.
You mention that Picard-O is able to recover both super- and sub-Gaussian sources. Is this also possible with Picard (i.e. this would correspond to Extended Infomax)?
Following @agramfort's suggestion, I've been adding the support for Picard in the Brainstorm GUI.
It's so fast, and I trust so blindly Alex's recommendations, I've even made it the current default for ICA:
I've made it a plugin, easy to download and update from the Brainstorm package manager:
https://neuroimage.usc.edu/brainstorm/Tutorials/Plugins
For the moment, the interface looks like this:
The only parameter (beyond the data) that is passed to picard.m is the the number of components, if a dimension reduction before PCA is requested.
If we set a parameter 'pca' lower than the rank of the input data, we still get a warning:
Warning: Input matrix is of deficient rank. Please consider to reduce dimensionality (pca) prior to ICA.
> In picard (line 157)
Is this expected?
Wouldn't we want to see this warning only if number of requested components (or the number of signals) is higher than the rank of the input data?
Any other comment regarding this integration is welcome!
Cheers
I ended up using picard python implementation due to #19. I've just hit something I believe to be a convergence issue: the picard python implementation took very long to compute and couldn't reach the tolerance criterion in 360 iterations - giving back very weird components.
The last few progress prints from picard:
iteration 352, gradient norm = 33.42, loss = -717.8
line search failed, falling back to gradient
iteration 353, gradient norm = 31.96, loss = -717.7
line search failed, falling back to gradient
iteration 354, gradient norm = 20.22, loss = -717.8
line search failed, falling back to gradient
iteration 355, gradient norm = 18.03, loss = -717.7
line search failed, falling back to gradient
iteration 356, gradient norm = 15.43, loss = -717.7
line search failed, falling back to gradient
iteration 357, gradient norm = 13.68, loss = -717.8
iteration 358, gradient norm = 61.55, loss = -717.8
line search failed, falling back to gradient
iteration 359, gradient norm = 77.3, loss = -717.9
line search failed, falling back to gradient
iteration 360, gradient norm = 32.36, loss = -717.9
I checked the data and run picard again with very similar outcome - it took very long, didn't converge and gave stange components.
I then used eeglab's extended infromax (runica
- I didn't use mne's version because I was running everything from matlab and transferring 2d arrays between matlab and python can be a pain) - I can't say anything about the speed because I left it running without any timing set up, but it seems to have converged (although in more steps than maxiter I set for picard). It returned meaningful components. The last few lines of outputs from runica
:
step 413 - lrate 0.000000, wchange 0.00000021, angledelta 101.3 deg
step 414 - lrate 0.000000, wchange 0.00000019, angledelta 95.5 deg
step 415 - lrate 0.000000, wchange 0.00000019, angledelta 95.9 deg
step 416 - lrate 0.000000, wchange 0.00000015, angledelta 98.2 deg
step 417 - lrate 0.000000, wchange 0.00000017, angledelta 96.5 deg
step 418 - lrate 0.000000, wchange 0.00000016, angledelta 100.5 deg
step 419 - lrate 0.000000, wchange 0.00000014, angledelta 97.5 deg
step 420 - lrate 0.000000, wchange 0.00000013, angledelta 97.8 deg
step 421 - lrate 0.000000, wchange 0.00000012, angledelta 99.3 deg
step 422 - lrate 0.000000, wchange 0.00000011, angledelta 95.6 deg
step 423 - lrate 0.000000, wchange 0.00000010, angledelta 96.3 deg
If you are interested in investigating I can help by sharing the file or I could set up a better comparison between picard
extended and standard infomax extended.
The reference to the Picard-O article still refers to the arxiv version, not the ICASSP article
:)
Hi @pierreablin
in the API of the current docs, I cannot see an extended
parameter: https://pierreablin.github.io/picard/generated/picard.picard.html#picard.picard
yet it's there in the source code:
Line 43 in e839e77
the docs have the dev
appendix so I would expect that they reflect the current (development) status of the project.
For my current project I'm using matlab so it's a bit easier to use the matlab implementation of picard. While the extended
option is clearly visible for the python picard implementation I'm not sure if it is available for matlab.
... should be listed in setup.py
. When I do python setup.py develop
it did not pull in the requirement.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.