Giter VIP home page Giter VIP logo

picard's People

Contributors

agramfort avatar arokem avatar mathurinm avatar mmagnuski avatar pierreablin avatar sappelhoff avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

picard's Issues

missing scipy import

I cannot find a scipy import statement in the code base, though scipy is listed as an installation requirement. I used the following command:

find . -type f | xargs grep scipy | grep import

Example request: compare Picard "settings" with FastICA, Infomax, Extended Infomax

In the README, it says the following about the Picard settings:

ortho=False, extended=False: same solution as Infomax
ortho=False, extended=True: same solution as extended-Infomax
ortho=True, extended=True: same solution as FastICA

It'd be nice to have an example (using real data, e.g., EEG data [because that's probably what most users deal with]) and compare in this example directly:

  • Picard (ortho=False, extended=False) with Infomax
  • Picard (ortho=False, extended=True) with extended Infomax
  • Picard (ortho=True, extended=True) with FastICA

where the non-picard implementations are taken from MNE-Python (or sklearn in case of FastICA)

[Bug][version : 0.7] Running picard with infomax and an exponential density model leads to a FloatingPointError

Hello,

First of all, thank you for this great package. It helps me a lot in my project.

Running picard on simple data sets I encountered a FloatingPointError with infomax and the exponential density model.

Thank you for your help,

Nicolas Captier

package version : 0.7

Description of the bug

When running picard function with ortho = False , extended = False and fun = 'exp' a FloatingPointError appears. I tested different data sets and I systematically encountered this error.

Minimal code to reproduce the bug

import numpy as np
from picard import picard
N, T = 3, 1000
S = np.random.laplace(size=(N, T))
A = np.random.randn(N, N)
X = np.dot(A, S)
K, W, Y = picard(X , ortho = False , extended = False , fun = 'exp')

Screen shots of the error messages

image
image

Demeaning the wrong dimension?

Hi,
Thank you for the amazing package.

I have a question related to computing the mean during preprocessing.
The picard function takes X as a shape of (n_features, n_samples).

picard/picard/solver.py

Lines 25 to 27 in 1557b10

X : array-like, shape (n_features, n_samples)
Training vector, where n_samples is the number of samples and
n_features is the number of features.

However, when centering is performed during data preprocessing,
the data is demeaned against the sample dimension (axis=-1):

picard/picard/solver.py

Lines 166 to 169 in 1557b10

if centering:
# Center the columns (ie the variables)
X_mean = X1.mean(axis=-1)
X1 -= X_mean[:, np.newaxis]

But, I guess it should be the other way around?

I really appreciate your help in advance.
Minho

Make new PyPI release

Now that you have added the extended parameter, it would be great if you made a new release on PyPI.

cannot import name 'picard' from 'picard' (unknown location)

I installed python-picard.
I could do 'import picard' but when I try use it with MNE

ica = ICA(n_components=0.95, method='picard', allow_ref_meg=True,random_state=0, max_iter=100)

I get following error

cannot import name 'picard' from 'picard' (unknown location)

Requirements are not automatically installed when running a pip install

When installing into a Docker image, we get:

Python 3.8.10 (default, May 12 2021, 15:46:43) 
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import picard
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.8/site-packages/picard/__init__.py", line 23, in <module>
    from .dropin_sklearn import Picard  # noqa
  File "/usr/local/lib/python3.8/site-packages/picard/dropin_sklearn.py", line 10, in <module>
    from sklearn.decomposition import FastICA
ModuleNotFoundError: No module named 'sklearn'

I think that's because the setup.py doesn't have a "requires" section, but not sure.

Integration with Brainstorm / Rank check

Following @agramfort's suggestion, I've been adding the support for Picard in the Brainstorm GUI.
It's so fast, and I trust so blindly Alex's recommendations, I've even made it the current default for ICA:

I've made it a plugin, easy to download and update from the Brainstorm package manager:
https://neuroimage.usc.edu/brainstorm/Tutorials/Plugins

For the moment, the interface looks like this:
image

The only parameter (beyond the data) that is passed to picard.m is the the number of components, if a dimension reduction before PCA is requested.
If we set a parameter 'pca' lower than the rank of the input data, we still get a warning:

Warning: Input matrix is of deficient rank. Please consider to reduce dimensionality (pca) prior to ICA. 
> In picard (line 157)

Is this expected?
Wouldn't we want to see this warning only if number of requested components (or the number of signals) is higher than the rank of the input data?

Any other comment regarding this integration is welcome!
Cheers

Convergence issues

I ended up using picard python implementation due to #19. I've just hit something I believe to be a convergence issue: the picard python implementation took very long to compute and couldn't reach the tolerance criterion in 360 iterations - giving back very weird components.
The last few progress prints from picard:

iteration 352, gradient norm = 33.42, loss = -717.8
line search failed, falling back to gradient
iteration 353, gradient norm = 31.96, loss = -717.7
line search failed, falling back to gradient
iteration 354, gradient norm = 20.22, loss = -717.8
line search failed, falling back to gradient
iteration 355, gradient norm = 18.03, loss = -717.7
line search failed, falling back to gradient
iteration 356, gradient norm = 15.43, loss = -717.7
line search failed, falling back to gradient
iteration 357, gradient norm = 13.68, loss = -717.8
iteration 358, gradient norm = 61.55, loss = -717.8
line search failed, falling back to gradient
iteration 359, gradient norm = 77.3, loss = -717.9
line search failed, falling back to gradient
iteration 360, gradient norm = 32.36, loss = -717.9

I checked the data and run picard again with very similar outcome - it took very long, didn't converge and gave stange components.
I then used eeglab's extended infromax (runica - I didn't use mne's version because I was running everything from matlab and transferring 2d arrays between matlab and python can be a pain) - I can't say anything about the speed because I left it running without any timing set up, but it seems to have converged (although in more steps than maxiter I set for picard). It returned meaningful components. The last few lines of outputs from runica:

step 413 - lrate 0.000000, wchange 0.00000021, angledelta 101.3 deg
step 414 - lrate 0.000000, wchange 0.00000019, angledelta 95.5 deg
step 415 - lrate 0.000000, wchange 0.00000019, angledelta 95.9 deg
step 416 - lrate 0.000000, wchange 0.00000015, angledelta 98.2 deg
step 417 - lrate 0.000000, wchange 0.00000017, angledelta 96.5 deg
step 418 - lrate 0.000000, wchange 0.00000016, angledelta 100.5 deg
step 419 - lrate 0.000000, wchange 0.00000014, angledelta 97.5 deg
step 420 - lrate 0.000000, wchange 0.00000013, angledelta 97.8 deg
step 421 - lrate 0.000000, wchange 0.00000012, angledelta 99.3 deg
step 422 - lrate 0.000000, wchange 0.00000011, angledelta 95.6 deg
step 423 - lrate 0.000000, wchange 0.00000010, angledelta 96.3 deg

If you are interested in investigating I can help by sharing the file or I could set up a better comparison between picard extended and standard infomax extended.

`numexpr` requirement

... should be listed in setup.py. When I do python setup.py develop it did not pull in the requirement.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.