Giter VIP home page Giter VIP logo

affinewarp's People

Contributors

ahwillia avatar nirum avatar poolio avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

affinewarp's Issues

Aligning on event median

Hello,
I'm trying to apply warping to my data, following the 'rat motor cortex' tutorial.
I noticed that it always align to the max isi (for tap2 in the exemple). It is not ideal if my event distribution is polluted by some abnormaly long isi. Is it possible to align on the isi mean or median?

Figure 2021-12-15 095931

Here is how it looks like on my data : I don't think it makes sense to align on the max isi, especially for the 'align on both taps' column.

Thanks in advance!
Clara

About Nonlinear Warping

Hi Alex,

I have a question about nonlinear warping you mentioned in Discovering Precise Temporal Patterns in Large-Scale Neural Recordings through Robust and Interpretable Time Warping, 2020.
1644371464(1)

Fig.1C shows that for all trials the template time and clock time have little difference at the begining and ending. However when applying this technique to High-performance brain-to-text communication via handwriting, 2021, the warp function looks different. This paper use your time warp technique to warp all the single letter motor imaginary neuron activity data to an average template.
image

Take 'L' for example. the warping function shows that the clock time and aligned time have little difference at the beginning and big difference in the end. I am confused by this phenomenon since after alignment, raw data would nonlinearly map to the template and the start time should change as the same level as end time. Could you please help me with this confusion? Thanks!

Best regards,
Dongming

Quick question about time factor

Hi Alex,

Thanks for your help with my last issue. I am currently running the timewarping code and I wonder what the time factor represent. The neuron factor is easy to understand since every neuron signal contributes to the template. But the time factor is more abstract to me. Does it mean how to align the template time bins and clock time bins?

Thanks,
Dongming

Does it support time series with different length?

Hi Alex,

I tried to use this package on time series with different length. However, it seems that the package doesn't support this function. I would like to warp time series to the same length. Is it possible to realize this function using your package?

Besides, I have one more comment. After warping, when the time series are shifted and become shorter than the template, the algorithm will extend the shorter time series by simply reusing the first or last point. For neural signals, we would generally have additional information about the past or future spikes. It would be useful if this additional information is considered. I think there is a way to achieve this function. We can use the spike time arrays instead of the binned spike counts and we may be able to transform the spike time and then binned to calculate loss within a certain predefined time window.

Thank you!

Best,
Feng

densewarp incompatible with Python2

Hi Alex!

I just wanted to raise this as a GitHub issue since Python2 is deprecated, but the fix to make the function compatible is pretty trivial.

To highlight the issue I ran into,

  • I tried applying Timewarping to a 3D array (ntrials x ntime x ndim) of data.
  • I was successfully was able to apply ShiftWarping to the data (and the results made sense)
  • I was able to apply PiecewiseWarping with n_knots=0 (linear timewarping), but the output did not make sense.

I dove into the code and found that in piecewisewarp.py, the function densewarp has this line (Line 557, https://github.com/ahwillia/affinewarp/blob/master/affinewarp/piecewisewarp.py#L557) :

            # fraction of trial complete
            x = t / (T - 1)

In Python2, this line results in integer division which results in x=0 for all indices in T except for the last index (x=1). To fix this, I simply did:

            # fraction of trial complete
            x = float(t) / (T - 1)

This is not an issue in Python 3, as this line does not result in integer division But I just wanted to give you a heads up in case you wanted to make the change to the code since it is forward-compatible with Python3. Thanks!

Best,

Lahiru

Sample frequency changed after time warp

Hey Alex,

I have a simple question when I read the paper Discovering Precise Temporal Patterns in Large Scale Neural Recordings through Robust and Interpretable Time Warping. Since linear warping functions will compress or stretch the signal, the time length of the neural activity will change. In discrete time series, the total data points remain the same. That means the sample frequency which is points divided by time length will changed too. Thus, the frequencies of neural activities will change. I wonder if this frequency change exists and how you deal with it. Thank you!

Best,
Dongming

problem running _fit_template in example script

Hi

I'm trying to run the multi_warp example code but get an error during the model fitting stage
TypeError: _fit_template() takes exactly 3 arguments (2 given)
Any idea what went wrong?
thanks in advance,

Noam

Transformed Units Have Drastically Reduced Fire Rates

Hi!
So I have been trying to apply ShiftWarping on my dataset, but the transformed neurons have reduced spike count rate and in some of the trials, after a certain point of time, transformed units burst!
I have no idea whether if this phenomenon is expected or it is due to my code.

The raster plot for one of my neurons is attached, before and after time warping.

130361430-c98d5ec8-f895-473c-b713-9445d158f870

Export model to load in separate code

Hi everyone,
FIrst of all, thanks a lot for developing this amazing package.
I was wondering whether there is a way to export models after fitting on data to be able to then load from separate code and use it to transform.
Thanks for your help!

application to human LFP data

Hi Dr. Williams,

First of all thank you so much for sharing all of these resources openly online. I've already learned so much just by going through your implementations, and your code is so nicely documented.

I have a motor sequence learning dataset in which I record LFP and ECoG data from patients with movement disorders. They learn two different typed sequences (S1 and S2). Each time a fixation cross appears, they type one of the two sequences. I'm hoping to use frequency-domain neural activity during the reaction time period to predict which sequence the patient is about to type using a simple classifier. The total reaction time is highly variable, so I am thinking about trimming the data to the 200ms right before movement onset across all trials, and then apply time warping within that window. I'd follow that with TCA or some other dim reduction before using these as part of my feature vector before feature selection and the running through a simple classifier.

When I apply your TCA code on the (trimmed) raw spectral data just to make sure I am doing things correctly, the results seem to make sense when I compare to some of the basic trends in the data (though it doesn't visibly distinguish between S1 and S2 in the across trial factors unfortunately). S1 is purple dots, S2 is yellow dots.
Screen Shot 2022-11-09 at 7 32 24 PM
Screen Shot 2022-11-09 at 7 33 42 PM
Screen Shot 2022-11-09 at 7 34 53 PM

|
|
|

When I try to apply the piecewise warping example code, the loss is extremely small, and I'm not sure if it makes any sense..
Screen Shot 2022-11-09 at 7 38 14 PM
Screen Shot 2022-11-09 at 7 38 38 PM

|
|
|
Similarly, when I to do a hyperparameter search, the loss is extremely small. However, in the hyperparameter search, there also seems to be no change in loss across iterations, and the results from every random sample draw per fold seem to be identical (I plotted all loss histories for all hyperparameter samples for all models below- the lines for the same models are just overlapping).
Screen Shot 2022-11-09 at 7 39 42 PM

|
|
|

Do you have any idea what I am doing wrong?
Is it inappropriate to use these functions on spectral neural data?
Any suggestions for alternative methods or change to my overall approach?

code snippet I've been using for piecewise below

import numpy as np
import matplotlib.pyplot as plt

from affinewarp.multiwarp import MultiShiftWarping
from affinewarp.datasets import piecewise_warped_data

from affinewarp import ShiftWarping, PiecewiseWarping

from affinewarp.crossval import paramsearch

knot_range = (-1, 3)
num_models = 3
n_valid_samples = 10

#  g is just lfp channels
for g in [0,1,3]:
    print(g)
    
    # results in n_trials x n_time x n_centerfreq spectral data
    binned = data_og[:,:,:,g].transpose((2,1,0))

    # Run the parameter search.
    results = paramsearch(
        binned,  # time series data (trials x timebins x features/units)
        num_models,  # number of parameters to randomly sample
        n_valid_samples,  # number of hyperparameter samples per validation set
        n_train_folds=3,  # ratio of data to use for training
        n_valid_folds=1,  # ratio of data to use for validation
        n_test_folds=1,  # ratio of data to use for testing
        knot_range=knot_range,  # range of knots in warping function
        warpreg_range=(1e-3, 1e-2),  # range of warp regularization scale
        smoothness_range=(1e-1, 1e0),  # range of smoothness regularization scale
        iter_range=(50, 51),  # range of optimization iterations
        warp_iter_range=(50, 51), # range of warp iterations
    )
    

    # plot hyperparamters search results
    
    train_rsq = results["train_rsq"]
    valid_rsq = results["valid_rsq"]
    test_rsq = results["test_rsq"]
    knots = results["knots"]
    
    fig, ax = plt.subplots(1, 1, figsize=(6, 3))
    
    plt.plot(knots-.1, np.median(train_rsq, axis=1), 'ok', label='train', alpha=.5)
    plt.plot(knots, np.median(valid_rsq, axis=1), 'ob', label='validation', alpha=.7)
    plt.plot(knots+.1, test_rsq, 'or', label='test', alpha=.7)
    
    ax.set_xticks(range(*knot_range))
    ax.set_xticklabels(['shift', 'linear', 'pwise-1', 'pwise-2'])
    
    ax.set_ylabel("$R^2$")
    ax.spines['top'].set_visible(False)
    ax.spines['right'].set_visible(False)
    ax.set_xlabel('warping model', labelpad=7)
    ax.legend()
    plt.title("Channel "+str(g))
    
    fig.tight_layout()
    
    plt.figure()
    for i in range(12):
        for j in range(10):
            plt.plot(results['loss_hists'][i,j,:])





# Fit Models
models = [
    ShiftWarping(smoothness_reg_scale=20.0),
    PiecewiseWarping(n_knots=0, warp_reg_scale=1e-6, smoothness_reg_scale=20.0),
    PiecewiseWarping(n_knots=1, warp_reg_scale=1e-6, smoothness_reg_scale=20.0),
    PiecewiseWarping(n_knots=2, warp_reg_scale=1e-6, smoothness_reg_scale=20.0),
]

for m in models:
    try: 
        m.fit(binned, iterations=50, warp_iterations=200)
    except TypeError:
        m.fit(binned, iterations=50)
    

# Learning curve.
plt.figure()
for m, label in zip(models, ('shift', 'linear', 'pwise-1', 'pwise-2','multiwarp')):
    plt.plot(m.loss_hist, label=label)
plt.legend()
plt.xlabel('iterations')
plt.ylabel('loss')



# plot example before and after alignment

fig, axes = plt.subplots(5, 5, sharex=True, sharey=True, figsize=(10, 5))

plt.suptitle("Channel: "+str(g))
for n, axr in enumerate(axes):

    axr[0].imshow(binned.transpose(2,1,0)[:,:,n])
    
    axr[1].imshow(models[0].transform(binned).transpose(2,1,0)[:,:,n])
    axr[2].imshow(models[1].transform(binned).transpose(2,1,0)[:,:,n])
    axr[3].imshow(models[2].transform(binned).transpose(2,1,0)[:,:,n])
    axr[4].imshow(models[3].transform(binned).transpose(2,1,0)[:,:,n])

axes[0, 0].set_title("raw data")

axes[0, 1].set_title("shift-only")
axes[0, 2].set_title("linear")
axes[0, 3].set_title("piecewise-1")
axes[0, 4].set_title("piecewise-2")

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.