Giter VIP home page Giter VIP logo

bhallalab / mousebehaviour Goto Github PK

View Code? Open in Web Editor NEW
2.0 3.0 5.0 91.47 MB

Arduino and PointGrey camera based behaviour setup.

Home Page: https://mousebehaviour.readthedocs.io/en/latest/

License: GNU General Public License v3.0

OpenSCAD 1.24% Makefile 10.55% Python 25.68% CMake 1.66% C++ 45.07% Shell 3.29% C 1.55% Roff 0.20% MATLAB 10.19% Processing 0.39% Dockerfile 0.19%
arduino animal-behavior pointgrey python3

mousebehaviour's Introduction

Build Status Docker Pulls

Documents https://bhallalab.github.io/MouseBehaviour

A cmake based C++/Python pipeline to run behavioural experiment.

Protocols

A protocols must be listed in ./Protocols/BehaviourProtocols.csv. You must specify the right protocol code duging configuration. See section How to run the pipeline below.

How to run the pipeline

Both the arduino and the camera should be connected to the computer before starting the session.

Download and setup (done once)

This code will only work on Ubuntu-16.04. Install arduino, c++ compiler (gcc), cmake, and boost libraries.

Script bootstrap.sh will try to install all dependencies. Required libraries for PointGrey camera is in the source.

$ git clone  https://github.com/BhallaLab/MouseBehaviour.git
$ cd MouseBehaviour 
$ sudo -E ./bootstrap.sh  # on first time post login. If it fails, manual configuration is needed.

The script bootstap.sh will try to configure your system. You must have sudo permissions because we need to add to various groups (pgrimaging and dialout) before you can access camera and serial port. In case, something odd happens, raise an issue on github.

Once bootstrap is successful, logout and login. Changes made to your groups comes into effect only after a fresh login. To verify that you are in appropriate groups, type groups command in terminal. Make sure that pgrimaging and dialout appears in the group list. If not, things have not gone properly.

To build and upload to arduino:

$ mkdir _build 
$ cd _build
$ cmake -DANIMAL_NAME=k2 -DSESSION_NUM=1 -DPROTO_CODE=All1 ..
$ make run              # to run the whole setup, (arduino and camera both must be connected)or
$ make miniterm         # Just to test arduino board. (camera need to be connected)

On make run, a windows will appear with camera feed and couple of plots at the bottom.

Send commands to arduino

To send commands to arduino, make sure that you focus on the window first (click on it).

Press CTRL+C in terminal to close the session. To see the arduino output in the console, run make miniterm; and press CTRL+] to come out of miniterm.

CMake options

Port

If you need to change the arduino port, pass -DPORT=/path/to/port option to cmake. On a linux system, e.g.

 $ cmake -DPORT=/dev/ttyACM1 -DANIMAL_NAME=k2 -DSESSION_NUM=1 -DPROTO_CODE=A11 ..

Analysis

Analysis scripts are written in python and requires common numerical computational framework e.g., numpy, scipy, pandas and additionally tifflib.

  • sudo apt install python-tifflib

Commands

  • Puff : p
  • Tone : t
  • Led : l
  • Start : s
  • Shock : c
  • Terminate : ctrl+c

What is being printed?

See function write_data_line in file src/main.ino for updated values.

How to analyze data

Go to directory ./analysis and read the README.md file there.

Extracting arduino data from tiff file.

Each tiff file contains one trial. The first row of each frame contains arduino data. You can extract that using the script ./analysis/get_data_line_from_tiff.py file.

To extract data run,

$ python ./analysis/get_data_line_from_tiff.py /path/to/session/trial01.tiff

It will print the csv files onto the terminal and also save it to a dat file in the same folder. A plot file is also generated.

mousebehaviour's People

Contributors

ananthamurthy avatar anzalks avatar bhumikasingh0110 avatar dilawar avatar hrishi16 avatar shriya9 avatar soumyaaymous avatar soumyadead avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar  avatar

mousebehaviour's Issues

_tkinter.TclError in mouse analysis

screenshot from 2017-09-22 03-24-55

I tried looking up the problem;
2 options were
1)to add '' matplotlib.use('Agg')" right after import and before use
2)to change "backend :agg" in matplotlibrc file

Help needed!

Continuation of CS post end of session

After the end of the session, i.e. after using ctrl+c to terminate it, the light CS still keeps going on intermittently. somehow the pin is being triggered.

Rename file convention

Could we please have, "XX_yyyy_zz" with XX-> Mouse name, yyyy->Protocol code, zz->Session number?

Need session type info

HOWTO file doesn't mention the different commands for initiating different types of trials (ie sound, light or mixed)

camera is not detected even when connected

camera is not detected even when it is connected with a error i found before also.

screenshot from 2017-07-11 10-47-57
and I got the USB port with 4 wires coming out .If you want anything else made from the electronics workshop , please let me know.

FEC curve

In accordance with the field, it would be helpful if we made a Fraction of Eye Closure curve.

Calculated from the trial averaged data.
Base line of the curve is mean of 200ms of eyeblink, before CS. This would be zero.
Highest response (typically occurs during the air puff) is considered one.
-Probe trials also follow the same normalization as the CS+ trials. (i.e. probe trials are not normalised amongst themselves).
-Both probe trials and CS+ curves are shown in single graph.
SD is shown as shaded area around the mean.
Standard X axis ticks. starts at -200, one tick after every 50ms.
-This is in addition to the graphs that we are currently generating.

Trial length

Trial length is currently 22 seconds. It needs to be cut down to ~11 seconds.
Imaging trigger to be on for
5 seconds --- stim period --- 5 seconds

The following bit is currently correct and needs no change:
Stim period -
[Sound (350ms)+trace (250 ms) +puff(50ms)]

OR

[Light (50ms)+trace (250 ms) +puff(50ms)]

Review of PCB designs

@hrishi16 @ananthamurthy @soumyaaymous @upibhalla

Have a look at this PCB design: https://labnotes.ncbs.res.in/bhalla/mousebehaviour-cad-files-release-20190707pre2 https://github.com/BhallaLab/MouseBehaviour/releases/tag/v20190707.rc1

Eyeball the schematic at link above. Let me know if you notice something odd/missing. Tomorrow evening, @mrdorababu is going to drill couple of PCBs.

IMP Required components are here: https://github.com/BhallaLab/MouseBehaviour/releases/download/v20190707.pre2/BOM_MouseBehaviour.xls . This sheet contains component required to make 1 PCB.

Please place the order as soon as possible for as many PCB as required. My BOM on [mouser.in] is here: https://www.mouser.in/ProjectManager/ProjectDetail.aspx?AccessID=68514497ae .
If you have mouser accounts, I can share the project with you guys.

Both GPIO are optional. They support CAMERA TTL pulse for blackfly. They usually have GPIO 6pins port. We can also use 2-pin connectors as well.

  • J5 and J6 are complementary to GPIO females J10 and J11.
  • J4 and J12 are complementary.

to get the learning curve

to get the learning curve:
adding plot values of all the trials and taking the average. two different to be made; one for the probe trial and one for the US+ trials.

Add protocol for delay conditions

Changing the present csv doest take care of this.
IMG_20190827_170000

A protocol format neds to be added such that the trace interval is not there, and there is an overlap of the 2 stimuli (CS and US. They co-terminate.
Variations can be made in duration of the stimuli.

Urgent- Error in camera recording!

After doing a fresh git clone, the following error has crept in:

  1. all the images are labelled 'PRE' in the tif files. In the dat files pre,Cs,trac and post are labelled properly
  2. Probably due to the above reason the analysis code is showing the following error
    img_20171219_222141

New session type required

Current session types are S,L and M denoting sound, light and mixed.
Extinction session types are needed.
ME- mixed extinction (mixed session with no US on any trial)
MEL- mixed extinction only light (no US on light trials, usual CS and probe on sound trials)
MES- mixed extinction only sound (no US on sound trials, usual CS and probe on light trials)

To be implemented after discussion in lab meet

Notifications

This thread is for new/notification. Any major change worth attention will be posted here.

stimulus delivery doesnt match protocol

  1. light is being used as CS for proto code ''So1''. It should be sound. Same for protocode ''All2''.
    2)Wrong 'trial-phase' is being displayed in the 'Animal window'. Displays 'PRE' for a long time. My worry is, it might be writing the wrong trial phase in the data file as well.

Modifying protocols

Just adding a line or changing it in the local '.csv' file in 'Protocols' foldr does not start the new protocol. This is the error that comes:

protocol_error

Merging ananthmurthy/eyeBlinkBehaviour

@soumyaaymous @ananthamurthy

I've merged https://github.com/ananthamurthy/eyeBlinkBehaviour into this repository. The merged version is on devel branch.

I need the following information.

  • Are all protocols which should be in this repo closter to each other. If yes, can we have a generic template with varying parameters which can be configured by CMake during run time.
  • If not, we need to write down protocol with code names. For example if from command line I poss -DPROTOCOL=ANANTH or -DPROTOCOL=SHOMU, it would compile and upload the appropriate protocol.

In either case, I'd need a document listing all protocols. A private google-doc or a file here on this ticket would do.

Treadmill has been integrated into the pipeline. Only integration of various protocol is left.

Analysis code- Summary plots - FEC Calculation and treadmill data

FEC has to be calculated for each trial and normalised between 0 (baseline) and 1(max FEC for that trial).
Colours of probe and CS+ traces in FEC graph need to be consistent from day to day. Currently they seem to change.

session summary of tread mill data need to be plotted.

Treadmill motion representation

For verification purposes atleast, we need to plot the following

  • treadmill read out for the whole trial. (plotted for each trial. Not averaged)

This is in addition to the current analysis. Plot should be separate. Code can also be separate (and not analyze_trial.py)

Error in frame capture in box 2

There seems to be an error in the time diff b/w frames captured. There is no uniform difference in time stamps b/w 2 frames. If am reading correctly, then it is a problem with both behavior box 1 and box 2.

In behavior box 2 the no. of frames captured is 220 nad in box 1 it is ~330. Why is there this diff when codes used for both are same?

In box 2, the description that appears on the bottom of the pic and the actual event happening is mismatched, e.g when the puff is actually happening, the description says POST.

Latest ccreeninfo is not python3.6 compatible

[100%] Built target cam_server

  • cd ..
  • python3 gui.py --build-dir /root/MouseBehaviour/_docker_build
    Traceback (most recent call last):
    File "gui.py", line 25, in
    from screeninfo import get_monitors
    File "/usr/local/lib/python3.6/dist-packages/screeninfo/init.py", line 1, in
    from .common import Enumerator, Monitor
    File "/usr/local/lib/python3.6/dist-packages/screeninfo/common.py", line 3, in
    from dataclasses import dataclass
    ModuleNotFoundError: No module named 'dataclasses'
    Makefile:14: recipe for target 'run' failed
    make: *** [run] Error 1

New error after yesterday's update

Originally posted by @hrishi16 in #57 (comment)

Mislabelling of frames

After the bheja fry behaviour laptop was updated, the frames seem to be labelled incorrectly. The CS clearly comes on in frames labelled as "Trace" and the US comes on in frames labelled as "Post".
The time duration of Cs,trace and US is correct.

The analysis reflects that as well
summary

gaming mouse detector

I have placed the sensor near the wheel. Could you please check whether its working or not. If you tell me the commands then I will check it

  • integrate mouse reader into pipeline.

no module named cv2

I tried many versions of installing opencv using pip and python, according to their docs. None of them working.
This is error in when running analysis file:

"
hrishi@hrishi-Latitude-E5270:~/Dilawar_MB/MouseBehaviour/analysis$ python analyze_trial.py --datadir=/home/../../../ --outdir=/../../analysis

Traceback (most recent call last):
File "analyze_trial.py", line 13, in
import analyze_trial_video
File "/home/hrishi/Dilawar_MB/MouseBehaviour/analysis/analyze_trial_video.py", line 17, in
import cv2
ImportError: No module named cv2
"

Already tried updating scikit, numpy, and setuptools.

Some error while trying to install opencv :
"
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-moeln923/opencv-python/
"

It started when i tried to install ./requirements.txt in the analysis folder:
"
Collecting opencv-python (from -r ./requirements.txt (line 5))
Downloading https://files.pythonhosted.org/packages/77/f5/49f034f8d109efcf9b7e98fbc051878b83b2f02a1c73f92bbd37f317288e/opencv-python-4.4.0.42.tar.gz (88.9MB)
100% |████████████████████████████████| 88.9MB 12kB/s
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-build-lh0ewe0h/opencv-python/setup.py", line 9, in
import skbuild
ModuleNotFoundError: No module named 'skbuild'

----------------------------------------

Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-lh0ewe0h/opencv-python/
"

Slight modification in data representation

Instead of representing the summary like this:

screenshot from 2017-08-17 15 52 27

It would be better if we split the raw rastor plots into one graph and the averaged trace (with SD) to another graph. These would have matching X and Y axes. That makes it easier to compare between the averaged trace of paired and probe trials. Like this:
screenshot from 2017-08-17 15 56 43

Currently the Y axis is very squeezed.

Sessions saved over older files

I'd tried two sessions with "All1" and found session 2 to have rewritten the trials of "X_1_All1". We need to be sure the session is saved properly

error in analysis

itadmin@Bhallalab-common1:~/Work/MouseBehaviour/analysis$ python analyze_trial.py ~/home/itadmin/DATA/577/577_S_6
Generating '/usr/lib/python2.7/dist-packages/libtiff/tiff_h_4_0_6.py'
Traceback (most recent call last):
File "analyze_trial.py", line 20, in
import analyze_trial_video
File "/home/itadmin/Work/MouseBehaviour/analysis/analyze_trial_video.py", line 21, in
from libtiff import TIFF
File "/usr/lib/python2.7/dist-packages/libtiff/init.py", line 20, in
from .libtiff_ctypes import libtiff, TIFF, TIFF3D
File "/usr/lib/python2.7/dist-packages/libtiff/libtiff_ctypes.py", line 117, in
f = open(fn, 'w')
IOError: [Errno 13] Permission denied: '/usr/lib/python2.7/dist-packages/libtiff/tiff_h_4_0_6.py'

Analysis of mixed trials- Seperate for sound and light

Currently the analysis code gives a summary of the entire session. For mixed trials that includes both light and sound as CS.
In addition to the current graphs for mixed sessions, Light trials and sound trials are to be plotted separately.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.