Giter VIP home page Giter VIP logo

g-node-portal's People

Contributors

achilleas-k avatar asobolev avatar michaelfsp avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

g-node-portal's Issues

[Update record] misc

  • if a record was uploaded and one check failed, i updated it with the
    correct files. so far the progress bar did not disappear. something
    got stuck?
  • add record and update record dialogs can be open at the same time

[Evaluation upload] Evaluation does not complete

Uploading an evaluation file for Espens Benchmark data creates a new evaluation that does not come out of the "in progress" state.

How to reprocude:
Upload the attached file for Espens Benchmark (unfiltered).
How do I attach a file? The file is now in the dropbox: \Dropbox\OsloDataShare\espen_websitetest_sorted.gdf

[NEXT MILESTONE] Deadline in April for INCF meeting

This defines the immediate goal of the development. I did not find how to put that here, so it is an issue.

Goal: Bring the website to a state where it is fully operational and can be presented in a way that the evaluation results to one benchmark are understandable and presentable.
Important:

  1. The benchmark upload must work for files that are according to the specification.
  2. The evaluation must work for files that are according to the specification.
  3. The evaluation results must be accessible.
  4. The evaluation results must be presented in a "human" readable and at least rudimentary nice way.
  5. The "evaluation view filters", i.e. the summary presentation e.g. of all evaluations of one benchmark must work.
  6. One useful example benchmark must be uploaded.
  7. Several useful sortings must be uploaded for that example benchmark.

[Evaluation Details] Add detail field "degree of supervision" to method/algorithm details

When uploading a sorting for an evaluation the user should specify what kind of method he used. This is so far done in the field "algorithm". Since it would be very interesting for another user to filter all evaluations according to the type of method that was used, we need to specify that as well. Type of method I mean:

using ground truth templates, manual, semi automated, fully automated

Change Password: test bugfix

Traceback (most recent call last):

File "/opt/g-node-portal/apps/lib/python2.5/site-packages/django/core/handlers/base.py", line 111, in get_response
response = callback(request, _callback_args, *_callback_kwargs)

File "/opt/g-node-portal/apps/lib/python2.5/site-packages/django/contrib/auth/decorators.py", line 23, in _wrapped_view
return view_func(request, _args, *_kwargs)

File "/opt/g-node-portal/apps/g-node-portal/apps/account/views.py", line 179, in password_change
password_change_form.save()

File "/opt/g-node-portal/apps/g-node-portal/apps/account/forms.py", line 303, in save
if not (change == True):

UnboundLocalError: local variable 'change' referenced before assignment

<WSGIRequest
GET:<QueryDict: {}>,
POST:<QueryDict: {u'action': [u'change password'], u'password1': [u'xxHIDDENxx'], u'password2': [u'xxHIDDENxx'], u'oldpassword': [u'xxHIDDENxx']}>,
COOKIES:{'sessionid': 'd63df70e77b7b6dcbe4287c82f72ba1c'},
META:{'CONTENT_LENGTH': '83',
'CONTENT_TYPE': 'application/x-www-form-urlencoded',
'DOCUMENT_ROOT': '/data/web/vhosts/portal.g-node.org',
'GATEWAY_INTERFACE': 'CGI/1.1',
'HTTPS': '1',
'HTTP_ACCEPT': 'text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8',
'HTTP_ACCEPT_CHARSET': 'ISO-8859-1,utf-8;q=0.7,*;q=0.7',
'HTTP_ACCEPT_ENCODING': 'gzip,deflate',
'HTTP_ACCEPT_LANGUAGE': 'en-us,en;q=0.5',
'HTTP_CONNECTION': 'keep-alive',
'HTTP_COOKIE': 'sessionid=d63df70e77b7b6dcbe4287c82f72ba1c',
'HTTP_HOST': 'portal.g-node.org',
'HTTP_KEEP_ALIVE': '115',
'HTTP_REFERER': 'https://portal.g-node.org/data/account/password_change/',
'HTTP_USER_AGENT': 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25 ( .NET CLR 3.5.30729; .NET4.0E)',
'PATH': '/usr/local/bin:/usr/bin:/bin',
'PATH_INFO': u'/account/password_change/',
'PATH_TRANSLATED': '/data/web/vhosts/portal.g-node.org/account/password_change/',
'QUERY_STRING': '',
'REMOTE_ADDR': '141.5.20.126',
'REMOTE_PORT': '6705',
'REQUEST_METHOD': 'POST',
'REQUEST_URI': '/data/account/password_change/',
'SCRIPT_FILENAME': '/opt/g-node-portal/apps/g-node-portal/apache/django.wsgi',
'SCRIPT_NAME': u'/data',
'SCRIPT_URI': 'https://portal.g-node.org/data/account/password_change/',
'SCRIPT_URL': '/data/account/password_change/',
'SERVER_ADDR': '141.84.44.72',
'SERVER_ADMIN': '[email protected]',
'SERVER_NAME': 'portal.g-node.org',
'SERVER_PORT': '443',
'SERVER_PROTOCOL': 'HTTP/1.1',
'SERVER_SIGNATURE': '

Apache/2.2.9 (Debian) DAV/2 PHP/5.2.6-1+lenny3 with Suhosin-Patch mod_ssl/2.2.9 OpenSSL/0.9.8g mod_wsgi/2.5 Python/2.5.2 Server at portal.g-node.org Port 443\n',
'SERVER_SOFTWARE': 'Apache/2.2.9 (Debian) DAV/2 PHP/5.2.6-1+lenny3 with Suhosin-Patch mod_ssl/2.2.9 OpenSSL/0.9.8g mod_wsgi/2.5 Python/2.5.2',
'mod_wsgi.application_group': 'portal.g-node.org|/data',
'mod_wsgi.callable_object': 'application',
'mod_wsgi.listener_host': '141.84.44.72',
'mod_wsgi.listener_port': '443',
'mod_wsgi.process_group': '',
'mod_wsgi.reload_mechanism': '0',
'mod_wsgi.script_reloading': '1',
'mod_wsgi.version': (2, 5),
'wsgi.errors': <mod_wsgi.Log object at 0x4786ea0>,
'wsgi.file_wrapper': <built-in method file_wrapper of mod_wsgi.Adapter object at 0x4787378>,
'wsgi.input': <mod_wsgi.Input object at 0x1b4de30>,

[Add record] misc

Add record:

  • what is a record? didnt we call that differently? I recall "benchmark", "subbenchmark" and "data file", "benchmark file" or "data file pair".
  • upload of a record can take long but has no status bar
  • there must be a description of the file formats for the uploadable files
  • the "view log" text is not clickable (only mouse over).
  • the log text could be improved (e.g. after uploading a wrong gt
    file): "looking at groundtruth file version with uid: 62 error during
    record check: invalid literal for int() with base 10:
    '\xcf\xfc\x8c\xa63x(M\xf2y\xf1'"
  • there is a line break between the "x" icon for failed uploads and
    the "view log" text

Low priority:

  • option to upload several records at the same
    time. the 8+4+4+4 files of the Quiroga benchmark will be a pain
    otherwise

Localize exceptions

Change all exceptions to local ones (define new classes) so not to loose real bugs.

Main (active) portal layout and link bugs

Description:
On my profile page is
a) a text box not where it should be
b) a broken link

How to reproduce bug:
a) Login into the portal under https://portal.g-node.org/
b) click on "Profile", the "your profile" tab is active. (i am now here https://portal.g-node.org/data/profiles/profile/ffranke/)
first bug:
The Text "Name: Felix Franke Location: ETH Zürich, D-BSSE Website: http://www.bsse.ethz.ch/bel/people/frankef" overlaps with the BCCN logo
c) click on the "profiles" link under "your friends" (I have none). The link directs to: https://portal.g-node.org/profiles/ which is broken

[Documentation] Discuss the Documentation/BenchmarkDetail/Forum issue

We need dynamic pages for the following issues and have to discuss on a) what we actually want, b) which technology to use for that.

  1. Documentation for the user. We need to put text/figures/explanations on the website on specific items/buttons/figures to explain the user how to use the website and what the output means.
  2. We might want a dynamic part for evaluations an benchmarks where users can leave comments
  3. The creator of the benchmark might want to add details to his benchmark and add information about what the benchmark means

Maybe one content management system can handle all three, or two of them?

Make more detailed sharing possible

There should be an opportunity to share objects on all levels (segment, signal, unit etc.) without duplicating objects or changing the dataset structure.

[Explanations] Decide on how to include explanations into the website

The website needs more explanations for users of the items it depicts.

Example: When looking on the result page of an evaluation, the table containing the error values needs to be understandable.

Problem: These explanations can be very long and might contain pictures. Furthermore, it would be desirable if someone (Felix) could modify them without having to bother the developers.

Task: Decide either on an content management system, external wiki or other solution that will be used to manage the explanations.

GH pages for G-Node projects

It seems that github pages (http://pages.github.com/ ) service doesn't work as expected for organizations. I've succeeded to create a page for G-Node (g-node.github.com) but it doesn't work for repos (however, always says that page build was successful). If someone was different results pls let me know.

My example pages as a gh-pages branch in g-node-portal repo:
https://github.com/G-Node/g-node-portal/tree/gh-pages

Thus for the moment I propose to use single repo for documenting all projects, if needed. This repo is called g-node.github.com, everything that goes to master branch there is being automatically built to g-node.github.com.

Andrey

[Usability] [Evaluation upload] Make it easier for the user to upload all evaluation files for several records of a benchmark at once

Lets say a benchmark has 8 records. A user sorts all those records and gets 8 .gdf files. When uploading all those files he needs to do so for every file individually. That is boring and takes time.

Solution:
There are several options to solve that problem. Some are discussed below as "too much work". The simplest is probably:
Like in the prototype, when clicking on upload evaluation for a record, show an edit field for every records and allow that the user adds a file to all those edit fields.

Other solutions (probably too much work to implement):

  • Allow for uploads of .zips containing many files. The files must be named in a way that the website understands to which record to put them, e.g. folders for benchmarks, filenames for records.
  • Allow for multiple file selection when opening the upload dialog. Now the website has to derive from the filenames for which record the files are.

[Evaluations] Better report on check of uploaded spike sorting file

When uploading a gdf file that contains a sorting and this file is in some way corrupt, e.g.

  • it contains only one column,
  • it contains floats instead of integers,
  • it contains binary data
  • it is too big

the error should be reported to the user and the explanation of the correct data format should be shown. Maybe even with a link to the Matlab function that stores gdfs :

X = [001 101
002 102
001 215
001 469];
filename = 'dummy.gdf';
delimiter = ' ';
X = X';
fhandle = fopen(fileName, 'w+');
fprintf(fhandle, ['%05d' delimiter '%d\n'], X(:));
fclose(fhandle);

[Evaluation details] users must be able to export the results of evaluations

On the detail page of an evaluation there should be a button that allows downloading the results.

I would suggest the easiest way possible, but I dont know what that is. Would it be possible to build a zip file that contains

  • a) a H5file/matlab file containing the result table
  • b) a text file containing the result table as depicted on the website
  • c) the figures from the evaluation page
  • d) a readme file specifying what that zip contains, which benchmark was used and which evaluation that was

[Evaluation Details] Rename algorithm to Method/Algorithm

When an evaluation is uploaded, the user has to specify how he created the sorting. So far we called that "algorithm". However, I think it would be useful to rename that field since there are also manual sorters around.

Therefore, rename the field to "method or algorithm".

"Remember me" does not work

Although I click "Remember me" all the time during login as soon as I navigate away from the portal it forgets that I am logged and will ask for the login credentials again.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.