Giter VIP home page Giter VIP logo

procomprag's Introduction

Table of Contents

This is a server backend to receive, store and retrieve online linguistics (pragmatics) experiments. This program was written for the research program XPRAG.de

If you encountered any bugs during your experiments please submit an issue.

A live version of the server is deployed at https://procomprag.herokuapp.com

Server Documentation

This section documents the server program.

Required values from experiment submissions

The server expects to receive results from experiments which are structured similarly to the sample experiments provided under doc/sample-experiments, via HTTP POST. The experiment framework was developed by Stanford CoCoLab.

In addition to the original structure, three extra values are needed in the exp.data object to be submitted, as shown on lines 386 to 388 in /doc/sample_experiments/italian_free_production/experiment/js/norming.js:

  • author: The author of this experiment
  • experiment_id: The identifier (can be a string) that the author uses to name this experiment
  • description: A brief description of this experiment

When an experiment is finished, instead of sending it with mmturkey to the interface provided by MTurk/using the original turk.submit(exp.data), please POST the JSON to the following web address: {SERVER_ADDRESS}/api/submit_experiment, e.g. https://procomprag.herokuapp.com/api/submit_experiment

The following is an example for the POST call.

$.ajax({
  type: 'POST',
  url: 'https://procomprag.herokuapp.com/api/submit_experiment',
  // url: 'http://localhost:4000/api/submit_experiment',
  crossDomain: true,
  data: exp.data,
  success: function(responseData, textStatus, jqXHR) {
    console.log(textStatus)
  },
  error: function(responseData,textStatus, errorThrown) {
    alert('Submission failed.');
  }
})

The reason for error would most likely be missing mandatory fields (i.e. author, experiment_id, description) in the JSON file.

Note that crossDomain: true is needed since the server domain will likely be different the domain where the experiment is presented to the participant.

Retrieving experiment results

Just visit the server (e.g. at https://procomprag.herokuapp.com), enter the experiment_id and author originally contained within the JSON file, and hit "Submit". Authentication mechanisms might be added later, if necessary.

Experiment Documentation

This section documents the experiments themselves, which should work independent of the backend (e.g. this program or the default backend provided by Amazon MTurk) used to receive their results.

Deploying experiments

This program is intended to serve as the backend. An experiment is normally written as a set of static webpages to be hosted on a hosting provider (e.g. Gitlab Pages) and loaded in the participant's browser. Currently, most experiments collected by this backend are conducted on the crowdsourcing platform Prolific. However, there should be no restrictions on the way the experiment is run (via e.g. another crowdsourcing platform such as Amazon MTurk, or without any third-party platform at all).

Sample experiments based on the framework originally developed by Stanford CoCoLab are provided under doc/sample-experiments. The experiment 1c is for Amazon MTurk and the experiment italian_free_production is for Prolific.ac. The entry point for the experiments is the file index.html.

Deploying an experiment to Gitlab Pages

Currently all the experiments are deployed with Gitlab Pages, though other solutions might also be used, e.g. Bitballon.

The following is a short description of the deployment process on Gitlab Pages:

  1. Go to the folder containing the experiment: e.g. cd doc/sample-experiments/1c if you use the deployment script, or cd test if you followed the manual method.
  2. In your browser, create a gitlab repository, e.g. test
  3. Initialize git repo: git init
  4. Add the repository as a remote: git remote add origin [email protected]:exprag-lab/test.git
  5. Add all the files in the folder: git add .
  6. Commit: git commit -m "Initial commit"
  7. Push: git push -u origin master
  8. Check whether the deployment task was successfully run on Gitlab: Pipeline
  9. The experiment should be available at user-name.gitlab.io/repo-name, e.g. https://exprag-lab.gitlab.io/test/

As an alternative, you may also deploy to a hosting site such as Bitballon by simply dragging and dropping the public folder. However, this has the disadvantage of not being able to use a custom domain prefix such as exprag-lab when displaying the experiment.

An example of deployed experiment may be found at https://exprag-lab.gitlab.io/experiment-1c/ (Pushed to the repository "experiment-1c" under the user "exprag-lab").

To write a new experiment, you may modify the files norming.js and index.html. You may also include additional resources in the experiment folder, e.g. images to be used in the experiment. The file css/local-style.css can be used to define experiment-specific layouts. Due to differences in folder structures, the easiest way to update an experiment is to modify just the source files, and use the deploy.sh script to deploy the generated folder into actual Gitlab repos.

Posting/Publishing experiments

After having successfully deployed an experiment to Gitlab Pages and tested it, you may want to post it on crowdsourcing platforms. To post an experiment on MTurk, you may use the script Submiterator together with MTurk command line tools, or you may do so manually.

To post an experiment on Prolific.ac, just follow the instructions given on their user interface and link to the experiment deployed on Gitlab Pages. Please remember to change the variable exp.completionURL in the file norming.js to match the Prolific completion URL for that particular experiment.

Additional Notes

  • The assumption on the server side when receiving the experiments is that each set of experiment results would have the same keys in the JSON file submitted and that each trial n an experiment would have the same keys in an object named trials. Violating this assumption might lead to failure in the CSV generation process. Look at norming.js files in the sample experiments for details.

    If new experiments are created by modifying the existing experiment examples, they should work fine.

  • Please avoid using arrays to store the experiment results as much as possible. For example, if the participant is allowed to enter three answers on one trial, it would be better to store the three answers under three separate keys, instead of an array under one key.

    However, if an array is used regardless, its items will be separated by a | (pipe) in the retrieved CSV file.

  • There is limited guarantee on DB security on Heroku's Hobby grade. The experiment authors are expected to take responsibility of the results. They should retrieve them and perform backups as soon as possible.

  • This app is based on Phoenix Framework and written in Elixir. If you wish to modify the app, please look at the resources available at:

procomprag's People

Contributors

x-ji avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.