Giter VIP home page Giter VIP logo

backslash_art's Introduction

As part of the \Art program from Cornell Tech, we (Renee and Nishad) worked with artist Kate Gilmore on a performance piece called “They Call Us A Storm”. During the performance, four young women “orchestrated” a thunderstorm using various gestures/ body poses, such as stomping, kneeling and arm-raising. Each performer was able to trigger a different type of sound to play (wind, quiet thunder, big thunder or lightning), with a special gesture triggering Whitney Houston’s “I Will Always Love You.”

Implementation Details

In the final implementation of the performance, we use Microsoft’s Kinect for XBox One (2.0) sensor and the Kinect for Windows SDK (2.0) to implement body tracking in C++. In addition to using the built in gestures of body leaning, and hands open/closed, we used Microsoft’s Visual Gesture Builder (VGB) to create a custom gesture database by recording and tagging clips of our gestures (hands on hips, stomping and summoning). When one of our gestures is detected, we use OSCPack to send the appropriate messages to our Cycling 74’ Max patch, which we use to play quadraphonic sounds based on incoming messages.

Old Implementation

In our original implementation of this performance, we used OpenPose (https://github.com/CMU-Perceptual-Computing-Lab/openpose) - an open source, computer vision based project that can detect people’s skeletons, and returns keypoints ( x,y coordinates of body parts) for all the people detected in each frame. We ran OpenPose on an AWS g3 instance, and used an IP/ security camera feed as input to the models. We wrote a Python script to get the keypoints in real time from the output JSON files (using WatchDog) that were being created by OpenPose. In order to detect specific gestures (i.e.. arm raising, head nodding and stomping), we analyzed how certain keypoints changed during the gestures and used the patterns we found to come up with heuristics. When a gesture was detected, we sent the appropriate message to Max using Python-Osc. In this version, both sounds and flashing images of lightning were triggered.

Other Topics

Other topics we explored that did not make it to the final performance include:

  • Envelop for Live - Ambisonics/ Spatial Audio
  • Lighting control and flashing images on projectors
  • People/ face detection using OpenCV or Physical wearable sensors
  • Face detection using Kinect

Credits:

backslash_art's People

Contributors

nishadprinja avatar resses avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.