Giter VIP home page Giter VIP logo

mindroid's Introduction

Mindroid

Android Library for Lego Mindstorms Robots. Main features:

  • Controlling NXT brick using Android mobile phone (all sensors and motors)
  • Integration with PocketSphinx for speech recognition for voice control of robot.
  • Quickstart skeleton for begginers in Java programming (suitable for grades 8 and up)
  • Robot service (thread safe) for advanced programmers.

To see the available robot commands and their documentations, see Overview of robot commands

Beginners guide

The simplest way to start programing is to clone the repository and open the RobotControl class that is located in the app module.

To use it with your NXT brick, you have to pair your mobile device with LegoMindstorm using sytem Bluetooth manager. After that, you have to change the private static final String ROBOT_NAME = "YOUR_NXT_NAME"; to point to the name of your NXT.

There are two functions in the RobotControl class that allows you to control NXT brick. These are:

  • commandProgram() -- here you can write a regular program that will be executed when you hit the play button on your mobile phone.
  • onVoiceCommand() -- here you can place any code that should be executed when voice recognition system recognises given phrases.

The example below shows how to use commandProgram() method to send requests to NXT motors.

 @Override
    public void commandProgram(){
        super.commandProgram();
        /*************** START YOUR PROGRAM HERE ***************/
        //rotate motors A and B three times 
        robot.executeSyncTwoMotorTask(robot.motorA.run(10,360*3),robot.motorB.run(10,360*3));
        while(robot.motorA.getState() == Motor.STATE_RUNNING) {
            pause(500);
            Log.d(TAG, "Waiting....");
        }
        robot.executeSyncTwoMotorTask(robot.motorA.stop(), robot.motorB.stop());
  

}

Below, the example of a usage of voice commands is given. Speach recognition will be enabled when you push the microphone button in the mobnile app. It is worth noting, that both functions can be executed simultaneously, i.e. you can write a command program to tell robot to move around, and then in onVoiceCommand() stop robot when the stop phare was recognised.

For more information on voice recognition, see: Advanced guide and CMU Sphinx website

 @Override
    public void onVoiceCommand(String message) {
        super.onVoiceCommand(message);
        /*************** HANDLE VOICE MESSAGE HERE ***************/
        if(message.equals("run forward")){
            robot.executeSyncTwoMotorTask(robot.motorA.run(30),robot.motorB.run(30));
        }else if(message.equals("stop")){
            robot.executeSyncTwoMotorTask(robot.motorA.stop(), robot.motorB.stop());
        }else if(message.equals("run backward")) {
            robot.executeSyncTwoMotorTask(robot.motorA.run(-30), robot.motorB.run(-30));
        }else{
            Log.d(TAG, "Received wrong command: "+message);
        }
}

Overview of robot commands

All hardware that can be connected to NXT brick is accessible via public final variables in RobotService class

Motor

Motor implementation is located in Motor class. NXT allows to connect three motors to ports called A, B and C. They are accessible respectively by fileds named motorA, motorB and motorC.

There are three basic commands you can send to motor:

  • Run motor forward or backward for infinite period of time with a given speed:
//Run motor A forever with speed 50 (100 is maximum)
robot.executeMotorTask(robot.motorA.run(50));
  • Rotate a motor forward or backward through a given angle with a given speed
//Rotate motor A backwards over 360 degrees with speed 10
robot.executeMotorTask(robot.motorA.run(-10,360));
  • Stop a motor
//Stop motor A
robot.executeMotorTask(robot.motorA.stop());
  • Run/stop two or three motors at the same time (synchronized command)
//Rotate motor A over 360 degrees with speed 30 and rotate motor B backward over 360 degrees with speed 30
robot.executeSyncTwoMotorTask(robot.motorA.run(30,360),robot.motorB.run(-30,360));

It is also possible to monitor motor state via updates that are sent to the motor from NXT. For instance to rotate motor over 360 degrees and then rotate it backwards over the same angle you can use the follwing code:

robot.executeMotorTask(robot.motorA.run(50,360));
while(robot.motorA.getState() == Motor.STATE_RUNNING) {
    pause(200);
    Log.d(TAG, "Waiting for motor to finish rotating");
}
robot.executeMotorTask(robot.motorA.run(-50,360)

Input sensors

Updates from input sensors are received via listeners. First you need to connect sensor to one of four ports (physically) and via programming command:

robot.touchSensor.connect(Sensor.Port.ONE)

This does not allow you to get any updates from the NXT Brick. In order to start receiving updates, you have to register appropriate listener:

 robot.touchSensor.registerListener(new TouchSensorListener() {
          @Override
          public void onEventOccurred(TouchStateEvent e) {
              if(e.isPressed()){
                  Log.d(TAG, "Obstacle, running backwards");
                  robot.executeMotorTask(robot.motorA.run(-10,3*360));
              }
          }
       });

Differnt sensors delivers different events to their listeners. See JavaDoc to see the full documentation.

Note thta the listener will run untill you unregister it. As querying NXT drains its battery it is always a good idea to unregister listener when no longer is needed. Uregistering can be done inside the listener:

robot.touchSensor.registerListener(new TouchSensorListener() {
          @Override
          public void onEventOccurred(TouchStateEvent e) {
              // DO SOME COOL STUFF
              robot.touchSensor.unregisterListener();
          }
       });

Voice control

To handle voice commands you need to modify onVoiceCommand method presented below. However, the speech recognition service will not recognize commads that are not given in grammar file, or in language model.

 @Override
    public void onVoiceCommand(String message) {
        super.onVoiceCommand(message);
        /*************** HANDLE VOICE MESSAGE HERE ***************/
        if(message.equals("run forward")){
            robot.executeSyncTwoMotorTask(robot.motorA.run(30),robot.motorB.run(30));
        }else if(message.equals("stop")){
            robot.executeSyncTwoMotorTask(robot.motorA.stop(), robot.motorB.stop());
        }else if(message.equals("run backward")) {
            robot.executeSyncTwoMotorTask(robot.motorA.run(-30), robot.motorB.run(-30));
        }else{
            Log.d(TAG, "Received wrong command: "+message);
        }
}

Grammar based recognition

Grammar file that descibes allowed commands is located in models/assets/robot-commands.gram file.

The example of the file is given below:

#JSGF V1.0;

grammar robot;

<forward>  = run forward;
<backward>  = run backward;
<stop> = stop;
<max>  = (<forward> | <backward> ) with maximum speed;

public <item> = <forward> | <stop> | <backward> | <max> ;

The complete syntax of the JSGF format can be found on W3C website.

Language model based recognition

Alternatively to grammar file, you can build a statistical model to recognize more complex commands. See this Tutorial to find out how to add statistical model to Mindroid.

Basic gestures

The system uses accelerometer data from mobile phone to allow controlling robot via basic gestures (or more precisely via motion of mobile device around three axis.

The code repsonsibel for controlling robot based on acceleration should be placed in following method, located in RobotControl class.

 @Override
    protected synchronized void onGestureCommand(double x, double y, double z) {
        displayValues(x,y,z);
        /*************** HANDLE GESTURES HERE ***************/
    }

The x,y,z corresponds to Android coordinate system described here: Coordinate system

mindroid's People

Contributors

sbobek avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

mindroid's Issues

UltrasonicSensor not wotking

Ultrasonic sensor is active sensor (type LOWSPEED_9V), and should be used along with LSREAD and LSWRITE commands. Currently, ultrasonic sensor return always 0.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.