Giter VIP home page Giter VIP logo

davidgillsjo / videoimucapture-android Goto Github PK

View Code? Open in Web Editor NEW
209.0 10.0 39.0 3.54 MB

Android application for capture of Video, IMU data and Camera data useful in SLAM and Structure from Motion research. Differs between Optical Image Stabilization (OIS) and Digital Video Stabilization (DVS) and can provide OIS data if the device supports it.

License: GNU General Public License v3.0

Java 89.27% Shell 0.26% Python 9.73% Dockerfile 0.57% MATLAB 0.17%
android imu camera video sfm slam research image-stabilization camera2-api

videoimucapture-android's People

Contributors

davidgillsjo avatar johnnyhoichuen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

videoimucapture-android's Issues

some questions about the project structure and howw to understand each component

@DavidGillsjo Hi,I'm very excited to find your project here because I'm currently working on testing vins-mono and orb slam3 on my own android phone,I have tried Android_Camera-IMU and android_ros_sensors these two project to record the rosbag of my camera and imu,However,both of them have some problems where the timestamp between the cam and imu recored by the first project are not aligned and the image size is 1920*1080 recored by the second project which means you have to resize the image and re-calibrate the camera in the corresponding size.Fortunately,I find your project when I was browsing this blog,I have install the latest apk
and clone the source code locally.I noticed that there is a hooks directory which have three shell scripts:
image
and I also found that in the run_dockerhub.sh and build_docker.sh you directly use the three scripts:
image
image
but the libs directory is empty,so how could you build the container finally?and I have install kalibr through docker before on my machine,so is this means I don't need to use the script to build the container again?Looking forward to your comment!:-)

Question about the inverted aspect ratio of saved videos

Thank you for sharing this repo with those who hasn't developed android app like me.
There's a simple question, regardless of the aspect ratio set in the app or hardcoding (e.g. 1920x1440), the saved videos always have width and height flipped (1440x1920). I'm not sure if this is related to the vertical UI preview. Could you please let me know which part of the code I should modify if I want the saved videos to match the set aspect ratio without rotation? I only found this one relevant variable mSensorOrientation.
Thanks a lot.

Error converting video to rosbag

I am calibrating with reference to README in calibration.
When I run "kalibr_calibrate_cameras --bag kalibr.bag --target target.yaml --models pinhole-equi --topics /cam0/image_raw",
"RuntimeError: Could not find topic /cam0/image_raw in kalibr.bag." appears.
When I look at kalibr.bag, it's really only IMU topics, which puzzles me. My guess is that there may have been an error when the video was converted to rosbag.
Hope to ask you and get some answers, thank you very much!
image

Added magnetometer reading & but couldn't read from .pb3

I'm modifying the code such that it reads magnetometer data. However, when I read from the .pb3, only gyro and accel data are available. Everything including the .proto compiled successfully.

image

Here are the modified codes from IMUManager and Recording.proto:

IMUManager

public class IMUManager extends SensorEventCallback {
    private static final String TAG = "IMUManager";
    private int ACC_TYPE;
    private int GYRO_TYPE;
    private int MAG_TYPE;

    // if the accelerometer data has a timestamp within the
    // [t-x, t+x] of the gyro data at t, then the original acceleration data
    // is used instead of linear interpolation
    private final long mInterpolationTimeResolution = 500; // nanoseconds
    private final int mSensorRate = 10000; //Us, 100Hz
    private long mEstimatedSensorRate = 0; // ns
    private long mPrevTimestamp = 0; // ns
    private float[] mSensorPlacement = null;

    private static class SensorPacket {
        long timestamp;
        float[] values;

        SensorPacket(long time, float[] vals) {
            timestamp = time;
            values = vals;
        }
    }

    private static class SyncedSensorPacket {
        long timestamp;
        float[] acc_values;
        float[] gyro_values;
        float[] mag_values;

        SyncedSensorPacket(long time, float[] acc, float[] gyro, float[] mag) {
            timestamp = time;
            acc_values = acc;
            gyro_values = gyro;
            mag_values = mag;
        }
    }

    // Sensor listeners
    private SensorManager mSensorManager;
    private Sensor mAccel;
    private Sensor mGyro;
    private Sensor mMag;

    private int linear_acc; // accuracy
    private int angular_acc;
    private int mag_acc;

    private volatile boolean mRecordingInertialData = false;
    private RecordingWriter mRecordingWriter = null;
    private HandlerThread mSensorThread;

    private Deque<SensorPacket> mGyroData = new ArrayDeque<>();
    private Deque<SensorPacket> mAccelData = new ArrayDeque<>();
    private Deque<SensorPacket> mMagData = new ArrayDeque<>();

    public IMUManager(Activity activity) {
        super();
        mSensorManager = (SensorManager) activity.getSystemService(Context.SENSOR_SERVICE);
        setSensorType();
        mAccel = mSensorManager.getDefaultSensor(ACC_TYPE);
        mGyro = mSensorManager.getDefaultSensor(GYRO_TYPE);
        mMag = mSensorManager.getDefaultSensor(MAG_TYPE);
    }

    private void setSensorType() {
        if (Build.VERSION.SDK_INT >= 26)
            ACC_TYPE = Sensor.TYPE_ACCELEROMETER_UNCALIBRATED;
        else
            ACC_TYPE = Sensor.TYPE_ACCELEROMETER;
        GYRO_TYPE = Sensor.TYPE_GYROSCOPE_UNCALIBRATED;
        MAG_TYPE = Sensor.TYPE_MAGNETIC_FIELD_UNCALIBRATED;
    }

    public Boolean sensorsExist() {
        return (mAccel != null) && (mGyro != null) && (mMag != null);
    }

    public void startRecording(RecordingWriter recordingWriter) {
        mRecordingWriter = recordingWriter;
        writeMetaData();
        mRecordingInertialData = true;
    }

    public void stopRecording() {
        mRecordingInertialData = false;
    }

    @Override
    public final void onAccuracyChanged(Sensor sensor, int accuracy) {
        if (sensor.getType() == ACC_TYPE) {
            linear_acc = accuracy;
        } else if (sensor.getType() == GYRO_TYPE) {
            angular_acc = accuracy;
        } else if (sensor.getType() == MAG_TYPE) {
            mag_acc = accuracy;
        }
    }

    private SyncedSensorPacket syncInertialData() {
        // skip
    }

    private void writeData(SyncedSensorPacket packet) {
        RecordingProtos.IMUData.Builder imuBuilder =
                RecordingProtos.IMUData.newBuilder()
                        .setTimeNs(packet.timestamp)
                        .setAccelAccuracyValue(linear_acc)
                        .setGyroAccuracyValue(angular_acc)
                        .setMagAccuracyValue(mag_acc);

        for (int i = 0 ; i < 3 ; i++) {
            imuBuilder.addGyro(packet.gyro_values[i]);
            imuBuilder.addAccel(packet.acc_values[i]);
            imuBuilder.addMag(packet.mag_values[i]);
        }
        if (ACC_TYPE == Sensor.TYPE_ACCELEROMETER_UNCALIBRATED) {
            for (int i = 3 ; i < 6 ; i++) {
                imuBuilder.addAccelBias(packet.acc_values[i]);
            }
        }
        if (GYRO_TYPE == Sensor.TYPE_GYROSCOPE_UNCALIBRATED) {
            for (int i = 3 ; i < 6 ; i++) {
                imuBuilder.addGyroDrift(packet.gyro_values[i]);
            }
        }
        if (MAG_TYPE == Sensor.TYPE_MAGNETIC_FIELD_UNCALIBRATED) {
            for (int i = 3 ; i < 6 ; i++) {
                imuBuilder.addMagBias(packet.mag_values[i]);
            }
        }

        mRecordingWriter.queueData(imuBuilder.build());
    }

    private void writeMetaData() {
        RecordingProtos.IMUInfo.Builder builder = RecordingProtos.IMUInfo.newBuilder();
        if (mGyro != null) {
            builder.setGyroInfo(mGyro.toString()).setGyroResolution(mGyro.getResolution());
        }
        if (mAccel != null) {
            builder.setAccelInfo(mAccel.toString()).setAccelResolution(mAccel.getResolution());
        }
        if (mMag != null) {
            // verified mag is present
            Log.d(TAG, "writeMetaData: " + mMag.toString() + " " + mMag.getResolution());
            builder.setMagInfo(mMag.toString()).setMagResolution(mMag.getResolution());
        }
        builder.setSampleFrequency(getSensorFrequency());

        //Store translation for sensor placement in device coordinate system.
        if (mSensorPlacement != null) {
            // TODO: 19/1/2023 what is this placement??
            Log.d(TAG, "writeMetaData, mSensorPlacement: " + mSensorPlacement);

            builder.addPlacement(mSensorPlacement[3])
                    .addPlacement(mSensorPlacement[7])
                    .addPlacement(mSensorPlacement[11]); // original only up to 11
        }
        mRecordingWriter.queueData(builder.build());

        Log.d(TAG, "writeMetaData: builder.build()" + builder.build());
    }

    private void updateSensorRate(SensorEvent event) {
        long diff = event.timestamp - mPrevTimestamp;
        mEstimatedSensorRate += (diff - mEstimatedSensorRate) >> 3;
        mPrevTimestamp = event.timestamp;
    }

    public float getSensorFrequency() {
        return 1e9f/((float) mEstimatedSensorRate);
    }

    @Override
    public final void onSensorChanged(SensorEvent event) {
        if (event.sensor.getType() == ACC_TYPE) {
            SensorPacket sp = new SensorPacket(event.timestamp, event.values);
            mAccelData.add(sp);

            updateSensorRate(event);
        } else if (event.sensor.getType() == GYRO_TYPE) {
            SensorPacket sp = new SensorPacket(event.timestamp, event.values);
            mGyroData.add(sp);

            // sync data
            SyncedSensorPacket syncedData = syncInertialData();
            if (syncedData != null && mRecordingInertialData) {
                writeData(syncedData);
            }
        } else if (event.sensor.getType() == MAG_TYPE) {
            SensorPacket sp = new SensorPacket(event.timestamp, event.values);
            mMagData.add(sp);
        }
    }

    @Override
    public final void onSensorAdditionalInfo(SensorAdditionalInfo info) {
        if (mSensorPlacement != null) {
            return;
        }

        // what is this
        if ((info.sensor == mAccel) && (info.type == SensorAdditionalInfo.TYPE_SENSOR_PLACEMENT)) {
            Log.d(TAG, "onSensorAdditionalInfo: accel adding sensor placement");
            mSensorPlacement = info.floatValues;
        }
    }

    /**
     * This will register all IMU listeners
     * https://stackoverflow.com/questions/3286815/sensoreventlistener-in-separate-thread
     */
    public void register() {
        if (!sensorsExist()) {
            return;
        }
        mSensorThread = new HandlerThread("Sensor thread",
                Process.THREAD_PRIORITY_MORE_FAVORABLE);
        mSensorThread.start();
        // Blocks until looper is prepared, which is fairly quick
        Handler sensorHandler = new Handler(mSensorThread.getLooper());
        mSensorManager.registerListener(this, mAccel, mSensorRate, sensorHandler);
        mSensorManager.registerListener(this, mGyro, mSensorRate, sensorHandler);
        mSensorManager.registerListener(this, mMag, mSensorRate, sensorHandler);
    }

    /**
     * This will unregister all IMU listeners
     */
    public void unregister() {
        if (!sensorsExist()) {
            return;
        }
        mSensorManager.unregisterListener(this, mAccel);
        mSensorManager.unregisterListener(this, mGyro);
        mSensorManager.unregisterListener(this, mMag);
        mSensorManager.unregisterListener(this);
        mSensorThread.quitSafely();
        stopRecording();
    }
}

Recording.proto

syntax = "proto3";

import "google/protobuf/timestamp.proto";

package videoimu;

option java_package = "se.lth.math.videoimucapture";
option java_outer_classname = "RecordingProtos";

message CameraInfo {
  //fx, fy, cx, cy, s
  // See https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics#LENS_INTRINSIC_CALIBRATION
  // for details on how to use the intrinsics, pose_translation and pose_rotation.
  repeated float intrinsic_params = 1;
  //Radial: k1,k2,k3, Tangential: k4,k5
  repeated float distortion_params = 2;
  bool optical_image_stabilization = 3;
  bool video_stabilization = 4;
  bool distortion_correction = 10;
  int32 sensor_orientation = 14;

  enum FocusCalibration {
    UNCALIBRATED = 0;
    APPROXIMATE = 1;
    CALIBRATED = 2;
  }
  FocusCalibration focus_calibration = 5;

  enum TimestampSource {
    UNKNOWN = 0;
    REALTIME = 1;
  }
  TimestampSource timestamp_source = 6;

  enum LensPoseReference {
    PRIMARY_CAMERA = 0;
    GYROSCOPE = 1;
    UNDEFINED = 2;
  }
  LensPoseReference lens_pose_reference = 7;
  repeated float lens_pose_rotation = 8;
  repeated float lens_pose_translation = 9;

  message Size {
    int32 width = 1;
    int32 height = 2;
  }
  Size resolution = 11;
  Size pre_correction_active_array_size = 12; //SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE
  repeated float original_intrinsic_params = 13;

}

message VideoFrameToTimestamp{
  int64 time_us = 1;
  int64 frame_nbr = 2;
}

message VideoFrameMetaData {
  int64 time_ns = 1;
  int64 frame_number = 2;
  int64 exposure_time_ns = 3;
  int64 frame_duration_ns = 4;
  int64 frame_readout_ns = 5;
  int32 iso = 6;
  float focal_length_mm = 7;
  float est_focal_length_pix = 8;
  float focus_distance_diopters = 9;

  message OISSample {
    int64 time_ns = 1;
    float x_shift = 2;
    float y_shift = 3;
  }
  repeated OISSample OIS_samples =10;
  bool focus_locked = 11;
}

message IMUInfo {
  string gyro_info = 1;
  float gyro_resolution = 2;
  string accel_info = 3;
  float accel_resolution = 4;
  string mag_info = 5;
  float mag_resolution = 6;

  float sample_frequency = 7; //Hz
  repeated float placement = 8;
}

message IMUData {
  int64 time_ns = 1;
  repeated float gyro = 2;
  repeated float gyro_drift = 3;
  repeated float accel = 4;
  repeated float accel_bias = 5;
  repeated float mag = 6;
  repeated float mag_bias = 7;

  enum Accuracy {
    UNRELIABLE=0;
    LOW = 1;
    MEDIUM = 2;
    HIGH = 3;
  }
  Accuracy gyro_accuracy = 8;
  Accuracy accel_accuracy = 9;
  Accuracy mag_accuracy = 10;
  

}

message VideoCaptureData {
  google.protobuf.Timestamp time = 1;
  CameraInfo camera_meta = 2;
  IMUInfo imu_meta = 3;

  repeated IMUData imu = 4;
  repeated VideoFrameMetaData video_meta = 5;

}

message MessageWrapper {
  oneof msg {
    CameraInfo camera_meta = 1;
    IMUData imu_data = 2;
    IMUInfo imu_meta = 3;
    VideoFrameMetaData frame_meta = 4;
    VideoFrameToTimestamp frame_time = 5;
  }
}

I further debug with the app but still have no clue. For example with the imu meta data

image

The message contains magnetometer info until it write to the file. But when I read from the file, nothing about magnetometer was there.

Timestamp abnormal.

Some problems were encountered in the data conversion, and the timestamp of the converted data was abnormal.
93aceda9d3cae8be9281f5c54db59007

time synchronization problem

Hi,I have tried your code. I collected data by the mobile phone, and then convert the proto file into a bag file. Can you tell me whether the image IMU time in the bag file is synchronized? Thank you in advance.

Error regarding camera_meta.lens_pose_rotation

Hello,
I am currently trying to run visual-inertial SLAM using your app.
I successfully installed the app and recorded a video, but I cannot calibrate my camera.

When I try to run this code:

python data2kalibr.py /data --tag-size 0.11 --subsample 30

It gives me this error:

Traceback (most recent call last):
  File "data2kalibr.py", line 120, in <module>
    create_camera_yaml(proto, camera_yaml_path, args.matlab_calibration)
  File "data2kalibr.py", line 19, in create_camera_yaml
    q = Quaternion(c.lens_pose_rotation[3], *c.lens_pose_rotation[:3])
IndexError: list index (3) out of range

Apparently it is caused by this part of the file:

def create_camera_yaml(proto, camera_yaml_path, matlab_calibration=None):
    c = proto.camera_meta
    est_focal_length = proto.video_meta[0].est_focal_length_pix
    q = Quaternion(c.lens_pose_rotation[3], *c.lens_pose_rotation[:3])
    print('intrinsics: ', c.intrinsic_params)
    P = q.transformation_matrix
    print("Translation")
    print(c.lens_pose_translation)
...

I am struggling to find what caused this error. Is it because camera_meta variable has no attribute lens_pose_rotation?
I checked out issue #5 so I know that it is fine to not have intrinsic_param.

I would appreciate your suggestion or advice.

How do you read the video_meta.pb3 file?

Hi there,

And apologies if this is trivial, I've only just started reading on Protocol Buffers, but how can I read the IMU data pb3 file? I thought it might be by running:

protoc --python_out=data data/2021_06_24_14_30_19/video_meta.pb3

But this leads to 2021_06_24_14_30_19/video_meta.pb3:40:1459: Unexpected end of string.

My understanding was that the protoc command would create a python file that I'd be able to import allowing me to instantiate the message type (e.g. MessageType = video_meta.MessageType() and then I could use a library such as read-protobuf to read the buffer into a dataframe for analysis.

Even a mere link leading to reading I should go through is appreciated thanx. I'd also suggest such a link or instructions be added to the README.

Skipped Frame,Missing Data

First off, thank you for this software. I expect this to be super useful.

After recording the video by using Android app, i run "data2statistics.py //video_meta.pb3".Then when i run"python data2kalibr.py'" it print "skipping frame xxx,missing data".I don't know the reason,by the way,when i use the app,it warning"OIS is enabled but OIS data is not stored" and i can't enable the OIS data or close "Enable OIS".

Error running camera calibration

I should have followed all configuration requirements to configure the environment. But when I run "python data2kalibr.py / --tag-size 0.235 --subsample 21", I get the error:
Traceback (most recent call last):
File "data2kalibr.py" line 109,in
convert_to_bag(proto,video_path,bag_path,args.subsamle)
File "data2rosbag.py" line 39,in convert_to_bag
if not i == frame_data.frame_nbr:
AttributeError: frame_nbr

thanks!

Linear interpolation not working

I was checking the interpolation codes and I realise the leftAccel and rightAccel is always the same which eventually causing the interpolated acc_data equals to leftAccel and rightAccel.

I wonder if it only happens on my device (Samsung S8, Android 9) only.

Inside syncInertialData() in IMUManager

image

ModuleNotFoundError: No module named 'recording_pb2'

Describe the bug
I have read the Protobuf File and did it as instructed. However, When I run data2statistics.py, it always prompts me ModuleNotFoundError: No module named 'recording_pb2'。

Expected behavior
屏幕截图 2023-03-29 170607

Screenshots
捕获

"Cannot find IMU" error on Samsung A50, Android 11

Hi,
I have been trying to use the app but I get a warning sign with "Cannot find IMU" as the error. I can understand that the issue is the SensorManager not being able to find the accelerometer and gyroscope on my phone, but I can't figure out how to enable them for this app. I'm able to use my sensors on the marslogger app (but it has different issues). I'm also able to use this app on another model with the same Android version.

Need some pointers on how to fix this.

Phone details:
Model: Samsung A50 (SM-A505F)
Android version: Android 11

Recording has fewer frame metadata messages than there are image frames.

First off, thank you for this software. I expect this to be super useful.

I just did a quick test and I'm not having any difficulties reading the data from the protobuf file. The created mp4 file has 1,464 frames in it, but the protobuf file only has 1,456 VideoFrameMetaData messages in it, with the frame number ranging from 2 to 1,458. Can I assume that the frame numbers are correct? If so are they zero-indexed or one-indexed? If not, how can I be certain which frame goes with which metadata?

Thanks.

Global status : warn

I have created bag file successfully. However, when I play the baf file in vins-mono, the global status is warn and the track disappears.
Have you ever encountered this kind of problem?
Looking forward to your reply!
1

time system

Hi,I have tried your code, it help me a lot thanks! i want know which time system used in this app. After I used data2lth_version.py, there is a attribute about time. is its unit ns? and which time system is used? thanks very much!

Not getting 30 FPS Video Output

First of all, Thanks for the awesome project.

  • Device: POCO F1
  • OS: Android 10
  • App Version 0.12
  • Ubuntu 18.04

In my device, I'm getting 15 FPS video output

358789926_251217711067328_3194630989666638_n

I'm getting this data2statistics.py script output

python data2statistics.py /host_home/2023_08_01_23_08_39/video_meta.pb3

But whenever I tried this script for the calibration

python data2kalibr.py /host_home/2023_08_01_23_08_39 --tag-size 0.088 --subsample 30

I also tried changing the --subsample to 15, as my device is recording at 15 FPS but it showing this error:
(I'm sharing the whole terminal, just because if I made any error in between)

Using default tag: latest
latest: Pulling from davidgillsjo/videoimucapture-calibration
Digest: sha256:2c531c91c19d49df7145dcc8dc1887eb336dddea180ded87bfdd10fa6f520fcc
Status: Image is up to date for davidgillsjo/videoimucapture-calibration:latest
docker.io/davidgillsjo/videoimucapture-calibration:latest
groups: cannot find name for group ID 1000

sajjad@ab625d2976ad:/calibration$ python data2statistics.py /host_home/2023_08_01_23_08_39/video_meta.pb3

/usr/lib/python2.7/dist-packages/matplotlib/font_manager.py:273: UserWarning: Matplotlib is building the font cache using fc-list. This may take a moment.
  warnings.warn('Matplotlib is building the font cache using fc-list. This may take a moment.')
focus_calibration: APPROXIMATE
timestamp_source: REALTIME
resolution {
  width: 960
  height: 1280
}
pre_correction_active_array_size {
  width: 4032
  height: 3024
}
sensor_orientation: 90

gyro_info: "{Sensor name=\"BMI160 Gyroscope-Uncalibrated Non-wakeup\", vendor=\"BOSCH\", version=34744596, type=16, maxRange=17.452778, resolution=5.326165E-4, power=0.9, minDelay=5000}"
gyro_resolution: 0.000532616511919
accel_info: "{Sensor name=\"BMI160 Accelerometer-Uncalibrated Non-wakeup\", vendor=\"BOSCH\", version=34744596, type=35, maxRange=78.4532, resolution=0.0023928226, power=0.18, minDelay=2500}"
accel_resolution: 0.00239282264374
sample_frequency: 100.762886047


sajjad@ab625d2976ad:/calibration$ python data2kalibr.py /host_home/2023_08_01_23_08_39 --tag-size 0.088 --subsample 30


skipping frame 0, missing data
Traceback (most recent call last):
  File "data2kalibr.py", line 120, in <module>
    create_camera_yaml(proto, camera_yaml_path, args.matlab_calibration)
  File "data2kalibr.py", line 19, in create_camera_yaml
    q = Quaternion(c.lens_pose_rotation[3], *c.lens_pose_rotation[:3])
IndexError: list index (3) out of range

Can you guys please help me with what am I doing wrong?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.