Giter VIP home page Giter VIP logo

pytorch-lite-multiplatform's Introduction

pytorch-lite-multiplatform

CI Maven Central Cocoapods

A Kotlin multi-platform wrapper around the PyTorch Lite libraries on Android and iOS. You can use this library in your Kotlin multi-platform project to write mobile inference code for PyTorch Lite models. The API is very close to the Android API of PyTorch Lite.

Installation

Add the following to your shared/build.gradle.kts as a commonMain dependency.

implementation("de.voize:pytorch-lite-multiplatform:<version>")

Add the PLMLibTorchWrapper pod to your cocoapods plugin block in shared/build.gradle.kts and add useLibraries() because the PLMLibTorchWrapper pod has a dependency on the LibTorch-Lite pod which contains static libraries.

cocoapods {
    ...

    pod("PLMLibTorchWrapper") {
        version = "<version>"
        headers = "LibTorchWrapper.h"
    }

    useLibraries()
}

If you use Kotlin version < 1.8.0 the headers property is not available. Instead, you have to add the following to your shared/build.gradle.kts (see this issue for more information):

tasks.named<org.jetbrains.kotlin.gradle.tasks.DefFileTask>("generateDefPLMLibTorchWrapper").configure {
    doLast {
        outputFile.writeText("""
            language = Objective-C
            headers = LibTorchWrapper.h
        """.trimIndent())
    }
}

Additional steps:

  • make sure bitcode is disabled in your iOS XCode project
  • make sure that your iOS app's Podfile does not include use_frameworks!
  • your framework block should probably declare isStatic = true

Usage

First, export your PyTorch model for the lite interpreter. Manage in your application how the exported model file is stored on device, e.g. bundled with your app, downloaded from a server during app initialization or something else. Then you can initialize the TorchModule with the path to the model file.

import de.voize.pytorch_lite_multiplatform.TorchModule

val module = TorchModule(path = "<path/to/model.ptl>")

Once you initialized the model you are ready to run inference.

Just like in the Android API of PyTorch Lite, you can use IValue and Tensor to pass input data into your model and to process the model output. To manage the memory allocated for your tensors you need to use plmScoped to specify up to which point you need to keep the memory allocated.

import de.voize.pytorch_lite_multiplatform.*

plmScoped {
    val inputTensor = Tensor.fromBlob(
        data = floatArrayOf(...),
        shape = longArrayOf(...),
        scope = this
    )

    val inputIValue = IValue.fromTensor(inputTensor)

    val output = module.forward(inputIValue)
    // you could also use
    // module.runMethod("forward", inputIValue)

    val outputTensor = output.toTensor()
    val outputData = outputTensor.getDataAsFloatArray()

    ...
}

IValues are very flexible to construct the input you need for your model, e.g. tensors, scalars, flags, dicts, tuples etc. Refer to the IValue interface for all available options and browse PyTorch's Android Demo for examples on inferences using IValue.

Memory Management

To make management of resources allocated for your inference across Android and iOS simpler we introduced the PLMScope and the plmScoped util. On Android, the JVM garbage collection and PyTorch Lite manage the allocated memory nicely so plmScoped is a noop. But on iOS, memory is allocated in Kotlin and exchanged with native Objective-C code and vice-versa without automatic deallocation of resources. This is where plmScoped comes in and frees the memory allocated for your inference. So it is important that you properly define the scope in which resources need to stay allocated to avoid memory leaks or memory being lost that is needed later.

Running tests

iOS

To run the tests on iOS, execute the iosSimulatorX64Test gradle task:

./gradlew iosSimulatorX64Test

This will automatically call build_dummy_model.py to create the dummy torchscript module for testing, copy it into the simulator files directory and execute the tests. Make sure to select a Python environment where the torch dependency is available.

pytorch-lite-multiplatform's People

Contributors

erksch avatar iceychris avatar legion2 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.