Giter VIP home page Giter VIP logo

coast_audio's Introduction

coast_audio

codecov

logo_banner.png

demo.gif

coast_audio is a high performance audio processing library written in Dart.
This package aims to provide low-level audio functionalities with no Flutter dependency.

Features

  • Audio Buffer Management
  • Ring Buffer Management
  • Audio Node Graph System
    • Decoder Node
    • Mixer Node
    • Function Node
    • Volume Node
  • Encoding and Decoding mechanism
    • Wav Audio Encoder/Decoder
  • Wave Generation
    • Sine
    • Triangle
    • Square
    • Sawtooth
  • Audio Device I/O
    • Playback
    • Capture
  • Audio Format Management and Conversion

Supported Platforms

Android iOS macOS Windows Linux Web
⚠️
  • Windows is not tested but it should work if the native library is compiled correctly.

How It Works

coast_audio is built on top of dart:ffi.
Thus, it can be used in any Dart environment that supports FFI.

Some of the functionalities are implemented in native code which uses miniaudio.
This repository contains pre-built binaries for each supported platform.

Installation

Add the following to your pubspec.yaml:

coast_audio: ^1.0.0

Android

  1. Create a new directory android/src/main/jniLibs in your project.
  2. Copy the {ABI}/libcoast_audio.so file from the native/prebuilt/android directory to the jniLibs directory.

macOS/iOS

Add the following to your Podfile:

target 'Runner' do
  ...
  pod 'CoastAudio', :git => 'https://github.com/SKKbySSK/coast_audio.git', :tag => '1.0.0'
end

Open the AppDelegate.swift file and add the following import and CoastAudioSymbolKeeper.keep() call:

import CoastAudio // 1. Add import

@UIApplicationMain
@objc class AppDelegate: FlutterAppDelegate {
  override func application(
    _ application: UIApplication,
    didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
  ) -> Bool {
    CoastAudioSymbolKeeper.keep() // 2. Add this line to prevent native symbols from being stripped (You can place this anywhere inside your iOS/macOS code)
    GeneratedPluginRegistrant.register(with: self)
    return super.application(application, didFinishLaunchingWithOptions: launchOptions)
  }
}

Linux

  1. Copy the {ARCH}/libcoast_audio.so file from the native/prebuilt/linux directory to the linux/libs directory in your project.
  2. Add the following to your linux/CMakeLists.txt:
install(FILES "linux/libs/${CMAKE_SYSTEM_PROCESSOR}/libcoast_audio.so" DESTINATION "${INSTALL_BUNDLE_LIB_DIR}" COMPONENT Runtime)

Windows

TODO

There is no pre-built native library for Windows yet.
You need to build it manually.

Usage

Generate Sine Wave

coast_audio provides various audio nodes to generate/process audio data easily.
This example code generates a 440hz sine wave by using FunctionNode.

// define the audio format with 48khz sample rate and stereo channels.
final format = AudioFormat(sampleRate: 48000, channels: 1, sampleFormat: SampleFormat.int16);

// create a sine wave function node with 440hz frequency.
final functionNode = FunctionNode(
  function: const SineFunction(),
  frequency: 440,
);

AllocatedAudioFrames(length: 1024, format: format).acquireBuffer((buffer) {
  // read the audio data from the function node to the buffer.
  functionNode.outputBus.read(buffer);

  // floatList contains sine wave audio data.
  final floatList = buffer.asFloat32ListView();
});

Generate Wav File

You can use WavAudioEncoder to encode audio data.

// define the audio format with 48khz sample rate and stereo channels.
final format = AudioFormat(sampleRate: 48000, channels: 1, sampleFormat: SampleFormat.int16);

// create a sine wave function node with 440hz frequency.
final functionNode = FunctionNode(
  function: const SineFunction(),
  frequency: 440,
);

final fileOutput = AudioFileDataSource(
  file: File('output.wav'),
  mode: FileMode.write,
);

// create a wav audio encoder.
final encoder = WavAudioEncoder(dataSource: fileOutput, inputFormat: format);

encoder.start();

final duration = AudioTime(10);
AllocatedAudioFrames(length: duration.computeFrames(format), format: format).acquireBuffer((buffer) {
  // read the audio data from the function node to the buffer.
  final result = functionNode.outputBus.read(buffer);

  // encode the audio data to the wav file.
  encoder.encode(buffer.limit(result.frameCount));
});

encoder.finalize();

Mixing Audio with Audio Node Graph System

AudioNode can be connected to other nodes to build an audio graph.
This example code demonstrates how to mix two sine waves and write to a wav file.

const format = AudioFormat(sampleRate: 48000, channels: 1);
final mixerNode = MixerNode(format: format);

// Initialize sine wave nodes and connect them to mixer's input
for (final freq in [264.0, 330.0, 396.0]) {
  final sineNode = FunctionNode(function: const SineFunction(), format: format, frequency: freq);
  final mixerInputBus = mixerNode.appendInputBus();
  sineNode.outputBus.connect(mixerInputBus);
}

AllocatedAudioFrames(length: 1024, format: format).bufferFrames.acquireBuffer((buffer) {
  // read the audio data from the function node to the buffer.
  functionNode.outputBus.read(buffer);

  // floatList contains mixed sine wave audio data.
  final floatList = buffer.asFloat32ListView();
});

Play, record and loopback

coast_audio provides AudioDevice class to handle audio device I/O.
Please see the following examples:

Q&A

Can I use this package in Flutter?

Yes, you can use coast_audio in Flutter.

Most of coast_audio operation is synchronous and it may block the Flutter's main isolate.
So, it is recommended to use coast_audio in a separate isolate.

Please see the example app implementation for more details.

Can I use coast_audio in web?

In short, no,
You should use dart:web_audio instead.
But it may become available if the FFI is supported on the Web platform.

coast_audio's People

Contributors

aveia avatar hpoul avatar skkbyssk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

coast_audio's Issues

Building 'example' app on x86_64 Windows system fails

Hello. I've been trying to build new example application from v1.0.0 release using Android Studio.

It was working quite nicely on x86_64 Hackintosh system, I was able to build the application and test every feature on Android Virtual Device, and I tried to do the same thing on Windows.

However, it was not working on Windows system for some reason.

Gradle fails during build process, and here's the error -

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':app:mergeDebugJniLibFolders'.
> Cannot invoke "java.io.File.equals(Object)" because "current" is null

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 12s
Error: Gradle task assembleDebug failed with exit code 1

Here's the full stack trace -

> Task :app:mergeDebugJniLibFolders FAILED
Execution failed for task ':app:mergeDebugJniLibFolders'.
> Cannot invoke "java.io.File.equals(Object)" because "current" is null

* Try:
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':app:mergeDebugJniLibFolders'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.lambda$executeIfValid$1(ExecuteActionsTaskExecuter.java:142)
	at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:282)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:140)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:128)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:69)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:309)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:302)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:288)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:462)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:379)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:49)
Caused by: java.lang.NullPointerException: Cannot invoke "java.io.File.equals(Object)" because "current" is null
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:58)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.computePath(AssetItem.java:62)
	at com.android.ide.common.resources.AssetItem.create(AssetItem.java:47)
	at com.android.ide.common.resources.AssetSet.createFileAndItems(AssetSet.java:52)
	at com.android.ide.common.resources.AssetSet.createFileAndItems(AssetSet.java:28)
	at com.android.ide.common.resources.DataSet.handleNewFile(DataSet.java:550)
	at com.android.ide.common.resources.DataSet.loadFile(DataSet.java:282)
	at com.android.ide.common.resources.DataSet.loadFromFiles(DataSet.java:261)
	at com.android.build.gradle.tasks.MergeSourceSetFolders.doFullTaskAction(MergeSourceSetFolders.kt:150)
	at com.android.build.gradle.tasks.MergeSourceSetFolders.doTaskAction(MergeSourceSetFolders.kt:119)
	at com.android.build.gradle.internal.tasks.NewIncrementalTask$taskAction$$inlined$recordTaskAction$1.invoke(BaseTask.kt:69)
	at com.android.build.gradle.internal.tasks.Blocks.recordSpan(Blocks.java:51)
	at com.android.build.gradle.internal.tasks.NewIncrementalTask.taskAction(NewIncrementalTask.kt:46)
	at jdk.internal.reflect.GeneratedMethodAccessor2192.invoke(Unknown Source)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:125)
	at org.gradle.api.internal.project.taskfactory.IncrementalInputsTaskAction.doExecute(IncrementalInputsTaskAction.java:32)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:51)
	at org.gradle.api.internal.project.taskfactory.AbstractIncrementalTaskAction.execute(AbstractIncrementalTaskAction.java:25)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:29)
	at org.gradle.api.internal.tasks.execution.TaskExecution$3.run(TaskExecution.java:236)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:29)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:26)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.run(DefaultBuildOperationRunner.java:47)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:68)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeAction(TaskExecution.java:221)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeActions(TaskExecution.java:204)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeWithPreviousOutputFiles(TaskExecution.java:187)
	at org.gradle.api.internal.tasks.execution.TaskExecution.execute(TaskExecution.java:165)
	at org.gradle.internal.execution.steps.ExecuteStep.executeInternal(ExecuteStep.java:89)
	at org.gradle.internal.execution.steps.ExecuteStep.access$000(ExecuteStep.java:40)
	at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:53)
	at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:50)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:50)
	at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:40)
	at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:68)
	at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:38)
	at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:41)
	at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:74)
	at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:55)
	at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:51)
	at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:29)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.executeDelegateBroadcastingChanges(CaptureStateAfterExecutionStep.java:124)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:80)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:58)
	at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:48)
	at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:36)
	at org.gradle.internal.execution.steps.BuildCacheStep.executeWithoutCache(BuildCacheStep.java:181)
	at org.gradle.internal.execution.steps.BuildCacheStep.lambda$execute$1(BuildCacheStep.java:71)
	at org.gradle.internal.Either$Right.fold(Either.java:175)
	at org.gradle.internal.execution.caching.CachingState.fold(CachingState.java:59)
	at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:69)
	at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:47)
	at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:36)
	at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:25)
	at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:36)
	at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:22)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:110)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$2(SkipUpToDateStep.java:56)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:56)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:38)
	at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:73)
	at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:44)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:37)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:27)
	at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:89)
	at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:50)
	at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:102)
	at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:57)
	at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:76)
	at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:50)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.executeWithNoEmptySources(SkipEmptyWorkStep.java:254)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:91)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:56)
	at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:32)
	at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:21)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsStartedStep.execute(MarkSnapshottingInputsStartedStep.java:38)
	at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:43)
	at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:31)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.lambda$execute$0(AssignWorkspaceStep.java:40)
	at org.gradle.api.internal.tasks.execution.TaskExecution$4.withWorkspace(TaskExecution.java:281)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:40)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:30)
	at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:37)
	at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:27)
	at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:44)
	at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:33)
	at org.gradle.internal.execution.impl.DefaultExecutionEngine$1.execute(DefaultExecutionEngine.java:76)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:139)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:128)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:69)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:309)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:302)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:288)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:462)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:379)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:49)

I tried the same thing on 2 Windows devices and the result was same.

Here's what I tried -
I added following line to /coast_audio/example/android/app/build.gradle.

# BEFORE
sourceSets {
    main.java.srcDirs += 'src/main/kotlin'
}

# AFTER
sourceSets {
    # I tried to specify library location
    main.jniLibs.srcDirs = ['../../../../../native/prebuilt/android']

    main.java.srcDirs += 'src/main/kotlin'
}

It got rid of the gradle error I mentioned earlier, and it successfully built the project.
However, the application was unable to load the library with error.

When I try to initialize coast_audio by pressing floating button at 'Select Backend' page, this error occurs -

======== Exception caught by gesture ===============================================================
The following ArgumentError was thrown while handling a gesture:
Invalid argument(s): Failed to load dynamic library 'libcoast_audio.so': dlopen failed: library "libcoast_audio.so" not found

When the exception was thrown, this was the stack: 
#0      _open (dart:ffi-patch/ffi_dynamic_library_patch.dart:11:43)
#1      new DynamicLibrary.open (dart:ffi-patch/ffi_dynamic_library_patch.dart:22:12)
#2      CoastAudioNative.initialize (package:coast_audio/src/interop/helper/coast_audio_native.dart:35:28)
#3      CoastAudioNative.bindings (package:coast_audio/src/interop/helper/coast_audio_native.dart:16:35)
#4      CoastAudioInterop.bindings (package:coast_audio/src/interop/helper/coast_audio_interop.dart:34:29)
#5      new CaLog (package:coast_audio/src/interop/ca_log.dart:31:14)
#6      AudioDeviceContext._log (package:coast_audio/src/device/audio_device.dart:21:21)
#7      AudioDeviceContext._log (package:coast_audio/src/device/audio_device.dart)
#8      new AudioDeviceContext (package:coast_audio/src/device/audio_device.dart:9:52)
#9      _BackendPageState.build.<anonymous closure> (package:example/pages/backend_page.dart:84:35)
#10     _InkResponseState.handleTap (package:flutter/src/material/ink_well.dart:1183:21)
#11     GestureRecognizer.invokeCallback (package:flutter/src/gestures/recognizer.dart:315:24)
#12     TapGestureRecognizer.handleTapUp (package:flutter/src/gestures/tap.dart:652:11)
#13     BaseTapGestureRecognizer._checkUp (package:flutter/src/gestures/tap.dart:309:5)
#14     BaseTapGestureRecognizer.handlePrimaryPointer (package:flutter/src/gestures/tap.dart:242:7)
#15     PrimaryPointerGestureRecognizer.handleEvent (package:flutter/src/gestures/recognizer.dart:670:9)
#16     PointerRouter._dispatch (package:flutter/src/gestures/pointer_router.dart:98:12)
#17     PointerRouter._dispatchEventToRoutes.<anonymous closure> (package:flutter/src/gestures/pointer_router.dart:143:9)
#18     _LinkedHashMapMixin.forEach (dart:collection-patch/compact_hash.dart:633:13)
#19     PointerRouter._dispatchEventToRoutes (package:flutter/src/gestures/pointer_router.dart:141:18)
#20     PointerRouter.route (package:flutter/src/gestures/pointer_router.dart:127:7)
#21     GestureBinding.handleEvent (package:flutter/src/gestures/binding.dart:495:19)
#22     GestureBinding.dispatchEvent (package:flutter/src/gestures/binding.dart:475:22)
#23     RendererBinding.dispatchEvent (package:flutter/src/rendering/binding.dart:430:11)
#24     GestureBinding._handlePointerEventImmediately (package:flutter/src/gestures/binding.dart:420:7)
#25     GestureBinding.handlePointerEvent (package:flutter/src/gestures/binding.dart:383:5)
#26     GestureBinding._flushPointerEventQueue (package:flutter/src/gestures/binding.dart:330:7)
#27     GestureBinding._handlePointerDataPacket (package:flutter/src/gestures/binding.dart:299:9)
#28     _invoke1 (dart:ui/hooks.dart:328:13)
#29     PlatformDispatcher._dispatchPointerDataPacket (dart:ui/platform_dispatcher.dart:429:7)
#30     _dispatchPointerDataPacket (dart:ui/hooks.dart:262:31)
Handler: "onTap"
Recognizer: TapGestureRecognizer#30c16
  debugOwner: GestureDetector
  state: possible
  won arena
  finalPosition: Offset(792.0, 665.9)
  finalLocalPosition: Offset(22.8, 36.9)
  button: 1
  sent tap down
====================================================================================================

I tried 'flutter clean' command, installing another new AVD, Invalidating (Removing) Cache from Android Studio.
but, I wasn't able to go any further from here, so I'd like to ask for your guidance.

Here's my 'flutter doctor' summary if you're curious -

Doctor summary (to see all details, run flutter doctor -v):
[√] Flutter (Channel stable, 3.19.5, on Microsoft Windows [Version 10.0.22635.3430], locale ko-KR)
[√] Windows Version (Installed version of Windows is version 10 or higher)
[√] Android toolchain - develop for Android devices (Android SDK version 34.0.0)
[X] Chrome - develop for the web (Cannot find Chrome executable at .\Google\Chrome\Application\chrome.exe)
    ! Cannot find Chrome. Try setting CHROME_EXECUTABLE to a Chrome executable.
[X] Visual Studio - develop Windows apps
    X Visual Studio not installed; this is necessary to develop Windows apps.
      Download at https://visualstudio.microsoft.com/downloads/.
      Please install the "Desktop development with C++" workload, including all of its default components
[√] Android Studio (version 2023.2)
[√] VS Code (version 1.88.0)
[√] Connected device (3 available)
[√] Network resources

! Doctor found issues in 2 categories.

Thank you in advance!

FX on Audio Stream

Hi, this package seems interesting.
Low level Audio Processing is missing in Flutter.
I thought of an App that can listen to incoming audio and process effects on the stream.
Latency at 128 or 256 samples?
Is this possible?

version solving failed in examples/audio_recorder

When I run in examples/audio_recorder I got this error

Because every version of coast_audio_fft from path depends on coast_audio from path and every version of flutter_coast_audio_miniaudio from path depends on coast_audio from hosted, coast_audio_fft from
  path is incompatible with flutter_coast_audio_miniaudio from path.
So, because audio_recorder depends on both flutter_coast_audio_miniaudio from path and coast_audio_fft from path, version solving failed.

Pitch recognition

I'm going to create ear training app, and need to recognize the pitch user sing in real time. Is it possible with your package?

Question : Filtering Audio with Miniaudio's built-in Filters?

Hi. I've been experimenting how can I implement 'Parametric EQ' or 'Peaking EQ' filter on flutter project.

After I found this package, I tried to make a parametric eq node but the filtered output audio was noticeably noisy.

I was then curious if I could utilize miniaudio's built-in peaking eq filter with 'coast_audio_miniaudio' package to go 'easier way'.

Is it possible to filter out audio with miniaudio's built-in filters?

P.S. The code below, is the 'Parametric EQ Node' I tried to make. I referenced several codes within this package, but I'm not sure whether it was done right.

I have tested the code with provided 'Music Player' Example, by connecting Parametric EQ Node before, and after FFT Node later.
The sound output was noticeably noisy on both Android Emulator and Physical Android Device with A12.

import 'dart:math';

import 'package:coast_audio/coast_audio.dart';

class ParametricEQNode extends AutoFormatSingleInoutNode with ProcessorNodeMixin, BypassNodeMixin {
  ParametricEQNode({
    required this.format,
    required this.centerFreq,
    required this.dbGain,
    required this.qFactor,
  });

  final AudioFormat format;
  double centerFreq;
  double dbGain;
  double qFactor;

  @override
  List<SampleFormat> get supportedSampleFormats => const [SampleFormat.float32];

  @override
  int process(AudioBuffer buffer) {

    final inputData = buffer.asFloat32ListView();

    // Following Coefficients and Transfer Functions are adapted from Audio-EQ-Cookbook.
    // more at https://www.w3.org/TR/audio-eq-cookbook/

    // Calculate filter coefficients
    double w0 = 2 * pi * centerFreq / format.sampleRate;
    double alpha = sin(w0) / (2 * qFactor);
    double A = pow(10.0, (dbGain / 40)).toDouble();

    double b0 = 1 + (alpha * A);
    double b1 = -2 * cos(w0);
    double b2 = 1 - (alpha * A);
    double a0 = 1 + (alpha / A);
    double a1 = b1;
    double a2 = 1 - (alpha / A);

    double x1 = 0;
    double x2 = 0;
    double y1 = 0;
    double y2 = 0;

    // Process each sample
    for(int frame = 0; frame < buffer.sizeInFrames; frame++) {
      for (var channel = 0; format.channels > channel; channel++) {
        final inputBufferIndex = (frame * format.channels) + channel;

        double x = inputData[inputBufferIndex];

        // Calculate output sample
        double y = (b0/a0)*x + (b1/a0)*x1 + (b2/a0)*x2 - (a1/a0)*y1 - (a2/a0)*y2;

        // Update the state variables
        x2 = x1;
        x1 = x;
        y2 = y1;
        y1 = y;

        // Update the buffer data
        inputData[inputBufferIndex] = y;
      }
    }

    return buffer.sizeInFrames;
  }
}

Possible to record from audio output?

I have a Flutter app that is capable of playing multiple audio files concurrently (picture something like a piano). I would like to be able to add a record/playback option so the user could listen to what they just played after pressing a record button.

Most Flutter libraries only seem to offer the ability to record via microphone (not the audio output). Would this library allow the ability to record the user-generated audio output instead of microphone?

Can you add Linux support?

Great works!

I'm looking for a flutter package that handles audio buffers in Linux.
Can you add Linux support?

The method 'setString' isn't defined for the class 'Array<Char>'.

I'm having an issue with import flutter_coast_audio_miniaudio.

I'm using FlutterFlow (Yes yes I know) and after scouring the docs and lib files I ended up with this

Future<double> getAverageFrequency(String filePath) async {
 // Read audio data from the file
 final memory = Memory();
 final audioData = await readAudioFile(filePath, memory);

 // Perform FFT on the audio data
 final fft = FFT(audioData.length);
 final complexArray = fft.realFft(audioData);

 // Calculate the power spectrum
 final powerSpectrum = complexArray.map((c) => c.abs() * c.abs()).toList();

 // Find the maximum power in the spectrum
 Float64x2 maxPower = Float64x2.zero();
 int maxPowerIndex = 0;
 for (int i = 0; i < powerSpectrum.length; i++) {
   if (powerSpectrum[i].x > maxPower.x) {
     maxPower = powerSpectrum[i];
     maxPowerIndex = i;
   }
 }

 // Calculate the frequency corresponding to the maximum power index
 // Assuming you know the sample rate from the ID3 tag or other source
 final sampleRate = 48000;
 final maxFrequency = maxPowerIndex * (sampleRate / 2) / powerSpectrum.length;

 //memory.zeroMemory();
 return maxFrequency;
}

Future<List<double>> readAudioFile(String filePath, Memory memory) async {
 // Open the audio file
 final file = File(filePath);

 try {
   // Create an AudioFileDataSource
   final dataSource = AudioFileDataSource(file: file, mode: FileMode.read);

   // Determine the output format (assuming you know the format)
   final outputFormat =
       AudioFormat(sampleRate: 44100, channels: 1); // Adjust as needed

   // Create a buffer size
   final bufferSize = 4096;

   // Use ffi to allocate memory for the buffer
   final pBuffer = memory.allocate<ffi.Uint8>(bufferSize);

   // Create a MabAudioDecoder
   final decoder =
       MabAudioDecoder(dataSource: dataSource, outputFormat: outputFormat);

   final audioData = <double>[];

   // Decode audio data in chunks
   while (true) {
     final decodeResult = await decoder.decode(
       destination: AudioBuffer(
         pBuffer: pBuffer,
         sizeInBytes: bufferSize,
         sizeInFrames: bufferSize ~/ outputFormat.channels,
         format: outputFormat,
         memory: memory, // Pass the memory instance
       ),
     );

     // Check if decoding is finished
     if (decodeResult.isEnd) {
       break;
     }

     // Convert the decoded data in memory to a Dart List<double>
     for (var i = 0; i < decodeResult.frames * outputFormat.channels; i++) {
       final byte = pBuffer.elementAt(i).value;
       audioData.add(byte.toDouble());
     }
   }

   // Dispose decoder and data source
   decoder.dispose();
   dataSource.dispose();

   // Free memory
   calloc.free(pBuffer.cast<ffi.Void>());

   return audioData;
 } finally {
   // Dispose resources
   await file.delete();
 }
}

Finally I have code that is supposed to work but now I keep getting this error, possible id rather not change the lib files

ffi :^2.0.1 is the version I'm using

/opt/.pub-cache/hosted/pub.dev/coast_audio_miniaudio-0.0.4/lib/src/ma_bridge/device_info/aaudio_device_info.dart:26:31: Error: The method 'setString' isn't defined for the class 'Array<Char>'.
 - 'Array' is from 'dart:ffi'.
 - 'Char' is from 'dart:ffi'.
Try correcting the name to the name of an existing method, or defining a method named 'setString'.
    info.pDeviceInfo.ref.name.setString(name);
                              ^^^^^^^^^
/opt/.pub-cache/hosted/pub.dev/coast_audio_miniaudio-0.0.4/lib/src/ma_bridge/device_info/core_audio_device_info.dart:25:39: Error: The method 'setString' isn't defined for the class 'Array<Char>'.
 - 'Array' is from 'dart:ffi'.
 - 'Char' is from 'dart:ffi'.
Try correcting the name to the name of an existing method, or defining a method named 'setString'.
    info.pDeviceInfo.ref.id.coreaudio.setString(id);
                                      ^^^^^^^^^
/opt/.pub-cache/hosted/pub.dev/coast_audio_miniaudio-0.0.4/lib/src/ma_bridge/device_info/core_audio_device_info.dart:26:31: Error: The method 'setString' isn't defined for the class 'Array<Char>'.
 - 'Array' is from 'dart:ffi'.
 - 'Char' is from 'dart:ffi'.
Try correcting the name to the name of an existing method, or defining a method named 'setString'.
    info.pDeviceInfo.ref.name.setString(name);
                              ^^^^^^^^^
/opt/.pub-cache/hosted/pub.dev/coast_audio_miniaudio-0.0.4/lib/src/ma_bridge/device_info/mab_device_info.dart:31:43: Error: The method 'getString' isn't defined for the class 'Array<Char>'.
 - 'Array' is from 'dart:ffi'.
 - 'Char' is from 'dart:ffi'.
Try correcting the name to the name of an existing method, or defining a method named 'getString'.
  String get name => pDeviceInfo.ref.name.getString(256);
                                          ^^^^^^^^^
/opt/.pub-cache/hosted/pub.dev/coast_audio_miniaudio-0.0.4/lib/src/ma_bridge/device_info/opensl_device_info.dart:26:31: Error: The method 'setString' isn't defined for the class 'Array<Char>'.
 - 'Array' is from 'dart:ffi'.
 - 'Char' is from 'dart:ffi'.
Try correcting the name to the name of an existing method, or defining a method named 'setString'.
    info.pDeviceInfo.ref.name.setString(name); I get this error though

Thanks in advance hope someone helps me solve thissss!!!<3

Bug: unsupported format found in riff chunk: WAVE-

Hello.
I'm using this package in ubuntu 22.04.
When I pass a wav file to the AudioFileDataSource:

final dataSource = AudioFileDataSource(file: File('my-wav-file.wav'), mode: FileMode.read);
final WavAudioDecoder decoder = WavAudioDecoder(dataSource: dataSource);

the below error is shown:

unsupported format found in riff chunk: WAVE-�

It seems that this package can not correctly decode header of wav file.
What is problem?

Is the `flutter_coast_audio_miniaudio` support ios simulator?

Is the flutter_coast_audio_miniaudio support ios simulator? I got this error when building with ios simulator

User-Defined Issue (Xcode): Unsupported Swift architecture
/my/path/coast_audio/examples/audio_recorder/build/ios/Debug-iphonesimulator/flutter_coast_audio_miniaudio/flutter_coast_audio_miniaudio.framework/Headers/flutter_coast_audio_mini
audio-Swift.h:274:1


Parse Issue (Xcode): Could not build module 'flutter_coast_audio_miniaudio'
/my/path/coast_audio/examples/audio_recorder/ios/Runner/GeneratedPluginRegistrant.m:29:8


Could not build the application for the simulator.
Error launching application on iPhone 14 Pro.

Not an issue just a question

Is it possible that your library can make text to speech programs?

sorry I asked here because every time I search for the Python language it always appears

coast_audio 1.0.0 decoding

Hello, I see that you did update coast_audio and in fact merged coast_audio_miniaudio and flutter_coast_audio_miniaudio with coast_audio!

I'm a bit confused though, I understand that MabAudioDecoder is deprecated so I'm lost again!

I simply want to extract audio data from a mp3 file and put the data in a fft library so I can preform spectral analysis/pitch detection but now I'm not even sure how to approach!

Waiting for your response, Thanks in advance! :))

Issues with audio data processing and conversion

First of all thank you for this amazing package. It looks very promising for decoding, playing and processing audio data.

So, I'm trying to use coast_audio to do a pitch detection on the audio data stream decoded from a wav file.

There are some general issues I mentioned in #13 (processing data without playing or encoding) and getting AudioTime when processing an AudioBuffer inside ProcessorNodeMixin.process() implementation.

To do pitch detection I used the pitch_detector_dart package, which implements AUBIO_YIN pitch tracking algorithm ported from TarsosDSP.

So, I used FftNode as an example and implemented a naive PitchNode class below. I'm struggling with getting data out of the audio buffer to get it in a format that works with the PitchDetector. This code kind of works, but I get false positive detections with frequency over 50kHz, which should not be there when sampling rate is 48000-ish.

Also, there should be a better way to work with the data buffer or use something like FrameRingBuffer as FftNode does, but I could not figure it out how to do that.

@SKKbySSK I really need your advise on this or maybe you could incorporate Yin algorithm in the coast_audio or coast_audio_fft packages for a general use.

Thank you in advance.

class PitchNode extends AutoFormatSingleInoutNode with ProcessorNodeMixin, SyncDisposableNodeMixin {
  PitchNode({
    required this.format,
    required this.bufferSize,
    required this.onPitchDetected,
    required this.position,
  }) : _pitchDetector = PitchDetector(format.sampleRate.toDouble() * 2, bufferSize);

  final AudioFormat format;
  final int bufferSize;
  final PitchDetector _pitchDetector;
  final PitchDetectedCallback onPitchDetected;
  final double Function() position;
  final List<double> _buffer = [];

  @override
  int process(AudioBuffer buffer) {
    if (buffer.sizeInFrames == 0) {
      return buffer.sizeInFrames;
    }

    double audioTime = position();

    List<double> list =  buffer.asInt32ListView().map((v) => v.toDouble()).toList();
    _buffer.addAll(list);

    bool detected = false;
    while (_buffer.length > bufferSize) {
      PitchDetectorResult result = _pitchDetector.getPitch(_buffer);
      if (result.pitched) {
        int pitch = result.pitch.toInt();
        onPitchDetected(audioTime, pitch);
        detected = true;
        break;
      }
      _buffer.removeRange(0, bufferSize);
    }

    if(detected) {
      while (_buffer.length > bufferSize) {
        _buffer.removeRange(0, bufferSize);
      }
    }

    return buffer.sizeInFrames;
  }  
  ...

Play asset wav file

Hello, can you give us an example of how to play asset file in Flutter framework?

How to convert SampleFormat.int16 to SampleFormat.float32 suitable for FFT

I'm truing to add an FftNode to the audio_graph_demo.dart demo app.

So, created the FftNode like below and then linked it in the graphNode between mixerNode and masterVolumeNode.

final fftNode = FftNode(
    format: format, fftSize: 512,
    onFftCompleted: (FftResult result) {
      print(result.complexArray.magnitudes().toList().join(', '));
    });

With that it is giving me all NaN values for FFT magnitudes.

If I change the sampleFormat: SampleFormat.float32 to be used to all FunctionNodes - then FFT works, but it fail with an "unsupported sample format" error in the WavAudioEncoder.

The WavAudioDecoder also usually produces the SampleFormat.int16 output and I can't figure out how to convert that data into the SampleFormat.float32 that FftNode can work with.

Could you please advise how to make the such conversion and to integrate FftNode with int16 audio buffers and WavAudioDecoder. Thank you in advance.

PS: the MabAudioDecoder produces SampleFormat.float32 and it seem to work with the FftNode out of the box, but it seems an overkill for a simple audio processing.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.