anarchuser / mic_stream Goto Github PK
View Code? Open in Web Editor NEWFlutter plugin to get an infinite audio stream from the microphone
Home Page: https://pub.dev/packages/mic_stream
License: GNU General Public License v3.0
Flutter plugin to get an infinite audio stream from the microphone
Home Page: https://pub.dev/packages/mic_stream
License: GNU General Public License v3.0
in order to use flutter local notification we had to update permission handler and its make a conflict with mic stream
mic stream is a very very good package and very helpful please consider this issue
When compiling the flutter app with mic_stream included I get the following error:
Launching lib\main.dart on FRD L09 in debug mode...
Initializing gradle...
Resolving dependencies...
Running Gradle task 'assembleDebug'...
Note: E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\mic_stream-0.1.4\android\src\main\java\com\code\aaron\micstream\MicStreamPlugin.java uses unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:10: error: cannot find symbol
import android.support.v4.app.ActivityCompat;
^
symbol: class ActivityCompat
location: package android.support.v4.app
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:11: error: package android.support.v4.content does not exist
import android.support.v4.content.ContextCompat;
^
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:67: error: cannot find symbol
if (ContextCompat.checkSelfPermission(registrar.activity(), permission) == PackageManager.PERMISSION_DENIED) {
^
symbol: variable ContextCompat
location: class PermissionPlugin
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:68: error: cannot find symbol
if (!ActivityCompat.shouldShowRequestPermissionRationale(activity, permission)) {
^
symbol: variable ActivityCompat
location: class PermissionPlugin
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:86: error: cannot find symbol
ActivityCompat.requestPermissions(activity, permissions, 0);
^
symbol: variable ActivityCompat
location: class PermissionPlugin
E:\develop\flutter\.pub-cache\hosted\pub.dartlang.org\permission-0.1.1\android\src\main\java\com\ly\permission\PermissionPlugin.java:140: error: cannot find symbol
if (!ActivityCompat.shouldShowRequestPermissionRationale(registrar.activity(), strings[i])) {
^
symbol: variable ActivityCompat
location: class PermissionPlugin
6 errors
*******************************************************************************************
The Gradle failure may have been because of AndroidX incompatibilities in this Flutter app.
See https://goo.gl/CP92wY for more information on the problem and how to fix it.
*******************************************************************************************
Finished with error: Gradle task assembleDebug failed with exit code 1
Using mic_stream: 0.1.4
[√] Flutter (Channel beta, v1.5.4-hotfix.2, on Microsoft Windows [Version 10.0.17134.706], locale en-US)
• Flutter version 1.5.4-hotfix.2 at E:\develop\flutter
• Framework revision 7a4c33425d (6 days ago), 2019-04-29 11:05:24 -0700
• Engine revision 52c7a1e849
• Dart version 2.3.0 (build 2.3.0-dev.0.5 a1668566e5)
[√] Android toolchain - develop for Android devices (Android SDK version 28.0.3)
• Android SDK at E:\develop\android\sdk
• Android NDK location not configured (optional; useful for native profiling support)
• Platform android-28, build-tools 28.0.3
• ANDROID_HOME = E:\develop\android\sdk
• Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1343-b01)
• All Android licenses accepted.
[√] Android Studio (version 3.4)
• Android Studio at C:\Program Files\Android\Android Studio
• Flutter plugin version 35.2.1
• Dart plugin version 183.6270
• Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1343-b01)
[√] Connected device (1 available)
• FRD L09 • 73QDU16C28000239 • android-arm64 • Android 7.0 (API 24)
! Doctor found issues in 1 category.
How can I solve that issue?
I get the following error when I attempt "flutter build ios":
...
[!] Unable to determine Swift version for the following pods:
- `mic_stream` does not specify a Swift version and none of the targets (`Runner`)
integrating it have the `SWIFT_VERSION` attribute set. Please contact the author or
set the `SWIFT_VERSION` attribute in at least one of the targets that integrate this
pod.
...
?
Is there a way to calculate the amplitude of the streamed microphone data?
Recording 16 Bit encoded audio and saving it as a .wav produces noise (8 Bit encoding works just fine).
Is suspect it is because of:
for (int i = 0; i < BUFFER_SIZE; i++) {
data_b[2 * i] = (byte) Math.floor((data_s[i] + 32767) / 256.0);
data_b[2*i+1] = (byte) ((data_s[i] + 32767) % 256);
}
I implemented this plugin using the recorder.read(byteArray) for both 8 bit and 16 bit and it works.
recorder!!.read(byteArray, 0, bufferSize)
sink?.success(byteArray)
Is your implementation for 16 bit working for you? Any thoughts?
Hi there,
I added the package to by project dependencies in pubspec.yaml: sound_stream: ^0.3.0
First, I had to update the minimum android SDK Version to 21 (which is absolutely fine).
but now I'm getting this error:
e: /Users/fluffy/.pub-cache/hosted/pub.dev/sound_stream-0.3.0/android/src/main/kotlin/vn/casperpas/sound_stream/SoundStreamPlugin.kt: (45, 8): Class 'SoundStreamPlugin' is not abstract and does not implement abstract member public abstract fun onRequestPermissionsResult(p0: Int, p1: Array<(out) String!>, p2: IntArray): Boolean defined in io.flutter.plugin.common.PluginRegistry.RequestPermissionsResultListener
e: /Users/fluffy/.pub-cache/hosted/pub.dev/sound_stream-0.3.0/android/src/main/kotlin/vn/casperpas/sound_stream/SoundStreamPlugin.kt: (182, 5): 'onRequestPermissionsResult' overrides nothing
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sound_stream:compileDebugKotlin'.
Hello and thank you for this great plugin.
I tried porting my Android app that uses mic_stream 0.6.2
to iOS. As the recording didn't work, I tried the example app. The recording doesn't start and the app never asks for permission to use the microphone. The key NSMicrophoneUsageDescription
is present in Info.plist and there are no error messages.
I modified the Podfile as described by the permission_handler
documentation to fix the permission request.
The debug console output the following:
START LISTENING
wait for stream
However, the following awaits never complete:
mic_stream/example/lib/main.dart
Line 104 in 61a5c35
Even if I disable these awaits, I do not receive any sample when listening to the stream.
I tried updating mic_stream
to 0.6.3-dev
which doesn't solve the issue. I use the iPhone 14 (iOS 16.1) simulator and XCode 14.1 (14B47b). I am not familiar with the Apple/iOS ecosystem and didn't found a similar existing issue here. Maybe there was a breaking change in the latest iOS version?
Hello! Question for you:
We're printing the 960 values per sec from the mic stream and we were wondering how the data points are decided. Like what is the meaning of each data point?
Thanks!
After the init of the MicStream, the device switches its audio output to it's speakers. And because of this when I try to play audio (with audioplayers plugin) and record through mic in the same time, the output is always the device's speakers. My goal would be to listen to the audio continuously on headphones even if the recording started.
Just leaving this here as a placeholder - I may get around to this myself if I get a chance.
Hi there!
Due to the new null safety rules I am getting this error when I run the mic stream example code:
Error: The library 'package:mic_stream/mic_stream.dart' is legacy, and should not be imported into a null safe library.
Try migrating the imported library.dart(import_of_legacy_library_into_null_safe)
Most of the errors are fixable with the dart quick fix, but looks like the errors are in Statistics and WavePainter that are more in depth, would you be able to upload a new version wrt the new null safety rules?
Hello,
I really like this package and his simplicity.
In my app, I wanna to be able to keep listenning in background even if another app like Youtube is open.
I see that some app who use microphone (like SleepCycle) is keeping the microphone alive in background even if the user open Youtube app.
So it's shoud be possible to do that in Flutter.
Regards,
Hey Hi @anarchuser,
So, I have been extensively using this lib for mic streaming data. But another requirement is to generate a wav file.
Could you please show me the correct way of doing it.
I was checking this link Code for Wav but I am not sure if there is a class or method I can directly utilize?
Hi, thanks a lot for this plugin !
I have this issue most of the time when starting the microphone again after stopping it.
E/AndroidRuntime(20320): java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.AudioRecord.getRecordingState()' on a null object reference
Is there something I'm doing wrong ? To stop the microphone I simply cancel the stream listener.
can provide stream to now if it used from an other app or not ?
if it available how can i do it .
Hi,
How can I fix this?
* What went wrong:
The Android Gradle plugin supports only Kotlin Gradle plugin version 1.3.10 and higher.
The following dependencies do not satisfy the required version:
project ':audio_streams' -> org.jetbrains.kotlin:kotlin-gradle-plugin:1.2.71
Thank you
Hi Dev,
We are using your library extensively for retrieving streaming audio data. We would appreciate if along with stream data, we can also obtain dB value which we can use for silence detection e.g.
stream = await MicStream.microphone(....)
if we can do
if(stream.currentDecibleValue>20){
stream.listen((data){....});
}else{
print('Silence detected');
}
Basically, anyway we can do silence detection
Hello, I have an issue with the plugin since i used flutter upgrade.
Flutter 1.7.8+hotfix.3 • channel stable • https://github.com/flutter/flutter.git
Framework • revision b712a172f9 (2 weeks ago) • 2019-07-09 13:14:38 -0700
Engine • revision 54ad777fd2
Tools • Dart 2.4.0
The code that produce the error :
class _MyHomePageState extends State<MyHomePage> {
Stream<List<int>> stream;
StreamSubscription<List<int>> listener;
/**
**/
void _startRecording(){
stream = microphone(sampleRate: 16000);
listener = stream.listen((samples) => _onNewSample(samples));
setState((){
_isRunning = true;
});
}
/**
**/
}
The error
E/AndroidRuntime(16100): FATAL EXCEPTION: Thread-6
E/AndroidRuntime(16100): Process: com.linagora.flutter_audio, PID: 16100
E/AndroidRuntime(16100): java.lang.RuntimeException: Methods marked with @UiThread must be executed on the main thread. Current thread: Thread-6
E/AndroidRuntime(16100): at io.flutter.embedding.engine.FlutterJNI.ensureRunningOnMainThread(FlutterJNI.java:794)
E/AndroidRuntime(16100): at io.flutter.embedding.engine.FlutterJNI.dispatchPlatformMessage(FlutterJNI.java:684)
E/AndroidRuntime(16100): at io.flutter.embedding.engine.dart.DartMessenger.send(DartMessenger.java:80)
E/AndroidRuntime(16100): at io.flutter.embedding.engine.dart.DartExecutor.send(DartExecutor.java:174)
E/AndroidRuntime(16100): at #io.flutter.view.FlutterNativeView.send(FlutterNativeView.java:144)
E/AndroidRuntime(16100): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler$EventSinkImplementation.success(EventChannel.java:226)
E/AndroidRuntime(16100): at com.code.aaron.micstream.MicStreamPlugin$1.run(MicStreamPlugin.java:63)
E/AndroidRuntime(16100): at java.lang.Thread.run(Thread.java:764)
It used to work fine before. I tried to switch flutter to channels dev and beta with the same result.
Thank you.
What approach should be taken to convert the Uint8List
recording to Base64?
How can i reproduce audio from a List
If the bit depth is set to 16, the uint8, which can only represent up to 2^8, seems inadequate because it must represent values up to 2^16.
In short, I think we need to change the return value of MicStream.microphone to Future<Stream<Uint16List>?>.
hi~
I send audio byte stream which is 16000 sample rate from android phone to spring server by web socket. I tried to create a wav file from streamed data but it has noise. but when i tried 8000 sample rate it worked fine
here is my code
flutter
stream = microphone(
audioSource: AudioSource.DEFAULT,
sampleRate: 16000,
channelConfig: ChannelConfig.CHANNEL_IN_MONO,
audioFormat: AUDIO_FORMAT);
listener = stream.listen((samples) => websocketByte(samples));
void websocketByte(List<int> samples){
Uint16List data = Uint16List.fromList(samples);
widget.channel.sink.add(data);
}
java header code
mFileOutStream = new FileOutputStream(mWavFileName);
mOutStream = new DataOutputStream(mFileOutStream);
// write the wav file per the wav file format
mOutStream.writeBytes("RIFF");
mOutStream.write(intToByteArray(32 + mVoiceTotalSize), 0, 4);
mOutStream.writeBytes("WAVE");
mOutStream.writeBytes("fmt ");
mOutStream.write(intToByteArray(16), 0, 4);
mOutStream.write(shortToByteArray((short) 1), 0, 2);
// dulation
mOutStream.write(shortToByteArray(numChannels), 0, 2); //numChannels => 1
mOutStream.write(intToByteArray(16000), 0, 4);
mOutStream.write(intToByteArray((BITS_PER_SAMPLE / 8) * 16000 * numChannels), 0, 4); //BITS_PER_SAMPLE => 16
mOutStream.write(shortToByteArray((short) ((BITS_PER_SAMPLE / 8) * numChannels)), 0, 2);
mOutStream.write(shortToByteArray((short) BITS_PER_SAMPLE), 0, 2);
mOutStream.writeBytes("data");
mOutStream.write(intToByteArray(mVoiceTotalSize), 0, 4);
private byte[] intToByteArray(int i) {
byte[] b = new byte[4];
b[0] = (byte) (i & 0x00FF);
b[1] = (byte) ((i >> 8) & 0x000000FF);
b[2] = (byte) ((i >> 16) & 0x000000FF);
b[3] = (byte) ((i >> 24) & 0x000000FF);
return b;
}
private byte[] shortToByteArray(short data) {
return new byte[] { (byte) (data & 0xff), (byte) ((data >>> 8) & 0xff) };
}
Thanks for bringing us a microphone library that works across multiple platforms!
I got a strange problem:
I used the demo recorder app for all the testing. So, there shouldn't be any int16 conversion mistakes.
Hi!
I opened an issue regarding permissions from a separate isolate a week or two ago. Your solution works great.
Unfortunaltely I did not have time to finish what I started until now. For the next step I would like to store samples to a WAV file that I write just by using RandomAccessFile, nothing else. I know all about the file format, but the results I get are not what I expected. I do this (on Android):
_audioStream = await MicStream.microphone(
audioSource: AudioSource.MIC,
sampleRate: 32000,
channelConfig: ChannelConfig.CHANNEL_IN_MONO,
audioFormat: AudioFormat.ENCODING_PCM_16BIT,
);
I set audioSource to MIC to be sure that data I get is always from microphone. Is there any docs on other values of AudioSource and when they should be used?
Looking at the sample code, I have questions regarding conversion to 16-bit samples:
Should this code be changed or do I misunderstand how samples are encoded?
When trying to upgrade my project to 0.7.0-dev
, I noticed than reading microphone stream settings (await MicStream.sampleRate
, await MicStream.bitDepth
, ...) do not complete until the stream is listened to.
For use-cases requiring stream settings in advance, is it possible to get settings without listening to the microphone and dropping samples until the mic is actually used?
Below exception is thrown when I simply build the 'example' folder and try to record
I/flutter (13387): Start Listening to the microphone
E/flutter (13387): [ERROR:flutter/lib/ui/ui_dart_state.cc(186)] Unhandled Exception: type '_ControllerStream' is not a subtype of type 'Stream<List>'
E/flutter (13387): #0 _squashStream (package:mic_stream/mic_stream.dart:110:3)
E/flutter (13387): #1 microphone (package:mic_stream/mic_stream.dart:77:9)
E/flutter (13387):
E/flutter (13387): #2 _MicStreamExampleAppState._startListening. (package:mic_stream_example/main.dart)
E/flutter (13387):
I wanted to save the input audio and was trying a modified version of the solution provided here but this gives me a corrupted audio file.
Would it be possible to make some changes on the package side to make it easier to record the audio?
The code specifying the sample rate was commented out and was not working.
Hi , I'm new to flutter and I was looking for a library that could help me record sound into a stream. I found your library and was pleased to see that it looks pretty promising.
I'm having some problems trying to understand everything within your example , so I would like to know If you have any idea on how I could split the audio stream into 0.5s chunks , for them to be sent to an external server.
here is the log:
Thread 0 crashed with ARM Thread State (64-bit):
x0: 0x0000000000000000 x1: 0x0000000000000000 x2: 0x0000000000000000 x3: 0x0000000000000000
x4: 0x0000000186898647 x5: 0x000000016aff09e0 x6: 0x000000000000006e x7: 0x0000000000000003
x8: 0xafa7f79cb68d44b2 x9: 0xafa7f79d575e1ff2 x10: 0x0000000000000200 x11: 0x000000000000000b
x12: 0x000000000000000b x13: 0x0000000080880843 x14: 0x00000000ffffffff x15: 0x0000000000000000
x16: 0x0000000000000148 x17: 0x00000001e685cf20 x18: 0x0000000000000000 x19: 0x0000000000000006
x20: 0x00000001e1d35b40 x21: 0x0000000000000103 x22: 0x00000001e1d35c20 x23: 0x0000000138a96ad0
x24: 0x0000000138a971a0 x25: 0x0000000129e2e970 x26: 0x00000002203a8000 x27: 0x00000002203a8000
x28: 0x00000001ec78ce2e fp: 0x000000016aff0950 lr: 0x00000001868dbc28
sp: 0x000000016aff0930 pc: 0x00000001868a4724 cpsr: 0x40001000
far: 0x0000000106e50000 esr: 0x56000080 Address size fault
Binary Images:
0x104e0c000 - 0x104f1ffff com.example.app (2.0.2) <b09ba8a4-1409-3548-84ad-28a857a4bbac> /Applications/example.app/Contents/MacOS/example
0x105020000 - 0x10502bfff org.cocoapods.FBLPromises (2.0.2) <72158610-f2d5-3f74-9566-26f3a31f2461> /Applications/example.app/Contents/Frameworks/FBLPromises.framework/Versions/A/FBLPromises
0x104fe8000 - 0x104febfff org.cocoapods.FirebaseAppCheckInterop (2.0.2) <f91378fa-ecd7-36cf-90ca-7220caf78107> /Applications/example.app/Contents/Frameworks/FirebaseAppCheckInterop.framework/Versions/A/FirebaseAppCheckInterop
0x105224000 - 0x105263fff org.cocoapods.FirebaseAuth (2.0.2) <7ce73d24-40fe-3534-accb-882fa381cf8d> /Applications/example.app/Contents/Frameworks/FirebaseAuth.framework/Versions/A/FirebaseAuth
0x104ffc000 - 0x104ffffff org.cocoapods.FirebaseAuthInterop (2.0.2) <f6ac3347-3702-3aa4-ac49-314511098341> /Applications/example.app/Contents/Frameworks/FirebaseAuthInterop.framework/Versions/A/FirebaseAuthInterop
0x1050b4000 - 0x1050c3fff org.cocoapods.FirebaseCore (2.0.2) <5e66f7f6-822b-3932-861f-1fb540d3c7d1> /Applications/example.app/Contents/Frameworks/FirebaseCore.framework/Versions/A/FirebaseCore
0x105050000 - 0x105053fff org.cocoapods.FirebaseCoreExtension (2.0.2) <663e3327-6475-3219-897f-77d9be29fb83> /Applications/example.app/Contents/Frameworks/FirebaseCoreExtension.framework/Versions/A/FirebaseCoreExtension
0x1051b4000 - 0x1051cbfff org.cocoapods.FirebaseCoreInternal (2.0.2) <d3f18c70-e810-3350-aea4-d09daa07240a> /Applications/example.app/Contents/Frameworks/FirebaseCoreInternal.framework/Versions/A/FirebaseCoreInternal
0x105f1c000 - 0x10611ffff org.cocoapods.FirebaseFirestore (2.0.2) <abe9cbfa-905d-3132-94af-9feda2b50960> /Applications/example.app/Contents/Frameworks/FirebaseFirestore.framework/Versions/A/FirebaseFirestore
0x105064000 - 0x105077fff org.cocoapods.FirebaseFunctions (2.0.2) <72f2a6dd-5c57-354c-aedd-ed547864a317> /Applications/example.app/Contents/Frameworks/FirebaseFunctions.framework/Versions/A/FirebaseFunctions
0x105170000 - 0x105183fff org.cocoapods.FirebaseInstallations (2.0.2) <446d33dc-1eff-30a0-bdbe-89ba65533938> /Applications/example.app/Contents/Frameworks/FirebaseInstallations.framework/Versions/A/FirebaseInstallations
0x1050e4000 - 0x1050e7fff org.cocoapods.FirebaseMessagingInterop (2.0.2) <29d428b1-fde5-3d1f-9dcb-b80680c7419e> /Applications/example.app/Contents/Frameworks/FirebaseMessagingInterop.framework/Versions/A/FirebaseMessagingInterop
0x105448000 - 0x105473fff org.cocoapods.FirebaseSharedSwift (2.0.2) <e3eb1c41-8344-39c7-bd9b-ded327f8252c> /Applications/example.app/Contents/Frameworks/FirebaseSharedSwift.framework/Versions/A/FirebaseSharedSwift
0x105310000 - 0x10534bfff org.cocoapods.FirebaseStorage (2.0.2) <ff23c595-855c-37b6-800d-e8d6662e5e51> /Applications/example.app/Contents/Frameworks/FirebaseStorage.framework/Versions/A/FirebaseStorage
0x1050f8000 - 0x10511ffff org.cocoapods.GTMSessionFetcher (2.0.2) <6139526a-36c3-3011-ae8f-5f82f854f26b> /Applications/example.app/Contents/Frameworks/GTMSessionFetcher.framework/Versions/A/GTMSessionFetcher
0x1053d4000 - 0x1053ebfff org.cocoapods.GoogleUtilities (2.0.2) <b26ee4fc-17af-3f43-a5f1-aa44b4535430> /Applications/example.app/Contents/Frameworks/GoogleUtilities.framework/Versions/A/GoogleUtilities
0x10558c000 - 0x1055a3fff org.cocoapods.PurchasesHybridCommon (2.0.2) <778473dc-eb21-3fb6-adb3-0c1ef87d5f8b> /Applications/example.app/Contents/Frameworks/PurchasesHybridCommon.framework/Versions/A/PurchasesHybridCommon
0x1055e0000 - 0x105743fff org.cocoapods.RevenueCat (2.0.2) <f8222322-613d-3bd8-b4ad-56eee8b8439f> /Applications/example.app/Contents/Frameworks/RevenueCat.framework/Versions/A/RevenueCat
0x105d04000 - 0x105d6bfff org.cocoapods.absl (2.0.2) <c64d89d3-01d0-362d-a377-1e33be3181db> /Applications/example.app/Contents/Frameworks/absl.framework/Versions/A/absl
0x104fc0000 - 0x104fc7fff org.cocoapods.app-links (2.0.2) <46122497-0c16-3e23-9be4-ce874ee72a11> /Applications/example.app/Contents/Frameworks/app_links.framework/Versions/A/app_links
0x105420000 - 0x105427fff org.cocoapods.desktop-window (2.0.2) <36d07d42-a30b-362d-85b5-88747751d8aa> /Applications/example.app/Contents/Frameworks/desktop_window.framework/Versions/A/desktop_window
0x105528000 - 0x10552ffff org.cocoapods.device-info-plus (2.0.2) <6282b875-03ca-3fe8-8943-95b03fc401cb> /Applications/example.app/Contents/Frameworks/device_info_plus.framework/Versions/A/device_info_plus
0x10554c000 - 0x105553fff org.cocoapods.facebook-auth-desktop (2.0.2) <8543a986-8ac7-3752-a237-fad55d222902> /Applications/example.app/Contents/Frameworks/facebook_auth_desktop.framework/Versions/A/facebook_auth_desktop
0x1054e0000 - 0x1054ebfff org.cocoapods.flutter-secure-storage-macos (2.0.2) <3f2a4c61-3edf-3c76-a078-508499eba919> /Applications/example.app/Contents/Frameworks/flutter_secure_storage_macos.framework/Versions/A/flutter_secure_storage_macos
0x1074c4000 - 0x10777bfff org.cocoapods.grpc (2.0.2) <70c31967-a83e-3da3-87d6-d40f4325b90a> /Applications/example.app/Contents/Frameworks/grpc.framework/Versions/A/grpc
0x105a98000 - 0x105af3fff org.cocoapods.grpcpp (2.0.2) <d36960c7-26f0-3425-8559-2afd4cae448f> /Applications/example.app/Contents/Frameworks/grpcpp.framework/Versions/A/grpcpp
0x105508000 - 0x10550ffff org.cocoapods.in-app-review (2.0.2) <8e3c92d8-4032-33be-92e8-fd0ca3f1edc1> /Applications/example.app/Contents/Frameworks/in_app_review.framework/Versions/A/in_app_review
0x105c3c000 - 0x105c6bfff org.cocoapods.leveldb (2.0.2) <c711345a-cee8-3a56-afdc-98661aa05b53> /Applications/example.app/Contents/Frameworks/leveldb.framework/Versions/A/leveldb
0x105cc4000 - 0x105ccbfff org.cocoapods.mic-stream (2.0.2) <0cb53e49-17ba-3d53-abf8-1b0ea83fc60a> /Applications/example.app/Contents/Frameworks/mic_stream.framework/Versions/A/mic_stream
0x105570000 - 0x105573fff org.cocoapods.nanopb (2.0.2) <6d7d55cb-0c62-3d8c-9a14-2eec4f6e6adb> /Applications/example.app/Contents/Frameworks/nanopb.framework/Versions/A/nanopb
0x106960000 - 0x106a57fff org.cocoapods.openssl-grpc (2.0.2) <35309b27-6803-3fbd-ace8-a6cc93eb25e9> /Applications/example.app/Contents/Frameworks/openssl_grpc.framework/Versions/A/openssl_grpc
0x105ce4000 - 0x105ce7fff org.cocoapods.package-info-plus (2.0.2) <69c286b5-aba6-3074-abec-15190bbc83fc> /Applications/example.app/Contents/Frameworks/package_info_plus.framework/Versions/A/package_info_plus
0x105e7c000 - 0x105e83fff org.cocoapods.path-provider-foundation (2.0.2) <ad609672-be5d-37b3-9a03-bdc3a47fe8c9> /Applications/example.app/Contents/Frameworks/path_provider_foundation.framework/Versions/A/path_provider_foundation
0x105e3c000 - 0x105e43fff org.cocoapods.share-plus (2.0.2) <57536212-d854-3179-9c18-7d8d8489b778> /Applications/example.app/Contents/Frameworks/share_plus.framework/Versions/A/share_plus
0x105ee4000 - 0x105eebfff org.cocoapods.shared-preferences-foundation (2.0.2) <3ea57e02-7c85-3fbf-91c9-8de727c43d69> /Applications/example.app/Contents/Frameworks/shared_preferences_foundation.framework/Versions/A/shared_preferences_foundation
0x106518000 - 0x10651ffff org.cocoapods.url-launcher-macos (2.0.2) <062c7773-6659-33a2-be8d-fe2a9a13e493> /Applications/example.app/Contents/Frameworks/url_launcher_macos.framework/Versions/A/url_launcher_macos
0x105e9c000 - 0x105eb7fff org.cocoapods.uv (2.0.2) <ba4f1953-d423-34dd-9f76-4f06db65a0cb> /Applications/example.app/Contents/Frameworks/uv.framework/Versions/A/uv
0x10973c000 - 0x10a31ffff io.flutter.flutter-macos (2.0.2) <4c4c446c-5555-3144-a1c2-83bdb5a7a7b0> /Applications/example.app/Contents/Frameworks/FlutterMacOS.framework/Versions/A/FlutterMacOS
0x1067a0000 - 0x1067abfff libobjc-trampolines.dylib (*) <80f14f3d-d099-3693-a8e0-eb9a526b1790> /usr/lib/libobjc-trampolines.dylib
0x119908000 - 0x11a49ffff io.flutter.flutter.app (2.0.2) <679ba11e-9411-3a53-a6a4-eeeac2b5c15e> /Applications/example.app/Contents/Frameworks/App.framework/Versions/A/App
0x107000000 - 0x107133fff com.apple.audio.units.Components (1.14) <3318bd64-e64f-3e69-991d-605d1bc10d7d> /System/Library/Components/CoreAudio.component/Contents/MacOS/CoreAudio
0x18689b000 - 0x1868d4fe7 libsystem_kernel.dylib (*) <7acbd9bc-d056-310e-858d-81b116cf6d28> /usr/lib/system/libsystem_kernel.dylib
0x1868d5000 - 0x1868e1fff libsystem_pthread.dylib (*) <b401cfb3-8dfe-32db-92b3-ba8af0f8ca6e> /usr/lib/system/libsystem_pthread.dylib
0x186773000 - 0x1867f1ff7 libsystem_c.dylib (*) <9277aff7-3cc3-30d0-99b7-c62680da95cf> /usr/lib/system/libsystem_c.dylib
0x18687f000 - 0x18689afff libc++abi.dylib (*) <fa1e66a8-48dd-3435-a00e-4fcd9bf5de69> /usr/lib/libc++abi.dylib
0x186538000 - 0x18657df3f libobjc.A.dylib (*) <25a3d3ea-8a9e-3a8f-becc-0199e4ed6f94> /usr/lib/libobjc.A.dylib
0x18693b000 - 0x186e14fff com.apple.CoreFoundation (6.9) <b4fdaece-9727-3969-b014-27f7f24c8e01> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation
0x1ec69a000 - 0x1ec7b4fff com.apple.audio.AVFAudio (1.0) <4a3f0500-7b8c-35c9-8be4-e78396ca9eeb> /System/Library/Frameworks/AVFAudio.framework/Versions/A/AVFAudio
0x186728000 - 0x18676ffff libdispatch.dylib (*) <8e87dc0e-a570-3933-b37d-5e05ad516206> /usr/lib/system/libdispatch.dylib
0x1901bd000 - 0x1904f0fff com.apple.HIToolbox (2.1.1) <5f34bbf5-653a-31a5-b4b3-0a02c91ab488> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/HIToolbox.framework/Versions/A/HIToolbox
0x189b9f000 - 0x18aaadfff com.apple.AppKit (6.9) <0218f27e-98c0-3af4-809a-138a01479f4c> /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit
0x18657e000 - 0x18660c53b dyld (*) <6f2c2bb8-4bbc-3b64-b927-d3f3193b6295> /usr/lib/dyld
0x18b13f000 - 0x18b502fff com.apple.CFNetwork (1406.0.4) <bf4e5300-6bc6-3feb-ab50-4266ac8fca01> /System/Library/Frameworks/CFNetwork.framework/Versions/A/CFNetwork
0x1878d4000 - 0x188299fff com.apple.Foundation (6.9) <b7d67e5a-dce2-3f6b-b2b8-895a3669e3ec> /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation
0x1867f2000 - 0x18687eff7 libc++.1.dylib (*) <79cab92f-5e03-31e7-b2bd-feafdfd2bbde> /usr/lib/libc++.1.dylib
0x18fef8000 - 0x18ff20fff com.apple.audio.caulk (1.0) <06456788-36d4-3e9d-ab9a-eab934756fe4> /System/Library/PrivateFrameworks/caulk.framework/Versions/A/caulk
External Modification Summary:
Calls made by other processes targeting this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by this process:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
Calls made by all processes on this machine:
task_for_pid: 0
thread_create: 0
thread_set_state: 0
I would like to use this library to gift audio data to GStreamer on the server.
I don't think the data Mic_steam sends is RTP, but is there any way to receive the data at GStreamer?
Hello,
I have check the basic functionality on Android simulator and the audio streams works fine.
Instead in iOS 15.0 simulator I have problem that the stream does not start. It gets stuck at the output "flutter: wait for stream". So it looks like the native iOS is not starting the microphone audio stream. BTW, there is no request for MIC access within the app. Of course, I added the necessary lines in the Info.plist.
Any ideas why the stream is not starting on iOS?
you can add support windows using https://pub.dev/packages/flutter_webrtc pkg
Hello, I am trying to get a stream from the microphone but can't make it work. I have just copied the whole code from the sample project, and I have not modified anything and it still does not work. The app is throwing following error when trying to get the stream:
E/IAudioFlinger(17562): createRecord returned error -1
E/AudioRecord(17562): createRecord_l(0): AudioFlinger could not create record track, status: -1
E/AudioRecord-JNI(17562): Error creating AudioRecord instance: initialization check failed with status -1.
E/android.media.AudioRecord(17562): Error code -20 when initializing native AudioRecord object.
E/flutter (17562): [ERROR:flutter/lib/ui/ui_dart_state.cc(186)] Unhandled Exception: PlatformException(-1, PlatformError, null, null)
I have added permission into the manifest:
Flutter clean/uninstalling the app doesn't help.
I am using real device Redmi Note 7 fro testing
Flutter version:
Flutter 2.0.2 • channel unknown • unknown source
Framework • revision 8962f6dc68 (4 months ago) • 2021-03-11 13:22:20 -0800
Engine • revision 5d8bf811b3
Tools • Dart 2.12.1
Package version: 0.4.0
Hello, I have the same problem on my device OnePlus 6 (A6003) with 0.5.1 version.
I/flutter (18307): LISTEN NOW: Instance of 'CastStream<dynamic, Uint8List>'
E/IAudioFlinger(18307): createRecord returned error -1
E/AudioRecord(18307): createRecord_l(411): AudioFlinger could not create record track, status: -1
E/AudioRecord-JNI(18307): Error creating AudioRecord instance: initialization check failed with status -1.
E/android.media.AudioRecord(18307): Error code -20 when initializing native AudioRecord object.
It works fine on Android simulator. Only real device is the problem.
Flutter 2.2.3 • channel stable
I'm receiving an error when attempting to initialize the microphone from the mic_stream package.
_ControllerStream<dynamic>' is not a subtype of type 'Stream<Uint8List>
The implementation is exact to your specifications:
Stream<Uint8List> stream = microphone();
I can't discern if I am doing something wrong or if the package may have an issue, please advise
Is it possible to change the buffer size? all the microphone samples in my use case seem to be 1028 bytes.
Can I make this smaller? I'm trying to stream this data over the network, so I want the buffer size to be smaller to improve latency
Is there a plan for adding support for additional platforms such as Linux and web?
Thanks!
Hey, first of all thanks for the work you've done.
My application requires for me to start/stop recording frequently with different settings (sample rate, bit depth), but I've noticed the config won't change when calling MicStream.microphone
after the first time!
I've noticed on line 90: mic_stream.dart
you're working with the cached _microphone
stream from previous initializations which won't reconfigure the microphone with the new settings. Is this intended behavior, if so, can you add an option to force re-initialization every time ?
I have a function which invokes MicStream.microphone
. I want to be able to pass my function optional values for sample rate, audio format and so on and pass them on to microphone
.
That works fine but because of the way the default values are specified in the microphone
call itself it is difficult to provide a way for my function to default to simply use the microphone
defaults.
For example:
Future<Stream<Uint8List>?> myMicFunc({
int sampleRate = ???,
...
}) async {
...
final stream = await MicStream.microphone(sampleRate: sampleRate);
...
}
The ??? in the code above is the problem. As it stands I can't simply use MicStream
's default value (_DEFAULT_SAMPLE_RATE
) because it is private. I could copy the value (16000) but that seems tacky and error prone.
One solution would be to make _DEFAULT_SAMPLE_RATE
publicly accessible so my code would become:
Future<Stream<Uint8List>?> myMicFunc({
int sampleRate = MicStream.DEFAULT_SAMPLE_RATE,
...
}) async {
...
final stream = await MicStream.microphone(sampleRate: sampleRate);
...
}
However that's also pretty fiddly.
I think a better solution would be to make the optional parameters to microphone
nullable. For example:
static Future<Stream<Uint8List>?> microphone({
...
int? sampleRate,
...
}) async {
sampleRate ??= _DEFAULT_SAMPLE_RATE;
// code from here as before
if (sampleRate < _MIN_SAMPLE_RATE || sampleRate > _MAX_SAMPLE_RATE) {
throw (RangeError.range(sampleRate, _MIN_SAMPLE_RATE, _MAX_SAMPLE_RATE));
...
(I'm only showing sampleRate
here but the principle applies to all defaulting arguments.)
This would mean the calling function could be simply defined:
Future<Stream<Uint8List>?> myMicFunc({
int? sampleRate,
...
}) async {
...
final stream = await MicStream.microphone(sampleRate: sampleRate);
...
}
I have found in general that using nullable values for default parameters and then applying the default where it is needed is better than using default values in the parameter lists of dart functions.
Is this a change you would consider?
have planned for iOS?
Hey,
So I've been using the older 0.2.1 version of the package. Right now, I'd like to upgrade to the new version, since I see ios support is there.
But...
The behaviour of the package has changed, and I cannot figure out how to work it out. Previously, when requesting the 16bit PCM transmission, a list of integers was returned. These integers were signed integers, but I'm not sure what size they were.
I now see that the new version returns a Uint8list, meaning it's a list of singed 8bit integers, and that is totally different from what was happening before.
Any help with what exactly happened here, why this breaking change isn't specified anywhere, and how to convert this new Uint8list to the old list of integers which was previously returned? I have some complex logic which works with the old list, and I'm having trouble mapping the new data to the old one, or even use the new data directly.
Thanks.
I am just here to say hi
I'm trying to use this mic stream in an app and play some sound effects at the same time. But once the mic stream initializes and starts listening, all the sound effects are silenced. I tried just_audio and audioplayers libs and both fail to play any sound when this mic plugin is active. This behavior is only in iOS, in Android both audio input and output work well at the same time. What might be wrong and how can I fix it?
Hi, thanks for the plugin, I have a API suggestion
Right now MicStream.microphone()
returns a Future<Stream<Uint8List>?>
. However, it's strange and also I think it's not needed to return a Future, it should return just a Stream<Uint8List>
(not nullable)
Hey great package , can we have a functionality of streaming device internal Audio as well
there is an Application Audio Relay , which is having similar functionality of Stream device internal Audio( Audio of Apps like chrome or VLC ), for both Mac , Windows
well we have to use BlackHole for Audio routing and then by Audio Relay stream audio on device Ip ,
i want similar functionality in flutter , is there any scope of this ??
Hi, thanks for creating this library :)
I'm running into a bit of a problem with this package on macOS. When I try to get the microphone, i get the following error:
MissingPluginException(No implementation found for method requestPermissions on channel flutter.baseflow.com/permissions/methods)
Looking into permission_handler
it seems it does not have support for macOS, so that might be the reason?
On Android (Flutter), I am calling MicStream.microphone from the isolate that I started (so, not from UI isolate).
This causes an exception because "permissionStatus" function is called and there is this line:
var micStatus = await handler.Permission.microphone.request();
If I remove this line, everything else works. I think you cannot request a permission from non-UI isolate. You can however query the status with Permission.microphone.status.
Would you consider removing the permissions code from MicStream.microphone or introduce a parameter to skip permissions code?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.