Giter VIP home page Giter VIP logo

voice-overlay-ios's Introduction

Voice Overlay for iOS

Platform iOS Swift 4 compatible Carthage compatible CocoaPods compatible License: MIT

Overview

Voice overlay helps you turn your user's voice into text, providing a polished UX while handling for you the necessary permissions.

It uses internally the native SFSpeechRecognizer in order to perform the speech to text conversion.

       

Demo

You can clone and run the Demo project by doing pod install and then running the project

Installation

Swift Package Manager

The Swift Package Manager is a tool for managing the distribution of Swift code. It’s integrated with the Swift build system to automate the process of downloading, compiling, and linking dependencies.

To use SwiftPM, you should use Xcode 11+ to open your project. Click File -> Swift Packages -> Add Package Dependency, enter InstantSearch VoiceOverlay repo's URL.

If you're a framework author and use VoiceOverlay as a dependency, update your Package.swift file:

let package = Package(
    // 1.1.0 ..< 2.0.0
    dependencies: [
        .package(url: "https://github.com/algolia/voice-overlay-ios", from: "1.1.0")
    ],
    // ...
)

CocoaPods

InstantSearchVoiceOverlay is available through CocoaPods. To install it, add the following line to your Podfile:

pod 'InstantSearchVoiceOverlay', '~> 1.1.0'

Carthage

Carthage is a simple, decentralized dependency manager for Cocoa.

To install InstantSearchVoiceOverlay, add the following line to your Cartfile:

github "algolia/voice-overlay-ios" ~> 1.1.0

Usage

  1. In Info.plist, add these 2 string properties along with the description
  • Privacy - Microphone Usage Description with a description like: Need the mic for audio to text
  • Privacy - Speech Recognition Usage Description some description like: Need the speech recognition capabilities for searching tags

  1. Start the Voice Overlay and listen to the text output:
import InstantSearchVoiceOverlay

class ViewController: UIViewController {
    
    let voiceOverlayController = VoiceOverlayController()
    
    @objc func voiceButtonTapped() {
        
        voiceOverlayController.start(on: self, textHandler: { (text, final) in
            print("voice output: \(String(describing: text))")
            print("voice output: is it final? \(String(describing: final))")
        }, errorHandler: { (error) in
            print("voice output: error \(String(describing: error))")
        })
    }

Customization

You can customize your voice overlay by modifying the settings property of the voiceOverlayController:

/// Specifies whether the overlay directly starts recording (true), 
/// or if it requires the user to click the mic (false). Defaults to true.
voiceOverlayController.settings.autoStart = true

/// Specifies whether the overlay stops recording after the user stops talking for `autoStopTimeout`
/// seconds (true), or if it requires the user to click the mic (false). Defaults to true.
voiceOverlayController.settings.autoStop = true

/// When autoStop is set to true, autoStopTimeout determines the amount of
/// silence time of the user that causes the recording to stop. Defaults to 2.
voiceOverlayController.settings.autoStopTimeout = 2

/// The layout and style of all screens of the voice overlay.
voiceOverlayController.settings.layout.<someScreen>.<someConstant>

// Use XCode autocomplete to see all possible screens and constants that are customisable.
// Examples:

/// The voice suggestions that appear in bullet points
voiceOverlayController.settings.layout.inputScreen.subtitleBulletList = ["Suggestion1", "Sug2"]
/// Change the title of the input screen when the recording is ongoing.
voiceOverlayController.settings.layout.inputScreen.titleListening = "my custom title"
/// Change the background color of the permission screen.
voiceOverlayController.settings.layout.permissionScreen.backgroundColor = UIColor.red
/// And many more...

Changing Locale or SpeechController

You can change locale or SpeechController when initializing your voiceOverlayController like so:

lazy var voiceOverlayController: VoiceOverlayController = {
  let recordableHandler = {
    return SpeechController(locale: Locale(identifier: "en_US"))
  }
  return VoiceOverlayController(speechControllerHandler: recordableHandler)
}()

You can create your own custom SpeechController class by implementing the Recordable protocol.

Note that in Swift 4, you can use Locale.current.languageCode to get current locale.

Delegate

Optionally, to listen to text and error events, you can conform to the method of the VoiceOverlayDelegate protocol.

// Second way to listen to recording through delegate
func recording(text: String?, final: Bool?, error: Error?) {
    if let error = error {
        print("delegate: error \(error)")
    }
    
    if error == nil {
        print("delegate: text \(text)")
    }
}

How it handles when Permissions are missing

When there are missing permissions, the voice overlay will guide the user to the correct section of the settings app.

Result Screen (Beta)

The result screen appears when showResultScreen is set to true.

/// Whether or not to show a result screen after the recording is finished.
voiceOverlayController.settings.showResultScreen = true

/// Timeout for showing the result screen in case no resultScreenText is provided on time.
voiceOverlayController.settings.showResultScreenTimeout = 2

/// Time for showing the result screen with the provided resultScreenText.
voiceOverlayController.settings.showResultScreenTime = 4

/// The processed result screen text that should be appear in the result screen.
voiceOverlayController.settings.resultScreenText = NSAttributedString(string: myString, attributes: myAttributes)

The widget provides a resultScreenHandler for when the result screen is dismissed (provided the "Start again" button is not clicked). The handler provides the text that has been set in resultScreenText beforehand.

voiceOverlayController.start(on: self, textHandler: { (text, final) in
    print("getting \(String(describing: text))")
    print("is it final? \(String(describing: final))")

    if final {
        // Process the result to post in the result screen.
        // The timer here simulates a network processing call that took 1.5 seconds.
        Timer.scheduledTimer(withTimeInterval: 1.5, repeats: false, block: { (_) in
            let myString = text
            let myAttribute = [ NSAttributedString.Key.foregroundColor: UIColor.red ]
            let myAttrString = NSAttributedString(string: myString, attributes: myAttribute)

            self.voiceOverlayController.settings.resultScreenText = myAttrString
        })
    }
}, errorHandler: { (error) in
    print("error \(String(describing: error))")
}, resultScreenHandler: { (text) in
    print("Result Screen: \(text)")
})

Getting Help

Getting involved

  • If you want to contribute please feel free to submit pull requests.
  • If you have a feature request please open an issue.
  • If you use InstantSearch in your app, we would love to hear about it! Drop us a line on discourse or twitter.

License

InstantSearchVoiceOverlay is available under the MIT license. See the LICENSE file for more info.

voice-overlay-ios's People

Contributors

plnech avatar robertmogos avatar shaialkoby avatar spinach avatar vladislavfitz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

voice-overlay-ios's Issues

Crash when dismissed

In VoiceOverlayController -> showRecordingScreen -> line number 112, app crash due to trying to set a de-allocated object, i fixed it by replacing un-owned with weak and making self optional -. self?, So please fix it.

Change Locale error

When I change change locale like so:

var voiceOverlayController: VoiceOverlayController {
    let recordableHandler = {
        return SpeechController(locale: Locale(identifier: "en_US"))
    }
    return VoiceOverlayController(speechControllerHandler: recordableHandler)
 }

I get a crash when run:

inputViewController.dismissHandler = { [unowned self] (retry) in
          self.inputViewController = nil
          if retry {
            self.showRecordingScreen(view)
          }
        }

Fatal error: Attempted to read an unowned reference but the object was already deallocated2019-01->10 10:09:32.479917+0700 hoaanhdao[2018:372531] Fatal error: Attempted to read an unowned >reference but the object was already deallocated
(lldb)

Fatal Exception: com.apple.coreaudio.avfaudio

I am facing this issue on iOS 17. Any help to resolve this issue is appreciated.

Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false: IsFormatSampleRateAndChannelCountValid(hwFormat)

Exception start from Line 106 of SpeechController
node.installTap(onBus: 0,
bufferSize: SpeechController.AUDIO_BUFFER_SIZE,
format: recordingFormat) { [weak self] (buffer, _) in
self?.speechRequest?.append(buffer)
}

stacktrace:
Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0xf27f4 (Missing UUID 9f046e3672863a6ea280699d6e47cfaf)
1 libobjc.A.dylib 0x19eb4 (Missing UUID 49e2dcb3f0143fcf949bf5f57b3ef0a8)
2 CoreFoundation 0x1186f4 (Missing UUID 9f046e3672863a6ea280699d6e47cfaf)
3 AVFAudio 0x1724 (Missing UUID f4c157ac08683041a562e1d6ad8323ab)
4 AVFAudio 0xc5038 (Missing UUID f4c157ac08683041a562e1d6ad8323ab)
5 AVFAudio 0x234f4 (Missing UUID f4c157ac08683041a562e1d6ad8323ab)
6 AVFAudio 0xb8580 (Missing UUID f4c157ac08683041a562e1d6ad8323ab)
7 MyTestApp 0xd2750 SpeechController.record(textHandler:errorHandler:) + 106 (SpeechController.swift:106)
8 MyTestApp 0xd22d0 closure #1 in SpeechController.startRecording(textHandler:errorHandler:) + 82 (SpeechController.swift:82)
9 MyTestApp 0xd31ac partial apply for closure #1 in SpeechController.requestAuthorization(_:)
10 MyTestApp 0xd20a4 thunk for https://github.com/escaping @callee_guaranteed (https://github.com/unowned SFSpeechRecognizerAuthorizationStatus) -> () ()
11 libdispatch.dylib 0x1cb8 (Missing UUID dc1d018771493100bc63f633afebee6c)
12 libdispatch.dylib 0x3910 (Missing UUID dc1d018771493100bc63f633afebee6c)
13 libdispatch.dylib 0x6a5c (Missing UUID dc1d018771493100bc63f633afebee6c)
14 libdispatch.dylib 0x151f4 (Missing UUID dc1d018771493100bc63f633afebee6c)
15 libdispatch.dylib 0x15a04 (Missing UUID dc1d018771493100bc63f633afebee6c)
16 libsystem_pthread.dylib 0x30d8 (Missing UUID daf953735de639a1a6ced87f3f0629cc)
17 libsystem_pthread.dylib 0x1e30 (Missing UUID daf953735de639a1a6ced87f3f0629cc)

After setting up the project, while specking I am getting error(VoiceOverlay-Demo[640:138261] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker))

Hello,

I setup the demo project & giving all permission as you had mentioned in Redme section. But while testing application is not able to Listen anything & every time I am getting one error which was mentioned below :

VoiceOverlay-Demo[640:138261] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Speaker (type: Speaker)

My ViewController file :

//
// ViewController.swift
// VoiceOverlay-Demo
//
// Created by Guy Daher on 25/06/2018.
// Copyright © 2018 Algolia. All rights reserved.
//

import UIKit
import InstantSearchVoiceOverlay
import Speech

class ViewController: UIViewController, VoiceOverlayDelegate {

let voiceOverlayController = VoiceOverlayController()
let button = UIButton()
let label = UILabel()

override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.

let margins = view.layoutMarginsGuide

button.addTarget(self, action: #selector(buttonTapped), for: .touchUpInside)
label.text = "Result Text from the Voice Input"

label.font = UIFont.boldSystemFont(ofSize: 16)
label.lineBreakMode = .byWordWrapping
label.numberOfLines = 0
label.textAlignment = .center

button.setTitle("Start using voice", for: .normal)
button.setTitleColor(.white, for: .normal)
button.titleLabel?.font = UIFont.boldSystemFont(ofSize: 18)
button.backgroundColor = UIColor(red: 255/255.0, green: 64/255.0, blue: 129/255.0, alpha: 1)
button.layer.cornerRadius = 7
button.layer.borderWidth = 1
button.layer.borderColor = UIColor(red: 237/255, green: 82/255, blue: 129/255, alpha: 1).cgColor

label.translatesAutoresizingMaskIntoConstraints = false
button.translatesAutoresizingMaskIntoConstraints = false

self.view.addSubview(label)
self.view.addSubview(button)

NSLayoutConstraint.activate([
  label.leadingAnchor.constraint(equalTo: margins.leadingAnchor, constant: 10),
  label.trailingAnchor.constraint(equalTo: margins.trailingAnchor, constant: -10),
  label.topAnchor.constraint(equalTo: margins.topAnchor, constant: 110),
  ])

NSLayoutConstraint.activate([
button.leadingAnchor.constraint(equalTo: margins.leadingAnchor, constant: 10),
button.trailingAnchor.constraint(equalTo: margins.trailingAnchor, constant: -10),
button.centerYAnchor.constraint(equalTo: margins.centerYAnchor, constant: 10),
button.heightAnchor.constraint(equalToConstant: 50),
])

voiceOverlayController.delegate = self

// If you want to start recording as soon as modal view pops up, change to true
voiceOverlayController.settings.autoStart = true
voiceOverlayController.settings.autoStop = true
voiceOverlayController.settings.showResultScreen = true

voiceOverlayController.settings.layout.inputScreen.subtitleBulletList = ["Suggestion1", "Suggestion2"]

}

@objc func buttonTapped() {
// First way to listen to recording through callbacks
voiceOverlayController.start(on: self, textHandler: { (text, final, extraInfo) in
print("callback: getting (String(describing: text))")
print("callback: is it final? (String(describing: final))")

  if final {
    let string = "Hello, World!"
    let utterance = AVSpeechUtterance(string: string)
    utterance.voice = AVSpeechSynthesisVoice(language: "en-US")
    
    let synth = AVSpeechSynthesizer()
    synth.speak(utterance)
    // here can process the result to post in a result screen
    Timer.scheduledTimer(withTimeInterval: 1.5, repeats: false, block: { (_) in
      let myString = text
      let myAttribute = [ NSAttributedString.Key.foregroundColor: UIColor.red ]
      let myAttrString = NSAttributedString(string: myString, attributes: myAttribute)
      
      self.voiceOverlayController.settings.resultScreenText = myAttrString
      self.voiceOverlayController.settings.layout.resultScreen.titleProcessed = "BLA BLA"
      
    })
  }
}, errorHandler: { (error) in
  print("callback: error \(String(describing: error))")
}, resultScreenHandler: { (text) in
  print("Result Screen: \(text)")
}
)

}
// Second way to listen to recording through delegate
func recording(text: String?, final: Bool?, error: Error?) {
if let error = error {
print("delegate: error (error)")
}
if error == nil {
label.text = text
}
}
}

Regards,
Sandeep

Crash when voice controller finishes

Is this already usable?

Because I get a crash when the controller finishes.
https://cl.ly/cf78707938f5/Screenshot%2525202018-10-12%252520at%25252016.19.37.png

voice output: error Error Domain=kAFAssistantErrorDomain Code=4 "(null)" UserInfo={NSUnderlyingError=0x600001698000 {Error Domain=SiriCoreSiriConnectionErrorDomain Code=4 "(null)"}}
voice output: error Error Domain=kAFAssistantErrorDomain Code=4 "(null)" UserInfo={NSUnderlyingError=0x600001698000 {Error Domain=SiriCoreSiriConnectionErrorDomain Code=4 "(null)"}}
Fatal error: Attempted to read an unowned reference but the object was already deallocated2018-10-12 16:32:59.928673+0200 simpleclub[5988:1421136] Fatal error: Attempted to read an unowned reference but the object was already deallocated

Process finished with exit code 0

remove search screen

hi, first thanks for great library.
I just want to get text from speech and compare with my text to check condition, so i dont need to show "Searching for" screen, did you have any solution to provide it? Thanks

Unable to access settings and init with locale using Objective-C

#import "InstantSearchVoiceOverlay-Swift.h"

// can only use init 
VoiceOverlayController *voiceOverlayController = [[VoiceOverlayController alloc] init]; 

// error
voiceOverlayController.settings.autoStopTimeout = 0.5;

// *** Terminating app due to uncaught exception 'NSUnknownKeyException', reason: '[<InstantSearchVoiceOverlay.VoiceOverlayController 0x2827adb80> setValue:forUndefinedKey:]: this class is not key value coding-compliant for the key settings.'
[voiceOverlayController setValue:@(0.5) forKeyPath:@"settings.autoStopTimeout"]

Crash when first request permission

hi thanks for this cool library, but I found a problem when my serachController is in the tabBarController

when it was my first request for permission and I allowed it, the application crashed and after the application was reopened everything went smoothly. but then the problem arises again because when the input page appears the menu on the tabbar is not overwritten like the permission page.

I really appreciate if you help me solve the problem, cheers

Voice Audio is not working

This library voice audio was working perfectly but after few days I realised my existing working feature is not working. I also tried example code it also not working now.

Note : I tried 1.1.0 & 1.0.0 both version. its showing [AXTTSCommon] Failure starting audio queue \M-3<…> error

Problem in SpeechController

After recording has done I want to speak out loud the text what I just said. But I guess there is still SpeechController running in background even though SpeechController.stopRecording() method calls from deinit.
I am trying to use this code

let string = "Hello, World!"
let utterance = AVSpeechUtterance(string: string)
utterance.voice = AVSpeechSynthesisVoice(language: "en-US")

let synth = AVSpeechSynthesizer()
synth.speak(utterance)

But no utterance is happening. Please help me.

Getting NSInternalInconsistencyException on allow microphone

Hi,

we are seeing some crashes which happen due to the voice overlay search showing permission view controllers.

Error:
Fatal Exception: NSInternalInconsistencyException
accessing _cachedSystemAnimationFence requires the main thread

I can't reproduce it on my simulator...
It occurs on several iOS versions and devices.
How could this be solved? Is there something happening not on the UI thread?

Stacktrace
Fatal Exception: NSInternalInconsistencyException
0  CoreFoundation                 0x1b9d83758 __exceptionPreprocess
1  libobjc.A.dylib                0x1b8f8bd00 objc_exception_throw
2  CoreFoundation                 0x1b9c99434 +[_CFXNotificationTokenRegistration keyCallbacks]
3  Foundation                     0x1ba773754 -[NSAssertionHandler handleFailureInMethod:object:file:lineNumber:description:]
4  UIKitCore                      0x1e6c5ad98 -[UIApplication _cachedSystemAnimationFenceCreatingIfNecessary:]
5  UIKitCore                      0x1e6c5adc8 -[UIApplication _systemAnimationFenceCreatingIfNecessary:]
6  UIKitCore                      0x1e6ca88c0 +[UIWindow _synchronizedDrawingFence]
7  UIKitCore                      0x1e7196fa0 +[_UIKeyboardChangedInformation informationForKeyboardUp:withIAV:]
8  UIKitCore                      0x1e6afb558 -[_UIRemoteKeyboards prepareToMoveKeyboard:withIAV:showing:forScreen:]
9  UIKitCore                      0x1e6ae4360 -[UIPeripheralHost(UIKitInternal) setInputViews:animationStyle:]
10 UIKitCore                      0x1e6ae6120 -[UIPeripheralHost(UIKitInternal) _restoreInputViewsWithId:animated:]
11 UIKitCore                      0x1e66aaca4 -[UIViewController _restoreInputViewsForPresentation]
12 UIKitCore                      0x1e65bf1f4 -[UIPresentationController runTransitionForCurrentState]
13 UIKitCore                      0x1e65bd1d4 -[UIPresentationController _dismissWithAnimationController:interactionController:target:didEndSelector:]
14 UIKitCore                      0x1e66aa7a0 -[UIViewController _dismissViewControllerWithAnimationController:interactionController:completion:]
15 UIKitCore                      0x1e66aa3ec -[UIViewController _dismissViewControllerWithTransition:from:completion:]
16 UIKitCore                      0x1e66a9bb8 -[UIViewController dismissViewControllerWithTransition:completion:]
17 UIKitCore                      0x1e66a9974 -[UIViewController dismissViewControllerWithTransition:completion:]
18 UIKitCore                      0x1e66a9178 -[UIViewController _performCoordinatedPresentOrDismiss:animated:]
19 UIKitCore                      0x1e66abf58 -[UIViewController dismissViewControllerAnimated:completion:]
20 InstantSearchVoiceOverlay      0x10342d26c UIViewController.dismissMe(animated:completion:) (Extensions.swift:24)
21 InstantSearchVoiceOverlay      0x103430628 partial apply for closure #1 in closure #1 in PermissionViewController.allowMicrophoneTapped() (PermissionViewController.swift:54)
22 InstantSearchVoiceOverlay      0x10342f15c thunk for @escaping @callee_guaranteed (@unowned Bool) -> () (<compiler-generated>)
23 AVFAudio                       0x1bfca5a40 __42-[AVAudioSession requestRecordPermission:]_block_invoke
24 AudioToolbox                   0x1bde79020 invocation function for block in AudioSessionRequestRecordPermission_Common(void (unsigned char) block_pointer)
25 TCC                            0x1bcb4b964 __TCCAccessRequest_block_invoke.75
26 TCC                            0x1bcb4fb78 __tccd_send_message_block_invoke
27 libxpc.dylib                   0x1b99d5e38 _xpc_connection_reply_callout
28 libxpc.dylib                   0x1b99c902c _xpc_connection_call_reply_async
29 libdispatch.dylib              0x1b979597c _dispatch_client_callout3
30 libdispatch.dylib              0x1b97ad83c _dispatch_mach_msg_async_reply_invoke
31 libdispatch.dylib              0x1b97a5684 _dispatch_kevent_worker_thread
32 libsystem_pthread.dylib        0x1b998fac0 _pthread_wqthread
33 libsystem_pthread.dylib        0x1b9995dc4 start_wqthread

Issue with Unwrapped SpeechRequest in SpeechController Causing Nil Issue in iOS 17

In the SpeechController module, there is an explicit unwrapping of the speechRequest, which is causing a nil issue, particularly in iOS 17. This nil issue arises when tapping the screen while recording, which previously resulted in displaying existing text but is now malfunctioning due to the nil issue with the speechRequest. And, does it affect the usage of AVAudioPlayer?

AVAudioEngine Exception

Hey,

This library works perfectly on almost all the devices, but occasionally I get an exception of the following:

Fatal Exception: com.apple.coreaudio.avfaudio required condition is false: format.sampleRate == hwFormat.sampleRate

The exception occurs on line 100 of SpeechController on the following statement:
let node = audioEngine.inputNode

stacktrace:

Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0x1a92eea48 __exceptionPreprocess
1 libobjc.A.dylib 0x1a9015fa4 objc_exception_throw
2 CoreFoundation 0x1a91f0e88 +[_CFXNotificationTokenRegistration keyCallbacks]
3 AVFAudio 0x1b5de5b8c AVAE_RaiseException(NSString*, ...)
4 AVFAudio 0x1b5de5afc AVAE_Check(char const*, int, char const*, char const*, bool)
5 AVFAudio 0x1b5e80888 AVAudioIONodeImpl::SetOutputFormat(unsigned long, AVAudioFormat*)
6 AVFAudio 0x1b5e7a7a4 -[AVAudioNode setOutputFormat:forBus:]
7 AVFAudio 0x1b5e92198 AVAudioEngineImpl::UpdateInputNode(bool)
8 AVFAudio 0x1b5e8e484 -[AVAudioEngine inputNode]
9 InstantSearchVoiceOverlay 0x101cc6ad4 SpeechController.record(textHandler:errorHandler:) + 100 (SpeechController.swift:100)
10 InstantSearchVoiceOverlay 0x101cc6580 closure #1 in SpeechController.startRecording(textHandler:errorHandler:) + 81 (SpeechController.swift:81)
11 InstantSearchVoiceOverlay 0x101cc772c partial apply for closure #1 in SpeechController.requestAuthorization(
:) + 67 (SpeechController.swift:67)
12 InstantSearchVoiceOverlay 0x101cc633c thunk for @escaping @callee_guaranteed (@unowned SFSpeechRecognizerAuthorizationStatus) -> () ()

set locate crash

I use set locate like this

   @IBOutlet weak var btn: UIButton!
    @IBOutlet weak var tv: UITextView!

   
    var voice2:VoiceOverlayController {
        let recordableHandler = {
            return SpeechController(locale: Locale(identifier: "ja_JP"))
        }
        return VoiceOverlayController(speechControllerHandler: recordableHandler)
    }
    
    override func viewDidLoad() {
        super.viewDidLoad()

        // Do any additional setup after loading the view.
        btn.addTarget(self, action: #selector(voiceButtonTapped2), for: .touchUpInside)
        
        
    }

    
    @objc func voiceButtonTapped2() {
        voice2.start(on: self, textHandler: { (text, final, extraInfo)  in
            print("voice output: \(String(describing: text))")
            print("voice output: is it final? \(String(describing: final))")
            if final{
                self.tv.text = text
            }
            
        }, errorHandler: { (error) in
            print("voice output: error \(String(describing: error))")
        })
    }
 Crash is

`Fatal error: Attempted to read an unowned reference but the object was already deallocated2019-03-18 17:08:43.601895+0700 test01[21328:8262914] Fatal error: Attempted to read an unowned reference but the object was already deallocated`

Crash while calling and recording at the same time.

The operation couldn’t be completed. (OSStatus error 561017449.)
2020-12-14 12:46:05.868110+0530 Allie[1307:360867] [aurioc] AURemoteIO.cpp:1095:Initialize: failed: 561017449 (enable 1, outf< 2 ch,      0 Hz, Float32, non-inter> inf< 2 ch,      0 Hz, Float32, non-inter>)
2020-12-14 12:46:08.789219+0530 Allie[1307:360469] In Get State
2020-12-14 12:46:08.791580+0530 Allie[1307:360469] GPS Permission is given!!
2020-12-14 12:46:08.793475+0530 Allie[1307:360867] [avae]            AVAEInternal.h:76    required condition is false: [AVAEGraphNode.mm:823:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))]

Solutions I found on stack overflow which worked are
changing this code
let recordingFormat = node.outputFormat(forBus: 0)
to this
let recordingFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1)

in the SpeechController class. Let me know your views on this ?

Error Domain=kAFAssistantErrorDomain Code=203

some times i got this error.
[Utility] +[AFAggregator logDictationFailedWithError:] Error Domain=kAFAssistantErrorDomain Code=203 "Error" UserInfo={NSLocalizedDescription=Error, NSUnderlyingError=0x60000252eee0 {Error Domain=SiriSpeechErrorDomain Code=201 "(null)"}}

iOS 9 Support

Is it possible to allow the use of iOS 9 as a minimum deployment target?

Not able to implement in Project

Hello,

As I am new in IOS development, I tried to use the POD in new project but not able to implement any of feature. If it is possible can you please provide me a sample project.

Thanks in advance.

Regards,
Sandeep

AVAudioEngine Exception & Crash App

Hi,
Thank you for your support of this project. Actually i want to say that when open ViewController for speech multi times and speak then "app crash" sometime .

File Name:-
SpeechController.swift

Method Name:-

public func startRecording(textHandler: @escaping SpeechTextHandler, errorHandler: @escaping SpeechErrorHandler) {
requestAuthorization {[unowned self] (authStatus) in
if authStatus {
if !self.audioEngine.isRunning { -->>>>**//Error
self.record(textHandler: textHandler, errorHandler: errorHandler)
}
} else {
let errorMsg = "Speech recognizer needs to be authorized first"
errorHandler(NSError(domain:"com.algolia.speechcontroller", code:1, userInfo:[NSLocalizedDescriptionKey: errorMsg]))
}
}
}

**Error:--
Fatal error: Attempted to read an unowned reference but the object was already deallocated

Demo missing file

error: missing module map file: '/Users/Harvey/Downloads/voice-overlay-ios-master/VoiceOverlay-Demo/Pods/Target Support Files/InstantSearchVoiceOverlay/InstantSearchVoiceOverlay.modulemap (in target 'InstantSearchVoiceOverlay')

Title and subtitle text get truncated

With specific combinations of title and subtitle text and device screen size you can end up with truncated text.

  • for permission screen subtitle was set to First, you need to grant device permissions., but visible text is just First, you need to grant device

Screen Shot 2019-07-09 at 12 02 42

  • for no permission screen title was set to Time can't create a Smart Memo, but visible text is just Time can't create a

Screen Shot 2019-07-09 at 12 02 50

I poked around the sample app and found that removing override for viewDidLayoutSubviews() in those UIViewControllers fixes this issue. I'm not sure why it happens, thought. As preferredMaxLayoutWidth should drive the max width of a UILabel, but if constraints have been set for both leading and trailing, then preferredMaxLayoutWidth is not needed..?

Any thought on this?

Crash on iOS 12 and iOS 13 devices

2021-09-01 10:37:38.497110+0530 EduNet[2488:52836] [aqme] 254: AQDefaultDevice (1): output stream 0: null buffer
2021-09-01 10:37:38.527571+0530 EduNet[2488:52836] [aqme] 1640: EXCEPTION thrown (-50): -
2021-09-01 10:37:46.958175+0530 EduNet[2488:52702] RPCTimeout.mm:55:_ReportRPCTimeout: Initialize: Mach message timeout. Apparently deadlocked. Aborting now.
CoreSimulator 757.5 - Device: iPhone 7 Plus (69C5E06B-DE35-49CE-BF09-E997779A03F7) - Runtime: iOS 12.1 (16B91) - DeviceType: iPhone 7 Plus
Printing description of errorHandler:
expression produced error: error: Execution was interrupted, reason: EXC_BAD_ACCESS (code=EXC_I386_GPFLT).
The process has been returned to the state before expression evaluation.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.