Comments (19)
Hi, interesting I got it to connect yesterday...but it only happened twice...and since then I haven't gotten the connection again....it's so damn frustrating, have you been able to figure it out...
from flutter-webrtc.
I make it "work" here too...(almost)
I force-filtered all "typ host" candidates, allowing the RTCPeerConnection to be connected.
But something is still not quite right. When I try to render, I get a "black video" for both local and remote videos.
Again all that only when running on a real iOS device.
from flutter-webrtc.
Hello!
Did you find a possible reason for this issue or solve it?
I'm having it as well.
from flutter-webrtc.
Sadly, I've not gone back to the code since then,
But I spent quite a long time on it, I should go over it this weekend
from flutter-webrtc.
I'm on the same boat now having the exact issue, been working on the process for a week, I discovered the issue today and there's no clear way on how to proceed from there.
from flutter-webrtc.
from flutter-webrtc.
That would be great!
Thanks
from flutter-webrtc.
I have the same issues, at least when I run it on an iOS device.
When I run it on the WEB platform, it works fine, and I always connect with the server.
Two things I've noticed running on iOS devices:
-
It's not respecting the
iceTransportPolicy
:
I'm trying to set it torelay
orpublic,
but in both cases, I'm getting all candidates including private IPs. -
The SDP offer generated on the devices is different from the WEB.
a=candidate:149964835 1 udp 2122194687 192.168.3.80 51361 typ host generation 0 network-id 1 network-cost 10
a=candidate:4112558200 1 udp 2122063615 10.120.124.217 62449 typ host generation 0 network-id 13 network-cost 900
a=candidate:265544218 1 udp 2121932543 127.0.0.1 63276 typ host generation 0 network-id 11
a=candidate:125836694 1 udp 2122265343 fdf8:af05:cef:5100:1482:cdcb:d1af:c7d3 59154 typ host generation 0 network-id 2 network-cost 10
a=candidate:307215866 1 udp 2122131711 2804:389:b10f:1c13:b062:6db5:e7d5:cae8 51896 typ host generation 0 network-id 14 network-cost 900
a=candidate:3113597479 1 udp 2122005759 ::1 56213 typ host generation 0 network-id 12
a=candidate:1983883963 1 tcp 1518214911 192.168.3.80 51971 typ host tcptype passive generation 0 network-id 1 network-cost 10
a=candidate:2347715296 1 tcp 1518083839 10.120.124.217 51972 typ host tcptype passive generation 0 network-id 13 network-cost 900
a=candidate:1897660546 1 tcp 1517952767 127.0.0.1 51973 typ host tcptype passive generation 0 network-id 11
a=candidate:2035285774 1 tcp 1518285567 fdf8:af05:cef:5100:1482:cdcb:d1af:c7d3 51974 typ host tcptype passive generation 0 network-id 2 network-cost 10
a=candidate:1820352354 1 tcp 1518151935 2804:389:b10f:1c13:b062:6db5:e7d5:cae8 51975 typ host tcptype passive generation 0 network-id 14 network-cost 900
a=candidate:3344586943 1 tcp 1518025983 ::1 51976 typ host tcptype passive generation 0 network-id 12
a=candidate:3555944546 1 udp 1685987071 201.80.0.190 26035 typ srflx raddr 192.168.3.80 rport 51361 generation 0 network-id 1 network-cost 10
a=candidate:657206775 1 udp 1685855999 187.69.78.0 57601 typ srflx raddr 10.120.124.217 rport 62449 generation 0 network-id 13 network-cost 900
a=candidate:2243162187 1 udp 41820415 137.184.96.20 40354 typ relay raddr 201.80.0.190 rport 26035 generation 0 network-id 1 network-cost 10
a=candidate:2243162187 1 udp 41689343 137.184.96.20 23366 typ relay raddr 187.69.78.0 rport 57601 generation 0 network-id 13 network-cost 900
It contains localhost and private addresses that I suspect the server is trying to use.
Another thing I had this working with a previous combination of early versions of Flutter, flutter_webrtc, and iOS.(I'm not sure about the version, tho - mid-2013 was working fine)
from flutter-webrtc.
Hi there!!
I've got it to work too, many times. I wasn't using a TURN server. When I put the devices on the same network it works just fine.
Audio wasn't working for me, although I set it to true and gave permission (manually not using the permissions handler)
from flutter-webrtc.
wait, you've got it to work already?
from flutter-webrtc.
Yes, I implemented it using appwrite then tried firebase. Both work now but in the same network for the devices.
What helped me a lot is writing down the steps as sequential comments and make each comment a one or two method implementation. Breaking it down to smaller pieces and test each individually really helps!
from flutter-webrtc.
Hi Harith,
I don't know if time permits you, but I would really appreciate if you shared those notes of sequential steps including their comments, not necessary actual code...pseudocode works...so I can know what to watch out for, plus gotya.
from flutter-webrtc.
Sure!
from flutter-webrtc.
The caller does the following steps to make a call
Get the media from the device by opening the camera and the microphone and put it in a local stream.
Create a peer connection, which is the heart of WebRTC. The peer connection class is used to: connect the caller and the callee during the call, tracking the connection state, tracking the ICE gathering state, streaming video and audio between the two peers and more.
Get tracks (video and audio combined) from local stream and add them to the peer connection.
Create an offer.
Save the offer in the database so that the callee can fetch it and put it in his peer connection as a remote description.
Set the offer as a local description.
Set event listeners (onAddStream / onTrack / onIceCandidate)
onAddStream will be called when a remote stream is added to the peer connection.
onTrack is same as onAddStream but gives us access to individual tracks.
onIceCandidate will be called immediately when the offer is set as a local description.
Generate his ICE candidates list and send them to the database.
The onIceCandidate listener will be called immediately after setting the local description, ICE candidates will be provided, the caller takes those candidates and save them in the database.
Listen for an answer in the database.
If there is an answer, get it and set it as a remote description.
Listen for callee ICE candidates in the database. (the callee created an answer and he should be generating his ICE candidates and send them to the database)
Add callee ICE candidates to the peer connection.
The callee does the following steps to answer the call:
Get the media from the device by opening the camera and the microphone and put it in a local stream.
Create a peer connection.
Get tracks (video and audio combined) from local stream and add them to the peer connection.
Get the offer from the database.
Set the offer as a remote description.
Set event listeners (onAddStream / onTrack / onIceCandidate).
Create an answer.
Save the answer in the database so that the caller can fetch it and put it in his peer connection as a remote description.
Set the answer as a local description.
The onIceCandidate listener will be called immediately after, ICE candidates will be provided, the callee takes those candidates and save them in the database.
Get caller's ICE candidates from the database.
Add them to the peer connection.
from flutter-webrtc.
`Future createRoom({
required String userId,
required String roomId,
}) async {
final uuid = ref.read(uuidProvider);
final allRelatedDocsId = uuid.v4();
//!initialize peerConnection
peerConnection = await createPeerConnection(_configuration);
state = PeerConnectionCreated();
//! set onAddStream listener
setOnAddStreamListener();
//! set onTrack listenrer
setOnTrackListener();
iceState();
//! get tracks from local stream and add them to the peerConnection
addLocalStreamTracksToPeerConnection();
//! create offer
final offer = await createOffer(roomId);
//! send the offer to the database
await sendOfferToDatabase(offer: offer, roomId: roomId);
//! set the offer as a local description
await setOfferAsLocalDescription(offer);
//! get caller ICE candidates and send them to the database
await getCallerIceCandidatesAndSendThemToDatabase(roomId);
//! listen for answer in the database and set it as remote description
listenForAnswerAndSetRemoteDescription(roomId);
//! listen for callee Ice candidates and add them to peerConnection
listenForCalleeIceCandidatesAndAddThemToPeerConnection(roomId);
//! update callerIceDone to use it for cancelling subscription
//! that listens to callee ICE candidates
// TODO: FIX THIS LATER
await updateCallerIceDone(roomId);
}`
from flutter-webrtc.
`Future joinRoom({
required String roomId,
}) async {
//! initialize peerConnection
peerConnection = await createPeerConnection(_configuration);
state = PeerConnectionCreated();
//! set onAddStream listener
setOnAddStreamListener();
//! set onTrack listenrer
setOnTrackListener();
iceState();
//! get tracks from local stream and add them to the peerConnection
addLocalStreamTracksToPeerConnection();
//! get the offer
final offer = await getOffer(roomId);
//! set the offer as remote description
await setOfferAsRemoteDescription(offer: offer);
//! create answer
final answer = await createAnswer();
//! send the answer to the database
await updateAnswerInDatabase(answer: answer, roomId: roomId);
//! set the answer as local description
await setAnswerAsLocalDescription(answer);
//! get local ICE candidates and send them to database
await getCalleeIceCandidatesAndSendThemToDatabase(roomId);
//! get caller ICE candidates and add them to the peer connection
await getCallerICeCandidatesFromDatabaseAddThemToPeerConnection(roomId);
}
`
from flutter-webrtc.
iOS is a bit strict, maybe you're missing something!
from flutter-webrtc.
final audioCallServiceProvider = Provider<WebRtcCallService>((ref) {
return WebRtcCallService(ref.watch(socketServiceProvider));
});
class CallNotifier extends AsyncNotifier<CallDetails> {
@override
FutureOr<CallDetails> build() async {
try {
print("Initial Call State");
return CallDetails.initial();
} catch (e) {
rethrow; //TODO
}
}
Future<void> makeACallTo(
String userId,
) async {
try {
await updateRemoteUserId(userId);
await _initializeRendererAndStartPeerConnection();
_registerPeerConnectionListeners();
await _getLocalUserAudioTracksAndAddToMediaStream();
await _addLocalUserAudioTracksToPeerConnection();
_registerListenerForIncomingTrackEventsFromRemotePeer();
_setupIncomingIceCandidateListeners();
await _createAndSendOfferTo(userId);
} catch (e, s) {
print("error making a call to: $userId");
print(e.toString());
print(s.toString());
rethrow; //TODO
}
}
Future<void> updateRemoteUserId(
String userId,
) async {
try {
await update((oldData) {
return oldData.copyWith(
remoteUserId: userId,
);
});
} catch (e) {
//TODO
rethrow; //TODO
}
}
Future<void> _initializeRendererAndStartPeerConnection() async {
try {
await state.requireValue.initializeRenderer();
final userPeerConnection = await createPeerConnection(
CallDetails.iceServerConfig,
);
await userPeerConnection.addTransceiver(
kind: RTCRtpMediaType.RTCRtpMediaTypeAudio,
init: RTCRtpTransceiverInit(
direction: TransceiverDirection.SendRecv,
),
);
await update((oldData) {
return oldData.copyWith(
userPeerConnection: userPeerConnection,
);
});
} catch (e) {
print("error with starting peer connection");
print(e.toString());
rethrow; //TODO
}
}
Future<void> _getLocalUserAudioTracksAndAddToMediaStream() async {
try {
// getting user local stream
//and save it;
final mediaStreams =
await ref.read(audioCallServiceProvider).getUserMedia();
await update((oldData) {
return oldData.copyWith(
localUserStream: mediaStreams.localStream,
remoteUserStream: mediaStreams.remoteStream,
);
});
} catch (e) {
rethrow; //TODO
}
}
Future<void> _addIncomingIceCandidatesToPeerConn(
RTCIceCandidate candidate,
) async {
try {
final userPeerConnection = state.requireValue.userPeerConnection!;
final callerId = state.requireValue.remoteUserId!;
await userPeerConnection.addCandidate(
candidate,
);
await update((oldData) {
return oldData.copyWith(
userPeerConnection: userPeerConnection,
sendIceCandidate: true,
);
});
await _sendAllBufferedIceCandidatesTo(callerId);
print("finished adding incoming ice candidate");
} catch (e) {
print("error adding incoming ice candidates");
print(e.toString());
rethrow; //TODO
}
}
Future<void> _addLocalUserAudioTracksToPeerConnection() async {
try {
print("adding audio tracks to call");
final _localUserStream = await state.requireValue.localUserStream;
final _userPeerConnection = await state.requireValue.userPeerConnection;
if (_localUserStream != null && _userPeerConnection != null) {
_localUserStream.getAudioTracks().forEach((eachAudioTrack) {
_userPeerConnection.addTrack(eachAudioTrack, _localUserStream);
});
await update((oldData) {
return oldData.copyWith(
userPeerConnection: _userPeerConnection,
);
});
print("finished adding local user audio tracks to call");
} else {
throw Exception(
"Either user stream or peer connection ain't available",
);
}
} catch (e) {
print("error adding user audio to peer connection");
print(e.toString());
rethrow; //TODO
}
}
Future<void> _createAndSendOfferTo(
String userId,
) async {
try {
// creating an offer;
final _userPeerConnection = state.requireValue.userPeerConnection;
if (_userPeerConnection != null) {
final _offer =
await _userPeerConnection.createOffer(CallDetails.voiceConstraints);
await _userPeerConnection.setLocalDescription(_offer);
await update((oldData) {
return oldData.copyWith(
userPeerConnection: _userPeerConnection,
);
});
final _response = await ref
.read(audioCallServiceProvider)
.sendCallOfferToUser(userId, data: _offer);
print("response from sending offer: $_response");
if (_response != null) {
await _userPeerConnection.setRemoteDescription(_response);
print("👀👀👀just set remote description ");
await update((oldData) {
return oldData.copyWith(
sendIceCandidate: true,
userPeerConnection: _userPeerConnection,
);
});
await _sendAllBufferedIceCandidatesTo(userId);
} else {
// TODO: notify user that call wasn't picked;
throw Exception("notify user that call wasn't picked;");
}
} else {
throw Exception("interestingly, theres no active peer connection");
}
} catch (e) {
rethrow; //TODO
}
}
Future<void> respondToThisCallWith(
bool response,
OfferDetails callDetails,
) async {
try {
if (response) {
await _initializeRendererAndStartPeerConnection();
_registerPeerConnectionListeners();
_registerListenerForIncomingTrackEventsFromRemotePeer();
_setupIncomingIceCandidateListeners();
await updateRemoteUserId(callDetails.callerId);
//TODO: inform user of exchanging stuffs
final answer = await _acceptOfferandCreateAnswer(callDetails.offer);
await _sendResponseBackToCallerAndIncludingPendingIceCandidates(
response,
callDetails,
answer,
);
} else {
await callDetails.callback({
"response": response,
});
}
} catch (e, s) {
print(s.toString);
print(e.toString());
rethrow; //TODO
}
}
Future<RTCSessionDescription> _acceptOfferandCreateAnswer(
RTCSessionDescription offer,
) async {
try {
final _userPeerConnection = state.requireValue.userPeerConnection;
if (_userPeerConnection != null) {
await _userPeerConnection.setRemoteDescription(offer);
print("👀👀👀just set remote description ");
// setup media
await _getLocalUserAudioTracksAndAddToMediaStream();
await _addLocalUserAudioTracksToPeerConnection();
final answer = await _userPeerConnection.createAnswer(
CallDetails.voiceConstraints,
);
await _userPeerConnection.setLocalDescription(answer);
await update((oldData) {
return oldData.copyWith(
userPeerConnection: _userPeerConnection,
);
});
return answer;
} else {
throw Exception("peer connection is interestingly not present");
}
} catch (e) {
// TODO
rethrow; //TODO
}
}
Future<void> _sendResponseBackToCallerAndIncludingPendingIceCandidates(
bool response,
OfferDetails callDetails,
RTCSessionDescription answer,
) async {
try {
await callDetails.callback({
"response": response,
"sdp": answer.toMap(),
});
print("successfully responded to call from ${callDetails.callerName}");
// await _sendAllBufferedIceCandidatesTo(callDetails.callerId);
// await update((oldData) {
// return oldData.copyWith(
// sendIceCandidate: true,
// );
// });
} catch (e) {
// /
rethrow; //TODO
}
}
void _setupIncomingIceCandidateListeners() {
try {
print('setting up incoming ice candidate listener');
ref
.read(audioCallServiceProvider)
.listenForIncomingIceCandidate()
.listen((newIceCandidates) async {
await _addIncomingIceCandidatesToPeerConn(newIceCandidates);
});
} catch (e, s) {
//
print(e.toString());
print(s.toString());
rethrow; //TODO
}
}
void _registerListenerForIncomingTrackEventsFromRemotePeer() {
final _userPeerConnection = state.requireValue.userPeerConnection;
_userPeerConnection?.onTrack = (RTCTrackEvent event) {
print("got new track from other user");
final _remoteUserStream = state.requireValue.remoteUserStream;
// final _remoteRenderer = state.requireValue.remoteRenderer;
event.streams[0].getAudioTracks().forEach((eachAudioTrack) async {
print("adding each new track to remote user media stream");
_remoteUserStream?.addTrack(eachAudioTrack);
await update((oldData) {
return oldData.copyWith(
remoteUserStream: _remoteUserStream,
// TODO: added renderer;
);
});
// _remoteRenderer.srcObject = event.streams[0];
// remoteRenderer.muted = false;
});
};
}
void _registerPeerConnectionListeners() {
print("registering events");
final _userPeerConnection = state.requireValue.userPeerConnection;
_userPeerConnection?.onIceGatheringState = (
RTCIceGatheringState iceGatheringState,
) async {
if (iceGatheringState ==
RTCIceGatheringState.RTCIceGatheringStateComplete) {
//
final _remoteUserId = state.requireValue.remoteUserId;
await _sendAllBufferedIceCandidatesTo(
_remoteUserId!,
);
}
print("ICE gathering state changed: $iceGatheringState");
};
_userPeerConnection?.onConnectionState = (
RTCPeerConnectionState state,
) async {
print("Connection state change: $state");
if (state == RTCPeerConnectionState.RTCPeerConnectionStateFailed) {
await _userPeerConnection.restartIce();
}
};
_userPeerConnection?.onSignalingState = (
RTCSignalingState state,
) {
print("Signaling state change: $state");
};
_userPeerConnection?.onAddStream = (
MediaStream newStream,
) async {
print("Got new stream");
await update((oldData) {
return oldData.copyWith(
remoteUserStream: newStream,
);
});
};
_userPeerConnection?.onIceConnectionState =
(RTCIceConnectionState state) async {
print("on ice connection state: ");
print(state);
};
_userPeerConnection?.onRenegotiationNeeded = () {
print("renegotiation needed, what are we going to do?");
// final offer = await userPeerConnection.createOffer()
// await userPeerConnection.setLocalDescription(offer);
};
_userPeerConnection?.onIceCandidate = (candidate) async {
print("on ice candidates ....");
final _remoteUserId = state.requireValue.remoteUserId;
final shouldSendIceCandidate = state.requireValue.shouldSendIceCandidate;
if (shouldSendIceCandidate) {
await _sendAllBufferedIceCandidatesTo(_remoteUserId!);
await ref.read(audioCallServiceProvider).sendIceCandidate(
_remoteUserId,
candidate,
);
} else {
print("sendIceCandidate is false for now");
await update((oldData) {
return oldData.copyWith(
iceCandidates: [...state.requireValue.iceCandidates, candidate],
);
});
}
};
}
Future<void> _sendAllBufferedIceCandidatesTo(
String remoteUserId,
) async {
try {
//
final iceCandidates = state.requireValue.iceCandidates;
if (iceCandidates.isNotEmpty) {
print("sending from buffererd list of length: ${iceCandidates.length}");
iceCandidates.forEach((eachIceCandidate) async {
await ref
.read(audioCallServiceProvider)
.sendIceCandidate(remoteUserId, eachIceCandidate);
});
// reset the buffered list;
await update((oldData) {
return oldData.copyWith(
iceCandidates: [],
sendIceCandidate: true,
);
});
} else {
print("no buffered ice candidates in list");
}
} catch (e) {
//
print("struggled to send all pending ice candidates");
print(e.toString());
rethrow; //TODO
}
}
Future<void> endCall() async {
try {
// TODO
// await state.requireValue.stopAllPeerConnections();
} catch (e) {
print("ending the call");
print(e.toString());
rethrow; //TODO
}
}
Future<void> stopAllPeerConnections() async {
try {
final _userPeerConnection = state.requireValue.userPeerConnection;
await _userPeerConnection?.close();
await update((oldData) {
return oldData.copyWith(
userPeerConnection: null,
);
});
} catch (e) {
print("failed to stop peer connection");
rethrow; //TODO
}
}
Future<void> dispose() async {
try {
stopAllPeerConnections();
await state.requireValue.localRenderer.dispose();
await state.requireValue.remoteRenderer.dispose();
navigator.mediaDevices.ondevicechange = null;
} catch (e) {
print("error disposing the renderers");
print(e.toString());
rethrow; //TODO
}
}
// void notifyUserOfIncomingCall() {
// //
// }
// Future hangUpTheCall() async {
// try {
// // stop the local stream tracks
// // stop the remote stream tracks
// // stop the peer connection
// // _timer.cancel();
// await stopAllPeerConnections();
// } catch (e) {
// print("error hanging up the call");
// print(e.toString());
// rethrow; //TODO
// }
// }
// Future<void> selectAudioOutput(String? deviceId) async {
// try {
// // if (!_inCalling) {
// // return;
// // }
// // await _localRenderer.audioOutput(deviceId!);
// } catch (e) {
// rethrow; //TODO
// }
// }
// Future<void> changePhoneSpeakerState() async {
// try {
// // _speakerphoneOn = !_speakerphoneOn;
// // await Helper.setSpeakerphoneOn(_speakerphoneOn);
// // setState(() {});
// } catch (e) {
// rethrow; //TODO
// }
// }
// Future<void> changeMicrophoneMuteState() async {
// try {
// // _speakerphoneOn = !_speakerphoneOn;
// // await Helper.setSpeakerphoneOn(_speakerphoneOn);
// // setState(() {});
// } catch (e) {
// rethrow; //TODO
// }
// }
// Future<void> selectAudioInput(String? deviceId) async {
// _selectedAudioInputId = deviceId;
// if (callstage == CallStage.callInProgress) {
// return;
// }
// var newLocalStream = await navigator.mediaDevices.getUserMedia({
// 'audio': {
// if (_selectedAudioInputId != null) 'deviceId': _selectedAudioInputId,
// if (_selectedAudioInputId != null)
// 'optional': [
// {'sourceId': _selectedAudioInputId}
// ],
// },
// 'video': false,
// });
// localStream = newLocalStream;
// localRenderer.srcObject = localStream;
// var newTrack = newLocalStream.getAudioTracks().first;
// print('track.settings ' + newTrack.getSettings().toString());
// var sender =
// senders.firstWhereOrNull((sender) => sender.track?.kind == 'audio');
// await sender?.replaceTrack(newTrack);
// }
// Future<void> selectAudioOutput(String? deviceId) async {
// if (callstage != CallStage.callInProgress) {
// return;
// }
// await localRenderer.audioOutput(deviceId!);
// }
// void startCallTimer() {
// //
// }
// Future<void> stopEverything() async {
// try {
// localStream?.getTracks().forEach((track) async {
// await track.stop();
// });
// await localStream?.dispose();
// localStream = null;
// localRenderer.srcObject = null;
// remoteRenderer.srcObject = null;
// senders.clear();
// callstage = CallStage.callInProgressStopped;
// await stopAllPeerConnections();
// speakerphoneOn = false;
// await Helper.setSpeakerphoneOn(speakerphoneOn);
// } catch (e) {
// print(e.toString());
// }
// }
// void updateTimer() {
// try {
// _timer = Timer.periodic(Duration(seconds: 1), (timer) {
// timeElapsed++; // Update the timer every second
// });
// } catch (e) {
// //
// }
// }
}
final audioCallControllerProvider =
AsyncNotifierProvider<CallNotifier, CallDetails>(
CallNotifier.new,
);
#my call service
typedef OfferDetails = ({
String callerName,
String callerId,
RTCSessionDescription offer,
Function callback,
});
class WebRtcCallService {
final SocketService _socketService;
WebRtcCallService(this._socketService);
Future<List<MediaDeviceInfo>> loadDevices() async {
try {
if (WebRTC.platformIsAndroid || WebRTC.platformIsIOS) {
//Ask for runtime permissions if necessary.
var status = await Permission.bluetooth.request();
if (status.isPermanentlyDenied) {
print('BLEpermdisabled');
}
status = await Permission.bluetoothConnect.request();
if (status.isPermanentlyDenied) {
print('ConnectPermdisabled');
}
}
final devices = await navigator.mediaDevices.enumerateDevices();
print("printing devices");
devices.forEach((element) {
print("${element.deviceId} ${element.kind} ${element.label}");
});
return devices;
} catch (e) {
print(e.toString());
rethrow;
}
}
Future<({MediaStream localStream, MediaStream remoteStream})>
getUserMedia() async {
try {
final localStream = await navigator.mediaDevices.getUserMedia({
'audio': true,
'video': false,
});
final remoteStream = await createLocalMediaStream("remote_key");
return (localStream: localStream, remoteStream: remoteStream);
} catch (e) {
print("error getting user media");
print(e.toString());
rethrow;
}
}
Stream<OfferDetails> listenForIncomingCallOffer() {
final controller = StreamController<OfferDetails>.broadcast();
try {
_socketService
.listenForThisEvent(
AudioServiceSocketEvents.listenForCall.description,
)
.listen((event) async {
final mapData = event.$1;
// print('map from listening for call $mapData');
final socketCallbackFn = event.$2;
final callerName = mapData["caller"] as String;
final callerId = mapData["callerId"] as String;
final sdpContent = mapData["sdp"] as Json;
final offer = RTCSessionDescription(
sdpContent['sdp'] as String?,
sdpContent['type'] as String?,
);
controller.add((
offer: offer,
callerId: callerId,
callerName: callerName,
callback: socketCallbackFn,
));
});
} catch (e, s) {
print("error listening for incomming call offer");
print(e.toString());
// rethrow;
controller.addError(e, s);
}
return controller.stream;
}
Future<RTCSessionDescription?> sendCallOfferToUser(
String userId, {
required RTCSessionDescription data,
}) async {
try {
Map<String, dynamic> dataToBeSent = {
"sdp": data.toMap(),
"recipientId": userId,
"callerId": _socketService.user.id,
"caller": _socketService.user.fullName,
};
final response = _socketService.emitThisEvent(
AudioServiceSocketEvents.makeCall.description,
dataToBeSent,
);
final recievedData = await response.first;
final recievedDataMap = recievedData.$1;
print("printing response from call offer");
// print(recievedDataMap);
if (recievedDataMap["response"]) {
final sdpAnswer = recievedDataMap["sdp"] as Map<String, dynamic>;
return RTCSessionDescription(
sdpAnswer["sdp"] as String?,
sdpAnswer["type"] as String?,
);
} else {
return null;
}
} catch (e) {
print("error with sending offer for call");
print(e.toString());
rethrow;
}
}
Future<bool> sendIceCandidate(
String userId,
RTCIceCandidate iceCandidate,
) async {
try {
print("about to start sending ice candidates");
final dataToBeSent = {
"recipientId": userId,
"candidate": iceCandidate.toMap(),
};
final result = _socketService.emitThisEvent(
AudioServiceSocketEvents.sendIceCandidates.description,
dataToBeSent,
);
final response = await result.first;
final responseMap = response.$1;
print("response from sending ice candidates:");
if (responseMap["isReady"]) {
return true;
}
return false;
} catch (e) {
print("error with sending ice candidates");
print(e.toString());
rethrow;
}
}
Stream<RTCIceCandidate> listenForIncomingIceCandidate() {
final controller = StreamController<RTCIceCandidate>.broadcast();
try {
_socketService
.listenForThisEvent(
AudioServiceSocketEvents.incomingIceCandidate.description,
)
.listen((event) async {
// print("got some ice candidate data");
final payload = event.$1;
final candidateData = payload["candidate"];
final candidate = RTCIceCandidate(
candidateData['candidate'] as String?,
candidateData['sdpMid'] as String?,
candidateData['sdpMLineIndex'] as int?,
);
controller.add(candidate);
});
} catch (e, s) {
print("error listening for incomming call offer");
print(e.toString());
// rethrow;
controller.addError(e, s);
}
return controller.stream;
}
}
I believe the steps outlined by @harith is what I'm doing here
but nothing seems to work...
from flutter-webrtc.
Try to make it simple at first without all Riverpod providers. Use a global private variable for peer connection, local and remote streams.
You can copy this code sample
https://github.com/md-weber/webrtc_tutorial
It has comments for guidance, replace parts with your own setup and more importantly try to make small functions and test each one then move forward.
from flutter-webrtc.
Related Issues (20)
- Error: TypeError: Instance of 'LinkedMap<String, dynamic>': type 'LinkedMap<String, dynamic>' is not a subtype of type 'JSObject' HOT 8
- Disable a connection Video or Audio HOT 2
- Windows Communication Detection Caused System Volume to be Lowered Significantly HOT 1
- Ios audio sometimes missing HOT 2
- IOS pull stream hw decode
- Regarding how to compile the WebRTC library by myself and replace it / 关于如何自己编译webrtc库并替换 HOT 4
- Need a method to set Audio Output device for web.
- How to set Audio Output Device for web?
- 报错:channel sent a message from native to Flutter on a non-platform thread HOT 1
- can`t get streams
- library "liblkjingle_peerconnection_so.so" not found HOT 1
- onRenegotiationNeeded callback no longer being called on Web HOT 1
- webrtc will drop frames on Linux, is there any way to speed it up?
- Failed to get the video stream
- HELP: How do I send the usb camera stream to webRTC ?
- Wifi and Mobile data | Not Connection HOT 1
- iOS: The pod "WebRTC-SDK" required by the plugin "flutter_webrtc" requires a higher minimum iOS deployment version
- [Android, IOS] - Android crash app, IOS throw error and not response
- The `onAddStream` callback is not fired on Web. HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from flutter-webrtc.