Giter VIP home page Giter VIP logo

client-sdk-react-native's Introduction

The LiveKit icon, the name of the repository and some sample code in the background.

livekit-react-native

Use this SDK to add real-time video, audio and data features to your React Native app. By connecting to a self- or cloud-hosted LiveKit server, you can quickly build applications like interactive live streaming or video calls with just a few lines of code.

Note

This is v2 of the React-Native SDK. When migrating from v1.x to v2.x you might encounter a small set of breaking changes. Read the migration guide for a detailed overview of what has changed.

Installation

NPM

npm install @livekit/react-native @livekit/react-native-webrtc

Yarn

yarn add @livekit/react-native @livekit/react-native-webrtc

This library depends on @livekit/react-native-webrtc, which has additional installation instructions found here:


Once the @livekit/react-native-webrtc dependency is installed, one last step is needed to finish the installation:

Android

In your MainApplication.java file:

Java

import com.livekit.reactnative.LiveKitReactNative;
import com.livekit.reactnative.audio.AudioType;

public class MainApplication extends Application implements ReactApplication {

  @Override
  public void onCreate() {
    // Place this above any other RN related initialization
    // When AudioType is omitted, it'll default to CommunicationAudioType.
    // Use MediaAudioType if user is only consuming audio, and not publishing.
    LiveKitReactNative.setup(this, new AudioType.CommunicationAudioType());

    //...
  }
}

Or in your MainApplication.kt if you are using RN 0.73+

Kotlin

import com.livekit.reactnative.LiveKitReactNative
import com.livekit.reactnative.audio.AudioType

class MainApplication : Application, ReactApplication() {
  override fun onCreate() {
    // Place this above any other RN related initialization
    // When AudioType is omitted, it'll default to CommunicationAudioType.
    // Use MediaAudioType if user is only consuming audio, and not publishing.
    LiveKitReactNative.setup(this, AudioType.CommunicationAudioType())

    //...
  }
}

iOS

In your AppDelegate.m file:

#import "LivekitReactNative.h"

@implementation AppDelegate

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
  // Place this above any other RN related initialization
  [LivekitReactNative setup];

  //...
}

Expo

LiveKit is available on Expo through development builds. You can find our Expo plugin and setup instructions here.

Example app

You can try our standalone example app here.

Usage

In your index.js file, setup the LiveKit SDK by calling registerGlobals(). This sets up the required WebRTC libraries for use in Javascript, and is needed for LiveKit to work.

import { registerGlobals } from '@livekit/react-native';

// ...

registerGlobals();

In your app, wrap your component in a LiveKitRoom component, which manages a Room object and allows you to use our hooks to create your own real-time video/audio app.

import * as React from 'react';
import {
  StyleSheet,
  View,
  FlatList,
  ListRenderItem,
} from 'react-native';
import { useEffect } from 'react';
import {
  AudioSession,
  LiveKitRoom,
  useTracks,
  TrackReferenceOrPlaceholder,
  VideoTrack,
  isTrackReference,
  registerGlobals,
} from '@livekit/react-native';
import { Track } from 'livekit-client';

const wsURL = "wss://example.com"
const token = "your-token-here"

export default function App() {
  // Start the audio session first.
  useEffect(() => {
    let start = async () => {
      await AudioSession.startAudioSession();
    };

    start();
    return () => {
      AudioSession.stopAudioSession();
    };
  }, []);

  return (
    <LiveKitRoom
      serverUrl={wsURL}
      token={token}
      connect={true}
      options={{
        // Use screen pixel density to handle screens with differing densities.
        adaptiveStream: { pixelDensity: 'screen' },
      }}
      audio={true}
      video={true}
    >
      <RoomView />
    </LiveKitRoom>
  );
};

const RoomView = () => {
  // Get all camera tracks.
  // The useTracks hook grabs the tracks from LiveKitRoom component
  // providing the context for the Room object.
  const tracks = useTracks([Track.Source.Camera]);

  const renderTrack: ListRenderItem<TrackReferenceOrPlaceholder> = ({item}) => {
    // Render using the VideoTrack component.
    if(isTrackReference(item)) {
      return (<VideoTrack trackRef={item} style={styles.participantView} />)
    } else {
      return (<View style={styles.participantView} />)
    }
  };

  return (
    <View style={styles.container}>
      <FlatList
        data={tracks}
        renderItem={renderTrack}
      />
    </View>
  );
};

const styles = StyleSheet.create({
  container: {
    flex: 1,
    alignItems: 'stretch',
    justifyContent: 'center',
  },
  participantView: {
    height: 300,
  },
});

API documentation is located here.

Additional documentation for the LiveKit SDK can be found at https://docs.livekit.io/

Audio sessions

As seen in the above example, we've introduced a class AudioSession that helps to manage the audio session on native platforms. This class wraps either AudioManager on Android, or AVAudioSession on iOS.

You can customize the configuration of the audio session with configureAudio.

Android

Media playback

By default, the audio session is set up for bidirectional communication. In this mode, the audio framework exhibits the following behaviors:

  • The volume cannot be reduced to 0.
  • Echo cancellation is available and is enabled by default.
  • A microphone indicator can be displayed, depending on the platform.

If you're leveraging LiveKit primarily for media playback, you have the option to reconfigure the audio session to better suit media playback. Here's how:

useEffect(() => {
  let connect = async () => {
    // configure audio session prior to starting it.
    await AudioSession.configureAudio({
      android: {
        // currently supports .media and .communication presets
        audioTypeOptions: AndroidAudioTypePresets.media,
      },
    });
    await AudioSession.startAudioSession();
    await room.connect(url, token, {});
  };
  connect();
  return () => {
    room.disconnect();
    AudioSession.stopAudioSession();
  };
}, [url, token, room]);

Customizing audio session

Instead of using our presets, you can further customize the audio session to suit your specific needs.

await AudioSession.configureAudio({
  android: {
    preferredOutputList: ['earpiece'],
    // See [AudioManager](https://developer.android.com/reference/android/media/AudioManager)
    // for details on audio and focus modes.
    audioTypeOptions: {
      manageAudioFocus: true,
      audioMode: 'normal',
      audioFocusMode: 'gain',
      audioStreamType: 'music',
      audioAttributesUsageType: 'media',
      audioAttributesContentType: 'unknown',
    },
  },
});
await AudioSession.startAudioSession();

iOS

For iOS, the most appropriate audio configuration may change over time when local/remote audio tracks publish and unpublish from the room. To adapt to this, the useIOSAudioManagement hook is advised over just configuring the audio session once for the entire audio session.

Screenshare

Enabling screenshare requires extra installation steps:

Android

Android screenshare requires a foreground service with type mediaProjection to be present.

The example app uses @supersami/rn-foreground-service for this.

Add the following permissions to your AndroidManifest.xml file:

<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MEDIA_PROJECTION" />

Declare the the service and ensure it's labelled a mediaProjection service like so:

<service android:name="com.supersami.foregroundservice.ForegroundService" android:foregroundServiceType="mediaProjection" />
<service android:name="com.supersami.foregroundservice.ForegroundServiceTask" />

Once setup, start the foreground service prior to using screenshare.

iOS

iOS screenshare requires adding a Broadcast Extension to your iOS project. Follow the integration instructions here:

https://jitsi.github.io/handbook/docs/dev-guide/dev-guide-ios-sdk/#screen-sharing-integration

It involves copying the files found in this sample project to your iOS project, and registering a Broadcast Extension in Xcode.

It's also recommended to use CallKeep, to register a call with CallKit (as well as turning on the voip background mode). Due to background app processing limitations, screen recording may be interrupted if the app is restricted in the background. Registering with CallKit allows the app to continue processing for the duration of the call.

Once setup, iOS screenshare can be initiated like so:

const screenCaptureRef = React.useRef(null);
const screenCapturePickerView = Platform.OS === 'ios' && (
  <ScreenCapturePickerView ref={screenCaptureRef} />
);
const startBroadcast = async () => {
  if (Platform.OS === 'ios') {
    const reactTag = findNodeHandle(screenCaptureRef.current);
    await NativeModules.ScreenCapturePickerViewManager.show(reactTag);
    room.localParticipant.setScreenShareEnabled(true);
  } else {
    room.localParticipant.setScreenShareEnabled(true);
  }
};

return (
  <View style={styles.container}>
    /*...*/ // Make sure the ScreenCapturePickerView exists in the view tree.
    {screenCapturePickerView}
  </View>
);

Note

You will not be able to publish camera or microphone tracks on iOS Simulator.

Troubleshooting

Cannot read properties of undefined (reading 'split')

This error could happen if you are using yarn and have incompatible versions of dependencies with livekit-client.

To fix this, you can either:

  • use another package manager, like npm
  • use yarn-deduplicate to deduplicate dependencies

Contributing

See the contributing guide to learn how to contribute to the repository and the development workflow.

License

Apache License 2.0


LiveKit Ecosystem
Real-time SDKsReact Components ยท JavaScript ยท iOS/macOS ยท Android ยท Flutter ยท React Native ยท Rust ยท Python ยท Unity (web) ยท Unity (beta)
Server APIsNode.js ยท Golang ยท Ruby ยท Java/Kotlin ยท Python ยท Rust ยท PHP (community)
Agents FrameworksPython ยท Playground
ServicesLivekit server ยท Egress ยท Ingress ยท SIP
ResourcesDocs ยท Example apps ยท Cloud ยท Self-hosting ยท CLI

client-sdk-react-native's People

Contributors

b0iq avatar davidliu avatar davidzhao avatar dsa avatar lukasio avatar ocupe avatar radko93 avatar wjaykim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

client-sdk-react-native's Issues

New participants not showing up in room

I've downloaded the example app and setup a local docker server.
The example app is running on 2 separate devices. I use the same url and a different token for each device.

However, the devices don't update the session with the new participant. I get the following error, across both devices.

could not get connected server address {"error": [Error: PeerConnection ID not found]}

Could you let me know what steps need to be taken in order to make this work?

Regarding a code example, again it's using the example app here (no code changed), and the docker instructions from the livekit blog.

I assume there's more config to the room that needs to happen to add more participants?

Failure to compile for Android when starting from a blank project

Hi there,

I set up a new project using the instructions here: https://reactnative.dev/docs/environment-setup . Tried running the app with yarn android to make sure everything is working as expected.

Then, after adding the livekit libraries to the project:

yarn add https://github.com/livekit/react-native-webrtc.git#dl/wip-transceiver
yarn add https://github.com/livekit/client-sdk-react-native

When I try to run with yarn android, I get the following errors

* What went wrong:
Execution failed for task ':livekit-react-native:compileDebugKotlin'.
> Compilation error. See log for more details

With a number of messages like this:

: (20, 15): Class 'kotlin.Unit' was compiled with an incompatible version of Kotlin. The binary version of its metadata is 1.6.0, expected version is 1.1.15.
The class is loaded from /Users/phillip/.gradle/caches/transforms-3/b4bd8b2d8b84f89e1b1eb3315da16eb4/transformed/jetified-kotlin-stdlib-1.6.10.jar!/kotlin/Unit.class
e: /Users/phillip/Projects/test/node_modules/livekit-react-native/android/src/main/java/com/livekit/reactnative/LivekitReactNativePackage.kt: (11, 16): Unresolved reference: listOf
e: /Users/phillip/test/node_modules/livekit-react-native/android/src/main/java/com/livekit/reactnative/LivekitReactNativePackage.kt: (15, 16): Unresolved reference: emptyList

Let me know if there's a step I'm missing, or any other logs would be helpful!

[livekit/react-native-webrtc] git library history big objects remove

The react-native-webrtc library so large , because of webrtc library historyใ€‚
[livekit/react-native-webrtc] is a fork๏ผŒI cannot put issues๏ผŒ so i write thereใ€‚

this is git bash out:
$ git rev-list --objects --all | grep "$(git verify-pack -v .git/objects/pack/*.idx | sort -k 3 -n | tail -5 | awk '{print$1}')"
a08059db5ed179e118b3517a7f2b756f886ec2b7 ios/WebRTC.framework/WebRTC
c15c0df402d0fd94d5725be4fc7380e09769604b ios/WebRTC.framework/WebRTC
46d395efa2e2ceab9c2057fd14da9bf2d6c1530b ios/WebRTC.framework/WebRTC
145102ef6cb1f51ac7d4b4f565a796445135347e ios/WebRTC.framework/WebRTC
dce70a686b0394b7aa767726f7bd73ef89fb964d ios/libjingle_peerconnection/libWebRTC.a

peer connection is null when other participant connected

Able to connect room but when other participant is joining its giving
unpublishing track
rn-webrtc:pc:DEBUG 0 createOffer +120ms
LOG rn-webrtc:pc:DEBUG 0 createOffer ERROR +4ms
WARN Possible Unhandled Promise Rejection (id: 0):
"peerConnection is null"

working on web properly but not in react native android

1)connecting calling token api
2)connected to room
3)when other participant try to join its giving above error

"react": "18.2.0",
"react-native": "0.71.2",
"@livekit/react-native": "^0.3.0",
"livekit-client": "^1.3.2",
"react-native-webrtc": "106.0.0",

Microphone requested and by default enabled for viewer or listener

Describe the bug

So I'm joining a room from my other phone as viewer who cannot publish and tracks. The problem is the viewer microphone enabled and used all the time!

To Reproduce

  1. Create new room
  2. Join as a host (Live Streaming)
  3. Then join as viewer

Expected behavior

The viewer microphone shall not be requested or enabled if he is not going to publish any thing

Screenshots

Viewer's Screenshot
Screenshot 2023-04-12 at 00 09 08
publisher's Screenshot
IMG_8CE62FB51764-1

Device Info:

Viewer: iPhone 11 [16.1.2]
Host: iPhone 14 [16.4.1]

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: ^0.3.0
  • livekit-client: ^1.7.1
  • react-native-webrtc: ^111.0.0

When AudioSession.startAudioSession() called the app crashes without errors

Describe the bug
When the AudioSession.startAudioSession() is called the app crashes and closes without any error in the development.

To Reproduce

Following code I added to make output to default to speaker

const room = await new Room({ publishDefaults: { simulcast: false }, adaptiveStream: true })
AudioSession.startAudioSession();
await AudioSession.selectAudioOutput("speaker");
room.connect(......) // connecting to a room

When I tried to console the AudioOutput by
console.log('Available Outputs: ', await AudioSession.getAudioOutputs());

It returns empty array
That means no speaker or earpiece available?

Eventhough, by default both Android & iOS Audio Output set to speaker the doesn't works still the default is by earpiece

Any perquisite configurations needs to be done in android folder?

Device Info (please complete the following information):

  • Device: Samsung Galaxy M13

  • OS: Android 12

  • React Version: 17.0.1

  • "react-native": "0.64.2"

I get error while connecting to server / connecting to the room

image

this is the useEffect code

`useEffect(() => {
console.log('1. Connecting to ROOM....', token);
let url = getLiveKitSocket();
console.log(url, 'url');
if (token !== undefined) {
console.log('2. Connecting to ROOM....', token);
let connect = async () => {
await AudioSession.configureAudio({
android: {
preferredOutputList: ['speaker'],
},
ios: {
defaultOutput: 'speaker',
},
});
await AudioSession.startAudioSession();

    room.connect(url, token, {}).catch(err => {
      console.error('Error Connecting to Room', err);
    });
    await room.localParticipant.setMicrophoneEnabled(true);
    // await room.localParticipant.enableCameraAndMicrophone();
  };
  connect();

  return () => {
    room.disconnect();
    AudioSession.stopAudioSession();
  };
} else {
  console.log('NO TOKEN WHILE ROOM CONNECTION');
}

}, [token, room]);`

but i think room is getting connected but there is issue on participants

const {cameraPublication, microphonePublication, isSpeaking} = useParticipant(
room.localParticipant,
);

Unable to initialize livekit on expo

Describe the bug

After yarn add @livekit/react-native and setting registerGlobals(), I get an error

Your JavaScript code tried to access a native module that doesn't exist in this development client

Screenshots
IMG_2566

Dependencies Info (please reference your package-lock.json or yarn.lock file):

"@livekit/react-native": "^0.3.0",
"react-native-webrtc": "^106.0.7",
"livekit-client": "^1.7.0",

iOS: Video stream does not resume when app is brought back into the foreground

Describe the bug

When a video call is in progress, the video stream for a participant on iOS will freeze for other participants if the app is backgrounded. When the iOS user brings the app back into the foreground, the video stream does not resume and remains frozen for the all other participants. The local monitor stream for the user on iOS will continue to work as expected and audio is not interrupted at any time. There are also no track/participant/room events indicating that anything is interrupted nor are there any relevant log entries at the debug level.

To Reproduce

Steps to reproduce the behavior:

  1. Connect to a room in the Example app
  2. Wait for another user to connect
  3. Put the app into the background on a device running iOS
  4. Bring the app back into the foreground
  5. The other participants will still see frozen video from the iOS user

Expected behavior

For the video stream to show as unavailable when the app is backgrounded (if it is not possible to remain streaming on iOS) and to resume streaming when the app is placed back into the foreground.

Screenshots

N/A

Device Info:

  • Device: iPhone 6s Plus, iPhone 14 Pro
  • OS: iOS 15.7.4, iOS 16.4.1

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: 1.0.0
  • livekit-client: 1.8.0
  • react-native-webrtc: 111.0.0

This issue is NOT present with the following older dependencies:

  • @livekit/react-native: github:livekit/client-sdk-react-native#5ccd65b01e4ea591e10dac6cd2334d2ff6b60e34
  • livekit-client: 1.6.0
  • react-native-webrtc: 106.0.0

Issue with track subscription on iOS

Hi

The following error is logged while joining

unable to subscribe because MediaStreamTrack is ended. Do not call MediaStreamTrack.stop()

The video view is black but can hear audio from the remote track.

client-sdk-react-native -> latest
react-native-webrtc -> 106.0.0-beta.5
iOS -> 16.1
react-native -> 0.69.5

rolling back to react-native-webrtc to https://github.com/livekit/react-native-webrtc.git#dl/wip-transceiver worked, no longer gave any errors, and was able to subscribe to tracks.
Tried using a couple of other beta versions of react-native-webrtc, the error did not appear but the issue persists.

Thank you for the great work.

Can't see self camera picture on android app

I add modules by
yarn add @livekit/react-native @livekit/react-native-webrtc

when the @livekit/react-native-webrtc version is 104.0.1, there is an error on opening camera.
[Error: Exception in HostFunction: expected 0 arguments, got 1]

I track the source of the error๏ผŒbecause of RTCRtpSender.getCapabilities(kind)
open camera called it at ใ€RTCEngine.tsใ€‘- ใ€setPreferredCodecใ€‘https://github.com/livekit/client-sdk-js/blob/main/src/room/RTCEngine.ts#L609

@livekit/react-native-webrtc version 104.0.0 is ok.

I find RTCRtpSender.getCapabilities changed in 104.0.1

Android build not works in dev

When tried to build the app in dev mode with following dependencies.

"@livekit/react-native": "^0.3.0", "react-native-webrtc": "^106.0.0"

The app build successfully, but when I opened in the mobile

Screenshot from 2023-01-21 21-01-09

Streaming quality is not as good as `meet.livekit.io`

Describe the bug

The streaming quality is weird I tried several methods but still receiving quality is not same meet.livekit.io

To Reproduce

Steps to reproduce the behavior:

Room configs:

const room = new Room({
  adaptiveStream: {
    pixelDensity: 'screen',
  },
  publishDefaults: {
    videoSimulcastLayers: [VideoPresets.h720, VideoPresets.h1080],
    simulcast: false,
  },
  // optimize publishing bandwidth and CPU for published tracks
  dynacast: false,
  videoCaptureDefaults: {
    facingMode: 'user',
    resolution: VideoPresets.h720, // Cannot set it more than that!! 1080 and above will not work
  },
});

// -------- SETTING THE QUALITY 
trackStreamStateChanged(
  publication: RemoteTrackPublication,
  streamState: Track.StreamState,
  participant: RemoteParticipant
) {
  publication.setVideoQuality(2);
},

// -------- RENDERING REMOTE VIDEO
const RemoteVideo = () => {
  const remote = useLiveStore((state) => state.remoteVideo);
  if (!remote || remote.isMuted) return <ZLoading backgroundColor="red" zIndex={1} />;
  return (
    <VideoView
      objectFit="contain"
      videoTrack={remote}
      mirror={false}
      style={{ flex: 1, backgroundColor: 'red' }}
    />
  );
};

room is connected to cloud server...

Expected behavior

There problem is I cannot check the video quality only the connection quality and it is excellent
but it should be better than meet.livekit.io because it is native

Screenshots

web viewer web publisher
IMG_6326 IMG_0863
app viewer app publisher
IMG_6327 IMG_0864

.
.
.
.
.

Most of the time the quality is worst, However, this is as highest as I can get

.
.
.
.
.

Device Info:

Viewer: iPhone 11 [16.1.2]
Host: iPhone 14 [16.4.1]

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: ^0.3.0
  • livekit-client: ^1.7.1
  • react-native-webrtc: ^111.0.0
  • Server region is in germany and am located it Dubai

Reported connectionQuality value from useParticipant never improves

Describe the bug

connectionQuality returned by useParticipant will never recover from "poor" even if the connection improves.

Example session: https://cloud.livekit.io/projects/p_3tux3vr78ux/sessions/RM_mUW6ymhUZXo9/participants/PA_AZcvSwaWa2rb

To Reproduce

Steps to reproduce the behavior:

  1. Observe the connectionQuality value returned by useParticipant
  2. Wait for connectionQuality to degrade to poor
  3. Wait for the affected client's actual connection quality to improve
  4. See that the connectionQuality value will never change even if the actual connection quality changes

Expected behavior

Reported connection quality should improve as the client's connection improves. As you can in the screenshot below, the connection quality in the example session improves, framerate hovers around 30fps, but the improved connection quality is never reported to the publisher nor any subscribers.

Screenshots

lkquality

Device Info:

  • Device: Google Pixel 4, iPhone 6s, iPhone 14 Pro
  • OS: Android 13 (Build TP1A.221005.002.B2), iOS 15.7.5, iOS 16.2

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: 1.0.0
  • livekit-client: 1.8.0
  • react-native-webrtc: 111.0.0

The issue also occurs in older versions such as:

  • @livekit/react-native: github:livekit/client-sdk-react-native#5ccd65b01e4ea591e10dac6cd2334d2ff6b60e34
  • livekit-client: 1.6.0
  • react-native-webrtc: 106.0.0

Additional context

Manually subscribing to the ParticipantEvent.ConnectionQualityChanged event has the same issue.

Sound issues

Sound is not played from the main speaker. The sound is played from the speaker above the screen. How change audio output ?

yarn bootstrap ๆ‰ง่กŒๆŠฅ้”™

error /Users/only/Downloads/client-sdk-react-native-main/example/node_modules/react-native-webrtc: Command failed.
Exit code: 1
Command: node tools/downloadWebRTC.js
Arguments:
Directory: /Users/only/Downloads/client-sdk-react-native-main/example/node_modules/react-native-webrtc
Output:
Downloading https://github.com/jitsi/webrtc/releases/download/v94.0.0/WebRTC.xcframework.tgz...
node:internal/errors:692
const ex = new Error(msg);
^

Error: socket hang up
at connResetException (node:internal/errors:692:14)
at TLSSocket.socketOnEnd (node:_http_client:478:23)
at TLSSocket.emit (node:events:539:35)
at endReadableNT (node:internal/streams/readable:1345:12)
at processTicksAndRejections (node:internal/process/task_queues:83:21) {
code: 'ECONNRESET'
}
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.

Ping timeout in background

Describe the bug
Websocket client signal disconnected when react-native APP based on RN sdk goes from Android PIP(picture in picture) mode to normal mode after 20 seconds.
From our debugging, it is caused by the ping time out. So our theory is: when APP goes to Android PIP mode, the RN has gone into background mode, and the ping is not working, when RN goes back from PIP mode to normal, it times out then triggers disconnect.

Additional context
setTimeout and setInterval don't work in the background. Need to figure out workaround to avoid the need for these or work in background.

Participant does not have video track listed for other users even though it is published

Describe the bug

For some reason, some users (using mobile devices) do not have their video tracks displayed to other users. They can see their videos and other users' videos, but their video tracks are not displayed to others. Their audio tracks are listed correctly. What could be the cause of this? The publication of the participant's video track is not recognized in the room's event, only the publication of the audio track

To Reproduce

Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior

A clear and concise description of what you expected to happen.

Screenshots

If applicable, add screenshots to help explain your problem.

Device Info:

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: [e.g. commit hash]
  • livekit-client: [e.g. 1.6.0]
  • react-native-webrtc: [e.g. 106.0.0]

Additional context
Add any other context about the problem here.

Release build both Android and iOS crash the app

Describe the bug

Locally both the Android and iOS apps work perfectly but when moving to a release build when joining the room both apps (Android and iOS) crash the app.
The error is the same on both platforms:

2023-06-18 18:42:46.665396+0300 uHubs Dev[8610:509463] [javascript] TypeError: undefined is not an object (evaluating 'r(d[8]).Room')

and

CTFatalException: Unhandled JS Exception: TypeError: this.methodFactory is not a function. (In 'this.methodFactory(a,t,n)', 'this.methodFactory' is undefined)', reason: 'Unhandled JS Exception: TypeError: this.methodFactory is not a function. (In 'this.methodFactory(a,t,n)', 'this.methodFactory' is undefined),

To Reproduce

Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

(4) #Logcat ERROR

2023-06-18 18:24:42.067 27745-27813/? E/ReactNativeJS: TypeError: undefined is not an object (evaluating 'r(d[8]).Room')
    
    This error is located at:
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in PanGestureHandler
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in RCTView
        in Unknown
        in G
        in Unknown
        in RNSScreen
        in f
        in Unknown
        in M
        in P
        in Unknown
        in RNSScreenContainer
        in ScreenContainer
        in Unknown
        in D
        in T
        in Unknown
        in RCTView
        in Unknown
        in Unknown
        in H
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in PanGestureHandler
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in RCTView
        in Unknown
        in G
        in Unknown
        in RNSScreen
        in f
        in Unknown
        in M
        in P
        in Unknown
        in RNSScreenContainer
        in ScreenContainer
        in Unknown
        in D
        in T
        in Unknown
        in RCTView
        in Unknown
        in Unknown
        in H
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in k
        in RNSScreen
        in f
        in Unknown
        in M
        in P
        in Unknown
        in RNSScreenContainer
        in ScreenContainer
        in Unknown
        in H
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in PanGestureHandler
        in Unknown
        in RCTView
        in Unknown
        in f
        in Unknown
        in RCTView
        in Unknown
        in G
        in Unknown
        in RNSScreen
        in f
        in Unknown
        in M
        in P
        in Unknown
        in RNSScreenContainer
        in ScreenContainer
        in Unknown
        in D
        in T
        in Unknown
        in GestureHandlerRootView
        in Unknown
        in H
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in Unknown
        in c
        in RNCSafeAreaProvider
        in Unknown
        in s
        in C
        in n
        in Unknown
        in RCTView
        in Unknown
        in RCTView
        in Unknown
        in C
2023-06-18 18:24:42.070 27745-27813/? I/ReactNativeJS: { error: 
       { [TypeError: undefined is not an object (evaluating 'r(d[8]).Room')]
         componentStack: '\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in PanGestureHandler\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in RCTView\n    in Unknown\n    in G\n    in Unknown\n    in RNSScreen\n    in f\n    in Unknown\n    in M\n    in P\n    in Unknown\n    in RNSScreenContainer\n    in ScreenContainer\n    in Unknown\n    in D\n    in T\n    in Unknown\n    in RCTView\n    in Unknown\n    in Unknown\n    in H\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in PanGestureHandler\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in RCTView\n    in Unknown\n    in G\n    in Unknown\n    in RNSScreen\n    in f\n    in Unknown\n    in M\n    in P\n    in Unknown\n    in RNSScreenContainer\n    in ScreenContainer\n    in Unknown\n    in D\n    in T\n    in Unknown\n    in RCTView\n    in Unknown\n    in Unknown\n    in H\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in k\n    in RNSScreen\n    in f\n    in Unknown\n    in M\n    in P\n    in Unknown\n    in RNSScreenContainer\n    in ScreenContainer\n    in Unknown\n    in H\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in PanGestureHandler\n    in Unknown\n    in RCTView\n    in Unknown\n    in f\n    in Unknown\n    in RCTView\n    in Unknown\n    in G\n    in Unknown\n    in RNSScreen\n    in f\n    in Unknown\n    in M\n    in P\n    in Unknown\n    in RNSScreenContainer\n    in ScreenContainer\n    in Unknown\n    in D\n    in T\n    in Unknown\n    in GestureHandlerRootView\n    in Unknown\n    in H\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in Unknown\n    in c\n    in RNCSafeAreaProvider\n    in Unknown\n    in s\n    in C\n    in n\n    in Unknown\n    in RCTView\n    in Unknown\n    in RCTView\n    in Unknown\n    in C',
         isComponentError: true,
         line: 2257,
         column: 1065,
         sourceURL: 'index.android.bundle' } }
2023-06-18 18:24:42.840 27745-27813/? E/ReactNativeJS: TypeError: this.methodFactory is not a function. (In 'this.methodFactory(a,t,n)', 'this.methodFactory' is undefined)
    
    --------- beginning of crash
2023-06-18 18:24:42.850 27745-27814/? E/AndroidRuntime: FATAL EXCEPTION: mqt_native_modules
    Process: io.uhubs.development, PID: 27745
    com.facebook.react.common.JavascriptException: TypeError: this.methodFactory is not a function. (In 'this.methodFactory(a,t,n)', 'this.methodFactory' is undefined), stack:
    n@2258:1061
    setLevel@2258:2040
    r@2258:2577
    i@2258:2595
    <unknown>@2258:3019
    d@2258:3033
    <unknown>@2258:395714
    h@2:1585
    d@2:958
    <unknown>@2257:1058
    Nr@81:45962
    RoomPage@2257:1034
    wr@81:42144
    Hl@81:89901
    Ba@81:79143
    Oa@81:79045
    ja@81:78810
    Ia@81:75972
    Ia@-1
    <unknown>@81:26350
    unstable_runWithPriority@144:3806
    ht@81:26297
    pt@81:26232
    Pa@81:73345
    Vr@81:47944
    Vr@-1
    <unknown>@903:465
    <unknown>@914:548
    <unknown>@914:548
    <unknown>@914:548
    <unknown>@920:747
    t@918:289
    k@916:477
    <unknown>@916:611
    <unknown>@2113:2520
    generatorResume@-1
    n@281:68
    v@281:279
    f@92:154
    <unknown>@92:863
    p@98:497
    b@98:895
    callImmediates@98:2990
    value@48:2779
    <unknown>@48:937
    value@48:2459
    value@48:907
    value@-1
    value@-1
    
        at com.facebook.react.modules.core.ExceptionsManagerModule.reportException(ExceptionsManagerModule.java:83)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.facebook.react.bridge.JavaMethodWrapper.invoke(JavaMethodWrapper.java:372)
        at com.facebook.react.bridge.JavaModuleWrapper.invoke(JavaModuleWrapper.java:151)
        at com.facebook.react.bridge.queue.NativeRunnable.run(Native Method)
        at android.os.Handler.handleCallback(Handler.java:942)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at com.facebook.react.bridge.queue.MessageQueueThreadHandler.dispatchMessage(MessageQueueThreadHandler.java:27)
        at android.os.Looper.loopOnce(Looper.java:226)
        at android.os.Looper.loop(Looper.java:313)
        at com.facebook.react.bridge.queue.MessageQueueThreadImpl$4.run(MessageQueueThreadImpl.java:226)
        at java.lang.Thread.run(Thread.java:1012)

Expected behavior

To be able to start the room w/o any issue

Screenshots

If applicable, could you add screenshots to help explain your problem?

Device Info:

  • Product name: Samsung Galaxy S20 5G
  • Model name: SM-G981U1

Dependencies Info (please reference your package-lock.json or yarn.lock file):

"dependencies": {
        "@codler/react-native-keyboard-aware-scroll-view": "^1.0.1",
        "@fingerprintjs/fingerprintjs": "^3.0.6",
        "@flyerhq/react-native-link-preview": "^1.6.0",
        "@fortawesome/fontawesome-svg-core": "^1.2.32",
        "@fortawesome/free-brands-svg-icons": "^5.15.1",
        "@fortawesome/free-regular-svg-icons": "^5.15.1",
        "@fortawesome/free-solid-svg-icons": "^5.15.1",
        "@fortawesome/react-native-fontawesome": "^0.2.6",
        "@gorhom/bottom-sheet": "^2",
        "@livekit/react-native": "^1.1.2",
        "@livekit/react-native-webrtc": "^104.0.1",
        "@react-native-async-storage/async-storage": "^1.14.1",
        "@react-native-community/art": "^1.2.0",
        "@react-native-community/cli": "^5.0.1",
        "@react-native-community/cli-debugger-ui": "5.0.1",
        "@react-native-community/clipboard": "^1.5.1",
        "@react-native-community/datetimepicker": "^3.4.6",
        "@react-native-community/masked-view": "^0.1.10",
        "@react-native-community/netinfo": "^6.0.0",
        "@react-native-community/picker": "^1.8.1",
        "@react-native-community/push-notification-ios": "^1.10.1",
        "@react-navigation/bottom-tabs": "^5.11.11",
        "@react-navigation/compat": "^5.3.15",
        "@react-navigation/material-bottom-tabs": "^5.3.15",
        "@react-navigation/native": "^5.9.4",
        "@react-navigation/stack": "^5.14.5",
        "@stripe/stripe-react-native": "^0.22.0",
        "@voximplant/react-native-foreground-service": "^3.0.1",
        "add": "^2.0.6",
        "axios": "^0.21.0",
        "babel-plugin-module-resolver": "^4.0.0",
        "buffer": "^6.0.3",
        "date-fns": "^2.28.0",
        "device-uuid": "^1.0.4",
        "final-form": "^4.20.2",
        "final-form-arrays": "^3.0.2",
        "jetifier": "^2.0.0",
        "js-base64": "^3.6.0",
        "json5": "^2.2.0",
        "lodash": "^4.17.20",
        "metro-config": "^0.66.1",
        "moment": "^2.29.1",
        "public-ip": "^4.0.3",
        "qs": "^6.9.3",
        "query-string": "^7.0.1",
        "react": "17.0.2",
        "react-devtools": "^4.14.0",
        "react-devtools-core": "^4.14.0",
        "react-final-form": "^6.5.3",
        "react-native": "^0.64.1",
        "react-native-add-calendar-event": "^4.0.0",
        "react-native-autocomplete-input": "^5.0.2",
        "react-native-autolink": "^4.0.0",
        "react-native-calendars": "^1.1249.0",
        "react-native-chip-view": "^0.0.12",
        "react-native-complete-mentions": "^1.0.9",
        "react-native-config": "^1.4.1",
        "react-native-controlled-mentions": "^2.2.5",
        "react-native-datepicker": "^1.7.2",
        "react-native-device-info": "^8.0.1",
        "react-native-dialog": "^9.3.0",
        "react-native-document-picker": "^5.0.3",
        "react-native-drawer": "^2.5.1",
        "react-native-elements": "3.4.2",
        "react-native-email-link": "^1.10.0",
        "react-native-fast-image": "^8.3.6",
        "react-native-file-viewer": "^2.1.4",
        "react-native-fs": "^2.20.0",
        "react-native-gesture-handler": "^1.10.0",
        "react-native-gifted-chat": "^0.16.3",
        "react-native-image-crop-picker": "^0.36.0",
        "react-native-image-picker": "^4.10.2",
        "react-native-image-progress": "^1.1.1",
        "react-native-image-slider": "^2.0.3",
        "react-native-image-slider-box": "^1.0.12",
        "react-native-image-slider-show": "^1.0.3",
        "react-native-keyboard-aware-scroll-view": "^0.9.3",
        "react-native-keychain": "^7.0.0",
        "react-native-linear-gradient": "^2.5.6",
        "react-native-maps": "^0.28.0",
        "react-native-onesignal": "^4.3.1",
        "react-native-paper": "^4.7.2",
        "react-native-permissions": "^3.1.0",
        "react-native-picker-select": "^8.0.2",
        "react-native-popup-dialog": "^0.18.3",
        "react-native-reanimated": "^1.13.2",
        "react-native-render-html": "^5.0.1",
        "react-native-safe-area-context": "^3.1.9",
        "react-native-screens": "3.10.2",
        "react-native-shared-element": "^0.8.2",
        "react-native-shortcut-badge": "^0.1.0-beta.2",
        "react-native-simple-toast": "^1.1.3",
        "react-native-skeleton-placeholder": "^4.0.0",
        "react-native-snap-carousel": "^4.0.0-beta.5",
        "react-native-splash-screen": "^3.2.0",
        "react-native-status-bar-height": "^2.6.0",
        "react-native-svg": "^12.1.0",
        "react-native-svg-transformer": "^0.14.3",
        "react-native-swipe-list-view": "^3.2.5",
        "react-native-toast-message": "2.1.5",
        "react-native-url-polyfill": "^1.3.0",
        "react-native-url-preview": "^1.1.9",
        "react-native-vector-icons": "^8.0.0",
        "react-native-video": "^5.1.0-alpha8",
        "react-native-webview": "^11.18.1",
        "react-navigation-slide-from-right-transition": "^1.0.4",
        "react-redux": "^7.2.2",
        "react-tiny-link": "^3.6.1",
        "redux": "^4.0.5",
        "redux-actions": "^2.6.5",
        "redux-enhancer-react-native-appstate": "^0.3.1",
        "redux-logger": "^3.0.6",
        "redux-persist": "^6.0.0",
        "redux-persist-transform-filter": "^0.0.20",
        "redux-saga": "^1.1.3",
        "redux-thunk": "^2.3.0",
        "reselect": "^4.0.0",
        "rn-fetch-blob": "^0.12.0",
        "rollbar-react-native": "^0.9.2",
        "sync-storage": "^0.4.2",
        "url-parse": "^1.5.3",
        "xregexp": "^5.0.1",
        "yarn": "^1.22.10"
    },
    "devDependencies": {
        "@babel/core": "^7.17.9",
        "@babel/runtime": "^7.17.9",
        "@react-native-community/eslint-config": "3.0.1",
        "@types/jest": "^27.4.1",
        "@types/lodash": "^4.14.181",
        "@types/qs": "^6.9.7",
        "@types/react": "^18.0.5",
        "@types/react-native": "0.67.4",
        "@types/react-native-autocomplete-input": "^5.0.0",
        "@types/react-native-datepicker": "^1.7.1",
        "@types/react-native-popup-dialog": "^0.16.3",
        "@types/react-native-snap-carousel": "^3.8.4",
        "@types/react-native-video": "^5.0.13",
        "@types/react-native-webrtc": "^1.75.5",
        "@types/react-redux": "^7.1.24",
        "@types/react-test-renderer": "^18.0.0",
        "@types/redux-actions": "^2.6.2",
        "@types/redux-immutable-state-invariant": "^2.1.2",
        "@types/redux-logger": "^3.0.9",
        "@types/url-parse": "^1.4.8",
        "babel-jest": "^27.5.1",
        "eslint": "8.13.0",
        "jest": "^27.5.1",
        "metro-react-native-babel-preset": "^0.70.1",
        "react-native-iap": "8.0.8",
        "react-test-renderer": "18.0.0",
        "redux-immutable-state-invariant": "^2.1.0",
        "redux-promise-middleware": "^6.1.2",
        "typescript": "^4.1.4"
    },

Additional context
Please feel free to add any other context about the problem here.

iOS: Dev works errors happens when archived

Describe the bug
The development is working good as expected but when we tried to archive the iOS version following error occurs.

image

To Reproduce
Build the iOS app and archive it

Expected behavior
The app should be able to archive successfully

No any issues with mobile devices. The archive couldn't build to publish in the Testflight

Additional context
We are using https://github.com/livekit/react-native-webrtc#dl/wip-transceiver as react-native-webrtc.

react-native-webrtc/react-native-webrtc#910: The issue we found in the react-native-webrtc GitHub page.
We tried remove Pod File and reinstalled the Pod

The coco-pod version is 1.11 (Latest stable version)

FATAL EXCEPTION: EglRenderer

Describe the bug
My application is one-to-one, but different from the traditional one, the users' video alternates, that is, the first user speaks for 30 seconds and only his video and audio is displayed, then the second user speaks for 30 seconds and only its audio and video is displayed.

The approach I took was to not display the VideoView, I also tried to set the width and height to 0, it works once, but then it has some errors and then it crashes.

One message that appears is visibility resize observer not triggered. Am I using the proper approach? Or should I remove the track from the user and then subscribe to it again?

To Reproduce
Try switching between videos, not playing them and then playing them normally.
Remembering that the video is 100% width and height.

Expected behavior
The videos should alternate smoothly considering what is said in the documentation, that when the video does not appear it is "paused".

Screenshots
FATAL EXCEPTION: EglRenderer

android.opengl.GLException: Failed to create window surface: 0x3000

FATAL EXCEPTION: EglRenderer
Process: br.com.simplevideo, PID: 32500
android.opengl.GLException: Failed to create window surface: 0x3000
at org.webrtc.EglBase14Impl.createSurfaceInternal(EglBase14Impl.java:107)
at org.webrtc.EglBase14Impl.createSurface(EglBase14Impl.java:85)
at org.webrtc.EglRenderer$EglSurfaceCreation.run(EglRenderer.java:76)
at android.os.Handler.handleCallback(Handler.java:790)
at android.os.Handler.dispatchMessage(Handler.java:99)
at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
at android.os.Looper.loop(Looper.java:164)
at android.os.HandlerThread.run(HandlerThread.java:65)

Device Info (please complete the following information):

expo is not working in version "@livekit/react-native": "^1.1.2" but works in ^1.0.0

Describe the bug

This library does not work with Expo if the version is ^1.1.2.

To Reproduce

Steps to reproduce the behavior:

  1. follow these instruction https://github.com/livekit/client-sdk-react-native/wiki/Expo-Development-Build-Instructions
  2. run eas build --profile development --local (iOS target)
  3. See error

Expected behavior

@livekit/react-native is available in expo in latest version

Screenshots

Simulator Screenshot - iPhone 14 - 2023-06-29 at 17 56 51

Screenshot 2023-06-29 at 17 57 01

Device Info:

  • Device: iPhone Simulator 14
  • OS: iOS17 (beta)

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: ^1.1.2
  • react-native-webrtc: ^111.0.1

And this library looks like currently not work with @livekit/react-native-webrtc (expo environment)

got following error

[INSTALL_PODS] [!] The 'Pods-test' target has frameworks with conflicting names: webrtc.xcframework.

Audio on android forced to system speaker

Describe the bug

When connected to a LiveKit room, all audio is routed to the system speaker even though a bluetooth headset is connected.

To Reproduce

Steps to reproduce the behavior:

  1. Connect AirPods to Pixel 6a.
  2. Play a YouTube video and confirm that audio is coming through AirPods.
  3. Join a LiveKit room using sample app.
  4. Notice that LiveKit audio is coming through speaker instead of AirPods.
  5. Switch back to YouTube and now notice that audio from YouTube is also coming through speaker.
  6. Leave LiveKit room.
  7. Switch back to YouTube again and now notice that audio is coming through AirPods once again.

Expected behavior

In steps 4 and 5 above, audio should be routed to the AirPods and not the system speakers. This does not repro on iOS. On iOS, all audio is routed to the AirPods, as expected.

Screenshots

If applicable, add screenshots to help explain your problem.

Device Info:

  • Device: Pixel 6a
  • OS: Android 13

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • livekit-react-native: 4f29ade
  • livekit-client: [e.g. 1.6.0]
  • react-native-webrtc: [e.g. 106.0.0]

Additional context
Add any other context about the problem here.

Issue when switching Camera on iOS

Hi

I am trying to implement livekit on iOS and having some trouble getting the camera switch working right. Below is my code to switch camera

to get get available cameras

import {
  mediaDevices,
} from "react-native-webrtc";
 mediaDevices.enumerateDevices().then((devices) => {
            console.log(devices);
          });

to switch camera

room
        .switchActiveDevice("videoinput",deviceId)
        .then(() => {
          console.log("switched active device");
        })
        .catch((e) => {
          console.log(e);
        });

Now, when I try to switch, it works sometimes, like 1 in 10 tries and when it switches, it takes a lot of time to do so.
There are no errors on the console.

Am I doing this right? I tried different methods with 'react-native-webrtc', but no result

I am not sure if this is a bug or my logic. Thank you in advance for your help.

Connectivity issues on dual-sim devices

Describe the bug

A user has reported being unable to connect to LiveKit using RN SDK from dual SIM devices when on cellular. The issue goes away when they switch to WiFi or takes out one of the SIM cards.

Livekit Cloud
livekit-react-native: 5ccd65b
livekit-client: 1.6.0
react-native-webrtc: 106.0.0

Unable to connect room in iOS

Describe the bug

Hi, when connecting to room, following warning log comes out and fails to connect to room only in iOS.

websocket closed {"ev": {"code": 1006, "isTrusted": false, "reason": "Received bad response code from server: 401."}}

The issue came out after I upgraded library to 1.1.2 and also upgraded React Native to 0.72.0.

To Reproduce

Connect to room in iOS with @livekit/react-native: 1.1.2, react-native: 0.72.0

Expected behavior

The connection to Room should be established.

Screenshots

Device Info:

  • Device: iPhone 13 Pro
  • OS: iOS 16.5

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: 1.1.2
  • livekit-client: 1.11.3
  • react-native-webrtc: 104.0.1 (Livekit Fork)

Additional context

could not get audio sender stats

On unmuting either audio or video throws error, functionalities works but throwing error.
{"error": [TypeError: this.sender.getStats is not a function. (In 'this.sender.getStats()', 'this.sender.getStats' is undefined)]}

To Reproduce
Steps to reproduce the behavior:

  1. Click on toggleVideo or toggleAudio
  2. See error

Expected behavior
On unmuting should not throw the {"error": [TypeError: this.sender.getStats is not a function. (In 'this.sender.getStats()', 'this.sender.getStats' is undefined)]} error.

Screenshots

Device Info

  • Development System - mac mini m1
  • Device: Poco F1 (Android)
  • OS: [Android 12]
  • LiveKit Version [1.5.0]

audio output selection

By default the audio output selection is earpiece.
Need a selection option like other platforms to output the audio through speaker as well

Android Fails to build

Describe the bug

Android fails to build.

Error log

> Task :livekit_react-native:generateDebugRFile FAILED
Execution optimizations have been disabled for task ':livekit_react-native:generateDebugRFile' to ensure correctness due to the following reasons:
  - Gradle detected a problem with the following location: '/Users/pankajsoni/Desktop/mobile/mobile_v2/node_modules/@livekit/react-native/android/build/intermediates/local_only_symbol_list/debug/R-def.txt'. Reason: Task ':livekit_react-native:generateDebugRFile' uses this output of task ':livekitreactnative:parseDebugLocalResources' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5/userguide/validation_problems.html#implicit_dependency for more details about this problem.
  - Gradle detected a problem with the following location: '/Users/pankajsoni/Desktop/mobile/mobile_v2/node_modules/@livekit/react-native/android/build/intermediates/packaged_manifests/debug'. Reason: Task ':livekit_react-native:generateDebugRFile' uses this output of task ':livekitreactnative:processDebugManifest' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.5/userguide/validation_problems.html#implicit_dependency for more details about this problem.

Execution failed for task ':livekit_react-native:generateDebugRFile'.
> A failure occurred while executing com.android.build.gradle.internal.res.GenerateLibraryRFileTask$GenerateLibRFileRunnable
   > /Users/pankajsoni/Desktop/mobile/mobile_v2/node_modules/@livekit/react-native/android/build/intermediates/local_only_symbol_list/debug/R-def.txt

* Try:
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':livekit_react-native:generateDebugRFile'.
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.lambda$executeIfValid$1(ExecuteActionsTaskExecuter.java:142)
	at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:282)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:140)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:128)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:69)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:327)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:314)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:307)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:293)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:417)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:339)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
Caused by: org.gradle.workers.internal.DefaultWorkerExecutor$WorkExecutionException: A failure occurred while executing com.android.build.gradle.internal.res.GenerateLibraryRFileTask$GenerateLibRFileRunnable
	at org.gradle.workers.internal.DefaultWorkerExecutor$WorkItemExecution.waitForCompletion(DefaultWorkerExecutor.java:339)
	at org.gradle.internal.work.DefaultAsyncWorkTracker.lambda$waitForItemsAndGatherFailures$2(DefaultAsyncWorkTracker.java:130)
	at org.gradle.internal.Factories$1.create(Factories.java:31)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withoutLocks(DefaultWorkerLeaseService.java:321)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withoutLocks(DefaultWorkerLeaseService.java:304)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withoutLock(DefaultWorkerLeaseService.java:309)
	at org.gradle.internal.work.DefaultAsyncWorkTracker.waitForItemsAndGatherFailures(DefaultAsyncWorkTracker.java:126)
	at org.gradle.internal.work.DefaultAsyncWorkTracker.waitForItemsAndGatherFailures(DefaultAsyncWorkTracker.java:92)
	at org.gradle.internal.work.DefaultAsyncWorkTracker.waitForAll(DefaultAsyncWorkTracker.java:78)
	at org.gradle.internal.work.DefaultAsyncWorkTracker.waitForCompletion(DefaultAsyncWorkTracker.java:66)
	at org.gradle.api.internal.tasks.execution.TaskExecution$3.run(TaskExecution.java:244)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:29)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$1.execute(DefaultBuildOperationRunner.java:26)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.run(DefaultBuildOperationRunner.java:47)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:68)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeAction(TaskExecution.java:221)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeActions(TaskExecution.java:204)
	at org.gradle.api.internal.tasks.execution.TaskExecution.executeWithPreviousOutputFiles(TaskExecution.java:187)
	at org.gradle.api.internal.tasks.execution.TaskExecution.execute(TaskExecution.java:165)
	at org.gradle.internal.execution.steps.ExecuteStep.executeInternal(ExecuteStep.java:89)
	at org.gradle.internal.execution.steps.ExecuteStep.access$000(ExecuteStep.java:40)
	at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:53)
	at org.gradle.internal.execution.steps.ExecuteStep$1.call(ExecuteStep.java:50)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:50)
	at org.gradle.internal.execution.steps.ExecuteStep.execute(ExecuteStep.java:40)
	at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:68)
	at org.gradle.internal.execution.steps.RemovePreviousOutputsStep.execute(RemovePreviousOutputsStep.java:38)
	at org.gradle.internal.execution.steps.CancelExecutionStep.execute(CancelExecutionStep.java:41)
	at org.gradle.internal.execution.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:74)
	at org.gradle.internal.execution.steps.TimeoutStep.execute(TimeoutStep.java:55)
	at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:51)
	at org.gradle.internal.execution.steps.CreateOutputsStep.execute(CreateOutputsStep.java:29)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.executeDelegateBroadcastingChanges(CaptureStateAfterExecutionStep.java:124)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:80)
	at org.gradle.internal.execution.steps.CaptureStateAfterExecutionStep.execute(CaptureStateAfterExecutionStep.java:58)
	at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:48)
	at org.gradle.internal.execution.steps.ResolveInputChangesStep.execute(ResolveInputChangesStep.java:36)
	at org.gradle.internal.execution.steps.BuildCacheStep.executeWithoutCache(BuildCacheStep.java:181)
	at org.gradle.internal.execution.steps.BuildCacheStep.lambda$execute$1(BuildCacheStep.java:71)
	at org.gradle.internal.Either$Right.fold(Either.java:175)
	at org.gradle.internal.execution.caching.CachingState.fold(CachingState.java:59)
	at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:69)
	at org.gradle.internal.execution.steps.BuildCacheStep.execute(BuildCacheStep.java:47)
	at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:36)
	at org.gradle.internal.execution.steps.StoreExecutionStateStep.execute(StoreExecutionStateStep.java:25)
	at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:36)
	at org.gradle.internal.execution.steps.RecordOutputsStep.execute(RecordOutputsStep.java:22)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:110)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.lambda$execute$2(SkipUpToDateStep.java:56)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:56)
	at org.gradle.internal.execution.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:38)
	at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:73)
	at org.gradle.internal.execution.steps.ResolveChangesStep.execute(ResolveChangesStep.java:44)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:37)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsFinishedStep.execute(MarkSnapshottingInputsFinishedStep.java:27)
	at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:89)
	at org.gradle.internal.execution.steps.ResolveCachingStateStep.execute(ResolveCachingStateStep.java:50)
	at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:114)
	at org.gradle.internal.execution.steps.ValidateStep.execute(ValidateStep.java:57)
	at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:76)
	at org.gradle.internal.execution.steps.CaptureStateBeforeExecutionStep.execute(CaptureStateBeforeExecutionStep.java:50)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.executeWithNoEmptySources(SkipEmptyWorkStep.java:254)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:91)
	at org.gradle.internal.execution.steps.SkipEmptyWorkStep.execute(SkipEmptyWorkStep.java:56)
	at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:32)
	at org.gradle.internal.execution.steps.RemoveUntrackedExecutionStateStep.execute(RemoveUntrackedExecutionStateStep.java:21)
	at org.gradle.internal.execution.steps.legacy.MarkSnapshottingInputsStartedStep.execute(MarkSnapshottingInputsStartedStep.java:38)
	at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:43)
	at org.gradle.internal.execution.steps.LoadPreviousExecutionStateStep.execute(LoadPreviousExecutionStateStep.java:31)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.lambda$execute$0(AssignWorkspaceStep.java:40)
	at org.gradle.api.internal.tasks.execution.TaskExecution$4.withWorkspace(TaskExecution.java:281)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:40)
	at org.gradle.internal.execution.steps.AssignWorkspaceStep.execute(AssignWorkspaceStep.java:30)
	at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:37)
	at org.gradle.internal.execution.steps.IdentityCacheStep.execute(IdentityCacheStep.java:27)
	at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:44)
	at org.gradle.internal.execution.steps.IdentifyStep.execute(IdentifyStep.java:33)
	at org.gradle.internal.execution.impl.DefaultExecutionEngine$1.execute(DefaultExecutionEngine.java:76)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeIfValid(ExecuteActionsTaskExecuter.java:139)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:128)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:77)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:55)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:52)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:52)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:69)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:327)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:314)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:307)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:293)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:417)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:339)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
Caused by: java.nio.file.NoSuchFileException: /Users/pankajsoni/Desktop/mobile/mobile_v2/node_modules/@livekit/react-native/android/build/intermediates/local_only_symbol_list/debug/R-def.txt
	at com.android.ide.common.symbols.SymbolIo.readRDef(SymbolIo.java:261)
	at com.android.build.gradle.internal.res.GenerateLibraryRFileTask$GenerateLibRFileRunnable.run(GenerateLibraryRFileTask.kt:159)
	at com.android.build.gradle.internal.profile.ProfileAwareWorkAction.execute(ProfileAwareWorkAction.kt:74)
	at org.gradle.workers.internal.DefaultWorkerServer.execute(DefaultWorkerServer.java:63)
	at org.gradle.workers.internal.NoIsolationWorkerFactory$1$1.create(NoIsolationWorkerFactory.java:66)
	at org.gradle.workers.internal.NoIsolationWorkerFactory$1$1.create(NoIsolationWorkerFactory.java:62)
	at org.gradle.internal.classloader.ClassLoaderUtils.executeInClassloader(ClassLoaderUtils.java:100)
	at org.gradle.workers.internal.NoIsolationWorkerFactory$1.lambda$execute$0(NoIsolationWorkerFactory.java:62)
	at org.gradle.workers.internal.AbstractWorker$1.call(AbstractWorker.java:44)
	at org.gradle.workers.internal.AbstractWorker$1.call(AbstractWorker.java:41)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:204)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$CallableBuildOperationWorker.execute(DefaultBuildOperationRunner.java:199)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:66)
	at org.gradle.internal.operations.DefaultBuildOperationRunner$2.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:157)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.execute(DefaultBuildOperationRunner.java:59)
	at org.gradle.internal.operations.DefaultBuildOperationRunner.call(DefaultBuildOperationRunner.java:53)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:73)
	at org.gradle.workers.internal.AbstractWorker.executeWrappedInBuildOperation(AbstractWorker.java:41)
	at org.gradle.workers.internal.NoIsolationWorkerFactory$1.execute(NoIsolationWorkerFactory.java:59)
	at org.gradle.workers.internal.DefaultWorkerExecutor.lambda$submitWork$2(DefaultWorkerExecutor.java:205)
	at org.gradle.internal.work.DefaultConditionalExecutionQueue$ExecutionRunner.runExecution(DefaultConditionalExecutionQueue.java:187)
	at org.gradle.internal.work.DefaultConditionalExecutionQueue$ExecutionRunner.access$700(DefaultConditionalExecutionQueue.java:120)
	at org.gradle.internal.work.DefaultConditionalExecutionQueue$ExecutionRunner$1.run(DefaultConditionalExecutionQueue.java:162)
	at org.gradle.internal.Factories$1.create(Factories.java:31)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:249)
	at org.gradle.internal.work.DefaultWorkerLeaseService.runAsWorkerThread(DefaultWorkerLeaseService.java:109)
	at org.gradle.internal.work.DefaultWorkerLeaseService.runAsWorkerThread(DefaultWorkerLeaseService.java:114)
	at org.gradle.internal.work.DefaultConditionalExecutionQueue$ExecutionRunner.runBatch(DefaultConditionalExecutionQueue.java:157)
	at org.gradle.internal.work.DefaultConditionalExecutionQueue$ExecutionRunner.run(DefaultConditionalExecutionQueue.java:126)
	... 2 more

dependencies

android > build.gradle

buildscript {
    ext {
        buildToolsVersion = "33.0.0"
        supportLibVersion = "31.0.0"
        minSdkVersion = 23
        compileSdkVersion = 33
        targetSdkVersion = 33
        googlePlayServicesVersion = "15.0.1"
        kotlinVersion = "1.8.0"
        firebaseVersion = "21.1.0"

        ndkVersion = "21.4.7075529"
    }

    repositories {
        google()
        mavenCentral()
        jcenter()
    }

    dependencies {
        classpath("com.android.tools.build:gradle:7.4.2")
        classpath("com.facebook.react:react-native-gradle-plugin")
        classpath("com.google.gms:google-services:4.3.15")
        classpath("org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlinVersion")
        classpath("io.realm:realm-gradle-plugin:10.11.1")
    }
}

allprojects {
    repositories {
        mavenLocal()
        maven {
            // All of React Native (JS, Obj-C sources, Android binaries) is installed from npm
            url("$rootDir/../node_modules/react-native/android")
        }
        maven {
            // Android JSC is installed from npm
            url("$rootDir/../node_modules/jsc-android/dist")
        }
        mavenCentral {
            content {
                excludeGroup "com.facebook.react"
            }
        }
        google()
        jcenter()
        maven { url 'https://www.jitpack.io' }
    }
}

Android subscriber renders a black screen after publisher has disabled and enabled their video

Describe the bug
When the subscriber is on an Android device, it'll render remote video as black after the publisher pauses and unpauses their video.

To Reproduce

Steps to reproduce the behavior:

  1. publish from web or iOS
  2. subscribe from Android
  3. pause video from publisher
  4. unpause video from publisher
  5. note VideoView of the Android subscriber will remain black

The following logs gets printed on the Android subscriber

visibility resize observer not triggered

Android app couldn't open in build version

As described in the README file, I have build the app and able test the app in development mode. Video & Audio everything was fine. But when I build the app and installed the APK and triend to open the app. But the app is not open it just closes the app from opening

To Reproduce

  1. Build the app with ./gradlew assembleRelease inside android directory in the app
  2. Open the app : The app doesn't open up

Expected behavior
The app should open up

Device Info (please complete the following information):

  • Device: Samsung Galaxy M13
  • OS: Android 12

Dependencies used in the package.json are

  • "livekit-react-native": "^0.8.0",

  • "livekit-react-native": "github:livekit/client-sdk-react-native"
    When I removed these dependencies and rebuild and tried, the app opens.

    "react": "17.0.1",
    "react-native": "0.64.2",
    These are the versions I used in package.json

I am using NPM as package manager. In documentation it uses yarn to do package managing

where is createDataChannel?

Good afternoon everyone, I hope you're all having a great day. I humbly come here today in search of some assistance regarding the understanding of some details when it comes to the inner workings of this library. More specifically about its use of @livekit/react-native-webrtc. I've been looking at said dependency and my current understanding is that, although it has some differences from the react-native-webrtc library that it was forked from, it should still be used by invoking the RTCPeerConnection.createDataChannel method, as it is suggested in the BasicUsage.md file that it's still present even in the fork:

let datachannel = peerConnection.createDataChannel( 'my_channel' );

datachannel.addEventListener( 'open', event => {} );
datachannel.addEventListener( 'close', event => {} );
datachannel.addEventListener( 'message', message => {} );

But contraty to what I expected, there isn't any reference to createDataChannel anywhere in this project. How so? Is this method being used indirectly in some way that I'm unaware of?

I'd greatly appreciate any info on this matter.

Thanks a lot for your time!

Cheers

Cannot read property 'connectionQuality` of undefined

Describe the bug
Algumas vezes em que estava executando o projeto, recebi o seguinte erro Cannot read property 'connectionQuality` of undefined.

To Reproduce
I was unable to repeat the steps to repeat the error.

Expected behavior
It is expected that this error will not occur, even if for some reason the participant is undefined. This occurred during the call.
I believe that such an error is possible to be corrected by checking if participant inside usePartipant is valid, perhaps with the optional-chaining operator? PRs welcome?

Screenshots
Screenshot_1665152741

Device Info (please complete the following information):

Support pauseUpstream

In the JS SDK, we have an API LocalTrack.pauseUpstream. This would allow the user to pause publishing of the track (by replacing it with a dummy track), while keeping the original track active for display to the local participant.

Because that code path relies upon DOM methods to create the dummy track, it does not work on react-native.

Issue sending & receiving streams between two clients

Describe the bug
I'm trying to build on the example app provided, so far I've implemented everything like the example app and I've been successful with connecting to a room on example website, I recieve audio from website, but I don't read the video stream, and I also can't send audio or video at all.

To Reproduce
Steps to reproduce the behavior:

  1. add the following to index.js
import { registerRootComponent } from "expo";
import { registerGlobals } from "livekit-react-native";

import App from "./App";
registerRootComponent(App);
registerGlobals();

  1. Rendering the following component in App.tsx
import { Participant, Room, Track } from "livekit-client";
import {
  useRoom,
  useParticipant,
  AudioSession,
  VideoView,
} from "livekit-react-native";
import { useEffect, useState } from "react";
import { Text, ListRenderItem, StyleSheet, FlatList, View } from "react-native";
import { ParticipantView } from "./ParticipantView";
import { RoomControls } from "./RoomControls";
import type { TrackPublication } from "livekit-client";

const App = () => {
  // Create a room state
  const [, setIsConnected] = useState(false);
  const [room] = useState(
    () =>
      new Room({
        publishDefaults: { simulcast: false },
        adaptiveStream: true,
      })
  );

  // Get the participants from the room
  const { participants } = useRoom(room);
  const url = "[hard-coded-url]";
  const token =
    "[hard-coded-token";
  useEffect(() => {
    let connect = async () => {
      // If you wish to configure audio, uncomment the following:
      await AudioSession.configureAudio({
        android: {
          preferredOutputList: ["speaker"],
        },
        ios: {
          defaultOutput: "speaker",
        },
      });
      await AudioSession.startAudioSession();
      await room.connect(url, token, {});
      await room.localParticipant.setCameraEnabled(true);
      await room.localParticipant.setMicrophoneEnabled(true);
      await room.localParticipant.enableCameraAndMicrophone();
      console.log("connected to ", url);
      setIsConnected(true);
    };

    connect();
    return () => {
      room.disconnect();
      AudioSession.stopAudioSession();
    };
  }, [url, token, room]);
  // Setup views.
  const stageView = participants.length > 0 && (
    <ParticipantView participant={participants[0]} style={styles.stage} />
  );

  const renderParticipant: ListRenderItem<Participant> = ({ item }) => {
    return (
      <ParticipantView participant={item} style={styles.otherParticipantView} />
    );
  };

  const otherParticipantsView = participants.length > 0 && (
    <FlatList
      data={participants}
      renderItem={renderParticipant}
      keyExtractor={(item) => item.sid}
      horizontal={true}
      style={styles.otherParticipantsList}
    />
  );

  const { cameraPublication, microphonePublication } = useParticipant(
    room.localParticipant
  );

  return (
    <View style={styles.container}>
      {stageView}
      {otherParticipantsView}
      <RoomControls
        micEnabled={isTrackEnabled(microphonePublication)}
        setMicEnabled={(enabled: boolean) => {
          room.localParticipant.setMicrophoneEnabled(enabled);
        }}
        cameraEnabled={isTrackEnabled(cameraPublication)}
        setCameraEnabled={(enabled: boolean) => {
          room.localParticipant.setCameraEnabled(enabled);
        }}
        onDisconnectClick={() => {
          //   navigation.pop();
          console.log("disconnected");
        }}
      />
    </View>
  );
};

function isTrackEnabled(pub?: TrackPublication): boolean {
  return !(pub?.isMuted ?? true);
}
const styles = StyleSheet.create({
  container: {
    flex: 1,
    alignItems: "center",
    justifyContent: "center",
  },
  stage: {
    flex: 1,
    width: "100%",
  },
  otherParticipantsList: {
    width: "100%",
    height: 150,
    flexGrow: 0,
  },
  otherParticipantView: {
    width: 150,
    height: 150,
  },
});

export default App;

the components used here are mostly the same as what's in the example, I've removed the screensharing logic and the messages
5. I run the app using an expo development build
6. it will log that it's connected, you'll be able to hear sound from the remote participant, but not see any video or send any sound.
7. if i try to add
await room.localParticipant.enableCameraAndMicrophone();
in the useEffect, I get the following error:

Possible Unhandled Promise Rejection (id: 0):
Error: Not implemented.
getSettings@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:103733:24
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:120307:109
generatorResume@[native code]
asyncGeneratorStep@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21908:26
_next@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21927:29
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21932:14
tryCallTwo@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26656:9
doResolve@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26788:25
Promise@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26675:14
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21924:25
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:120173:52
generatorResume@[native code]
asyncGeneratorStep@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21908:26
_next@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21927:29
tryCallOne@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26648:16
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:26729:27
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27687:26
_callTimer@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27602:17
_callReactNativeMicrotasksPass@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27635:17
callReactNativeMicrotasks@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:27799:44
__callReactNativeMicrotasks@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:21006:46
@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20806:45
__guard@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20986:15
flushedQueue@http://192.168.1.150:8081/index.bundle?platform=ios&dev=true&hot=false:20805:21
flushedQueue@[native code]

Expected behavior
This should both receive & send video and audio streams between the two clients

Screenshots
image

Device Info (please complete the following information):

  • Device: IphoneX
  • OS: iOS16.0
  • LiveKit Version: beta

Additional context
Normal peer to peer video calling using react-native-webrtc is working fine, so the issue isn't in any native setup of webrtc

error import type { AudioConfiguration } from './audio/AudioSession';

Describe the bug

Screen Shot 2023-03-06 at 1 24 18 PM

A clear and concise description of what the bug is.

To Reproduce
yarn add @livekit/react-native
launch the app error

Device Info:

  • Device: Oppo
  • Version 10

Dependencies Info (please reference your package-lock.json or yarn.lock file):

  • @livekit/react-native: ^3.0.0
  • livekit-client: 1.6.0
  • react-native-webrtc: ^106.0.6

** devDepencies

"devDependencies": {
        "@babel/core": "^7.12.9",
        "@babel/runtime": "^7.12.5",
        "@react-native-community/eslint-config": "^3.0.0",
        "@types/color": "^3.0.0",
        "@types/i18n-js": "^3.0.1",
        "@types/jest": "^24.0.18",
        "@types/lodash": "^4.14.172",
        "@types/mapbox__polyline": "^1.0.2",
        "@types/react": "^17.0.21",
        "@types/react-native": "^0.65.0",
        "@types/react-native-autocomplete-input": "^5.0.0",
        "@types/react-native-snap-carousel": "^3.8.4",
        "@types/react-native-status-bar-height": "^2.3.0",
        "@types/react-native-vector-icons": "^6.4.8",
        "@types/react-native-video": "^5.0.3",
        "@types/react-redux": "^7.1.4",
        "@types/react-test-renderer": "^16.9.0",
        "@types/redux-logger": "^3.0.7",
        "@types/websocket": "^1.0.0",
        "@types/yup": "^0.26.24",
        "@typescript-eslint/eslint-plugin": "^5.51.0",
        "@typescript-eslint/parser": "^5.51.0",
        "babel-jest": "^26.6.3",
        "babel-plugin-transform-remove-console": "^6.9.4",
        "metro-react-native-babel-preset": "^0.66.0",
        "ts-jest": "^24.0.0",
        "typescript": "^4.6.3"
    }

if I updated the code remove import type just import then work fine

Verify and move back to official react-native-webrtc

We've been on an experimental fork react-native-webrtc that has unified-plan and transceiver support, which has been recently officially added to the original repo. Need to verify that the original repo works properly.

The resolution is unable to recover when network quality is back to good or excellent

We are trying to use the SDK to build an Android app with 4G networking.

While the resolution seems ok at the beginning of the experiment (network state excellent), when the network quality dropped significantly to the poor state, the resolution also dropped, but when the network quality returned to excellent, the resolution did not increase.

Here is our setting:

const preset = VideoPresets.h720
  const [room] = useState(
    () =>
      new Room({
        dynacast: true,
        videoCaptureDefaults: {
          resolution: preset.resolution,

        },
        publishDefaults: {
          videoEncoding: preset.encoding,
          simulcast: false,
        },
      }),
  );

unable to connect to room

here are the logs

LOG rn-webrtc:pc:DEBUG 7 setRemoteDescription OK +29ms
LOG rn-webrtc:pc:DEBUG 7 addIceCandidate +1ms
LOG rn-webrtc:pc:DEBUG 7 addIceCandidate +1ms
LOG rn-webrtc:pc:DEBUG 7 createAnswer +2ms
LOG rn-webrtc:pc:DEBUG 7 createAnswer OK +18ms
LOG rn-webrtc:pc:DEBUG 7 setLocalDescription +5ms
LOG rn-webrtc:pc:DEBUG 7 setLocalDescription OK +23ms
LOG rn-webrtc:pc:DEBUG 7 getStats +238ms
LOG Failed to connect

`import {Room} from 'livekit-client';
import {useParticipant, useRoom} from '@livekit/react-native';
import * as React from 'react';
import {useEffect, useState} from 'react';
import {
ActivityIndicator,
FlatList,
StyleSheet,
Text,
View,
} from 'react-native';

import {ParticipantView} from './ParticipantView';
import {RoomControls} from './RoomControls';

const LIVEKET_SERVER_URL = 'ws://192.168.1.18:7880';
const TOKEN_SERVER_URL = 'http://192.168.1.18:7880/create-room-access-token';

export const RoomPage = ({navigation, route}) => {
const [, setIsConnected] = useState(false);
const [room] = useState(
() =>
new Room({
publishDefaults: {simulcast: false},
adaptiveStream: true,
}),
);
const {participants} = useRoom(room);
const {userName, roomName} = route.params;
const [token, setToken] = useState('eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE3MTEwMTgyMzgsImlzcyI6IkFQSW5KeXZTTHU2eG9pVSIsImp0aSI6InRvbnlfc3RhcmsiLCJuYW1lIjoiVG9ueSBTdGFyayIsIm5iZiI6MTY3NTAxODIzOCwic3ViIjoidG9ueV9zdGFyayIsInZpZGVvIjp7InJvb20iOiJzdGFyay10b3dlciIsInJvb21Kb2luIjp0cnVlfX0.8fsusnNXIy1AJUOeZ8ffo5jSv5mh-HM81gPixRJU0CY');

useEffect(() => {
const fn = async () => {
const body = JSON.stringify({
identity: userName,
name: userName,
room: roomName,
ttl: '1d',
});

  console.log(body)

  const res = await fetch(TOKEN_SERVER_URL, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
    },
    body: body,
  });
  console.log(res)

  // const data = await res.json();
  // setToken(data.data.token);
};
fn();

}, [userName, roomName]);

useEffect(() => {
if (token) {
console.log('Connecting to ROOM....');
room
.connect(LIVEKET_SERVER_URL, token, {})
.then(r => {
if (!r) {
console.log('Failed to connect');
return;
}
console.log('Connected');
setIsConnected(true);
})
.catch(err => {
console.error('Error Connecting to Room', err);
});

  return () => {
    room.disconnect();
  };
} else {
  console.log('NO TOKEN WHILE ROOM CONNECTION');
}

}, [token, room]);

const renderParticipant = ({item}) => {
return (

);
};

const {cameraPublication, microphonePublication} = useParticipant(
room.localParticipant,
);

function isTrackEnabled(data) {
return !(data?.isMuted ?? true);
}

if (participants.length && participants[0].identity) {
return (

    <FlatList
      data={participants}
      renderItem={renderParticipant}
      keyExtractor={item => item.sid}
      horizontal={true}
      style={styles.otherParticipantsList}
    />

    <RoomControls
      micEnabled={isTrackEnabled(microphonePublication)}
      setMicEnabled={enabled => {
        room.localParticipant.setMicrophoneEnabled(enabled);
      }}
      cameraEnabled={isTrackEnabled(cameraPublication)}
      setCameraEnabled={enabled => {
        room.localParticipant.setCameraEnabled(enabled);
      }}
      onDisconnectClick={() => {
        navigation.pop();
      }}
    />
  </View>
);

} else {
return (


Loading...

);
}
};

const styles = StyleSheet.create({
container: {
flex: 1,
alignItems: 'center',
justifyContent: 'center',
},
stage: {
flex: 1,
width: '100%',
},
otherParticipantsList: {
width: '100%',
height: 150,
flexGrow: 0,
},
otherParticipantView: {
width: 150,
height: 150,
},
text: {
color: 'white',
},
});`

iOS doesn't respect videoCaptureDefaults?

Describe the bug

Setting the capture defaults to 720x540 works as expected on Android client, but on iOS client the resulting video is 640x480

To Reproduce

Set videoCaptureDefaults to use 720x540 :

videoCaptureDefaults: {
resolution: new VideoPreset(720, 540, 300000, 25).resolution,
},

Then display the camera publication information:
{cameraPublication?.dimensions.width}x{cameraPublication?.dimensions.height}

Tested with both simulcast: true and simulcast:false

Expected behavior

iOS using same resolution as set in the config

Device Info:

  • Device: iPhone 8
  • OS: iOS 16.5

Dependencies Info:

  • @livekit/react-native: 1.1.2
  • livekit-client: 1.12.1
  • @livekit/react-native-webrtc: 104.0.1

Android 5.1 can't connect to a room

Describe the bug

A have a app that runs livekit. When connecting to the the app via emulator or phone with more recent androids, the app runs normaly. The problem is with android 5.1(which is needed to run with virtual reality goggles). The error showed by the app is :
Error: Could not connect PeerConnection after timeout

EDIT: If I change something in the code and save while I'm on the call screen, the app crashes with the following message: Unfortunately, app has stopped working.

EDIT2: Executing the react-native example with android 5.1 i had the same problem described on edit one, but instead of having to change the code to see the error i've got the error when trying to connect to a room.

To Reproduce

Steps to reproduce the behavior:

  1. Try to connect to a room
  2. See error

Expected behavior

Connect to a livekit room

Screenshots

error

Device Info:

  • Device: [e.g. iPhone6]
  • OS: [e.g. iOS8.1]

Dependencies Info (please reference your package-lock.json or yarn.lock file):

"@livekit/react-native": "^0.3.0",
"@react-native-community/async-storage": "^1.12.1",
"@react-navigation/native": "^6.1.1",
"@react-navigation/native-stack": "^6.9.7",
"@tanstack/react-query": "^4.28.0",
"@unform/core": "^2.1.6",
"@unform/mobile": "^2.1.6",
"axios": "^1.2.1",
"date-fns": "^2.29.3",
"livekit-client": "^1.7.0",
"react": "18.1.0",
"react-native": "0.70.6",
"react-native-blob-util": "^0.17.3",
"react-native-canvas": "^0.1.38",
"react-native-device-info": "^10.4.0",
"react-native-document-picker": "^8.2.0",
"react-native-dotenv": "^3.4.8",
"react-native-gesture-handler": "^2.8.0",
"react-native-pdf": "^6.6.2",
"react-native-render-html": "^6.3.4",
"react-native-safe-area-context": "^4.4.1",
"react-native-screens": "^3.20.0",
"react-native-svg": "^13.8.0",
"react-native-video": "^5.2.1",
"react-native-video-controls": "^2.8.1",
"react-native-webrtc": "^106.0.7",
"react-native-webview": "^11.26.1",
"socket.io-client": "^4.6.1",
"styled-components": "^5.3.6"

Additional context
Add any other context about the problem here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.