Skip to content

livekit/client-sdk-react-native

Repository files navigation

livekit-react-native

LiveKit Client SDK for React Native. (beta)

Installation

NPM

npm install https://github.com/livekit/client-sdk-react-native

Yarn

yarn add https://github.com/livekit/client-sdk-react-native

The react-native-webrtc library has additional installation instructions found here:

Example app

We've included an example app that you can try out.

Usage

In your index.js file, setup the LiveKit SDK by calling registerGlobals(). This sets up the required WebRTC libraries for use in Javascript, and is needed for LiveKit to work.

import { registerGlobals } from "livekit-react-native";

// ...

registerGlobals()

A Room object can then be created and connected to.

import { Participant, Room, Track } from 'livekit-client';
import { useRoom, AudioSession, VideoView } from 'livekit-react-native';

/*...*/

// Create a room state
const [room,] = useState(() => new Room());

// Get the participants from the room
const { participants } = useRoom(room);

useEffect(() => {
  AudioSession.startAudioSession();
  room.connect(url, token, {});
  return () => {
    room.disconnect()
    AudioSession.stopAudioSession();
  }
}, [url, token, room]);

const videoView = participants.length > 0 && (
  <VideoView style={{flex:1, width:"100%"}} videoTrack={participants[0].getTrack(Track.Source.Camera)?.videoTrack} />
);

API documentation is located here.

Additional documentation for the LiveKit SDK can be found at https://docs.livekit.io/references/client-sdks/

Screenshare

Enabling screenshare requires extra installation steps:

Android

Android screenshare requires a foreground service with type mediaProjection to be present.

The example app uses @voximplant/react-native-foreground-service for this. Ensure that the service is labelled a mediaProjection service like so:

<service android:name="com.voximplant.foregroundservice.VIForegroundService" 
  android:foregroundServiceType="mediaProjection" />

Once setup, start the foreground service prior to using screenshare.

iOS

iOS screenshare requires adding a Broadcast Extension to your iOS project. Follow the integration instructions here:

https://jitsi.github.io/handbook/docs/dev-guide/dev-guide-ios-sdk/#screen-sharing-integration

It involves copying the files found in this sample project to your iOS project, and registering a Broadcast Extension in Xcode.

It's also recommended to use CallKeep, to register a call with CallKit (as well as turning on the voip background mode). Due to background app processing limitations, screen recording may be interrupted if the app is restricted in the background. Registering with CallKit allows the app to continue processing for the duration of the call.

Once setup, iOS screenshare can be initiated like so:

const screenCaptureRef = React.useRef(null);
const screenCapturePickerView = Platform.OS === "ios" && (
  <ScreenCapturePickerView ref={screenCaptureRef} />
);
const startBroadcast = async () => {
  if(Platform.OS === "ios") {
    const reactTag = findNodeHandle(screenCaptureRef.current);
    await NativeModules.ScreenCapturePickerViewManager.show(reactTag);
    room.localParticipant.setScreenShareEnabled(true);
  }
  else {
    room.localParticipant.setScreenShareEnabled(true);
  }
};

return (
  <View style={styles.container}>
    /*...*/
    // Make sure the ScreenCapturePickerView exists in the view tree.
    {screenCapturePickerView}
  </View>
);

Note

Currently it does not run on iOS Simulator on M1 Macs.

Contributing

See the contributing guide to learn how to contribute to the repository and the development workflow.

License

Apache License 2.0