api.video is the video infrastructure for product builders. Lightning fast video APIs for integrating, scaling, and managing on-demand & low latency live streaming features in your app.
This module is made for broadcasting rtmp livestream from smartphone camera
npm install @api.video/react-native-livestream
or
yarn add @api.video/react-native-livestream
Note: if you are on iOS, you will need two extra steps:
- Don't forget to install the native dependencies with Cocoapods
cd ios && pod install
- This project contains swift code, and if it's your first dependency with swift code, you need to create an empty swift file in your project (with the bridging header) from XCode. Find how to do that
To be able to broadcast,
- On Android you must ask for internet, camera and microphone permissions:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.CAMERA" />
- On iOS, you must update Info.plist with a usage description for camera and microphone
...
<key>NSCameraUsageDescription</key>
<string>Your own description of the purpose</string>
<key>NSMicrophoneUsageDescription</key>
<string>Your own description of the purpose</string>
...
- On react-native you must handle the permissions requests before starting your livestream. If permissions are not accepted you will not be able to broadcast.
import React, { useRef, useState } from 'react';
import { View, TouchableOpacity } from 'react-native';
import { LivestreamView } from '@api.video/react-native-livestream';
const App = () => {
const ref = useRef(null);
const [streaming, setStreaming] = useState(false);
return (
<View style={{ flex: 1, alignItems: 'center' }}>
<LivestreamView
style={{ flex: 1, backgroundColor: 'black', alignSelf: 'stretch' }}
ref={ref}
video={{
fps: 30,
resolution: '720p',
camera: 'front',
orientation: 'portrait',
}}
liveStreamKey="your-livestrem-key"
onConnectionSuccess={() => {
//do what you want
}}
onConnectionFailed={(e) => {
//do what you want
}}
onDisconnect={() => {
//do what you want
}}
/>
<View style={{ position: 'absolute', bottom: 40 }}>
<TouchableOpacity
style={{
borderRadius: 50,
backgroundColor: streaming ? 'red' : 'white',
width: 50,
height: 50,
}}
onPress={() => {
if (streaming) {
ref.current?.stopStreaming();
setStreaming(false);
} else {
ref.current?.startStreaming();
setStreaming(true);
}
}}
/>
</View>
</View>
);
}
export default App;
type ReactNativeLivestreamProps = {
// Styles for the view containing the preview
style: ViewStyle;
// Your Streaming key, we will append this to the rtmpServerUrl
liveStreamKey: string;
// RTMP server url, default: rtmp://broadcast.api.video/s
rtmpServerUrl?: string;
video: {
// default: 30
fps: number;
// default: '720p'
resolution: '240p' | '360p' | '480p' | '720p' | '1080p' | '2160p';
// If omitted we will infer it from the resolution
bitrate?: number;
// default: 'back'
camera?: 'front' | 'back';
// default: 'landscape'
orientation?: 'landscape' | 'portrait';
};
audio?: {
// default: false
muted?: boolean;
// default: 128000
bitrate?: number;
};
// will be called when the connection is successful
onConnectionSuccess?: (event: NativeSyntheticEvent<{ }>) => void;
// will be called on connection's error
onConnectionFailed?: (event: NativeSyntheticEvent<{ code: string }>) => void;
// will be called when the live-stream is stopped
onDisconnect?: (event: NativeSyntheticEvent<{ }>) => void;
};
type ReactNativeLivestreamMethods = {
// Start the stream
startStreaming: () => void;
// Stops the stream
stopStreaming: () => void;
};
API.Video LiveStream module is using external native library for broadcasting
Plugin | README |
---|---|
rtmp-rtsp-stream-client-java | rtmp-rtsp-stream-client-java |
HaishinKit | HaishinKit |
If you have any questions, ask us here: https://community.api.video . Or use Issues.
You can try our example app, feel free to test it.