⚠️ We have deprecated this older version of our sample web app. Kindly refer to our new sample app
This is an example web app to demo 100ms' web SDK
You will need Node.js version v12.13.0 or greater installed on your system
Get the code by cloning this repo using git
git clone git@github.com:100mslive/sample-app-web.git
Once cloned, open the terminal in the project directory, and install dependencies with:
npm install
Create a new file .env
and copy the values from example.env
cp example.env .env
Host your token generation service following this guide
Update the TOKEN_ENDPOINT
in .env
file with your token generation service endpoint (eg. https://ms-services-29qzq06nzogv.runkit.sh/
)
To turn on the remote-mute feature, update the following values in .env
file from your firebase project settings:
# Firebase config
FIREBASE_API_KEY=<firebaseConfig.apiKey>
FIREBASE_AUTH_DOMAIN=<firebaseConfig.authDomain>
FIREBASE_DATABASE_URL=<firebaseConfig.databaseURL>
FIREBASE_PROJECT_ID=<firebaseConfig.projectId>
FIREBASE_STORAGE_BUCKET=<firebaseConfig.storageBucket>
FIREBASE_MESSAGING_ID=<firebaseConfig.messagingSenderId>
FIREBASE_APP_ID=<firebaseConfig.appId>
Then start the app with:
npm start
The app should now be up and running at http://localhost:8080 🚀
This guide provides an overview of the key objects you'll use with 100ms' JavaScript SDK to build a live audio/video application.
Platform | Chrome | Firefox | Opera | Safari |
---|---|---|---|---|
Android 4.4 or later | Version 66 or later | Version 66 or later | Version 45 or later | No |
MacOS 10 or later | Version 66 or later | Version 66 or later | Version 45 or later | Version 11 or later |
Windows 7 or later | Version 66 or later | Version 66 or later | Version 45 or later | No |
iOS | No | No | No | Version 12 or later |
Room
- A room represents a real-time audio, video session, the basic building block of the 100mslive Video SDKStream
- A stream represents real-time audio, video streams that are shared to a room. Usually, each stream contains a video track and an audio track (except screenshare streams, which contains only a video trac)Track
- A track represents either the audio or video that makes up a streamPeer
- A peer represents all participants connected to a room. Peers can be "local" or "remote"Publish
- A local peer can share its audio, video by "publishing" its tracks to the roomSubscribe
- A local peer can stream any peer's audio, video by "subscribing" to their streamsBroadcast
- A local peer can send any message/data to all remote peers in the room.
npm install --save @100mslive/hmsvideo-web@latest
Sign up on https://dashboard.100ms.live/register & visit Developer
tab to get your access credentials
To generate a server-side token, follow the steps described here - https://docs.100ms.live/server-side/generate-server-side-token
To create a room, follow the steps described here - https://docs.100ms.live/server-side/create-room
To generate a client-side token, follow the steps described here - https://docs.100ms.live/server-side/authentication
const peer = new HMSPeer(userName:"<userName here>",authToken:"<authToken here>")
const config = new HMSClientConfig({
endpoint: "wss://prod-in.100ms.live"
})
const client = new HMSClient(peer, config)
authTokenis the client-side token generated by your token generation service.
After instantiating HMSClient
, connect to 100ms' server
try {
await client.connect()
} catch(err) {
// Handle error
}
Add listener functions to listen to peers joining, establishing a connection to the server, peers publishing their streams etc.
client.on('connect',() => {
// This is where we can call `join(room)`
});
client.on('disconnect', () => {});
client.on('peer-join', (room, peer) => {
// Show a notification or toast message in the UI
});
client.on('peer-leave', (room, peer) => {
// Show a notification or toast message in the UI
});
client.on('stream-add', (room, peer, streamInfo) => {
// subscribe to the stream if needed
});
client.on('stream-remove', (room, peer, streamInfo) => {
// Remove remote stream if needed
});
client.on('broadcast', (room, peer ,message) => {
// Show a notification or update chat UI
});
client.on('disconnected', () => {
// If there is a temporary websocket disconnection, then execute code
// to re-publish and subscribe all streams. eg. location.reload();
});
Always wait for
connect
message listener after creating client before subscribing/publishing any streams.
If say, 4 streams were already published when client connects to the room, then client receives
stream-add
messages for all those 4 streams as soon as client joins
Remember to add
disconnected
message handler. Temporary websocket disconnections are common and trying to reconnect on disconnection will ensure the user sees the conference continuing instead of freezing up
try {
await client.join(roomId);
} catch(err) {
// Handle error
}
This roomId should be generated using
createRoom
API
This method prompts the user for permission to use a media input which produces audio/video tracks such as a camera, screen, microphone
const localStream = await client.getLocalStream({
resolution: "vga",
bitrate: 256,
codec: "VP8",
frameRate: 20,
shouldPublishAudio:true,
shouldPublishVideo:true
});
In order to connect to a specific camera/mic, you can use the advancedMediaConstraints key which accepts browser's native MediaStreamConstraints as shown below. To get deviceIDs, use enumerateDevices
const localStream = await client.getLocalStream({
resolution: "vga", //This defines the video height and width. Can be qqvga, qvga, shd, hd
bitrate: 256, //This is the maximum bitrate to cap the video at
codec: "VP8",
frameRate: 20,
shouldPublishAudio:true,
shouldPublishVideo:true,
advancedMediaConstraints: {
video: {
deviceId: "e82934fe80bdd62ed2aac541f5fd53e53d98abb0b738c6f52edea4f5014d32d8"
},
audio: {
deviceId: "756814e591e116616c740e39b307a8015cac4c2511950e0240cf7fbe62736dfd"
}
}
});
For advanced use cases: all
stream
objects returned bygetLocalStream
extends browser's native MediaStream class and implements all its methods
The settings above are recommended settings for most use cases. You can increase resolution to hd and bandwidth to 1024 to get higher quality video.
This method prompts the user for permission to share their screen and choose the screnshare source
const localScreen = await client.getLocalScreen({
bitrate: 0,
codec: "VP8",
frameRate: 10,
});
//If your primary usecase is sharing text
localScreen.getVideoTracks().forEach(track => {
if ('contentHint' in track) {
track.contentHint = 'text';
}
});
All stream
objects can be attached to HTML video elements. eg. The local stream from the user's camera
//This is React implementation.
//Replace ref, useRef with id, getElementById for native HTML implementation.
//Create a reference for the video element to which the local stream will be attached
const localVideo = useRef();
//Attach local stream to the video element
localVideo.current.srcObject = local;
//Add video element
<video autoPlay muted ref={localVideo}>
</video>
Remember to set
muted
totrue
and mirror the local webcam stream
try {
await client.publish(local, roomId);
} catch(err) {
// handle the error
}
A client can publish multiple streams eg. a screenshare, an in-built webcam and an external webcam all together
This method "subscribes" to a remote peer's stream. This should ideally be called in the stream-add
message listener.
try {
const remote = await client.subscribe(mid, roomId);
// Do something with remote stream
} catch(err) {
// Handle error
}
try {
await client.unsubscribe(remoteStream, roomId);
} catch(err) {
// Handle error
}
try {
await client.broadcast(payload, roomId);
} catch(err) {
// Handle error
}
client.disconnect();
100ms SDK's Stream
interface has mute
and unmute
methods provided to mute and unmute video or audio respectively.
// To mute local stream audio
local.mute('audio')
// To unmute local stream audio
local.unmute('audio')
// To mute local stream video
local.mute('video')
// To unmute local stream video
local.unmute('video')
You can use applyConstraints
to change the quality/source of the video mid-stream.
client.applyConstraints({
bitrate: 0,
codec: "VP8",
resolution:"hd",
frameRate: 10,
},
localStream)
Refer the full SDK Documentation here - https://docs.100ms.live/client-side/web