You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello all!
I've been working with NiceGUI for quite some time now and I've been wanting to do a project that relates to this for quite some time aswell.
Basically, I'm looking to have a way that you can do something similar to Microsoft Teams or Zoom. For now, I just want to get something related to webcams working, as I imagine that something like sharing a screen would follow nicely.
The part that I've been struggling with in this project is actually making it so that all the users see eachother video feeds. I am able to make it so that you can go to a route and view your own webcam feed, then if you access it via your phone you can see your phone webcam feed, but I can't get it to display eachother's feeds.
I've also looked around on FastAPI to see if there were any examples, however in both these examples:
However, I have not made much progress from them. I am looking to the community for help in facing this task.
I've set up this code which is barely an MRE:
fromfastapiimportWebSocket, WebSocketDisconnectfromfastapi.responsesimportHTMLResponsefromniceguiimportui, app# HTML page that uses WebRTC to capture and display video streams.html="""<!DOCTYPE html><html><head> <title>Simple Video Call</title> <style> video { width: 45%; margin: 2%; border: 1px solid black; } body { display: flex; flex-direction: column; align-items: center; } #videos { display: flex; justify-content: center; width: 100%; } </style></head><body> <h2>Simple Video Call</h2> <div id="videos"> <video id="localVideo" autoplay playsinline muted></video> <video id="remoteVideo" autoplay playsinline></video> </div> <script> let localStream; let peerConnection; const configuration = { iceServers: [{ urls: 'stun:stun.l.google.com:19302' }] }; const wsProtocol = location.protocol === 'https:' ? 'wss://' : 'ws://'; const ws = new WebSocket(wsProtocol + location.host + "/signaling"); ws.onmessage = async (event) => { const data = JSON.parse(event.data); console.log("Received:", data); if (data.type === "offer") { await setupPeerConnection(); await peerConnection.setRemoteDescription(data.offer); const answer = await peerConnection.createAnswer(); await peerConnection.setLocalDescription(answer); ws.send(JSON.stringify({ type: "answer", answer: answer })); } else if (data.type === "answer") { await peerConnection.setRemoteDescription(data.answer); } else if (data.type === "candidate") { try { await peerConnection.addIceCandidate(data.candidate); } catch (e) { console.error('Error adding received ICE candidate', e); } } }; // Start by capturing the local media. async function start() { try { localStream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true }); document.getElementById('localVideo').srcObject = localStream; if (!peerConnection) await setupPeerConnection(); } catch (err) { console.error('Error accessing media devices.', err); } } // Initialize the RTCPeerConnection and add the local stream tracks. async function setupPeerConnection() { peerConnection = new RTCPeerConnection(configuration); localStream.getTracks().forEach(track => { peerConnection.addTrack(track, localStream); }); // When an ICE candidate is found, send it to the other peer. peerConnection.onicecandidate = event => { if (event.candidate) { ws.send(JSON.stringify({ type: "candidate", candidate: event.candidate })); } }; // When a remote track is received, display it. peerConnection.ontrack = event => { document.getElementById('remoteVideo').srcObject = event.streams[0]; }; return peerConnection; } // When the WebSocket is open, create an offer. ws.onopen = async () => { await start(); const offer = await peerConnection.createOffer(); await peerConnection.setLocalDescription(offer); ws.send(JSON.stringify({ type: "offer", offer: offer })); }; </script></body></html>"""@ui.page("/")asyncdefget():
returnHTMLResponse(html)
# Simple connection manager to handle two WebSocket clients.classConnectionManager:
def__init__(self):
self.active_connections: list[WebSocket] = []
asyncdefconnect(self, websocket: WebSocket):
awaitwebsocket.accept()
self.active_connections.append(websocket)
print("Client connected. Total clients:", len(self.active_connections))
defdisconnect(self, websocket: WebSocket):
self.active_connections.remove(websocket)
print("Client disconnected. Total clients:", len(self.active_connections))
asyncdefbroadcast(self, message: str, sender: WebSocket):
forconnectioninself.active_connections:
ifconnection!=sender:
awaitconnection.send_text(message)
manager=ConnectionManager()
# Use a custom endpoint '/signaling' instead of '/ws'@app.websocket("/signaling")asyncdefwebsocket_endpoint(websocket: WebSocket):
awaitmanager.connect(websocket)
try:
whileTrue:
data=awaitwebsocket.receive_text()
awaitmanager.broadcast(data, sender=websocket)
exceptWebSocketDisconnect:
manager.disconnect(websocket)
ui.run(on_air=True, host="0.0.0.0", port=8000)
This code is able to ask the user for their webcam and show it to them when running on localhost, but unable to do so on air. When it's on air, it's unable to connect to the websocket for some reason :(
I would greatly appreciate any help with this endeavour.
Thanks a ton
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Question
Hello all!
I've been working with NiceGUI for quite some time now and I've been wanting to do a project that relates to this for quite some time aswell.
Basically, I'm looking to have a way that you can do something similar to Microsoft Teams or Zoom. For now, I just want to get something related to webcams working, as I imagine that something like sharing a screen would follow nicely.
The part that I've been struggling with in this project is actually making it so that all the users see eachother video feeds. I am able to make it so that you can go to a route and view your own webcam feed, then if you access it via your phone you can see your phone webcam feed, but I can't get it to display eachother's feeds.
I've also looked around on FastAPI to see if there were any examples, however in both these examples:
I was unable to get it running / working. I've also seen the following nicegui discussions about this (wherein I've conversed also):
Streamlit seems to have something implemented for this already:
https://github.com/whitphx/streamlit-webrtc
However, I have not made much progress from them. I am looking to the community for help in facing this task.
I've set up this code which is barely an MRE:
This code is able to ask the user for their webcam and show it to them when running on localhost, but unable to do so on air. When it's on air, it's unable to connect to the websocket for some reason :(
I would greatly appreciate any help with this endeavour.
Thanks a ton
Beta Was this translation helpful? Give feedback.
All reactions