After looking to implement WebRTC with a Client to Server model (like Discord), I came to the conclusion that the way to do this is to have 2 clients - the server and client. Audio streams can be overlayed and sent back to the user in 1 single stream.
backend/server.js
const clientPeer = new Peer({ initiator: true, wrtc });
clientPeer.on('connect', () => console.log('hi client, this is server'));
clientPeer.on('data', (data) => console.log('got a message from client peer: ', data));
frontend/index.js
serverPeer.on('connect', () => console.log('Connected to server'));
serverPeer.on('stream', async (stream) => {
const video = document.createElement('audio');
('srcObject' in video)
? video.srcObject = stream
: video.src = window.URL.createObjectURL(stream);
await video.play();
});
How would I implement sending media streams between the client and server?
A possible solution can be: Create a MediaRecorder
object, which can record the media streams on the client-side. This object emits data chunks over time. You can send these chunks via WebSocket to the server. On the server side, you can do what you want with the data chunks.
For more details, you can check this https://mux.com/blog/the-state-of-going-live-from-a-browser/.
https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder Another solution can be: Making a node.js application a PEER with WebRTC
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With