Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Stream overlapping audio files to Chromecast Audio

Short

I would like to stream multiple overlapping audio files (some sound effects that play at certain random times). So kind of a generated audio stream that will NEVER repeat exactly the same way. Some audio files are looping, some are at specific times. Probably kind of realtime stream insertion would be good I guess.

What is the best way to write such a server software? What protocols should be used for streaming that (I prefer over HTTP). I would probably want to expose an url for each configuration (tracks & timing of sound effects).

Any pointers to code/libraries? Best if in any language like java/kotlin/go/rust/ruby/python/node/...

Example

Url: https://server.org/audio?file1=loop&file2=every30s&file2_volume=0.5

Response: Audio stream (that plays on cast devices)

Stream loops the file1. At every 30s it plays file2 with 50% volume (overlayed over file1 which plays at 100%). File 1 is like 10m9s long. So the the combination never repeats really. So we can not just provide a pregenerated mp3 file.

Some background

I currently have an android application that plays different audio files at random. Some are looping, some play every x seconds. Sometimes as many as 10 at the same time.

Now I would like to add support for chromecast/chromecast audio/google home/... . I guess best would be to have a server that streams that. Every user would have his/her own stream when playing. No need for having multiple users listen to the same stream (even though it probably would be supported as well).

The server would basically read the url, get the configuration and then respond with a audio stream. The server opens one (or multiple audio files) that it then combines/overlays into a single stream. At certain times those audio files are looped. Some other audio files are opened at specific times and added/overlayed to the stream. Each audio file played is played at a different volume level (some are louder, some are quieter). The question is how to make such an audio stream and how to add the different files in in realtime.

like image 592
Patrick Boos Avatar asked Mar 26 '18 02:03

Patrick Boos


People also ask

Why did Google discontinue Chromecast Audio?

The official reason Google provided for discontinuing the Audio line was that they already had many products for users to enjoy their music and podcasts. They informed customers the Chromecast Audio would still be supported but no longer manufactured.

How do I separate audio from Chromecast?

Regardless of using the microphone as the audio input, you do need an HDMI to HDMI+Audio adaptor to extract audio from the Chromecast and use it on another device, such as your PC, laptop, or headphones. You will only use the microphone playback to play the audio from the media you're casting through your PC speakers.

Can Chromecast stream high resolution audio?

High-resolution audio is also supported thanks to a recent update. The Chromecast Audio can now stream up to 96KHz/24bit lossless audio from compatible devices (with apps that support the audio) to connected speakers, which makes the device an appealing option for Tidal fans and audiophiles.

Can I stream audio to Chromecast?

Mirror your Android phone or tablet music Hear anything playing on your Android device on your speakers. From your Android phone or tablet, open the Google Home app. Tap the left hand navigation to open the menu. Tap “Cast screen / audio” and select your speakers.


1 Answers

So there are two parts to your problem

  • Mixing the audios using different options
  • Stream that mixed response from a webserver

I can help you with the later part and you need to figure out the first part yourself

Below is a sample nodejs script. Run it create a directory and run

npm init
npm install fluent-ffmpeg express

and then save the below file

server.js

var ff = require('fluent-ffmpeg');
var express = require('express')
var app = express()

app.get('/merged', (req, res) => {
    res.contentType('mp3');
    // res.header("Transfer-Encoding", "chunked")
    command = ff()
        .input("1.mp3")
        .input("2.mp3")
        .input("3.mp3")
        .input("4.mp3")
        .complexFilter(`[1]adelay=2|5[b];
        [2]adelay=10|12[c];
        [3]adelay=4|6[d];
        [0][b][c][d]amix=4`)
        .outputOptions(["-f", "flv"])
    ;

    command.on('end', () => {
        console.log('Processed finished')
        // res.end()
    })
    command.pipe(res, {end: true});
    command.on('error', function(err, stdout, stderr) {
        console.log('ffmpeg stdout: ' + stdout);
        console.log('ffmpeg stderr: ' + stderr);
    });

})

app.listen(9090)

Run it using below command

node server.js

Now in VLC open http://localhost:9090/merged

Open in VLC

Now for your requirement the below part will change

        .complexFilter(`[1]adelay=2|5[b];
        [2]adelay=10|12[c];
        [3]adelay=4|6[d];
        [0][b][c][d]amix=4`)

But I am no ffmpeg expert to guide you around that area. Perhaps that calls for another question or taking lead from lot of existing SO threads

ffmpeg - how to merge multiple audio with time offset into a video?

How to merge two audio files while retaining correct timings with ffmpeg

ffmpeg mix audio at specific time

https://superuser.com/questions/850527/combine-three-videos-between-specific-time-using-ffmpeg

like image 140
Tarun Lalwani Avatar answered Oct 18 '22 21:10

Tarun Lalwani