Is it possible to enable subtitles from a Google Cast sender app?
I have implemented Google Cast
in my app and hls
streaming is working fine on Chrome Cast. There is a subtitle track included in the hls
file, but the subtitles are not showing. There are no .vtt
files available for the videos and therefor I cannot implement MediaTrack
to send subtitles URL to receiver app.
I was wondering if it is possible to enable subtitles on the hls
streaming from sender app or do I need to make a custom receiver app for that?
I am creating the MediaInfo
object which is sent to receiver app in the following way (standard)
private MediaInfo buildMediaInfo() {
MediaMetadata movieMetadata = new MediaMetadata(MediaMetadata.MEDIA_TYPE_MOVIE);
movieMetadata.putString(MediaMetadata.KEY_TITLE, mTitle + " (" + mProdYear + ")");
movieMetadata.putString(MediaMetadata.KEY_SUBTITLE, mFilmType);
movieMetadata.addImage(new WebImage(Uri.parse(mImageUrl)));
return new MediaInfo.Builder(mVideoUrl.toString())
.setStreamType(MediaInfo.STREAM_TYPE_BUFFERED)
.setContentType("videos/mp4")
.setMetadata(movieMetadata)
//enable subtitles on hls streaming??
.build();
}
In this post I’ll describe how to add subtitles to a live HLS stream. Subtitles can be added to a live video stream by creating a live subtitle playlist. Before I delve into the details, let’s recap how the playlist for a live video stream works. A live playlist contains a fixed number of entries.
Under “Select stream key,” click Create new stream key, and select HLS as a stream protocol. Note: If you want to stream in HDR, you must leave “Turn on manual resolution” unchecked. The “Stream URL” for HLS ingestion will update. The URL should start with “https” instead of “rtmp”.
Make sure your encoder supports HLS and that you know the basics of live streaming on YouTube. If your encoder has a preset for HLS ingestion to YouTube, select the preset. You may need to copy and paste your stream key like with RTMP streams. You’re now ready to stream.
A live subtitle playlist behaves in much the same way. For example, at a given point in time the subtitle playlist may look something like this: I’m only showing 3 segments here to be concise; a real playlist may have more entries. After an additional 10 seconds, it will look like this:
You didn't mention what receiver you are using. In Default/Styled receiver, HLS (adaptive streams in general) are handled by the Media Player Library. If it sees a supported tracks, then that information will be sent to the connected devices in status updates. Sender apps can then learn about the presence of additional tracks (and their associated track ids) and then can turn them on or off. For example, if you use CCL, this will be handled for you automatically (on the sender side): to see that in action, grab CastVideos-android-v2 (which uses CCL) and have two phones, connection to the receiver and start one of the first three movies there (the first three have closed captions) and go to the full screen cast controller page and turn on closed caption and select the text track. Then connect the second phone to the same receiver and you'll see it syncs up with what is playing on the receiver and if you go into the full screen controller on the second phone, you'll see that it knows about the closed captions and what track is enabled.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With