Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to play multiple video files simultaneously in one layout side by side in different view in Android

In Android, I created a layout with three surface view side by side, and I want to play one video file with different media player simultaneously. But one problem I faced that none of three can play that video simultaneously. One or two of them getting stopped the display. If I used video view instead of Media Player class directly, but the problem remains the same. Please anybody can help. What the problem is for? It is giving error surface creation failed native error. I tried different combination such as one file in 3 different view, three files in three different view, but the problem is not fixed yet. Some replies in other web site says that it depends on kernel version. If it depends on Kernel version, please can you give me any android documentation link on android site that it depends on kernel version. Or it is possible to play, please give me the steps of code. This is error log -

04-10 19:23:37.995: E/ANDROID_DRM_TEST(2573): Client::notify In
04-10 19:23:37.995: V/AudioPolicyManager(2573): startOutput() output 1, stream 3,  session 131
04-10 19:23:37.995: V/AudioPolicyManager(2573): getDeviceForStrategy() from cache strategy 0, device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): getNewDevice() selected device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() output 1 device 2 delayMs 0
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() setting same device 2 or null device for output 1
04-10 19:23:37.995: I/AudioFlinger(2573): start output streamType (0, 3) for 1
04-10 19:23:37.995: D/AudioHardwareYamaha(2573): AudioStreamOut::setParameters(keyValuePairs="start_output_streamtype=3")
04-10 19:23:38.010: W/SEC_Overlay(2689): overlay_setPosition(0) 0,0,200,397 => 0,0,200,397
04-10 19:23:38.010: I/SEC_Overlay(2689): overlay_setParameter param[4]=4
04-10 19:23:38.010: D/SEC_Overlay(2689): dst width, height have changed [w= 200, h= 397] -> [w=200, h= 397]
04-10 19:23:38.010: I/SEC_Overlay(2689): Nothing to do!
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO ROTATION 0
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO RENDERER 1
04-10 19:23:38.090: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.090: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.090: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.195: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.195: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.195: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO ROTATION 0
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO RENDERER 1
04-10 19:23:38.230: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.230: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.230: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.295: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.295: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.295: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.330: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.330: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.330: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.395: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.395: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.395: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.435: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.435: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.435: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.495: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.495: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.495: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.535: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
like image 298
Suvam Roy Avatar asked Mar 23 '12 06:03

Suvam Roy


People also ask

How can I play two videos at the same time on Android?

Open up the app which you assigned in SoundAssistant to be the first audio device. Make it the first app in split-screen mode and choose the second app for video playback. Press play on the video and then play on the second video. Both videos should play with each audio track directed toward a different audio source.


2 Answers

You are not giving an awful lot of specifics on what exactly you have tried and what the problematic areas are, so I just made a small test to see if I could reproduce any of what you're describing.

I do not have any conclusive findings, but can at least confirm that my Galaxy Nexus (Android 4.0.2) is able to play three videos simultaneously without any problems. On the other hand, an old Samsung Galaxy Spica (Android 2.1-update1) I had lying around only plays a single file at a time - it appears to always be the first SurfaceView.

I further investigated different API levels by setting up emulators for Android 3.0, 2.3.3, and 2.2. All these platforms appear to be able to handle playback of multiple video files onto different surface views just fine. I did one final test with an emulator running 2.1-update1 too, which interestingly also played the test case without problems, unlike the actual phone. I did notice some slight differences in how the layout was rendered though.

This behaviour leads me to suspect that there's not really any software limitation to what you're after, but it seems to depend on the hardware wether simultaneous playback of multiple video files is supported. Hence the support for this scenario will differ per device. From an emperical point of view, I definitely think it would be interesting to test this hypotheses on some more physical devices.

Just for reference some details with regards to the implementation:

  1. I set up two slightly different implementations: one based on three MediaPlayer instances in a single Activity, and one in which these were factored out into three separate fragments with each their own MediaPlayer object. (I did not find any playback differences for these two implementations by the way)
  2. A single 3gp file (thanks for that, Apple), located in the assets folder, was used for playback with all players.
  3. The code for both implementations is attached below and largely based on Googles MediaPlayerDemo_Video sample implementation - I did strip away some code not required for the actual testing. The result is by no means complete or suitable for using in live apps.

Activity-based implementation:

public class MultipleVideoPlayActivity extends Activity implements
    OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

    private static final String TAG = "MediaPlayer";
    private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };

    private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
    private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
    private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
    private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
    private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

    @Override public void onCreate(Bundle icicle) {
        super.onCreate(icicle);
        setContentView(R.layout.multi_videos_layout);

        // create surface holders
        for (int i=0; i<mSurfaceViews.length; i++) {
            mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
            mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
            mSurfaceHolders[i].addCallback(this);
            mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }
    }

    public void onBufferingUpdate(MediaPlayer player, int percent) {
        Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
        Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        int index = indexOf(player);
        if (index == -1) return; // sanity check; should never happen
        mSizeKnown[index] = true;
        if (mVideoReady[index] && mSizeKnown[index]) {
            startVideoPlayback(player);
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");

        int index = indexOf(player);
        if (index == -1) return; // sanity check; should never happen
        mVideoReady[index] = true;
        if (mVideoReady[index] && mSizeKnown[index]) {
            startVideoPlayback(player);
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
        Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
    }


    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

        int index = indexOf(holder);
        if (index == -1) return; // sanity check; should never happen
        try { 
            mMediaPlayers[index] = new MediaPlayer();
            AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
            mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
            mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
            mMediaPlayers[index].prepare();
            mMediaPlayers[index].setOnBufferingUpdateListener(this);
            mMediaPlayers[index].setOnCompletionListener(this);
            mMediaPlayers[index].setOnPreparedListener(this);
            mMediaPlayers[index].setOnVideoSizeChangedListener(this);
            mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override protected void onPause() {
        super.onPause();
        releaseMediaPlayers();
    }

    @Override protected void onDestroy() {
        super.onDestroy();
        releaseMediaPlayers();
    }

    private void releaseMediaPlayers() {
        for (int i=0; i<mMediaPlayers.length; i++) {
            if (mMediaPlayers[i] != null) {
                mMediaPlayers[i].release();
                mMediaPlayers[i] = null;
            }
        }
    }


    private void startVideoPlayback(MediaPlayer player) {
        Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
        player.start();
    }

    private int indexOf(MediaPlayer player) {
        for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
        return -1;  
    }

    private int indexOf(SurfaceHolder holder) {
        for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
        return -1;  
    }
}

R.layout.multi_videos_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent" android:layout_height="match_parent"
    android:orientation="vertical">

    <SurfaceView android:id="@+id/video_1_surfaceview"
        android:layout_width="fill_parent" android:layout_height="0dp"
        android:layout_weight="1" />

    <SurfaceView android:id="@+id/video_2_surfaceview"
        android:layout_width="fill_parent" android:layout_height="0dp"
        android:layout_weight="1" />

    <SurfaceView android:id="@+id/video_3_surfaceview"
        android:layout_width="fill_parent" android:layout_height="0dp"
        android:layout_weight="1" />

</LinearLayout>

Fragment-based implementation:

public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

    private static final String TAG = "MediaPlayer";

    @Override public void onCreate(Bundle icicle) {
        super.onCreate(icicle);
        setContentView(R.layout.multi_videos_activity_layout);
    }

    public static class VideoFragment extends Fragment implements
        OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

        private MediaPlayer mMediaPlayer;
        private SurfaceView mSurfaceView;
        private SurfaceHolder mSurfaceHolder;
        private boolean mSizeKnown;
        private boolean mVideoReady;

        @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
            return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
        }

        @Override public void onActivityCreated(Bundle savedInstanceState) {
            super.onActivityCreated(savedInstanceState);
            mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
            mSurfaceHolder = mSurfaceView.getHolder();
            mSurfaceHolder.addCallback(this);
            mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
        }

        public void onBufferingUpdate(MediaPlayer player, int percent) {
            Log.d(TAG, "onBufferingUpdate percent: " + percent);
        }

        public void onCompletion(MediaPlayer player) {
            Log.d(TAG, "onCompletion called");
        }

        public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
            Log.v(TAG, "onVideoSizeChanged called");
            if (width == 0 || height == 0) {
                Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
                return;
            }

            mSizeKnown = true;
            if (mVideoReady && mSizeKnown) {
                startVideoPlayback();
            }
        }

        public void onPrepared(MediaPlayer player) {
            Log.d(TAG, "onPrepared called");

            mVideoReady = true;
            if (mVideoReady && mSizeKnown) {
                startVideoPlayback();
            }
        }

        public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
            Log.d(TAG, "surfaceChanged called");
        }

        public void surfaceDestroyed(SurfaceHolder holder) {
            Log.d(TAG, "surfaceDestroyed called");
        }

        public void surfaceCreated(SurfaceHolder holder) {
            Log.d(TAG, "surfaceCreated called");

            try { 
                mMediaPlayer = new MediaPlayer();
                AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
                mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
                mMediaPlayer.setDisplay(mSurfaceHolder);
                mMediaPlayer.prepare();
                mMediaPlayer.setOnBufferingUpdateListener(this);
                mMediaPlayer.setOnCompletionListener(this);
                mMediaPlayer.setOnPreparedListener(this);
                mMediaPlayer.setOnVideoSizeChangedListener(this);
                mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
            }
            catch (Exception e) { e.printStackTrace(); }
        }

        @Override public void onPause() {
            super.onPause();
            releaseMediaPlayer();
        }

        @Override public void onDestroy() {
            super.onDestroy();
            releaseMediaPlayer();
        }

        private void releaseMediaPlayer() {
            if (mMediaPlayer != null) {
                mMediaPlayer.release();
                mMediaPlayer = null;
            }
        }

        private void startVideoPlayback() {
            Log.v(TAG, "startVideoPlayback");
            mMediaPlayer.start();
        }
    }
}

R.layout.multi_videos_activity_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:layout_width="match_parent" android:layout_height="match_parent"
    android:orientation="vertical">

    <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
        android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
        android:layout_height="0dp" android:layout_weight="1" />

    <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
        android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
        android:layout_height="0dp" android:layout_weight="1" />

    <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
        android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
        android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>

R.layout.multi_videos_fragment_layout:

<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
    android:layout_height="fill_parent" />

Update: Although it's been around for a while now, I just thought it'd be worth pointing out that Google's Grafika project showcases a 'double decode' feature, which "Decodes two video streams simultaneously to two TextureViews.". Not sure how well it scales to more than two video files, but nevertheless relevant for the original question.

like image 88
MH. Avatar answered Sep 22 '22 17:09

MH.


Chk out this code, It works....

video1=(VideoView)findViewById(R.id.myvideoview);
    video1.setVideoURI(Uri.parse("android.resource://" +getPackageName()+ "/"+R.raw.sample));
    video1.setMediaController(new MediaController(this));
    video1.requestFocus();
video2=(VideoView)findViewById(R.id.myvideview);
video2.setVideoURI(Uri.parse("android.resource://" +getPackageName()+ "/"+R.raw.sample1));
video2.setMediaController(new MediaController(this));
video2.requestFocus();

Thread view1=new Thread(new Runnable() {

    @Override
    public void run() {
        // TODO Auto-generated method stub
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DISPLAY);
        video1.start();
    }
});

Thread view2=new Thread(new Runnable() {

    @Override
    public void run() {
        // TODO Auto-generated method stub
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DISPLAY);
        video2.start();
    }
});

But it is depending upon ur device h/w weather it support multi video-view or not. If its not supporting it will give you error as This video can not be played Error (1, -110)

like image 5
Rohit Avatar answered Sep 22 '22 17:09

Rohit