Android apps, via the Presentation
class and related facilities, can display separate content on an external display (HDMI, MHL, Miracast, etc.) than on a phone or tablet's own touchscreen (what I'll call the "primary display" here).
For the primary display, we can use screencap
or DDMS to do a screenshot.
Is there a way to do a screenshot of the external display?
Ideally, it would be a solution that could be used akin to screencap
/DDMS, with no app modifications. However, solutions that involve somehow copying the contents of the Window
representing the external display would also be interesting.
Use USB-C cable or USB-C-to-HDMI cable to connect phone with external display. On the phone, choose an experience: Mobile desktop - Use your phone and mobile apps on an external display with multiple windows for a more productive workspace.
AZ Screen Recorder is one of the most popular screen capture apps for Android. It offers plenty of customisability and will allow you to configure settings such as resolution options (from 240p to 1080p), bitrate settings, frame rates (upto 60FPS), and more.
Use the usual button combination (Power + Volume down) to capture a screenshot. On the bottom right corner, you'll see a thumbnail of your screenshot with a button below it called Expanded Screenshot. Tap on it. The content on your screen will start scrolling down.
There is no OS feature for capturing the external display.
The main display is composited from a number of sources, as is the external display. The "screen capture" feature is actually just another way to composite the various sources. Because it uses the same parameters as the main display, you get the same output. (Note that any "secure" layers, such as DRM-protected video, do not appear however.) So to "capture" the external display, you'd want a screen capture variant that uses the external display "layer stack" rather than the main, and there is no API for doing so.
(See SurfaceFlinger::captureScreenImplLocked()
in SurfaceFlinger.cpp.)
The screenrecord
command does something similar: it creates a virtual display with the main display's layer stack, and captures whatever gets rendered there.
As far as capturing the output of a Presentation
goes, there are a couple of CTS tests that may be of interest. The VirtualDisplayTest creates a Presentation
and captures it with an ImageReader, while EncodeVirtualDisplayTest feeds the Presentation through a full video encode/decode path. The tests just draw a series of colors and then check to make sure the colors can be read back. Both of these require recent very versions of Android (e.g. ImageReader
didn't exist until API 19), and the app's limited permissions will restrict the output to what the app itself is rendering.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With