I searched many documents but didn't find any exact solution for my problem. I want to implement audio call and screen sharing in Android native application using webrtc without using any third party sdk.
I found one demo example i.e apprtc but it supports only audio call. How to implement screen sharing too?
You can use WebRTC facilities in the Android Platform with the help of Ant Media Server's Native WebRTC Android SDK. In this blog post, features of Webrtc Android SDK will be presented with a sample Android project which comes bundled with the SDK.
This answer may be irrelevant for the OP, since the question is very old.
Anyway, for anyone in the future searching for something similar, check this commit in webrtc repo. It adds a screen capturer for Android.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With