Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Audio file captured by MediaRecorder is broken after it is sent to server using Retrofit 2

My app records an audio clip and send the clip to the server using Retrofit2 after the recording is completed.The file is received in server,but the file is broken,what I mean by broken is that it cannot be played. I use the following URL(example url:mydomain.co/audio/myaudio.mp4) to play the audio clip,which I tried with another audio file using postman,the audio file can be played successfully.Besides,even downloading the audio clip captured by android via Filezilla also has the same broken file.

This is how I record the audio:

private void startRecordingAudio() {
    Log.d("audiorecording","recording sound");
    recorder = new MediaRecorder();
    recorder.setAudioSource(MediaRecorder.AudioSource.MIC);
    recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
    recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
    recorder.setAudioEncodingBitRate(16);
    recorder.setAudioSamplingRate(44100);

    MediaFileHelper fileHelper = new MediaFileHelper();
    audioOutputFile = fileHelper.getOutputAudioFile();
    recorder.setOutputFile(audioOutputFile.getAbsolutePath());

    try {
        recorder.prepare();
        recorder.start();
    } catch (IllegalStateException | IOException e) {
        e.printStackTrace();
    }
}

Here is the file path of the audio files:

/storage/emulated/0/DCIM/myapp/AUDIO_20171023143717.mp4

To solve this,I tried using different codec and different output format,as follows

recorder.setOutputFormat(MediaRecorder.OutputFormat.THERE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);

recorder.setOutputFormat(MediaRecorder.OutputFormat.THERE_GPP);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_WB);

I have also tried to change the file extension to .mp3 , .mp4 and .3gpp as well,but none of them seem to work on the server side. To make things clear, I've attach the code which I used to send the audio file to server using Retrofit2:

private void sendAudioToServer( final String audioFilePath,final String api_key){
    Log.d("AUDIO FILE PATH",audioFilePath);
    File audioFile = new File(audioFilePath);
    RequestBody audioBody = RequestBody.create(MediaType.parse("audio/*"), audioFilePath);
    MultipartBody.Part aFile = MultipartBody.Part.createFormData("audio", audioFile.getName(), audioBody);

    OkHttpClient httpClient = new OkHttpClient.Builder()
            .addInterceptor(new Interceptor() {
                @Override
                public okhttp3.Response intercept(Chain chain) throws IOException {
                    okhttp3.Request.Builder ongoing = chain.request().newBuilder();
                    ongoing.addHeader("authorization", api_key);
                    return chain.proceed(ongoing.build());
                }
            })
            .build();

    Retrofit retrofit = new Retrofit.Builder()
            .baseUrl(AppConfig.BASE_URL)
            .addConverterFactory(GsonConverterFactory.create())
            .client(httpClient)
            .build();

    AudioInterface audioInterface = retrofit.create(AudioInterface.class);
    Call<ResultObject> serverCom = audioInterface.sendAudioToServer(aFile);

    serverCom.enqueue(new Callback<ResultObject>() {
        @Override
        public void onResponse(Call<ResultObject> call, retrofit2.Response<ResultObject> response) {
            ResultObject result = response.body();
            if(!TextUtils.isEmpty(result.getSuccess())){
                Log.d("audio Result " , result.getSuccess());
            }
        }

        @Override
        public void onFailure(Call<ResultObject> call, Throwable t) {
            Log.d("audio error",t.toString());
        }
    });
}

My question is:

1)what is the correct way to send the audio file to server from android?

2) if the problem is about the codec for the audio recorder,what is the correct codec for audio file? Cause later the same audio I need to support Ios and website as well..

Somebody please give some suggestion.Thanks in advance

Edit:

I try to play the recorded audio file using MediaPlayer,the file path is like this

/storage/emulated/0/DCIM/Myapp/AUDIO_20171026135950.mp4

Here is my code to play the audio file above

String audioFile = " /storage/emulated/0/DCIM/Myapp/AUDIO_20171026135950.mp4";
audioButton.setOnClickListener(new View.OnClickListener() {
                    @Override
                    public void onClick(View v) {
                        MediaPlayer mp = new MediaPlayer();
                        Uri uri = Uri.parse(audioFile);
                        try{
                            mp.setDataSource(mContext,uri);
                            mp.prepare();
                            mp.start();
                        }catch (IOException E){
                            E.printStackTrace();
                        }
                    }
                });

But it throws me the following exception, when I play the local audio file:

W/MediaPlayer: Couldn't open file on client side; trying server side: java.io.FileNotFoundException: No content provider: /storage/emulated/0/DCIM/Myapp/AUDIO_20171026135950.mp4

If I play the URL from server,it throw this exception:

10-26 14:06:05.551 8806-8806/? W/System.err: java.io.IOException: Prepare failed.: status=0x1
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.media.MediaPlayer._prepare(Native Method)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.media.MediaPlayer.prepare(MediaPlayer.java:1163)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.view.View.performClick(View.java:5198)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.view.View$PerformClick.run(View.java:21147)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.os.Handler.handleCallback(Handler.java:739)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:95)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.os.Looper.loop(Looper.java:148)
10-26 14:06:05.552 8806-8806/? W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5417)
10-26 14:06:05.552 8806-8806/? W/System.err:     at java.lang.reflect.Method.invoke(Native Method)
10-26 14:06:05.552 8806-8806/? W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
10-26 14:06:05.552 8806-8806/? W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)

So what am I doing wrong here?

like image 749
ken Avatar asked Oct 18 '22 04:10

ken


1 Answers

Your setAudioEncodingBitRate is very low.

it should be 16000 for 16hz and not just 16

try it with 16000 or 32000 in setAudioEncodingBitRate value

comment here if it works or not, i'll try to help you out more

like image 89
Kathan Shah Avatar answered Oct 29 '22 23:10

Kathan Shah