I want to merge two or more .wav files to one and then convert it to .mp3 and this I would like to done in Swift (or at least to have option to include it to swift project).
Merge two .wav files in swift isn't problem. Here is my example Now I don't know how to add lame library to swift project and how to use it (how to change objective c lame code usage syntax to use it in swift).
I stuck in swift so I tried Lame library with Objective C. I found example code for converting .caf to .mp3 so I tried it. Here is what I've tried:
- (void) toMp3
{
NSString *cafFilePath = [[NSBundle mainBundle] pathForResource:@"sound" ofType:@"caf"];
NSString *mp3FileName = @"Mp3File";
mp3FileName = [mp3FileName stringByAppendingString:@".mp3"];
NSString *mp3FilePath = [[NSHomeDirectory() stringByAppendingFormat:@"/Documents/"] stringByAppendingPathComponent:mp3FileName];
NSLog(@"%@", mp3FilePath);
@try {
int read, write;
FILE *pcm = fopen([cafFilePath cStringUsingEncoding:1], "rb"); //source
fseek(pcm, 4*1024, SEEK_CUR); //skip file header
FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb"); //output
const int PCM_SIZE = 8192;
const int MP3_SIZE = 8192;
short int pcm_buffer[PCM_SIZE*2];
unsigned char mp3_buffer[MP3_SIZE];
lame_t lame = lame_init();
lame_set_in_samplerate(lame, 44100);
lame_set_VBR(lame, vbr_default);
lame_init_params(lame);
do {
read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm);
if (read == 0)
write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE);
else
write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE);
fwrite(mp3_buffer, write, 1, mp3);
} while (read != 0);
lame_close(lame);
fclose(mp3);
fclose(pcm);
}
@catch (NSException *exception) {
NSLog(@"%@",[exception description]);
}
@finally {
[self performSelectorOnMainThread:@selector(convertMp3Finish)
withObject:nil
waitUntilDone:YES];
}
}
- (void) convertMp3Finish
{
}
But result of this is just .mp3 with noise.
So I need fix my three problems:
I know that there are many questions about encoding and converting mp3 in iOS but I can't find one with Swift example and I can't find example with working Objective C code (just code above). Thanks for help
In the General Preferences tab, click on Import Settings, located towards the bottom. Click on the menu next to Import Using > WAV Encoder. Then click to change Setting > Custom and a new window will open. In the WAV Encoder window, change the Sample Rate to 44.100 kHz and Sample Size to 16-bit.
I would like to post my working solution because I get so many thumbs up and answer from naresh doesn't help me much.
And now source codes. So first wrapper. It's class for converting .wav files to .mp3. There could be many changes (maybe parameter for output file and other options) but I think everyone could change it. I guess this could be rewritten to Swift but I wasn't sure how to do it. So it's Objective C class:
#import "AudioWrapper.h"
#import "lame/lame.h"
@implementation AudioWrapper
+ (void)convertFromWavToMp3:(NSString *)filePath {
NSString *mp3FileName = @"Mp3File";
mp3FileName = [mp3FileName stringByAppendingString:@".mp3"];
NSString *mp3FilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:mp3FileName];
NSLog(@"%@", mp3FilePath);
@try {
int read, write;
FILE *pcm = fopen([filePath cStringUsingEncoding:1], "rb"); //source
fseek(pcm, 4*1024, SEEK_CUR); //skip file header
FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb"); //output
const int PCM_SIZE = 8192;
const int MP3_SIZE = 8192;
short int pcm_buffer[PCM_SIZE*2];
unsigned char mp3_buffer[MP3_SIZE];
lame_t lame = lame_init();
lame_set_in_samplerate(lame, 44100);
lame_set_VBR(lame, vbr_default);
lame_init_params(lame);
do {
read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm);
if (read == 0)
write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE);
else
write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE);
fwrite(mp3_buffer, write, 1, mp3);
} while (read != 0);
lame_close(lame);
fclose(mp3);
fclose(pcm);
}
@catch (NSException *exception) {
NSLog(@"%@",[exception description]);
}
@finally {
[self performSelectorOnMainThread:@selector(convertMp3Finish)
withObject:nil
waitUntilDone:YES];
}
}
Swift AudioHelper class for concatening audio files and calling method for converting .wav file to .mp3:
import UIKit
import AVFoundation
protocol AudioHelperDelegate {
func assetExportSessionDidFinishExport(session: AVAssetExportSession, outputUrl: NSURL)
}
class AudioHelper: NSObject {
var delegate: AudioHelperDelegate?
func concatenate(audioUrls: [NSURL]) {
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
var composition = AVMutableComposition()
var compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())
//create new file to receive data
var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL
var fileDestinationUrl = NSURL(fileURLWithPath: NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav"))
println(fileDestinationUrl)
StorageManager.sharedInstance.deleteFileAtPath(NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav"))
var avAssets: [AVURLAsset] = []
var assetTracks: [AVAssetTrack] = []
var durations: [CMTime] = []
var timeRanges: [CMTimeRange] = []
var insertTime = kCMTimeZero
for audioUrl in audioUrls {
let avAsset = AVURLAsset(URL: audioUrl, options: nil)
avAssets.append(avAsset)
let assetTrack = avAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack
assetTracks.append(assetTrack)
let duration = assetTrack.timeRange.duration
durations.append(duration)
let timeRange = CMTimeRangeMake(kCMTimeZero, duration)
timeRanges.append(timeRange)
compositionAudioTrack.insertTimeRange(timeRange, ofTrack: assetTrack, atTime: insertTime, error: nil)
insertTime = CMTimeAdd(insertTime, duration)
}
//AVAssetExportPresetPassthrough => concatenation
var assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)
assetExport.outputFileType = AVFileTypeWAVE
assetExport.outputURL = fileDestinationUrl
assetExport.exportAsynchronouslyWithCompletionHandler({
self.delegate?.assetExportSessionDidFinishExport(assetExport, outputUrl: fileDestinationUrl!)
})
}
func exportTempWavAsMp3() {
let wavFilePath = NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav")
AudioWrapper.convertFromWavToMp3(wavFilePath)
}
}
Bridging header contains:
#import "lame/lame.h"
#import "AudioWrapper.h"
We have dedicated classes to read/write media from/to a file they are AVAssetReader
and AVAssetWriter
and with the help of AVAssetExportSession
you can export it as mp3 file.
or else you can use https://github.com/michaeltyson/TPAACAudioConverter
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With