Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why isn't my multichannel mapping working correctly?

I recently posted this question about using multiroute with iOS and I thought I solved it, however I've discovered is doesn't quite work: AVAudioEngine Multichannel mapping

The issue I'm having is the multiroute is only working for the first two output channels. I'm trying to make it work for a 4 channel audio interface.

I have managed to route audio to each output of the USB interface using AVAudioPlayer:

var avplayer = AVAudioPlayer()

@IBAction func avAudioPlayerPlay(_ sender: Any)
{
    let audioSession = AVAudioSession.sharedInstance()
    let route = audioSession.currentRoute

    // set the session category
    do
    {
        //try audioSession.setCategory(.multiRoute)
        try audioSession.setCategory(.multiRoute, options: .mixWithOthers)
    }
    catch
    {
        print("unable to set category", error)
        return
    }

    // activate the audio session - turns on multiroute I believe
    do
    {
        try audioSession.setActive(true)
        //try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
    }
    catch
    {
        print("unable to set active", error)
        return
    }

    //audio interface + headphone jack
    let outputs:[AVAudioSessionChannelDescription] = [
        route.outputs[0].channels![2], // 3rd channel on Audio Interface
        route.outputs[1].channels![1]  // Right Channel of Headphones
    ]

    guard let filePath: String = Bundle.main.path(forResource: "audio", ofType: "m4a") else { return }
    let fileURL: URL = URL(fileURLWithPath: filePath)

    do
    {
        avplayer = try AVAudioPlayer(contentsOf: fileURL)
    }
    catch
    {
        print("play error", error)
        return
    }

    avplayer.channelAssignments = outputs

    let result = avplayer.play()
    print(result)
}

But I can't get it to work using AVAudioEngine:

private func getOutputChannelMapIndices(_ names:[String?]) -> [Int]
{
    let session = AVAudioSession.sharedInstance()
    let route = session.currentRoute
    let outputPorts = route.outputs

    var channelMapIndices:[Int] = []

    for name in names
    {
        var chIndex = 0
        for outputPort in outputPorts
        {
            guard let channels = outputPort.channels else
            {
                continue
            }
            for channel in channels
            {
                print(channel.channelName)
                if channel.channelName == name
                {
                    if names.count > channelMapIndices.count
                    {
                        channelMapIndices.append(chIndex)
                    }
                }
                chIndex += 1
            }
        }
    }
    return channelMapIndices
}

@IBAction func nodesPlay(_ sender: Any)
{
    let channelNames = [
        "UMC204HD 192k 3",
        "Headphones Left",
        "Headphones Right",
        nil
    ]

    let audioSession = AVAudioSession.sharedInstance()

    // set the session category
    do
    {
        //try audioSession.setCategory(.multiRoute)
        try audioSession.setCategory(.multiRoute, options: .mixWithOthers)
    }
    catch
    {
        print("unable to set category", error)
        return
    }

    // activate the audio session - turns on multiroute I believe
    do
    {
        try audioSession.setActive(true)
        //try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
    }
    catch
    {
        print("unable to set active", error)
        return
    }

    let channelMapIndices = getOutputChannelMapIndices(channelNames)

    print("channelMapIndices: ", channelMapIndices)

    engine = AVAudioEngine()
    output = engine.outputNode
    mixer = engine.mainMixerNode

    player = AVAudioPlayerNode()

    engine.attach(player)

    guard let filePath: String = Bundle.main.path(forResource: "audio", ofType: "m4a") else { return }
    let fileURL: URL = URL(fileURLWithPath: filePath)
    let file = try! AVAudioFile(forReading: fileURL)

    let outputNumChannels = output.outputFormat(forBus: 0).channelCount
    print("outputNumChannels:" , outputNumChannels)

    var outputChannelMap:[Int] = Array(repeating: -1, count: Int(outputNumChannels))

    let numberOfSourceChannels = file.processingFormat.channelCount
    print("numberOfSourceChannels: ", numberOfSourceChannels)

    var sourceChIndex = 0
    for chIndex in channelMapIndices
    {
        if chIndex < outputNumChannels && sourceChIndex < numberOfSourceChannels
        {
            outputChannelMap[chIndex] = sourceChIndex
            sourceChIndex += 1
        }
    }

    print("outputChannelMap: ", outputChannelMap)

    if let au = output.audioUnit
    {
        let propSize = UInt32(MemoryLayout.size(ofValue: outputChannelMap))
        print("propSize:", propSize)
        let result = AudioUnitSetProperty(au, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Global, 0, &outputChannelMap, propSize)
        print("result: ", result)
    }

    let channelLayout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_DiscreteInOrder | UInt32(numberOfSourceChannels))
    let format = AVAudioFormat(streamDescription: file.processingFormat.streamDescription, channelLayout: channelLayout)

    engine.connect(player, to: mixer, format:format)
    engine.connect(mixer, to: output, format:format)

    player.scheduleFile(file, at: nil, completionHandler: nil)

    do
    {
        try engine.start()
    }
    catch
    {
        print("can't start", error)
        return
    }

    player.play()
}

If anyone could explain why I can't seem to play any audio to output 3 or 4 I would really appreciate it.

Note, a lot of this code was translated from here: https://forums.developer.apple.com/thread/15416

like image 738
Castles Avatar asked Jun 06 '20 11:06

Castles


1 Answers

I believe the problem is the line

let propSize = UInt32(MemoryLayout.size(ofValue: outputChannelMap))

This is giving you the size of the array object, which is essentially the size of a pointer, not the size of the objects in the array. See the discussion in the Apple docs.

The size of the property should be the number of channels contained in the array multiplied by the size of Int32, since AudioUnitSetProperty is a C API and that would be the size of a corresponding C array.

let propSize = UInt32(MemoryLayout<Int32>.stride * outputChannelMap.count)

You should also declare outputChannelMap as an array of Int32 since that is the type expected by kAudioOutputUnitProperty_ChannelMap:

var outputChannelMap:[Int32] = Array(repeating: -1, count: Int(outputNumChannels))
like image 107
sbooth Avatar answered Oct 22 '22 19:10

sbooth