First of all I'm a noob in both iOS and audio programming, so bear with me if I don't use the correct technical terms, but I'll do my best!
In an iOS app we are developing, we want to be able to play sounds throughout 4 different outputs to have a mini surround system. That is, we want to have the Left and Right channels play through the Headphones, while the Center and Center surround play through an audio hardware connected to the lightning port. Since the audio files will be streamed/dowloaded from a remote server, using raw (PCM) audio files is not an option.
Apple has, since iOS 6, made it possible to play an audio file using a multiroute configuration... and that is grate and exactly what we need... but, when ever we try to play a 4-channel audio file, AAC-encoded and encapsulated in an m4a (or CAF) file format, we get the following error:
ERROR: [0x19deee000] AVAudioFile.mm:86: AVAudioFileImpl: error 1718449215
(Which is the status code for "kAudioFileUnsupportedDataFormatError" )
We get the same error when we use the same audio encoded as lossless (ALAC) instead, but we don't get this error when playing the same audio befor encoding (PCM format).
We don't get the error neither when we use a stereo audio file, or a 5.1 audio file encoded, the same way as the 4-channels one, in both AAC and ALAC.
The file was encoded using Apple's audio tools provided with Mac OS X: afconvert
using this command:
afconvert -v -f 'm4af' -d "aac@44100" 4ch_master.caf 4ch_44100_AAC.m4a
and
afconvert -v -f 'caff' -d "alac@44100" 4ch_master.caf 4ch_44100_ALAC.caf
in the case of lossless encoding.
The audio format, as given by afinfo
for the master (PCM) audio file:
File: 4ch_master.caf
File type ID: caff
Num Tracks: 1
----
Data format: 4 ch, 44100 Hz, 'lpcm' (0x0000000C) 16-bit little-endian signed integer
no channel layout.
estimated duration: 582.741338 sec
audio bytes: 205591144
audio packets: 25698893
bit rate: 2822400 bits per second
packet size upper bound: 8
maximum packet size: 8
audio data file offset: 4096
optimized
audio 25698893 valid frames + 0 priming + 0 remainder = 25698893
source bit depth: I16
The AAC-encoded format info:
File: 4ch_44100_AAC.m4a
File type ID: m4af
Num Tracks: 1
----
Data format: 4 ch, 44100 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame
Channel layout: Quadraphonic
estimated duration: 582.741338 sec
audio bytes: 18338514
audio packets: 25099
bit rate: 251730 bits per second
packet size upper bound: 1039
maximum packet size: 1039
audio data file offset: 106496
optimized
audio 25698893 valid frames + 2112 priming + 371 remainder = 25701376
source bit depth: I16
format list:
[ 0] format: 4 ch, 44100 Hz, 'aac ' (0x00000000) 0 bits/channel, 0 bytes/packet, 1024 frames/packet, 0 bytes/frame
Channel layout: Quadraphonic
----
And for the lossless encoded audio file:
File: 4ch_44100_ALAC.caf
File type ID: caff
Num Tracks: 1
----
Data format: 4 ch, 44100 Hz, 'alac' (0x00000001) from 16-bit source, 4096 frames/packet
Channel layout: 4.0 (C L R Cs)
estimated duration: 582.741338 sec
audio bytes: 83333400
audio packets: 6275
bit rate: 1143862 bits per second
packet size upper bound: 16777
maximum packet size: 16777
audio data file offset: 20480
optimized
audio 25698893 valid frames + 0 priming + 3507 remainder = 25702400
source bit depth: I16
----
In the code part, at the beginning, we followed the implementation presented at session 505 of WWDC12 using AVAudioPlayer API. At that level, multirouting didn't seemed to work reliably.. we didn't suspect that that might have been related to the audio format, so we moved on experimenting with AVAudioEngine API, presented at session 502 of WWDC14 and the sample code associated to it. We made the multirouting work for the master 4-channels audio file (after some adaptations), but then we hit the error mentioned above when calling scheduleFile
, as shown in the code snippet below (Note: We are using Swift and all the necessary audio graph setup is done but not shown here):
var playerNode: AVAudioPlayerNode!
...
...
let audioFileToPlay = AVAudioFile(forReading: URLOfTheAudioFle)
playerNode.scheduleFile(audioFileToPlay, atTime: nil, completionHandler: nil)
Do someone have a hint on what could be wrong in the audio data format?
After contacting Apple Support, the answer was that this is not possible for the currently shipping system configurations:
"Thank you for contacting Apple Developer Technical Support (DTS). Our engineers have reviewed your request and have concluded that there is no supported way to achieve the desired functionality given the currently shipping system configurations."
User contributions licensed under CC BY-SA 3.0