Android TV devices can have multiple audio outputs connected at the same time: TV speakers, HDMI-connected home cinema, Bluetooth headphones, and so on. These audio output devices can support different audio capabilities, like encodings (Dolby Digital+, DTS, and PCM), sample rate, and channels. For example, HDMI-connected TVs have support for a multitude of encodings while connected Bluetooth headphones usually support just PCM.
The list of available audio devices and the routed audio device can also change by hot-plugging HDMI devices, connecting or disconnecting Bluetooth headphones, or the user changing audio settings. Since the audio output capabilities can change even when apps are playing media, apps need to gracefully adapt to these changes and continue playback on the new routed audio device and its capabilities. Outputting the wrong audio format can result in errors or no sound playing.
Apps have the capability to output the same content in multiple encodings to offer the user the best audio experience depending on audio device capabilities. For example, a Dolby Digital encoded audio stream is played if the TV supports it, while a more widely-supported PCM audio stream is chosen when there is no support for Dolby Digital. The list of built-in Android decoders used to transform an audio stream into PCM can be found in Supported media formats.
At playback time, the streaming app should create an
AudioTrack
with the best
AudioFormat
supported by the output
audio device.
Create a track with the right format
Apps should create an AudioTrack
, start playing it, and call
getRoutedDevice()
to determine the default audio device from which to play sound.
This can be, for example, a safe short silence PCM encoded track used only to
determine the routed device and its audio capabilities.
Get supported encodings
Use
getAudioProfiles()
(API level 31 and higher) or
getEncodings()
(API level 23 and higher) to determine the audio formats available on the
default audio device.
Check supported audio profiles and formats
Use AudioProfile
(API level 31 and higher) or
isDirectPlaybackSupported()
(API level 29 and higher) to check supported combinations of format,
channel count, and sample rate.
Some Android devices are capable of supporting encodings beyond the ones
supported by the output audio device. These additional formats should be
detected through isDirectPlaybackSupported()
. In these cases the audio data
is re-encoded to a format that is supported by the output audio device. Use
isDirectPlaybackSupported()
to properly check support for the desired format
even if it is not present in the list returned by getEncodings()
.
Create audio track
Apps should use this information to create an AudioTrack
for the
highest-quality AudioFormat
supported by the default audio device
(and available for the selected content).
Intercept audio device changes
To intercept and react to audio device changes, apps should:
- For API levels equal to or greater than 24, add an
OnRoutingChangedListener
to monitor audio device changes (HDMI, Bluetooth, and so on). - For API level 23, register an
AudioDeviceCallback
to receive changes in the available audio device list. - For API levels 21 and 22, monitor for HDMI plug events and use the extra data from the broadcasts.
- Also register a
BroadcastReceiver
to monitorBluetoothDevice
state changes for devices lower than API 23, sinceAudioDeviceCallback
is not yet supported.
When an audio device change has been detected for the AudioTrack
, the app
should check the updated audio capabilities and, if needed, recreate
the AudioTrack
with a different AudioFormat
. Do this if a higher-quality
encoding is now supported or the previously-used encoding is
no-longer-supported.
Sample code
Kotlin
// audioPlayer is a wrapper around an AudioTrack // which calls a callback for an AudioTrack write error audioPlayer.addAudioTrackWriteErrorListener { // error code can be checked here, // in case of write error try to recreate the audio track restartAudioTrack(findDefaultAudioDeviceInfo()) } audioPlayer.audioTrack.addOnRoutingChangedListener({ audioRouting -> audioRouting?.routedDevice?.let { audioDeviceInfo -> // use the updated audio routed device to determine // what audio format should be used if (needsAudioFormatChange(audioDeviceInfo)) { restartAudioTrack(audioDeviceInfo) } } }, handler)
Java
// audioPlayer is a wrapper around an AudioTrack // which calls a callback for an AudioTrack write error audioPlayer.addAudioTrackWriteErrorListener(new AudioTrackPlayer.AudioTrackWriteError() { @Override public void audioTrackWriteError(int errorCode) { // error code can be checked here, // in case of write error try to recreate the audio track restartAudioTrack(findDefaultAudioDeviceInfo()); } }); audioPlayer.getAudioTrack().addOnRoutingChangedListener(new AudioRouting.OnRoutingChangedListener() { @Override public void onRoutingChanged(AudioRouting audioRouting) { if (audioRouting != null && audioRouting.getRoutedDevice() != null) { AudioDeviceInfo audioDeviceInfo = audioRouting.getRoutedDevice(); // use the updated audio routed device to determine // what audio format should be used if (needsAudioFormatChange(audioDeviceInfo)) { restartAudioTrack(audioDeviceInfo); } } } }, handler);