Raw Audio Playback with AudioTrack

AudioTrack is a class in Android that allows us to play raw audio samples. This allows for the playback of audio captured with AudioRecord that otherwise wouldn't be playable using a MediaPlayer object.

To construct an AudioTrack object, we need to pass in a series of configuration variables describing the audio to be played.

  • The first argument is the stream type. The possible values are defined as constants in the AudioManager class. We'll be using AudioManager.STREAM_MUSIC, which is the audio stream used for normal music playback.
  • The second argument is the sample rate in hertz of the audio data that will be played back. In our example, we'll be capturing audio at 11,025 Hz, and therefore, to play it back, we need to specify the same value.
  • The third argument is the channel configuration. The possible values, the same as those used when constructing an AudioRecord object, are defined as constants in the AudioFormat class. Their names are self-explanatory.
    • AudioFormat.CHANNEL_CONFIGURATION_MONO
    • AudioFormat.CHANNEL_CONFIGURATION_STEREO
    • AudioFormat.CHANNEL_CONFIGURATION_INVALID
    • AudioFormat.CHANNEL_CONFIGURATION_DEFAULT
  • The fourth argument is the format of the audio. The possible values are the same as those used when constructing an AudioRecord object, and they are defined in AudioFormat as constants. The value used should match the value of the audio that will be passed in.
    • AudioFormat.ENCODING_DEFAULT
    • AudioFormat.ENCODING_INVALID
    • AudioFormat.ENCODING_PCM_16BIT
    • AudioFormat.ENCODING_PCM_8BIT
  • The fifth argument is the size of the buffer that will be used in the object to store the audio. To determine the smallest buffer size to use, we can call getMinBufferSize, passing in the sample rate, the channel configuration, and audio format.
    int frequency = 11025;
    int channelConfiguration = AudioFormat.CHANNEL_CONFIGURATION_MONO;
    int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;

    int bufferSize = AudioTrack.getMinBufferSize(frequency, channelConfiguration,Image
     audioEncoding);
  • The last argument is the mode. The possible values are defined as constants in the AudioTrack class.
    • AudioTrack.MODE_STATIC: The audio data will all be transferred to the AudioTrack object before playback occurs.
    • AudioTrack.MODE_STREAM: The audio data will continue to be transferred to the AudioTrack object while playback is in progress.

Here is our AudioTrack configuration:

AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, frequency,
                channelConfiguration, audioEncoding, bufferSize,
                AudioTrack.MODE_STREAM);

Once the AudioTrack is constructed, we need to open an audio source, read the audio data into a buffer, and pass it to the AudioTrack object.

We'll construct a DataInputStream from a file containing raw PCM data in the right format (11,025 Hz, 16 bit, mono).

DataInputStream dis = new DataInputStream(
            new BufferedInputStream(new FileInputStream(recordingFile)));

We can then call play on the AudioTrack and start writing audio in from the DataInputStream.

audioTrack.play();

while (isPlaying && dis.available() > 0) {
    int i = 0;
    while (dis.available() > 0 && i < audiodata.length) {
        audiodata[i] = dis.readShort();
        i++;
    }
    audioTrack.write(audiodata, 0, audiodata.length);
}

dis.close();

That covers the basics of using AudioTrack to play back audio from a file as it is recorded from an AudioRecorder.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.225.57.164