Android Studio Audio Player [Part 3] Loading the audio into data array

Leonard Loo
3 min readDec 16, 2020
You can refer to all source codes here: https://github.com/leonardltk/AndroidAudioWaveViewer
The end product should look like this.

This is is part 3 of the series.

  1. Setting up the interface.
  2. Playing, pausing, resume, stopping an audio.
  3. Loading the audio into data array.
  4. Plotting the data into a waveplot canvas for visualisation.
  5. Enable scrolling of the canvas.
  6. Having a slider to show the current wave time.

Loading the audio into data array

This will be quite dry and technical code walkthrough, in the process to reading audio in to data array that we can manipulate and use.

public long getLE2(byte[] buffer) {
long val = buffer[1] & 0xFF;
val = (val << 8) + (buffer[0] & 0xFF);
return val;
}

The getLE2 is just a hardcoded way to convert bytes of length 2 into an integer, which we will use inside readAudio_ArrayList.

readAudio_ArrayList reads the identifier of the audio wave, which here we preset to R.id.test, and loads them into an ArrayList, which we define here as WaveOut.

A side note : wave file stores several information regarding its properties, prior to the actual wave data. So we need to process them slowly first.

InputStream inputStream = this.getResources().openRawResource(rawID)
int read;

First we use this.getResources().openRawResource to open it into a stream, so that we can start to read from this stream. Then we define read just as an output, to tell us whether the bytes are successfully read or not.

/** Header Details */
byte[] bytes_tmp = new byte[44];
read = inputStream.read(bytes_tmp, 0, bytes_tmp.length);

The header details are stored in 44 bytes of data, refer to here for full details about what they are individually. But for now, we read them and store it as a junk variable bytes_tmp, which we will ignore the usage. We needed to read them because the stream has to be read in order.

/** Reading Wav file */
/* Reading WaveOut */
byte[] bytes = new byte[2];
long longtmp;
while ( read != -1 ){
read = inputStream.read(bytes, 0, bytes.length);
longtmp = getLE2(bytes);
WaveOut.add( (float) longtmp );
}

/** Close */
inputStream.close();

Each data point is stored as 2 bytes, so we store them it into byte[2]. We read them into bytes using inputStream.read. We then convert the bytes into a long integer using getLE2, then we append them into WaveOut. We repeat this processes until read=-1, which means there are no more data to be read. Then we close the stream by calling inputStream.close().

amplitudes = readAudio_ArrayList(R.raw.test);

Now lets call our readAudio_ArrayList using the LoadAudio function. This line stores the audio data into an ArrayList amplitudes. We can get some information regarding its size and duration.

NumSamples = amplitudes.size();
DurMilli = NumSamples*1000/samplingRate;

NumSamples refers to the number of data points in the audio, and DurMilli refers to the duration in milliseconds.

Now you are able to do whatever you want with this audio data loaded into your MainActivity.xml. Next we will learn how to plot them data into a canvas for visualisation.

--

--