Home Development for Android Working with sound and the SuperPowered library

Working with sound and the SuperPowered library

by admin

I have a task before me : I need to develop an application that will record from a microphone, then modify (acceleration or pitch shifting), Saving the effect in the file itself and sending the resultant MP3-file to the application server. This is a complex task. And they also want min-sdk=9.
For the sound recording, to make it simpler, from the start we ask for the class MediaPlayer Writes from the microphone, and at the same time immediately in compressed form, AAC for example. For reference (if suddenly who does not know): MP3- een there is no coder in android because there’s a commercial lyuksha, and there’s only de is an MP3 encoder, so there is no way There is no way to record to MP3 at once, you can only play it back, which is what the decoder is for.
All is fine, but in order to do something with sound, namely to put any effect on it, it needs to be recorded in its original form, so to speak, i.e. not in compressed to MP3 or AAC, but in PCM/WAVE format. Besides, when playing it back, you need to "display" the effect in real time, so you can tune it by ear. The MediaPlayer class is not good for that because It writes only in compressed format, and putting acceleration on while playing is a difficult task, taking into account the minimal SDK. You can add work to yourself: record in MediaPlayer-e in AAC, for example, and then decompress AAC to PCM/WAVE, which in Android is not provided, and so you have to look for another solution for this. Again: unnecessary time to save the battery, on the computationally greedy process of unpacking versus the option to record immediately in uncompressed form, well, the user will not like to waste his precious time on this process (he may not know why, but he will have to wait).
From all this it follows that recording should be done using a bunch of other classes : AudioRecord – it writes, and AudioTrack – and it plays and acceleration on it is not a problem when playing.
In practice, however, there are plenty of problems with these classes, too. First, AudioRecord writes PCM data at once, that’s what I need, but it doesn’t create WAVE header itself. It writes RAW PCM, but in order to use this data not only in AudioTrack, you need to add code to create this header; plus you mustn’t forget to jump the first 44 bytes (header size) when playing it, so AudioTrack won’t try to play them back. Second, all recording and playback activities must be put in separate streams, which doesn’t make development any easier.
But I will give you an example of AudioRecord based recorder in a separate class, because it’s mostly not mine anyway, but copied from SO (includes header creation and may be useful for someone):

import android.annotation.SuppressLint;import android.media.AudioFormat;import android.media.AudioRecord;import android.media.MediaRecorder;import com.stanko.tools.DeviceInfo;import com.stanko.tools.Log;import java.io.File;import java.io.IOException;import java.io.RandomAccessFile;public class AudioRecorder{/*** INITIALIZING : recorder is initializing;* READY : recorder has been initialized, recorder not yet started* RECORDING : recording* ERROR : reconstruction needed* STOPPED: reset needed*/public enum State {INITIALIZING, READY, RECORDING, ERROR, STOPPED};public static final boolean RECORDING_UNCOMPRESSED = true;public static final boolean RECORDING_COMPRESSED = false;// The interval in which the recorded samples are output to the file// Used only in uncompressed modeprivate static final int TIMER_INTERVAL = 120;// Toggles uncompressed recording on/off; RECORDING_UNCOMPRESSED / RECORDING_COMPRESSEDprivate boolean isUncompressed;// Recorder used for uncompressed recordingprivate AudioRecord mAudioRecorder = null;// Recorder used for compressed recordingprivate MediaRecorder mMediaRecorder = null;// Stores current amplitude (only in uncompressed mode)private int cAmplitude= 0;// Output file pathprivate String mFilePath = null;// Recorder state; see Stateprivate State state;// File writer (only in uncompressed mode)private RandomAccessFile mFileWriter;// Number of channels, sample rate, sample size(size in bits), buffer size, audio source, sample size(see AudioFormat)private short nChannels;private int nRate;private short nSamples;private int nBufferSize;private int nSource;private int nFormat;// Number of frames written to file on each output(only in uncompressed mode)private int nFramePeriod;// Buffer for output(only in uncompressed mode)private byte[] mBuffer;// Number of bytes written to file after header(only in uncompressed mode)// after stop() is called, this size is written to the header/data chunk in the wave fileprivate int nPayloadSize;/**** Returns the state of the recorder in a RehearsalAudioRecord.State typed object.* Useful, as no exceptions are thrown.** @return recorder state*/public State getState(){return state;}/*** Method used for recording.**/private AudioRecord.OnRecordPositionUpdateListener updateListener = new AudioRecord.OnRecordPositionUpdateListener(){public void onPeriodicNotification(AudioRecord recorder){mAudioRecorder.read(mBuffer, 0, mBuffer.length); // Fill buffertry{mFileWriter.write(mBuffer); // Write buffer to filenPayloadSize += mBuffer.length;if (nSamples == 16){for (int i=0; i<mBuffer.length/2; i++){ // 16bit sample sizeshort curSample = getShort(mBuffer[i*2], mBuffer[i*2+1]);if (curSample > cAmplitude){ // Check amplitudecAmplitude = curSample;}}}else{ // 8bit sample sizefor (int i=0; i<mBuffer.length; i++){if (mBuffer[i] > cAmplitude){ // Check amplitudecAmplitude = mBuffer[i];}}}}catch (IOException e){Log.e(this, "Error occured in updateListener, recording is aborted");stop();}}public void onMarkerReached(AudioRecord recorder){// NOT USED}};/***** Default constructor** Instantiates a new recorder, in case of compressed recording the parameters can be left as 0.* In case of errors, no exception is thrown, but the state is set to ERROR**/@SuppressLint("InlinedApi")public AudioRecorder(boolean uncompressed, int audioSource, int sampleRate, int channelConfig, int audioFormat){try{isUncompressed = uncompressed;if (isUncompressed){ // RECORDING_UNCOMPRESSEDif (audioFormat == AudioFormat.ENCODING_PCM_16BIT){nSamples = 16;}else{nSamples = 8;}if (channelConfig == AudioFormat.CHANNEL_IN_MONO){nChannels = 1;}else{nChannels = 2;}nSource = audioSource;nRate = sampleRate;nFormat = audioFormat;nFramePeriod = sampleRate * TIMER_INTERVAL / 1000;nBufferSize = nFramePeriod * 2 * nSamples * nChannels / 8;if (nBufferSize < AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat)){ // Check to make sure buffer size is not smaller than the smallest allowed onenBufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);// Set frame period and timer interval accordinglynFramePeriod = nBufferSize / ( 2 * nSamples * nChannels / 8 );Log.w(this, "Increasing buffer size to " + Integer.toString(nBufferSize));}mAudioRecorder = new AudioRecord(audioSource, sampleRate, channelConfig, audioFormat, nBufferSize);if (mAudioRecorder.getState() != AudioRecord.STATE_INITIALIZED)throw new Exception("AudioRecord initialization failed");mAudioRecorder.setRecordPositionUpdateListener(updateListener);mAudioRecorder.setPositionNotificationPeriod(nFramePeriod);} else{ // RECORDING_COMPRESSEDmMediaRecorder = new MediaRecorder();mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);if (DeviceInfo.hasAPI10())mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);elsemMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT);}cAmplitude = 0;mFilePath = null;state = State.INITIALIZING;} catch (Exception e){if (e.getMessage() != null){Log.e(this, e.getMessage());}else{Log.e(this, "Unknown error occured while initializing recording");}state = State.ERROR;}}/*** Sets output file path, call directly after construction/reset.** @param output file path**/public void setOutputFile(File file){setOutputFile(file.getAbsolutePath());}public void setOutputFile(String argPath){try{if (state == State.INITIALIZING){mFilePath = argPath;if (!isUncompressed){mMediaRecorder.setOutputFile(mFilePath);}}}catch (Exception e){if (e.getMessage() != null){Log.e(this, e.getMessage());}else{Log.e(this, "Unknown error occured while setting output path");}state = State.ERROR;}}/**** Returns the largest amplitude sampled since the last call to this method.** @return returns the largest amplitude since the last call, or 0 when not in recording state.**/public int getMaxAmplitude(){if (state == State.RECORDING){if (isUncompressed){int result = cAmplitude;cAmplitude = 0;return result;}else{try{return mMediaRecorder.getMaxAmplitude();}catch (IllegalStateException e){return 0;}}}else{return 0;}}/**** Prepares the recorder for recording, in case the recorder is not in the INITIALIZING state and the file path was not set* the recorder is set to the ERROR state, which makes a reconstruction necessary.* In case uncompressed recording is toggled, the header of the wave file is written.* In case of an exception, the state is changed to ERROR**/public void prepare(){try{if (state == State.INITIALIZING){if (isUncompressed){if ((mAudioRecorder.getState() == AudioRecord.STATE_INITIALIZED) (mFilePath != null)){// write file headerLog.w(this, "prepare(): nRate: "+nRate+" nChannels: "+nChannels);mFileWriter = new RandomAccessFile(mFilePath, "rw");mFileWriter.setLength(0); // Set file length to 0, to prevent unexpected behavior in case the file already existedmFileWriter.writeBytes("RIFF"); // 4mFileWriter.writeInt(0); // 4 Final file size not known yet, write 0mFileWriter.writeBytes("WAVE"); // 4mFileWriter.writeBytes("fmt "); // 4mFileWriter.writeInt(Integer.reverseBytes(16)); // 4 Sub-chunk size, 16 for PCMmFileWriter.writeShort(Short.reverseBytes((short) 1)); // 2 AudioFormat, 1 for PCMmFileWriter.writeShort(Short.reverseBytes(nChannels)); // 2 Number of channels, 1 for mono, 2 for stereomFileWriter.writeInt(Integer.reverseBytes(nRate)); // 4 Sample ratemFileWriter.writeInt(Integer.reverseBytes(nRate*nSamples*nChannels/8)); // 4 Byte rate, SampleRate*NumberOfChannels*BitsPerSample/8mFileWriter.writeShort(Short.reverseBytes((short)(nChannels*nSamples/8))); // 2 Block align, NumberOfChannels*BitsPerSample/8mFileWriter.writeShort(Short.reverseBytes(nSamples)); // 2 Bits per samplemFileWriter.writeBytes("data"); // 4mFileWriter.writeInt(0); // 4 Data chunk size not known yet, write 0mBuffer = new byte[nFramePeriod*nSamples/8*nChannels];state = State.READY;}else{Log.e(this, "prepare() method called on uninitialized recorder");state = State.ERROR;}}else{mMediaRecorder.prepare();state = State.READY;}}else{Log.e(this, "prepare() method called on illegal state");release();state = State.ERROR;}}catch(Exception e){if (e.getMessage() != null){Log.e(this, e.getMessage());}else{Log.e(this, "Unknown error occured in prepare()");}state = State.ERROR;}}/***** Releases the resources associated with this class, and removes the unnecessary files, when necessary**/public void release(){if (state == State.RECORDING){stop();}else{if ((state == State.READY) (isUncompressed)){try{mFileWriter.close(); // Remove prepared file}catch (IOException e){Log.e(this, "I/O exception occured while closing output file");}(new File(mFilePath)).delete();}}if (isUncompressed){if (mAudioRecorder != null){mAudioRecorder.release();}}else{if (mMediaRecorder != null){mMediaRecorder.release();}}}/***** Resets the recorder to the INITIALIZING state, as if it was just created.* In case the class was in RECORDING state, the recording is stopped.* In case of exceptions the class is set to the ERROR state.**/public void reset(){try{if (state != State.ERROR){release();mFilePath = null; // Reset file pathcAmplitude = 0; // Reset amplitudeif (isUncompressed){mAudioRecorder = new AudioRecord(nSource, nRate, nChannels+1, nFormat, nBufferSize);}else{mMediaRecorder = new MediaRecorder();mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);}state = State.INITIALIZING;}}catch (Exception e){Log.e(this, e.getMessage());state = State.ERROR;}}/***** Starts the recording, and sets the state to RECORDING.* Call after prepare().**/public void start(){if (state == State.READY){if (isUncompressed){nPayloadSize = 0;mAudioRecorder.startRecording();mAudioRecorder.read(mBuffer, 0, mBuffer.length);}else{mMediaRecorder.start();}state = State.RECORDING;}else{Log.e(this, "start() called on illegal state");state = State.ERROR;}}/***** Stops the recording, and sets the state to STOPPED.* In case of further usage, a reset is needed.* Also finalizes the wave file in case of uncompressed recording.**/public void stop(){if (state == State.RECORDING){if (isUncompressed){mAudioRecorder.stop();mAudioRecorder.setRecordPositionUpdateListener(null);try{mFileWriter.seek(4); // Write size to RIFF headermFileWriter.writeInt(Integer.reverseBytes(36+nPayloadSize));mFileWriter.seek(40); // Write size to Subchunk2Size fieldmFileWriter.writeInt(Integer.reverseBytes(nPayloadSize));mFileWriter.close();Log.w(this, "Recording stopped successfully");}catch(IOException e){Log.e(this, "I/O exception occured while closing output file");state = State.ERROR;}}else{mMediaRecorder.stop();}state = State.STOPPED;}else{Log.e(this, "stop() called on illegal state");state = State.ERROR;}}/*** Converts a byte[2] to a short, in LITTLE_ENDIAN format**/private short getShort(byte argB1, byte argB2){return (short)(argB1 | (argB2 << 8));}}

Example of use when recording :

/** Thread to manage live recording/playback of voice input from the device's microphone.*/private final static int[] sampleRates = {44100, 22050, 16000, 11025, 8000};protected int usedSampleRate;private class AudioThread extends Thread {private final File targetFile;private final static String TAG = "AudioThread";/*** Give the thread high priority so that it's not canceled unexpectedly, and start it*/private AudioThread(final File file) {targetFile = file;}@Overridepublic void run() {Log.i(TAG, "Running Audio Thread");Looper.prepare();int i = 0;do {usedSampleRate = sampleRates[i];if (audioRecorder != null)audioRecorder.release();audioRecorder = new AudioRecorder(true, AudioSource.MIC, usedSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);}while ((++i < sampleRates.length) !(audioRecorder.getState() == AudioRecorder.State.INITIALIZING));Log.i(this, "usedSampleRate: " + usedSampleRate + " setOutputFile: " + targetFile);try {audioRecorder.setOutputFile(targetFile);// start the recordingaudioRecorder.prepare();audioRecorder.start();// if error occurred and thus recording is not startedif (audioRecorder.getState() == AudioRecorder.State.ERROR) {Toast.makeText(getBaseContext(), "AudioRecorder error", Toast.LENGTH_SHORT).show();}} catch (NullPointerException ignored){} // audioRecorder became null since it was canceledLooper.loop();}}

For playback :

 /** Thread to manage playback of recorded message.*/private int bufferSize;protected int byteOffset;protected int fileLengh;public void playerPlayUsingAudioTrack(File messageFileWav) {if (messageFileWav == null || !messageFileWav.exists() || !messageFileWav.canRead()) {Toast.makeText( getBaseContext(), "Audiofile error: exists(): "+ messageFileWav.exists() + " canRead(): "+ messageFileWav.canRead(), Toast.LENGTH_SHORT).show();return;}// is previous thread alive?if (audioTrackThread!=null){audioTrackThread.isStopped = true;audioTrackThread = null;}bufferSize = AudioRecord.getMinBufferSize(sampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 4;audioTrackThread = new StoppableThread(){@Overridepublic void run() {audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM);fileLengh = (int) messageFileWav.length();sbPlayerProgress.setMax(fileLengh / 2);int byteCount = 4 * 1024; // 4 kbfinal byte[] byteData = new byte[byteCount];// Reading the file..RandomAccessFile in = null;try {in = new RandomAccessFile(messageFileWav, "r");int ret;byteOffset = 44;audioTrack.play();isPaused = false;isPlayerPlaying = true;while (byteOffset < fileLengh) {if (this.isStopped)break;if(isPlayerPaused || this.isPaused)continue;in.seek(byteOffset);ret = in. read(byteData, 0, byteCount);if (ret != -1) { // Write the byte array to the trackaudioTrack.write(byteData, 0, ret);audioTrack.setPlaybackRate(pitchValue);byteOffset += ret;} elsebreak;}} catch (Exception e) {//IOException, FileNotFoundException, NPE for audioTracke.printStackTrace();} finally {if (in != null)try {in.close();} catch (IOException ignored) {}}}};audioTrackThread.start();}

In general, it works for recording, it works for playback, it works for pitch effect – audioTrack.setPlaybackRate(pitchValue) – this variable is bound to SeekBar and user can change its value to his taste and hear the effect during playback. I don’t know how to save the effect of the selected level in the file… I guess I have to make something specialized in C/NDK.
But in general, there are surprises and besides that: Who has experience, he knows that all sorts of devices from Samsung, HTC and other brands are not 100% Android compatible. Each brand has its own "improvements" at the level of OS sources, due to which the documented on google code simply will not work at all, or as expected, and especially this is associated with media, so all sorts of crutches need to be built for these devices.
For example, Samsung has problems with streaming audio playback, i.e. Using MediaPlayer class and providing it as a source with HTTP-link to an MP3 file (and this is how audio files uploaded to the application server are planned to be played), Samsung-i will play it randomly – then play, then not play, although any other devices with the same application source code and other similar conditions will always play fine, and the crutch is to load the file in parts in a separate stream and feed it to the player as if the locally recorded file. Samsung-i, unlike others, learned to swallow ideal pauses in MP3, well, when the perfect silence (in the editor even wave form is not drawn), they just skip them, so that the 5-minute report firstly played unnaturally, and secondly the original duration is violated, for example, comes out not 5 minutes, but 4 or less (depending on the duration of pauses between phrases). No other devices do this. Crutch : add white noise to the pauses.
On some NTS models – problems with audio recording when recording with MediaPlayer works fine, but not with AudioTrack. In fact, there is a problem with recording of accumulated buffer, just updateListener (see AudioRecoder code) does not work, on other devices this lisner works, but on NTS – no, well, we have to be different somehow, right? So. Crutch can be built here too, but other problems appear + on various other brands there are other problems, such as lack of support for 48 kHz or 44.1 kHz sampling rate or different combinations of recording settings, like mono – not writing, but only stereo, others – on the contrary. In general, this theme in android is still the same batchhurt and somehow do not want to find all new incompatible devices and to plant all new crutches for them.
The funny thing here is that any Chinese smartphones, except of course for brands like Meizu, will be more compatible with Android compared to the brands on the market, since they are stupidly not bothering with customization of the OS, well, or they don’t have the money to do it.
And so, once again, while looking for some more solutions on this subject, preferably very alternative ones, not wrappers around the mentioned two classes (I haven’t mentioned SoundPool yet, but that class is just not suitable for my task because of its limitations), I stumbled upon SuperPowered The SKD is free, I just need to register, it is cross platform, which is important, because I need an app for both AOS and iOS.
I saw the demo video on the site, to put it mildly, I got excited (and in real life just freaked out was extremely amazed), registered, downloaded the CDC and the sample.
First thing I want to note – AndroidStudio is out of luck, because "suddenly" the project was customized for NDK and AS had problems with it, to spend time to solve them I had absolutely no desire (well, AS couldn’t import the project properly). That’s why the project was opened in Eclipse without problems and without problems it was run, thankfully I downloaded and installed NDK earlier (specifying the path to which AS did not help much).
I ran Samlp on a corpse, by today’s standards – NTS HD2 with MIUI version 2.3.5 installed on it. The device in operation is slow even on fairly lightweight laptops, even on the usual ones, where you just see the lists with pictures. Antutu gives some ridiculous figures and advises to throw away, and after the next upgrade it hangs and hangs the phone so badly that I have to take the battery out. But on this device you can clearly see who knows about the existence of ViewHolder and who doesn’t.
So, this sample runs on it without any problems and works without even a hint of lag! That is, I made sure that yes, this bible really Low Latency! The application itself is great – it makes you feel like a DJ for a couple of minutes, I’ve been using it for a week, just like a piece of cake.
However, when I dug deeper, my joy calmed down, because I found that for Android it was limited to this sample, while for iOS there are much more examples, and to fully use this library I need to write JNI myself, which I can’t do, so I’m writing this article with the goal that interested hubbies would develop this topic. I’m not a C word processor myself, but in principle I’ve added some more effects to those three, which are there. They don’t add much interesting, but they work, plus I’ve added a small fix – when leaving for the background and coming back in the sample, there was a mess because the playback wouldn’t stop:

#include "SuperpoweredExample.h"#include <jni.h>#include <stdlib.h>#include <stdio.h>#include <android/log.h>static void playerEventCallbackA(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) {if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) {SuperpoweredAdvancedAudioPlayer *playerA = *((SuperpoweredAdvancedAudioPlayer **)clientData);playerA-> setBpm(126.0f);playerA-> setFirstBeatMs(353);playerA-> setPosition(playerA-> firstBeatMs, false, false);};}static void playerEventCallbackB(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) {if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) {SuperpoweredAdvancedAudioPlayer *playerB = *((SuperpoweredAdvancedAudioPlayer **)clientData);playerB-> setBpm(123.0f);playerB-> setFirstBeatMs(40);playerB-> setPosition(playerB-> firstBeatMs, false, false);};}static void openSLESCallback(SLAndroidSimpleBufferQueueItf caller, void *pContext) {((SuperpoweredExample *)pContext)-> process(caller);}static const SLboolean requireds[2] = { SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE };SuperpoweredExample::SuperpoweredExample(const char *path, int *params) : currentBuffer(0), buffersize(params[5]), activeFx(0), crossValue(0.0f), volB(0.0f), volA(1.0f * headroom) {pthread_mutex_init(mutex, NULL); // This will keep our player volumes and playback states in sync.for (int n = 0; n < NUM_BUFFERS; n++) outputBuffer[n] = (float *)memalign(16, (buffersize + 16) * sizeof(float) * 2);unsigned int samplerate = params[4];playerA = new SuperpoweredAdvancedAudioPlayer(playerA , playerEventCallbackA, samplerate, 0);playerA-> open(path, params[0], params[1]);playerB = new SuperpoweredAdvancedAudioPlayer(playerB, playerEventCallbackB, samplerate, 0);playerB-> open(path, params[2], params[3]);playerA-> syncMode = playerB-> syncMode = SuperpoweredAdvancedAudioPlayerSyncMode_TempoAndBeat;roll = new SuperpoweredRoll(samplerate);filter = new SuperpoweredFilter(SuperpoweredFilter_Resonant_Lowpass, samplerate);flanger = new SuperpoweredFlanger(samplerate);whoosh = new SuperpoweredWhoosh(samplerate);gate = new SuperpoweredGate(samplerate);echo = new SuperpoweredEcho(samplerate);reverb = new SuperpoweredReverb(samplerate);//stretch = new SuperpoweredTimeStretching(samplerate);mixer = new SuperpoweredStereoMixer();// Create the OpenSL ES engine.slCreateEngine(openSLEngine, 0, NULL, 0, NULL, NULL);(*openSLEngine)-> Realize(openSLEngine, SL_BOOLEAN_FALSE);SLEngineItf openSLEngineInterface = NULL;(*openSLEngine)-> GetInterface(openSLEngine, SL_IID_ENGINE, openSLEngineInterface);// Create the output mix.(*openSLEngineInterface)-> CreateOutputMix(openSLEngineInterface, outputMix, 0, NULL, NULL);(*outputMix)-> Realize(outputMix, SL_BOOLEAN_FALSE);SLDataLocator_OutputMix outputMixLocator = { SL_DATALOCATOR_OUTPUTMIX, outputMix };// Create the buffer queue player.SLDataLocator_AndroidSimpleBufferQueue bufferPlayerLocator = { SL_DATALOCATOR_ANDROIDSIMPLEBUFFERQUEUE, NUM_BUFFERS };SLDataFormat_PCM bufferPlayerFormat = { SL_DATAFORMAT_PCM, 2, samplerate * 1000, SL_PCMSAMPLEFORMAT_FIXED_16, SL_PCMSAMPLEFORMAT_FIXED_16, SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT, SL_BYTEORDER_LITTLEENDIAN };SLDataSource bufferPlayerSource = { bufferPlayerLocator, bufferPlayerFormat };const SLInterfaceID bufferPlayerInterfaces[1] = { SL_IID_BUFFERQUEUE };SLDataSink bufferPlayerOutput = { outputMixLocator, NULL };(*openSLEngineInterface)-> CreateAudioPlayer(openSLEngineInterface, bufferPlayer, bufferPlayerSource, bufferPlayerOutput, 1, bufferPlayerInterfaces, requireds);(*bufferPlayer)-> Realize(bufferPlayer, SL_BOOLEAN_FALSE);// Initialize and start the buffer queue.(*bufferPlayer)-> GetInterface(bufferPlayer, SL_IID_BUFFERQUEUE, bufferQueue);(*bufferQueue)-> RegisterCallback(bufferQueue, openSLESCallback, this);memset(outputBuffer[0], 0, buffersize * 4);memset(outputBuffer[1], 0, buffersize * 4);(*bufferQueue)-> Enqueue(bufferQueue, outputBuffer[0], buffersize * 4);(*bufferQueue)-> Enqueue(bufferQueue, outputBuffer[1], buffersize * 4);SLPlayItf bufferPlayerPlayInterface;(*bufferPlayer)-> GetInterface(bufferPlayer, SL_IID_PLAY, bufferPlayerPlayInterface);(*bufferPlayerPlayInterface)-> SetPlayState(bufferPlayerPlayInterface, SL_PLAYSTATE_PLAYING);}SuperpoweredExample::~SuperpoweredExample() {for (int n = 0; n < NUM_BUFFERS; n++) free(outputBuffer[n]);delete playerA;delete playerB;delete mixer;pthread_mutex_destroy(mutex);}void SuperpoweredExample::onPlayPause(bool play) {pthread_mutex_lock(mutex);if (!play) {playerA-> pause();playerB-> pause();} else {bool masterIsA = (crossValue <= 0.5f);playerA-> play(!masterIsA);playerB-> play(masterIsA);};pthread_mutex_unlock(mutex);}void SuperpoweredExample::onCrossfader(int value) {pthread_mutex_lock(mutex);crossValue = float(value) * 0.01f;if (crossValue < 0.01f) {volA = 1.0f * headroom;volB = 0.0f;} else if (crossValue > 0.99f) {volA = 0.0f;volB = 1.0f * headroom;} else { // constant power curvevolA = cosf(M_PI_2 * crossValue) * headroom;volB = cosf(M_PI_2 * (1.0f - crossValue)) * headroom;};pthread_mutex_unlock(mutex);}void SuperpoweredExample::onFxSelect(int value) {__android_log_print(ANDROID_LOG_VERBOSE, "SuperpoweredExample", "FXSEL %i", value);activeFx = value;}void SuperpoweredExample::onFxOff() {filter-> enable(false);roll-> enable(false);flanger-> enable(false);whoosh-> enable(false);gate-> enable(false);echo-> enable(false);reverb-> enable(false);}#define MINFREQ 60.0f#define MAXFREQ 20000.0fstatic inline float floatToFrequency(float value) {if (value > 0.97f) return MAXFREQ;if (value < 0.03f) return MINFREQ;value = powf(10.0f, (value + ((0.4f - fabsf(value - 0.4f)) * 0.3f)) * log10f(MAXFREQ - MINFREQ)) + MINFREQ;return value < MAXFREQ ? value : MAXFREQ;}void SuperpoweredExample::onFxValue(int ivalue) {float value = float(ivalue) * 0.01f;switch (activeFx) {// filtercase 1:filter-> setResonantParameters(floatToFrequency(1.0f - value), 0.2f);filter-> enable(true);flanger-> enable(false);roll-> enable(false);whoosh-> enable(false);gate-> enable(false);echo-> enable(false);reverb-> enable(false);break;// rollcase 2:if (value > 0.8f) roll-> beats = 0.0625f;else if (value > 0.6f) roll-> beats = 0.125f;else if (value > 0.4f) roll-> beats = 0.25f;else if (value > 0.2f) roll-> beats = 0.5f;else roll-> beats = 1.0f;roll-> enable(true);filter-> enable(false);flanger-> enable(false);whoosh-> enable(false);gate-> enable(false);echo-> enable(false);reverb-> enable(false);break;// echocase 3:flanger-> enable(false);filter-> enable(false);roll-> enable(false);whoosh-> enable(false);gate-> enable(false);echo-> setMix(value);echo-> enable(true);reverb-> enable(false);break;// whooshcase 4:flanger-> enable(false);filter-> enable(false);roll-> enable(false);whoosh-> setFrequency(floatToFrequency(1.0f - value));whoosh-> enable(true);gate-> enable(false);echo-> enable(false);reverb-> enable(false);break;// gatecase 5:flanger-> enable(false);filter-> enable(false);roll-> enable(false);whoosh-> enable(false);echo-> enable(false);if (value > 0.8f) gate-> beats = 0.0625f;else if (value > 0.6f) gate-> beats = 0.125f;else if (value > 0.4f) gate-> beats = 0.25f;else if (value > 0.2f) gate-> beats = 0.5f;else gate-> beats = 1.0f;gate-> enable(true);reverb-> enable(false);break;// reverbcase 6:flanger-> enable(false);filter-> enable(false);roll-> enable(false);whoosh-> enable(false);echo-> enable(false);gate-> enable(false);reverb-> enable(true);reverb-> setRoomSize(value);break;// flangerdefault:flanger-> setWet(value);flanger-> enable(true);filter-> enable(false);roll-> enable(false);whoosh-> enable(false);gate-> enable(false);echo-> enable(false);};}void SuperpoweredExample::process(SLAndroidSimpleBufferQueueItf caller) {pthread_mutex_lock(mutex);float *stereoBuffer = outputBuffer[currentBuffer];bool masterIsA = (crossValue <= 0.5f);float masterBpm = masterIsA ? playerA-> currentBpm : playerB-> currentBpm;double msElapsedSinceLastBeatA = playerA-> msElapsedSinceLastBeat; // When playerB needs it, playerA has already stepped this value, so save it now.bool silence = !playerA-> process(stereoBuffer, false, buffersize, volA, masterBpm, playerB-> msElapsedSinceLastBeat);if (playerB-> process(stereoBuffer, !silence, buffersize, volB, masterBpm, msElapsedSinceLastBeatA)) silence = false;roll-> bpm = flanger-> bpm = gate-> bpm = masterBpm; // Syncing fx is one line.if (roll-> process(silence ? NULL : stereoBuffer, stereoBuffer, buffersize) silence) silence = false;if (!silence) {filter-> process(stereoBuffer, stereoBuffer, buffersize);flanger-> process(stereoBuffer, stereoBuffer, buffersize);whoosh-> process(stereoBuffer, stereoBuffer, buffersize);gate-> process(stereoBuffer, stereoBuffer, buffersize);echo-> process(stereoBuffer, stereoBuffer, buffersize);reverb-> process(stereoBuffer, stereoBuffer, buffersize);};pthread_mutex_unlock(mutex);// The stereoBuffer is ready now, let's put the finished audio into the requested buffers.if (silence) memset(stereoBuffer, 0, buffersize * 4); else SuperpoweredStereoMixer::floatToShortInt(stereoBuffer, (short int *)stereoBuffer, buffersize);(*caller)-> Enqueue(caller, stereoBuffer, buffersize * 4);if (currentBuffer < NUM_BUFFERS - 1) currentBuffer++; else currentBuffer = 0;}extern "C" {JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_SuperpoweredExample(JNIEnv *javaEnvironment, jobject self, jstring apkPath, jlongArray offsetAndLength);JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onPlayPause(JNIEnv *javaEnvironment, jobject self, jboolean play);JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onCrossfader(JNIEnv *javaEnvironment, jobject self, jint value);JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxSelect(JNIEnv *javaEnvironment, jobject self, jint value);JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxOff(JNIEnv *javaEnvironment, jobject self);JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxValue(JNIEnv *javaEnvironment, jobject self, jint value);}static SuperpoweredExample *example = NULL;// Android is not passing more than 2 custom parameters, so we had to pack file offsets and lengths into an array.JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_SuperpoweredExample(JNIEnv *javaEnvironment, jobject self, jstring apkPath, jlongArray params) {// Convert the input jlong array to a regular int array.jlong *longParams = javaEnvironment-> GetLongArrayElements(params, JNI_FALSE);int arr[6];for (int n = 0; n < 6; n++) arr[n] = longParams[n];javaEnvironment-> ReleaseLongArrayElements(params, longParams, JNI_ABORT);const char *path = javaEnvironment-> GetStringUTFChars(apkPath, JNI_FALSE);example = new SuperpoweredExample(path, arr);javaEnvironment-> ReleaseStringUTFChars(apkPath, path);}JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onPlayPause(JNIEnv *javaEnvironment, jobject self, jboolean play) {example-> onPlayPause(play);}JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onCrossfader(JNIEnv *javaEnvironment, jobject self, jint value) {example-> onCrossfader(value);}JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxSelect(JNIEnv *javaEnvironment, jobject self, jint value) {example-> onFxSelect(value);}JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxOff(JNIEnv *javaEnvironment, jobject self) {example-> onFxOff();}JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxValue(JNIEnv *javaEnvironment, jobject self, jint value) {example-> onFxValue(value);}


#ifndef Header_SuperpoweredExample#define Header_SuperpoweredExample#include <SLES/OpenSLES.h>#include <SLES/OpenSLES_Android.h>#include <math.h>#include <pthread.h>#include "SuperpoweredExample.h"#include "SuperpoweredAdvancedAudioPlayer.h"#include "SuperpoweredFilter.h"#include "SuperpoweredRoll.h"#include "SuperpoweredFlanger.h"#include "SuperpoweredMixer.h"#include "SuperpoweredWhoosh.h"#include "SuperpoweredGate.h"#include "SuperpoweredEcho.h"#include "SuperpoweredReverb.h"#include "SuperpoweredTimeStretching.h"#define NUM_BUFFERS 2#define HEADROOM_DECIBEL 3.0fstatic const float headroom = powf(10.0f, -HEADROOM_DECIBEL * 0.025);class SuperpoweredExample {public:SuperpoweredExample(const char *path, int *params);~SuperpoweredExample();void process(SLAndroidSimpleBufferQueueItf caller);void onPlayPause(bool play);void onCrossfader(int value);void onFxSelect(int value);void onFxOff();void onFxValue(int value);private:SLObjectItf openSLEngine, outputMix, bufferPlayer;SLAndroidSimpleBufferQueueItf bufferQueue;SuperpoweredAdvancedAudioPlayer *playerA, *playerB;SuperpoweredRoll *roll;SuperpoweredFilter *filter;SuperpoweredFlanger *flanger;SuperpoweredStereoMixer *mixer;SuperpoweredWhoosh *whoosh;SuperpoweredGate *gate;SuperpoweredEcho *echo;SuperpoweredReverb *reverb;SuperpoweredTimeStretching *stretch;unsigned char activeFx;float crossValue, volA, volB;pthread_mutex_t mutex;float *outputBuffer[NUM_BUFFERS];int currentBuffer, buffersize;};#endif


package com.example.SuperpoweredExample;import java.io.IOException;import android.annotation.SuppressLint;import android.app.Activity;import android.content.Context;import android.content.res.AssetFileDescriptor;import android.media.AudioManager;import android.os.Bundle;import android.view.View;import android.widget.Button;import android.widget.RadioButton;import android.widget.RadioGroup;import android.widget.RadioGroup.OnCheckedChangeListener;import android.widget.SeekBar;import android.widget.SeekBar.OnSeekBarChangeListener;public class MainActivity extends Activity {boolean playing =false;RadioGroup group1;RadioGroup group2;OnCheckedChangeListener rgCheckedChanged = new OnCheckedChangeListener() {@Overridepublic void onCheckedChanged(RadioGroup group, int checkedId) {RadioButton checkedRadioButton = (RadioButton)group.findViewById(checkedId);final int delta = group==group2 ? 4:0;if (group==group1){group2.setOnCheckedChangeListener(null);group2.clearCheck();group2.setOnCheckedChangeListener(rgCheckedChanged);} else {group1.setOnCheckedChangeListener(null);group1.clearCheck();group1.setOnCheckedChangeListener(rgCheckedChanged);}onFxSelect(group.indexOfChild(checkedRadioButton)+delta);}};@SuppressLint("NewApi")protected void onCreate(Bundle savedInstanceState) {super.onCreate(savedInstanceState);setContentView(R.layout.main);// Get the device's sample rate and buffer size to enable low-latency Android audio output, if available.AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE);String samplerateString=null, buffersizeString=null;try {samplerateString = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);} catch (NoSuchMethodError ignored){}try {buffersizeString = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);} catch (NoSuchMethodError ignored){}if (samplerateString == null) samplerateString = "44100";if (buffersizeString == null) buffersizeString = "512";// Files under res/raw are not compressed, just copied into the APK. Get the offset and length to know where our files are located.AssetFileDescriptor fd0 = getResources().openRawResourceFd(R.raw.lycka), fd1 = getResources().openRawResourceFd(R.raw.nuyorica);long[] params = { fd0.getStartOffset(), fd0.getLength(), fd1.getStartOffset(), fd1.getLength(), Integer.parseInt(samplerateString), Integer.parseInt(buffersizeString) };try {fd0.getParcelFileDescriptor().close();} catch (IOException e) {}try {fd1.getParcelFileDescriptor().close();} catch (IOException e) {}SuperpoweredExample(getPackageResourcePath(), params); // Arguments: path to the APK file, offset and length of the two resource files, sample rate, audio buffer size.// crossfader eventsfinal SeekBar crossfader = (SeekBar)findViewById(R.id.crossFader);crossfader.setOnSeekBarChangeListener(new OnSeekBarChangeListener() {public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {onCrossfader(progress);}public void onStartTrackingTouch(SeekBar seekBar) {}public void onStopTrackingTouch(SeekBar seekBar) {}});// fx fader eventsfinal SeekBar fxfader = (SeekBar)findViewById(R.id.fxFader);fxfader.setOnSeekBarChangeListener(new OnSeekBarChangeListener() {public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {onFxValue(progress);}public void onStartTrackingTouch(SeekBar seekBar) {onFxValue(seekBar.getProgress());}public void onStopTrackingTouch(SeekBar seekBar) {onFxOff();}});group1 = (RadioGroup)findViewById(R.id.radioGroup1);group1.setOnCheckedChangeListener(rgCheckedChanged);group2 = (RadioGroup)findViewById(R.id.radioGroup2);group2.setOnCheckedChangeListener(rgCheckedChanged);// // fx select event// group.setOnCheckedChangeListener(new RadioGroup.OnCheckedChangeListener() {// public void onCheckedChanged(RadioGroup radioGroup, int checkedId) {// RadioButton checkedRadioButton = (RadioButton)radioGroup.findViewById(checkedId);// onFxSelect(radioGroup.indexOfChild(checkedRadioButton));// group2.clearCheck();// }// });// group2.setOnCheckedChangeListener(new RadioGroup.OnCheckedChangeListener() {// public void onCheckedChanged(RadioGroup radioGroup, int checkedId) {// RadioButton checkedRadioButton = (RadioButton)radioGroup.findViewById(checkedId);// onFxSelect(radioGroup.indexOfChild(checkedRadioButton)+4);// group.clearCheck();// }// });}public void SuperpoweredExample_PlayPause(View button) { // Play/pause.playing = !playing;onPlayPause(playing);Button b = (Button) findViewById(R.id.playPause);b.setText(playing ? "Pause" : "Play");}private native void SuperpoweredExample(String apkPath, long[] offsetAndLength);private native void onPlayPause(boolean play);private native void onCrossfader(int value);private native void onFxSelect(int value);private native void onFxOff();private native void onFxValue(int value);static {System.loadLibrary("SuperpoweredExample");}@Overrideprotected void onDestroy() {super.onDestroy();onPlayPause(false);}}


<?xml version="1.0" encoding="utf-8"?><RelativeLayout xmlns:tools="http://schemas.android.com/tools"xmlns:android="http://schemas.android.com/apk/res/android"xmlns:android1="http://schemas.android.com/apk/res/android"android:layout_width="match_parent"android:layout_height="match_parent" ><Buttonandroid1:id="@+id/playPause"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:layout_alignParentTop="true"android1:layout_centerHorizontal="true"android1:layout_marginLeft="5dp"android1:layout_marginRight="5dp"android1:layout_marginTop="15dp"android1:onClick="SuperpoweredExample_PlayPause"android1:text="@string/play" /><SeekBarandroid1:id="@+id/crossFader"android1:layout_width="match_parent"android1:layout_height="wrap_content"android1:layout_alignParentLeft="true"android1:layout_below="@+id/playPause"android1:layout_marginLeft="5dp"android1:layout_marginRight="5dp"android1:layout_marginTop="15dp" /><RadioGroupandroid1:id="@+id/radioGroup1"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:layout_below="@+id/crossFader"android1:layout_centerHorizontal="true"android1:layout_marginTop="15dp"android1:orientation="horizontal" ><RadioButtonandroid1:id="@+id/radio0"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:checked="true"android1:text="@string/flanger" /><RadioButtonandroid1:id="@+id/radio1"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/filter" /><RadioButtonandroid1:id="@+id/radio2"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/roll" /><RadioButtonandroid1:id="@+id/radio3"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/echo" /></RadioGroup><RadioGroupandroid1:id="@+id/radioGroup2"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:layout_below="@+id/radioGroup1"android1:layout_centerHorizontal="true"android1:layout_marginTop="5dp"android1:orientation="horizontal" ><RadioButtonandroid1:id="@+id/radio4"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/whoosh" /><RadioButtonandroid1:id="@+id/radio5"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/gate" /><RadioButtonandroid1:id="@+id/radio6"android1:layout_width="wrap_content"android1:layout_height="wrap_content"android1:text="@string/reverb" /></RadioGroup><SeekBarandroid1:id="@+id/fxFader"android1:layout_width="match_parent"android1:layout_height="wrap_content"android1:layout_alignParentLeft="true"android1:layout_below="@+id/radioGroup2"android1:layout_marginLeft="5dp"android1:layout_marginRight="5dp"android1:layout_marginTop="15dp" /></RelativeLayout>

What has become quite obvious to me is that this library allows you to solve :
– decompress audio if needed, e.g. from AAC to WAVE/RSM (no example for android and no JNI for this, but there is an example how to do it for iOS);
– add effects to file and save file with added/overlaid effects (no example for android and no JNI for this, but there is an example how to do it for iOS);
– SuperpoweredAdvancedAudioPlayer can change pitch shift and apply other effects on the fly without any lags;
– multiple copies of SuperpoweredAdvancedAudioPlayer can play simultaneously without problems, which allows you to mix in preset sound effects.
What can’t be solved with this SDK from what I need :
– You can’t pack the resulting WAV file into an MP3, I mentioned that above, but you can use a fork of LAME for android to do that.
What remains in question :
– It is unknown whether SuperpoweredAdvancedAudioPlayer can play from NTTR and how well it will work on all sorts of Samsung and HTC;
– It’s unknown if the SDK can record from a microphone.
In general, I’m waiting for your comments, advice, and maybe someone will even bother to write a JNI for all the functions of the SDK, if at all possible, which will help popularize this library. There is very little mention of it in our internet, especially regarding android, but it does deserve attention!

You may also like