📜 ⬆️ ⬇️

Work with sound and library SuperPowered

I have a task: I need to develop an application that will record from a microphone, then change (accelerate or pitch shifting), save the effect in the file itself and send the resulting MP3 file to the application server. This task turns out to be complex. And also min-sdk = 9 want.
To record sound, so that it is simpler, the MediaPlayer class suggests itself from the start. Writes from the microphone, while immediately compressed, AAS for example. For reference (if suddenly someone does not know): the MP3 encoder in Android is not; there litsuha commercial, but there is only de encoder MP3, respectively. can not be recorded immediately in MP3, but you can only play, for which a decoder is needed.
Everything would be fine, but only in order to be able to do something with sound, namely, to impose any effect, it is required to record it in its original, so to speak, form, i.e. not compressed to MP3 or AAC, namely in PCM / WAVE format. And besides, when playing it is necessary to “display” the overlapping effect in real time so that you can adjust it by ear. And for this, the MediaPlayer class is no longer suitable, since he writes only in compressed form, and to impose acceleration when playing is an intractable, given the minimum KFOR. Modzhno add yourself work: record in MediaPlayer in AAC, for example, and then unpack AAC to PCM / WAVE, which is not provided in Android and therefore will also have to look for a solution. And again: once again put the battery on the process of unpacking, which is greedy for computing resources, against writing down the option immediately uncompressed, well, the user will not like to spend his valuable time on this process (he may not know why, he will have to wait) .
From all this it follows that the recording should be carried out using a bunch of other classes: AudioRecord - he writes, and AudioTrack - and reproduces and acceleration on it is not a problem during playback.
However, in practice, with these classes of problems is also enough. Firstly: AudioRecord writes PCM data right away, I need it, but it doesn’t create the WAVE header itself, i.e. he writes RAW PCM, and then somehow this data can be used not only in AudioTrack, you need to add code to create this header; plus you should not forget to skip the first 44 bytes (header size) when playing, so that AudioTrack does not try to play them. Secondly: all the gestures on recording and playback should be carried out in separate streams, which does not simplify the development.
However, I will give an example of a separate recorder based on AudioRecord, since for the most part it is still not mine, but copied from the depths of SO (includes creating a header and can be useful to someone):
Audiorecorder
import android.annotation.SuppressLint; import android.media.AudioFormat; import android.media.AudioRecord; import android.media.MediaRecorder; import com.stanko.tools.DeviceInfo; import com.stanko.tools.Log; import java.io.File; import java.io.IOException; import java.io.RandomAccessFile; public class AudioRecorder { /** * INITIALIZING : recorder is initializing; * READY : recorder has been initialized, recorder not yet started * RECORDING : recording * ERROR : reconstruction needed * STOPPED: reset needed */ public enum State {INITIALIZING, READY, RECORDING, ERROR, STOPPED}; public static final boolean RECORDING_UNCOMPRESSED = true; public static final boolean RECORDING_COMPRESSED = false; // The interval in which the recorded samples are output to the file // Used only in uncompressed mode private static final int TIMER_INTERVAL = 120; // Toggles uncompressed recording on/off; RECORDING_UNCOMPRESSED / RECORDING_COMPRESSED private boolean isUncompressed; // Recorder used for uncompressed recording private AudioRecord mAudioRecorder = null; // Recorder used for compressed recording private MediaRecorder mMediaRecorder = null; // Stores current amplitude (only in uncompressed mode) private int cAmplitude= 0; // Output file path private String mFilePath = null; // Recorder state; see State private State state; // File writer (only in uncompressed mode) private RandomAccessFile mFileWriter; // Number of channels, sample rate, sample size(size in bits), buffer size, audio source, sample size(see AudioFormat) private short nChannels; private int nRate; private short nSamples; private int nBufferSize; private int nSource; private int nFormat; // Number of frames written to file on each output(only in uncompressed mode) private int nFramePeriod; // Buffer for output(only in uncompressed mode) private byte[] mBuffer; // Number of bytes written to file after header(only in uncompressed mode) // after stop() is called, this size is written to the header/data chunk in the wave file private int nPayloadSize; /** * * Returns the state of the recorder in a RehearsalAudioRecord.State typed object. * Useful, as no exceptions are thrown. * * @return recorder state */ public State getState() { return state; } /* * * Method used for recording. * */ private AudioRecord.OnRecordPositionUpdateListener updateListener = new AudioRecord.OnRecordPositionUpdateListener() { public void onPeriodicNotification(AudioRecord recorder) { mAudioRecorder.read(mBuffer, 0, mBuffer.length); // Fill buffer try { mFileWriter.write(mBuffer); // Write buffer to file nPayloadSize += mBuffer.length; if (nSamples == 16) { for (int i=0; i<mBuffer.length/2; i++) { // 16bit sample size short curSample = getShort(mBuffer[i*2], mBuffer[i*2+1]); if (curSample > cAmplitude) { // Check amplitude cAmplitude = curSample; } } } else { // 8bit sample size for (int i=0; i<mBuffer.length; i++) { if (mBuffer[i] > cAmplitude) { // Check amplitude cAmplitude = mBuffer[i]; } } } } catch (IOException e) { Log.e(this, "Error occured in updateListener, recording is aborted"); stop(); } } public void onMarkerReached(AudioRecord recorder) { // NOT USED } }; /** * * * Default constructor * * Instantiates a new recorder, in case of compressed recording the parameters can be left as 0. * In case of errors, no exception is thrown, but the state is set to ERROR * */ @SuppressLint("InlinedApi") public AudioRecorder(boolean uncompressed, int audioSource, int sampleRate, int channelConfig, int audioFormat) { try { isUncompressed = uncompressed; if (isUncompressed) { // RECORDING_UNCOMPRESSED if (audioFormat == AudioFormat.ENCODING_PCM_16BIT) { nSamples = 16; } else { nSamples = 8; } if (channelConfig == AudioFormat.CHANNEL_IN_MONO) { nChannels = 1; } else { nChannels = 2; } nSource = audioSource; nRate = sampleRate; nFormat = audioFormat; nFramePeriod = sampleRate * TIMER_INTERVAL / 1000; nBufferSize = nFramePeriod * 2 * nSamples * nChannels / 8; if (nBufferSize < AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat)) { // Check to make sure buffer size is not smaller than the smallest allowed one nBufferSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat); // Set frame period and timer interval accordingly nFramePeriod = nBufferSize / ( 2 * nSamples * nChannels / 8 ); Log.w(this, "Increasing buffer size to " + Integer.toString(nBufferSize)); } mAudioRecorder = new AudioRecord(audioSource, sampleRate, channelConfig, audioFormat, nBufferSize); if (mAudioRecorder.getState() != AudioRecord.STATE_INITIALIZED) throw new Exception("AudioRecord initialization failed"); mAudioRecorder.setRecordPositionUpdateListener(updateListener); mAudioRecorder.setPositionNotificationPeriod(nFramePeriod); } else { // RECORDING_COMPRESSED mMediaRecorder = new MediaRecorder(); mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4); if (DeviceInfo.hasAPI10()) mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC); else mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.DEFAULT); } cAmplitude = 0; mFilePath = null; state = State.INITIALIZING; } catch (Exception e) { if (e.getMessage() != null) { Log.e(this, e.getMessage()); } else { Log.e(this, "Unknown error occured while initializing recording"); } state = State.ERROR; } } /** * Sets output file path, call directly after construction/reset. * * @param output file path * */ public void setOutputFile(File file){ setOutputFile(file.getAbsolutePath()); } public void setOutputFile(String argPath) { try { if (state == State.INITIALIZING) { mFilePath = argPath; if (!isUncompressed) { mMediaRecorder.setOutputFile(mFilePath); } } } catch (Exception e) { if (e.getMessage() != null) { Log.e(this, e.getMessage()); } else { Log.e(this, "Unknown error occured while setting output path"); } state = State.ERROR; } } /** * * Returns the largest amplitude sampled since the last call to this method. * * @return returns the largest amplitude since the last call, or 0 when not in recording state. * */ public int getMaxAmplitude() { if (state == State.RECORDING) { if (isUncompressed) { int result = cAmplitude; cAmplitude = 0; return result; } else { try { return mMediaRecorder.getMaxAmplitude(); } catch (IllegalStateException e) { return 0; } } } else { return 0; } } /** * * Prepares the recorder for recording, in case the recorder is not in the INITIALIZING state and the file path was not set * the recorder is set to the ERROR state, which makes a reconstruction necessary. * In case uncompressed recording is toggled, the header of the wave file is written. * In case of an exception, the state is changed to ERROR * */ public void prepare() { try { if (state == State.INITIALIZING) { if (isUncompressed) { if ((mAudioRecorder.getState() == AudioRecord.STATE_INITIALIZED) & (mFilePath != null)) { // write file header Log.w(this,"prepare(): nRate: "+nRate+" nChannels: "+nChannels); mFileWriter = new RandomAccessFile(mFilePath, "rw"); mFileWriter.setLength(0); // Set file length to 0, to prevent unexpected behavior in case the file already existed mFileWriter.writeBytes("RIFF"); // 4 mFileWriter.writeInt(0); // 4 Final file size not known yet, write 0 mFileWriter.writeBytes("WAVE"); // 4 mFileWriter.writeBytes("fmt "); // 4 mFileWriter.writeInt(Integer.reverseBytes(16)); // 4 Sub-chunk size, 16 for PCM mFileWriter.writeShort(Short.reverseBytes((short) 1)); // 2 AudioFormat, 1 for PCM mFileWriter.writeShort(Short.reverseBytes(nChannels)); // 2 Number of channels, 1 for mono, 2 for stereo mFileWriter.writeInt(Integer.reverseBytes(nRate)); // 4 Sample rate mFileWriter.writeInt(Integer.reverseBytes(nRate*nSamples*nChannels/8)); // 4 Byte rate, SampleRate*NumberOfChannels*BitsPerSample/8 mFileWriter.writeShort(Short.reverseBytes((short)(nChannels*nSamples/8))); // 2 Block align, NumberOfChannels*BitsPerSample/8 mFileWriter.writeShort(Short.reverseBytes(nSamples)); // 2 Bits per sample mFileWriter.writeBytes("data"); // 4 mFileWriter.writeInt(0); // 4 Data chunk size not known yet, write 0 mBuffer = new byte[nFramePeriod*nSamples/8*nChannels]; state = State.READY; } else { Log.e(this, "prepare() method called on uninitialized recorder"); state = State.ERROR; } } else { mMediaRecorder.prepare(); state = State.READY; } } else { Log.e(this, "prepare() method called on illegal state"); release(); state = State.ERROR; } } catch(Exception e) { if (e.getMessage() != null) { Log.e(this, e.getMessage()); } else { Log.e(this, "Unknown error occured in prepare()"); } state = State.ERROR; } } /** * * * Releases the resources associated with this class, and removes the unnecessary files, when necessary * */ public void release() { if (state == State.RECORDING) { stop(); } else { if ((state == State.READY) & (isUncompressed)) { try { mFileWriter.close(); // Remove prepared file } catch (IOException e) { Log.e(this, "I/O exception occured while closing output file"); } (new File(mFilePath)).delete(); } } if (isUncompressed) { if (mAudioRecorder != null) { mAudioRecorder.release(); } } else { if (mMediaRecorder != null) { mMediaRecorder.release(); } } } /** * * * Resets the recorder to the INITIALIZING state, as if it was just created. * In case the class was in RECORDING state, the recording is stopped. * In case of exceptions the class is set to the ERROR state. * */ public void reset() { try { if (state != State.ERROR) { release(); mFilePath = null; // Reset file path cAmplitude = 0; // Reset amplitude if (isUncompressed) { mAudioRecorder = new AudioRecord(nSource, nRate, nChannels+1, nFormat, nBufferSize); } else { mMediaRecorder = new MediaRecorder(); mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP); mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB); } state = State.INITIALIZING; } } catch (Exception e) { Log.e(this, e.getMessage()); state = State.ERROR; } } /** * * * Starts the recording, and sets the state to RECORDING. * Call after prepare(). * */ public void start() { if (state == State.READY) { if (isUncompressed) { nPayloadSize = 0; mAudioRecorder.startRecording(); mAudioRecorder.read(mBuffer, 0, mBuffer.length); } else { mMediaRecorder.start(); } state = State.RECORDING; } else { Log.e(this, "start() called on illegal state"); state = State.ERROR; } } /** * * * Stops the recording, and sets the state to STOPPED. * In case of further usage, a reset is needed. * Also finalizes the wave file in case of uncompressed recording. * */ public void stop() { if (state == State.RECORDING) { if (isUncompressed) { mAudioRecorder.stop(); mAudioRecorder.setRecordPositionUpdateListener(null); try { mFileWriter.seek(4); // Write size to RIFF header mFileWriter.writeInt(Integer.reverseBytes(36+nPayloadSize)); mFileWriter.seek(40); // Write size to Subchunk2Size field mFileWriter.writeInt(Integer.reverseBytes(nPayloadSize)); mFileWriter.close(); Log.w(this, "Recording stopped successfully"); } catch(IOException e) { Log.e(this, "I/O exception occured while closing output file"); state = State.ERROR; } } else { mMediaRecorder.stop(); } state = State.STOPPED; } else { Log.e(this, "stop() called on illegal state"); state = State.ERROR; } } /* * * Converts a byte[2] to a short, in LITTLE_ENDIAN format * */ private short getShort(byte argB1, byte argB2) { return (short)(argB1 | (argB2 << 8)); } } 

Example of use when recording:
Audiothread
  /* * Thread to manage live recording/playback of voice input from the device's microphone. */ private final static int[] sampleRates = {44100, 22050, 16000, 11025, 8000}; protected int usedSampleRate; private class AudioThread extends Thread { private final File targetFile; private final static String TAG = "AudioThread"; /** * Give the thread high priority so that it's not canceled unexpectedly, and start it */ private AudioThread(final File file) { targetFile = file; } @Override public void run() { Log.i(TAG, "Running Audio Thread"); Looper.prepare(); int i = 0; do { usedSampleRate = sampleRates[i]; if (audioRecorder != null) audioRecorder.release(); audioRecorder = new AudioRecorder(true, AudioSource.MIC, usedSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT); } while ((++i < sampleRates.length) && !(audioRecorder.getState() == AudioRecorder.State.INITIALIZING)); Log.i(this, "usedSampleRate: " + usedSampleRate + " setOutputFile: " + targetFile); try { audioRecorder.setOutputFile(targetFile); // start the recording audioRecorder.prepare(); audioRecorder.start(); // if error occurred and thus recording is not started if (audioRecorder.getState() == AudioRecorder.State.ERROR) { Toast.makeText(getBaseContext(), "AudioRecorder error", Toast.LENGTH_SHORT).show(); } } catch (NullPointerException ignored){} // audioRecorder became null since it was canceled Looper.loop(); } } 

To play:
playerPlayUsingAudioTrack
  /* * Thread to manage playback of recorded message. */ private int bufferSize; protected int byteOffset; protected int fileLengh; public void playerPlayUsingAudioTrack(File messageFileWav) { if (messageFileWav == null || !messageFileWav.exists() || !messageFileWav.canRead()) { Toast.makeText( getBaseContext(), "Audiofile error: exists(): " + messageFileWav.exists() + " canRead(): " + messageFileWav.canRead(), Toast.LENGTH_SHORT).show(); return; } // is previous thread alive? if (audioTrackThread!=null){ audioTrackThread.isStopped = true; audioTrackThread = null; } bufferSize = AudioRecord.getMinBufferSize(sampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT) * 4; audioTrackThread = new StoppableThread(){ @Override public void run() { audioTrack = new AudioTrack( AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize, AudioTrack.MODE_STREAM); fileLengh = (int) messageFileWav.length(); sbPlayerProgress.setMax(fileLengh / 2); int byteCount = 4 * 1024; // 4 kb final byte[] byteData = new byte[byteCount]; // Reading the file.. RandomAccessFile in = null; try { in = new RandomAccessFile(messageFileWav, "r"); int ret; byteOffset = 44; audioTrack.play(); isPaused = false; isPlayerPlaying = true; while (byteOffset < fileLengh) { if (this.isStopped) break; if(isPlayerPaused || this.isPaused) continue; in.seek(byteOffset); ret = in. read(byteData, 0, byteCount); if (ret != -1) { // Write the byte array to the track audioTrack.write(byteData, 0, ret); audioTrack.setPlaybackRate(pitchValue); byteOffset += ret; } else break; } } catch (Exception e) { //IOException, FileNotFoundException, NPE for audioTrack e.printStackTrace(); } finally { if (in != null) try { in.close(); } catch (IOException ignored) { } } } }; audioTrackThread.start(); } 

In general, it turns out to record, reproduce - it turns out, the type of pitch effect is obtained - audioTrack.setPlaybackRate (pitchValue) - this variable is tied to my SeekBar and the user can change its value during playback and like to hear the effect. It is not clear how to save the effect of the selected level in the file ... You need to see something on C / NDK to sculpt specialized.
But in general there are surprises and besides this: who has acc. experience, he knows that all sorts of devices from Samsung, HTC and other brands are not 100% generic Android compatible. Each brand has its own “improvements” at the OS source level, because of which the code documented in Google will not work at all, or as expected, and this is especially related to the media, and therefore various kinds of crutches are required to be built for these devices.
For example, on Samsung, problems with streaming audio playback, i.e. using the MediaPlayer class and specifying an HTTP link to the MP3 file for the source (namely, the audio files uploaded to the application server are then planned to play), Samsung will play it randomly - then play, then not play, although any other devices with the same application source code and other identical conditions will always play normally, and the crutch consists in loading the file in parts in a separate stream and feeding it to playback as if it were a locally recorded file. And Samsung, and, unlike others, was taught to swallow perfect pauses in MP3, well, when there is perfect silence (even the wave form is not drawn in the editor), they just skip them, which is why the 5-minute report is reproduced first unnatural, and secondly, the original duration is broken, for example, it’s not 5 minutes, but 4 or less (depending on the length of pauses between phrases). No other devices do this. Crutch: add white noise to pause.
On some NTS models, there are problems with recording audio, when recording using the MediaPlayer works fine, but via AudioTrack it does not. In fact, there is a problem with recording the accumulated buffer, just updateListener (see AudioRecoder code) does not work, on other devices this lisner works, but on the NTS - no, well, is it necessary to differ somehow? Well, here it is. A crutch can also be built here, but other problems will emerge + there are other problems on various other brands, for example, non-support of the 48 kHz or 44.1 kHz sampling rate or different combinations of recording settings, such as mono do not write, but only stereo, others . In general, this topic in android is still a batthert and somehow I do not want to find all new incompatible devices and to fence all new crutches for them.
The funny thing is that any Chinese smartphones, except, of course, brands like Meizu, will be more compatible with Android compared to market brands, because they stupidly do not bother with customizing the OS, well, or they have no money for it.

And so, once again, when looking for some more solutions on this subject, preferably very alternative, and not wrappers around the two classes mentioned (I have not mentioned SoundPool yet, but this class is simply not suitable for solving my problem due to its limitations ), I stumbled upon the SuperPowered . SKD is given for free, you just need to register, cross-platform, which is important, since I just need the application for both AOS and iOS.
I looked at the demo video on the site, to put it mildly I was inspired (and in real life I was just as crazy ofigel ), I registered, downloaded KFOR and sample.
The first thing I want to mention is AndroidStudio in the span, because “Suddenly” the project turned out to be sharpened by the NDK and AS had problems with it, spending time on solving which I had absolutely no desire (well, I didn’t master AS to import the project normally). Therefore, the project was opened in Eclipse without problems and without problems it was launched, since earlier I downloaded and installed the NDK (indicating the path to which the AS did not help much).
Samlp I launched on the corpse, by today's standards - the NTS HD2 with MIUI installed on it with the bucket version 2.3.5. The device in work shows itself to be still a brake even on completely lightweight devices, even on ordinary ones, where lists with pictures are simply displayed. Antutu on it gives some completely ridiculous numbers and advises to throw it away, and after the next upgrade it generally stupidly hangs and hangs the phone so that you have to remove the battery. For that on this device one can clearly see who knows about the existence of the ViewHolder, and who does not.
So, this sample started on it without problems and works without even hints of lags! Those. I made sure that yes, this bibla is really Low Latency! Yes, and the application itself is cool - you can feel like a DJ for a couple of minutes, I spent a week with him, “as with a written sack,” rushing around with joy.
However, having dug deeper, my joys subsided, because I found that under Android there everything was limited to this sample, although for iOS there are much more examples, and in order to fully use this library, you have to write JNI yourself, which I don’t know and, in fact, I am writing this article with the goal that interested users will develop this topic. I myself am not a sishnik, but in principle I added by analogy a few more effects to those three that are there, however, they add a little interesting things, but they work, plus a small fix added - when you go to the background and return to the sample, pad due to the fact that playback did not stop:
SuperpoweredExample.h
 #include "SuperpoweredExample.h" #include <jni.h> #include <stdlib.h> #include <stdio.h> #include <android/log.h> static void playerEventCallbackA(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) { if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) { SuperpoweredAdvancedAudioPlayer *playerA = *((SuperpoweredAdvancedAudioPlayer **)clientData); playerA->setBpm(126.0f); playerA->setFirstBeatMs(353); playerA->setPosition(playerA->firstBeatMs, false, false); }; } static void playerEventCallbackB(void *clientData, SuperpoweredAdvancedAudioPlayerEvent event, void *value) { if (event == SuperpoweredAdvancedAudioPlayerEvent_LoadSuccess) { SuperpoweredAdvancedAudioPlayer *playerB = *((SuperpoweredAdvancedAudioPlayer **)clientData); playerB->setBpm(123.0f); playerB->setFirstBeatMs(40); playerB->setPosition(playerB->firstBeatMs, false, false); }; } static void openSLESCallback(SLAndroidSimpleBufferQueueItf caller, void *pContext) { ((SuperpoweredExample *)pContext)->process(caller); } static const SLboolean requireds[2] = { SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE }; SuperpoweredExample::SuperpoweredExample(const char *path, int *params) : currentBuffer(0), buffersize(params[5]), activeFx(0), crossValue(0.0f), volB(0.0f), volA(1.0f * headroom) { pthread_mutex_init(&mutex, NULL); // This will keep our player volumes and playback states in sync. for (int n = 0; n < NUM_BUFFERS; n++) outputBuffer[n] = (float *)memalign(16, (buffersize + 16) * sizeof(float) * 2); unsigned int samplerate = params[4]; playerA = new SuperpoweredAdvancedAudioPlayer(&playerA , playerEventCallbackA, samplerate, 0); playerA->open(path, params[0], params[1]); playerB = new SuperpoweredAdvancedAudioPlayer(&playerB, playerEventCallbackB, samplerate, 0); playerB->open(path, params[2], params[3]); playerA->syncMode = playerB->syncMode = SuperpoweredAdvancedAudioPlayerSyncMode_TempoAndBeat; roll = new SuperpoweredRoll(samplerate); filter = new SuperpoweredFilter(SuperpoweredFilter_Resonant_Lowpass, samplerate); flanger = new SuperpoweredFlanger(samplerate); whoosh = new SuperpoweredWhoosh(samplerate); gate = new SuperpoweredGate(samplerate); echo = new SuperpoweredEcho(samplerate); reverb = new SuperpoweredReverb(samplerate); //stretch = new SuperpoweredTimeStretching(samplerate); mixer = new SuperpoweredStereoMixer(); // Create the OpenSL ES engine. slCreateEngine(&openSLEngine, 0, NULL, 0, NULL, NULL); (*openSLEngine)->Realize(openSLEngine, SL_BOOLEAN_FALSE); SLEngineItf openSLEngineInterface = NULL; (*openSLEngine)->GetInterface(openSLEngine, SL_IID_ENGINE, &openSLEngineInterface); // Create the output mix. (*openSLEngineInterface)->CreateOutputMix(openSLEngineInterface, &outputMix, 0, NULL, NULL); (*outputMix)->Realize(outputMix, SL_BOOLEAN_FALSE); SLDataLocator_OutputMix outputMixLocator = { SL_DATALOCATOR_OUTPUTMIX, outputMix }; // Create the buffer queue player. SLDataLocator_AndroidSimpleBufferQueue bufferPlayerLocator = { SL_DATALOCATOR_ANDROIDSIMPLEBUFFERQUEUE, NUM_BUFFERS }; SLDataFormat_PCM bufferPlayerFormat = { SL_DATAFORMAT_PCM, 2, samplerate * 1000, SL_PCMSAMPLEFORMAT_FIXED_16, SL_PCMSAMPLEFORMAT_FIXED_16, SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT, SL_BYTEORDER_LITTLEENDIAN }; SLDataSource bufferPlayerSource = { &bufferPlayerLocator, &bufferPlayerFormat }; const SLInterfaceID bufferPlayerInterfaces[1] = { SL_IID_BUFFERQUEUE }; SLDataSink bufferPlayerOutput = { &outputMixLocator, NULL }; (*openSLEngineInterface)->CreateAudioPlayer(openSLEngineInterface, &bufferPlayer, &bufferPlayerSource, &bufferPlayerOutput, 1, bufferPlayerInterfaces, requireds); (*bufferPlayer)->Realize(bufferPlayer, SL_BOOLEAN_FALSE); // Initialize and start the buffer queue. (*bufferPlayer)->GetInterface(bufferPlayer, SL_IID_BUFFERQUEUE, &bufferQueue); (*bufferQueue)->RegisterCallback(bufferQueue, openSLESCallback, this); memset(outputBuffer[0], 0, buffersize * 4); memset(outputBuffer[1], 0, buffersize * 4); (*bufferQueue)->Enqueue(bufferQueue, outputBuffer[0], buffersize * 4); (*bufferQueue)->Enqueue(bufferQueue, outputBuffer[1], buffersize * 4); SLPlayItf bufferPlayerPlayInterface; (*bufferPlayer)->GetInterface(bufferPlayer, SL_IID_PLAY, &bufferPlayerPlayInterface); (*bufferPlayerPlayInterface)->SetPlayState(bufferPlayerPlayInterface, SL_PLAYSTATE_PLAYING); } SuperpoweredExample::~SuperpoweredExample() { for (int n = 0; n < NUM_BUFFERS; n++) free(outputBuffer[n]); delete playerA; delete playerB; delete mixer; pthread_mutex_destroy(&mutex); } void SuperpoweredExample::onPlayPause(bool play) { pthread_mutex_lock(&mutex); if (!play) { playerA->pause(); playerB->pause(); } else { bool masterIsA = (crossValue <= 0.5f); playerA->play(!masterIsA); playerB->play(masterIsA); }; pthread_mutex_unlock(&mutex); } void SuperpoweredExample::onCrossfader(int value) { pthread_mutex_lock(&mutex); crossValue = float(value) * 0.01f; if (crossValue < 0.01f) { volA = 1.0f * headroom; volB = 0.0f; } else if (crossValue > 0.99f) { volA = 0.0f; volB = 1.0f * headroom; } else { // constant power curve volA = cosf(M_PI_2 * crossValue) * headroom; volB = cosf(M_PI_2 * (1.0f - crossValue)) * headroom; }; pthread_mutex_unlock(&mutex); } void SuperpoweredExample::onFxSelect(int value) { __android_log_print(ANDROID_LOG_VERBOSE, "SuperpoweredExample", "FXSEL %i", value); activeFx = value; } void SuperpoweredExample::onFxOff() { filter->enable(false); roll->enable(false); flanger->enable(false); whoosh->enable(false); gate->enable(false); echo->enable(false); reverb->enable(false); } #define MINFREQ 60.0f #define MAXFREQ 20000.0f static inline float floatToFrequency(float value) { if (value > 0.97f) return MAXFREQ; if (value < 0.03f) return MINFREQ; value = powf(10.0f, (value + ((0.4f - fabsf(value - 0.4f)) * 0.3f)) * log10f(MAXFREQ - MINFREQ)) + MINFREQ; return value < MAXFREQ ? value : MAXFREQ; } void SuperpoweredExample::onFxValue(int ivalue) { float value = float(ivalue) * 0.01f; switch (activeFx) { // filter case 1: filter->setResonantParameters(floatToFrequency(1.0f - value), 0.2f); filter->enable(true); flanger->enable(false); roll->enable(false); whoosh->enable(false); gate->enable(false); echo->enable(false); reverb->enable(false); break; // roll case 2: if (value > 0.8f) roll->beats = 0.0625f; else if (value > 0.6f) roll->beats = 0.125f; else if (value > 0.4f) roll->beats = 0.25f; else if (value > 0.2f) roll->beats = 0.5f; else roll->beats = 1.0f; roll->enable(true); filter->enable(false); flanger->enable(false); whoosh->enable(false); gate->enable(false); echo->enable(false); reverb->enable(false); break; // echo case 3: flanger->enable(false); filter->enable(false); roll->enable(false); whoosh->enable(false); gate->enable(false); echo->setMix(value); echo->enable(true); reverb->enable(false); break; // whoosh case 4: flanger->enable(false); filter->enable(false); roll->enable(false); whoosh->setFrequency(floatToFrequency(1.0f - value)); whoosh->enable(true); gate->enable(false); echo->enable(false); reverb->enable(false); break; // gate case 5: flanger->enable(false); filter->enable(false); roll->enable(false); whoosh->enable(false); echo->enable(false); if (value > 0.8f) gate->beats = 0.0625f; else if (value > 0.6f) gate->beats = 0.125f; else if (value > 0.4f) gate->beats = 0.25f; else if (value > 0.2f) gate->beats = 0.5f; else gate->beats = 1.0f; gate->enable(true); reverb->enable(false); break; // reverb case 6: flanger->enable(false); filter->enable(false); roll->enable(false); whoosh->enable(false); echo->enable(false); gate->enable(false); reverb->enable(true); reverb->setRoomSize(value); break; // flanger default: flanger->setWet(value); flanger->enable(true); filter->enable(false); roll->enable(false); whoosh->enable(false); gate->enable(false); echo->enable(false); }; } void SuperpoweredExample::process(SLAndroidSimpleBufferQueueItf caller) { pthread_mutex_lock(&mutex); float *stereoBuffer = outputBuffer[currentBuffer]; bool masterIsA = (crossValue <= 0.5f); float masterBpm = masterIsA ? playerA->currentBpm : playerB->currentBpm; double msElapsedSinceLastBeatA = playerA->msElapsedSinceLastBeat; // When playerB needs it, playerA has already stepped this value, so save it now. bool silence = !playerA->process(stereoBuffer, false, buffersize, volA, masterBpm, playerB->msElapsedSinceLastBeat); if (playerB->process(stereoBuffer, !silence, buffersize, volB, masterBpm, msElapsedSinceLastBeatA)) silence = false; roll->bpm = flanger->bpm = gate->bpm = masterBpm; // Syncing fx is one line. if (roll->process(silence ? NULL : stereoBuffer, stereoBuffer, buffersize) && silence) silence = false; if (!silence) { filter->process(stereoBuffer, stereoBuffer, buffersize); flanger->process(stereoBuffer, stereoBuffer, buffersize); whoosh->process(stereoBuffer, stereoBuffer, buffersize); gate->process(stereoBuffer, stereoBuffer, buffersize); echo->process(stereoBuffer, stereoBuffer, buffersize); reverb->process(stereoBuffer, stereoBuffer, buffersize); }; pthread_mutex_unlock(&mutex); // The stereoBuffer is ready now, let's put the finished audio into the requested buffers. if (silence) memset(stereoBuffer, 0, buffersize * 4); else SuperpoweredStereoMixer::floatToShortInt(stereoBuffer, (short int *)stereoBuffer, buffersize); (*caller)->Enqueue(caller, stereoBuffer, buffersize * 4); if (currentBuffer < NUM_BUFFERS - 1) currentBuffer++; else currentBuffer = 0; } extern "C" { JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_SuperpoweredExample(JNIEnv *javaEnvironment, jobject self, jstring apkPath, jlongArray offsetAndLength); JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onPlayPause(JNIEnv *javaEnvironment, jobject self, jboolean play); JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onCrossfader(JNIEnv *javaEnvironment, jobject self, jint value); JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxSelect(JNIEnv *javaEnvironment, jobject self, jint value); JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxOff(JNIEnv *javaEnvironment, jobject self); JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxValue(JNIEnv *javaEnvironment, jobject self, jint value); } static SuperpoweredExample *example = NULL; // Android is not passing more than 2 custom parameters, so we had to pack file offsets and lengths into an array. JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_SuperpoweredExample(JNIEnv *javaEnvironment, jobject self, jstring apkPath, jlongArray params) { // Convert the input jlong array to a regular int array. jlong *longParams = javaEnvironment->GetLongArrayElements(params, JNI_FALSE); int arr[6]; for (int n = 0; n < 6; n++) arr[n] = longParams[n]; javaEnvironment->ReleaseLongArrayElements(params, longParams, JNI_ABORT); const char *path = javaEnvironment->GetStringUTFChars(apkPath, JNI_FALSE); example = new SuperpoweredExample(path, arr); javaEnvironment->ReleaseStringUTFChars(apkPath, path); } JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onPlayPause(JNIEnv *javaEnvironment, jobject self, jboolean play) { example->onPlayPause(play); } JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onCrossfader(JNIEnv *javaEnvironment, jobject self, jint value) { example->onCrossfader(value); } JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxSelect(JNIEnv *javaEnvironment, jobject self, jint value) { example->onFxSelect(value); } JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxOff(JNIEnv *javaEnvironment, jobject self) { example->onFxOff(); } JNIEXPORT void Java_com_example_SuperpoweredExample_MainActivity_onFxValue(JNIEnv *javaEnvironment, jobject self, jint value) { example->onFxValue(value); } 
SuperpoweredExample.cpp
 #ifndef Header_SuperpoweredExample #define Header_SuperpoweredExample #include <SLES/OpenSLES.h> #include <SLES/OpenSLES_Android.h> #include <math.h> #include <pthread.h> #include "SuperpoweredExample.h" #include "SuperpoweredAdvancedAudioPlayer.h" #include "SuperpoweredFilter.h" #include "SuperpoweredRoll.h" #include "SuperpoweredFlanger.h" #include "SuperpoweredMixer.h" #include "SuperpoweredWhoosh.h" #include "SuperpoweredGate.h" #include "SuperpoweredEcho.h" #include "SuperpoweredReverb.h" #include "SuperpoweredTimeStretching.h" #define NUM_BUFFERS 2 #define HEADROOM_DECIBEL 3.0f static const float headroom = powf(10.0f, -HEADROOM_DECIBEL * 0.025); class SuperpoweredExample { public: SuperpoweredExample(const char *path, int *params); ~SuperpoweredExample(); void process(SLAndroidSimpleBufferQueueItf caller); void onPlayPause(bool play); void onCrossfader(int value); void onFxSelect(int value); void onFxOff(); void onFxValue(int value); private: SLObjectItf openSLEngine, outputMix, bufferPlayer; SLAndroidSimpleBufferQueueItf bufferQueue; SuperpoweredAdvancedAudioPlayer *playerA, *playerB; SuperpoweredRoll *roll; SuperpoweredFilter *filter; SuperpoweredFlanger *flanger; SuperpoweredStereoMixer *mixer; SuperpoweredWhoosh *whoosh; SuperpoweredGate *gate; SuperpoweredEcho *echo; SuperpoweredReverb *reverb; SuperpoweredTimeStretching *stretch; unsigned char activeFx; float crossValue, volA, volB; pthread_mutex_t mutex; float *outputBuffer[NUM_BUFFERS]; int currentBuffer, buffersize; }; #endif 
MainActivity
 package com.example.SuperpoweredExample; import java.io.IOException; import android.annotation.SuppressLint; import android.app.Activity; import android.content.Context; import android.content.res.AssetFileDescriptor; import android.media.AudioManager; import android.os.Bundle; import android.view.View; import android.widget.Button; import android.widget.RadioButton; import android.widget.RadioGroup; import android.widget.RadioGroup.OnCheckedChangeListener; import android.widget.SeekBar; import android.widget.SeekBar.OnSeekBarChangeListener; public class MainActivity extends Activity { boolean playing =false; RadioGroup group1; RadioGroup group2; OnCheckedChangeListener rgCheckedChanged = new OnCheckedChangeListener() { @Override public void onCheckedChanged(RadioGroup group, int checkedId) { RadioButton checkedRadioButton = (RadioButton)group.findViewById(checkedId); final int delta = group==group2 ? 4:0; if (group==group1){ group2.setOnCheckedChangeListener(null); group2.clearCheck(); group2.setOnCheckedChangeListener(rgCheckedChanged); } else { group1.setOnCheckedChangeListener(null); group1.clearCheck(); group1.setOnCheckedChangeListener(rgCheckedChanged); } onFxSelect(group.indexOfChild(checkedRadioButton)+delta); } }; @SuppressLint("NewApi") protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); // Get the device's sample rate and buffer size to enable low-latency Android audio output, if available. AudioManager audioManager = (AudioManager) this.getSystemService(Context.AUDIO_SERVICE); String samplerateString=null, buffersizeString=null; try { samplerateString = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE); } catch (NoSuchMethodError ignored){} try { buffersizeString = audioManager.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER); } catch (NoSuchMethodError ignored){} if (samplerateString == null) samplerateString = "44100"; if (buffersizeString == null) buffersizeString = "512"; // Files under res/raw are not compressed, just copied into the APK. Get the offset and length to know where our files are located. AssetFileDescriptor fd0 = getResources().openRawResourceFd(R.raw.lycka), fd1 = getResources().openRawResourceFd(R.raw.nuyorica); long[] params = { fd0.getStartOffset(), fd0.getLength(), fd1.getStartOffset(), fd1.getLength(), Integer.parseInt(samplerateString), Integer.parseInt(buffersizeString) }; try { fd0.getParcelFileDescriptor().close(); } catch (IOException e) {} try { fd1.getParcelFileDescriptor().close(); } catch (IOException e) {} SuperpoweredExample(getPackageResourcePath(), params); // Arguments: path to the APK file, offset and length of the two resource files, sample rate, audio buffer size. // crossfader events final SeekBar crossfader = (SeekBar)findViewById(R.id.crossFader); crossfader.setOnSeekBarChangeListener(new OnSeekBarChangeListener() { public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { onCrossfader(progress); } public void onStartTrackingTouch(SeekBar seekBar) {} public void onStopTrackingTouch(SeekBar seekBar) {} }); // fx fader events final SeekBar fxfader = (SeekBar)findViewById(R.id.fxFader); fxfader.setOnSeekBarChangeListener(new OnSeekBarChangeListener() { public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { onFxValue(progress); } public void onStartTrackingTouch(SeekBar seekBar) { onFxValue(seekBar.getProgress()); } public void onStopTrackingTouch(SeekBar seekBar) { onFxOff(); } }); group1 = (RadioGroup)findViewById(R.id.radioGroup1); group1.setOnCheckedChangeListener(rgCheckedChanged); group2 = (RadioGroup)findViewById(R.id.radioGroup2); group2.setOnCheckedChangeListener(rgCheckedChanged); // // fx select event // group.setOnCheckedChangeListener(new RadioGroup.OnCheckedChangeListener() { // public void onCheckedChanged(RadioGroup radioGroup, int checkedId) { // RadioButton checkedRadioButton = (RadioButton)radioGroup.findViewById(checkedId); // onFxSelect(radioGroup.indexOfChild(checkedRadioButton)); // group2.clearCheck(); // } // }); // group2.setOnCheckedChangeListener(new RadioGroup.OnCheckedChangeListener() { // public void onCheckedChanged(RadioGroup radioGroup, int checkedId) { // RadioButton checkedRadioButton = (RadioButton)radioGroup.findViewById(checkedId); // onFxSelect(radioGroup.indexOfChild(checkedRadioButton)+4); // group.clearCheck(); // } // }); } public void SuperpoweredExample_PlayPause(View button) { // Play/pause. playing = !playing; onPlayPause(playing); Button b = (Button) findViewById(R.id.playPause); b.setText(playing ? "Pause" : "Play"); } private native void SuperpoweredExample(String apkPath, long[] offsetAndLength); private native void onPlayPause(boolean play); private native void onCrossfader(int value); private native void onFxSelect(int value); private native void onFxOff(); private native void onFxValue(int value); static { System.loadLibrary("SuperpoweredExample"); } @Override protected void onDestroy() { super.onDestroy(); onPlayPause(false); } } 
main.xml
 <?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:tools="http://schemas.android.com/tools" xmlns:android="http://schemas.android.com/apk/res/android" xmlns:android1="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" > <Button android1:id="@+id/playPause" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:layout_alignParentTop="true" android1:layout_centerHorizontal="true" android1:layout_marginLeft="5dp" android1:layout_marginRight="5dp" android1:layout_marginTop="15dp" android1:onClick="SuperpoweredExample_PlayPause" android1:text="@string/play" /> <SeekBar android1:id="@+id/crossFader" android1:layout_width="match_parent" android1:layout_height="wrap_content" android1:layout_alignParentLeft="true" android1:layout_below="@+id/playPause" android1:layout_marginLeft="5dp" android1:layout_marginRight="5dp" android1:layout_marginTop="15dp" /> <RadioGroup android1:id="@+id/radioGroup1" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:layout_below="@+id/crossFader" android1:layout_centerHorizontal="true" android1:layout_marginTop="15dp" android1:orientation="horizontal" > <RadioButton android1:id="@+id/radio0" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:checked="true" android1:text="@string/flanger" /> <RadioButton android1:id="@+id/radio1" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/filter" /> <RadioButton android1:id="@+id/radio2" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/roll" /> <RadioButton android1:id="@+id/radio3" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/echo" /> </RadioGroup> <RadioGroup android1:id="@+id/radioGroup2" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:layout_below="@+id/radioGroup1" android1:layout_centerHorizontal="true" android1:layout_marginTop="5dp" android1:orientation="horizontal" > <RadioButton android1:id="@+id/radio4" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/whoosh" /> <RadioButton android1:id="@+id/radio5" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/gate" /> <RadioButton android1:id="@+id/radio6" android1:layout_width="wrap_content" android1:layout_height="wrap_content" android1:text="@string/reverb" /> </RadioGroup> <SeekBar android1:id="@+id/fxFader" android1:layout_width="match_parent" android1:layout_height="wrap_content" android1:layout_alignParentLeft="true" android1:layout_below="@+id/radioGroup2" android1:layout_marginLeft="5dp" android1:layout_marginRight="5dp" android1:layout_marginTop="15dp" /> </RelativeLayout> 

, , :
— , , WAVE/ ( JNI , iOS);
— / ( JNI , iOS);
— SuperpoweredAdvancedAudioPlayer pitch shift ( );
— SuperpoweredAdvancedAudioPlayer , .

SDK , :
— WAV- MP3, , LAME .
')
:
— SuperpoweredAdvancedAudioPlayer Samsung HTC;
— SDK .

, , JNI SDK, , . , , !

Source: https://habr.com/ru/post/246491/


All Articles