⬆️ ⬇️

Basics of Android NDK on the example of working with OpenAL

Good day, dear Habrayuzer!



Recently, I have been developing applications for Android, in particular, developing games. It so happened that for one project I had to work with Android ndk. In principle, it is impossible to consider all the difficulties and nuances of working with native in one article, I decided to write a small introduction to ndk in this article.

And so that the article was interesting not only for beginners, I will show how to work with OpenAL and WAV, OGG formats.





')

Introduction



About setting the environment to write a lot is not worth it, I think. Regardless of the environment in which you develop (Eclipse, IntelliJ IDEA, etc.), the setup is quite simple.

  1. Android NDK itself .
  2. To build under WIn you need Cygwin .
  3. Plugins for the same Eclipse: CDT .


Naturally, you should already have ADT, JDK.



Why do you need NDK?






Calling C ++ code from Java



In general, everything is quite simple, the main steps:

  1. Creating files with C ++ and defining methods for export.
  2. Create .mk files.
  3. Library generation
  4. Connect Library in Java.






About the Makefiles (.mk) will not paint. You can read about him here . In addition, on Habré there is a good article on working with .mk files from BubaVV .



About libraries from ndk you can read here .



Creating C ++ Files


You need to define methods for export, which we will call from Java. As an example, when you start the application, we will load the music in OpenAL. To do this, we define the method:

JNIEXPORT void JNICALL Java_ru_suvitruf_androidndk_tutorial4_MainActivity_loadAudio(JNIEnv *pEnv, jobject pThis, jobject pNativeCallListener, jobject assetManager); 




I write all this with pens, but there is a handy utility for automatically generating javah .



Then we will need to implement it, but more on that later.

A little about the name
It is necessary to say a little about the name of the methods. Java_ is a required prefix. ru_suvitruf_androidndk_tutorial4 , since we have a package ru.suvitruf.androidndk.tutorial4 , well, then the name of the class and method on the Java side. Each function has JNIEnv * as an argument - an interface for working with Java, using it you can call Java methods, create Java objects. The second required parameter - jobject or jclass - depending on whether the method is static. If the method is static, then the argument will be of jclass type (reference to the class of the object in which the method is declared), if not static, jobject is a reference to the object on which the method was called.





Java library connection


After generating the library, you need to connect it to Java.

 static { System.loadLibrary("AndroidNDK"); } 




And define a method with the same name as in C ++ code:

 //  native public void loadAudio(NativeCalls nativeCallListener, AssetManager mng); 




Call this:

 loadAudio(activity, activity.getResources().getAssets()); 




Calling Java from C ++



A little more complicated, but not so scary. What we need:

  1. Define the class method (in Java) that we want to call.
  2. Get the handle of the desired class (in C ++).
  3. Describe the method signature.
  4. Get method ID (link).
  5. Call the method at the desired object.




Of course, you can simply define the method of the class, but it is better to use interfaces. Then we don’t have to change the native code if we want to work with another class.



As an example, create an interface with just one method:

 public interface NativeCalls { public void sendLog(String result); } 




CalledFromWrongThreadException and proper implementation of the interface
Oh, those flows. The problem is that you cannot affect a view from another thread. Therefore, the entire implementation of the interface will be something like this:

 protected Handler handler = new Handler() { @Override public void handleMessage(Message msg) { showResult(msg.getData().getString("result")); } }; public void showResult(String result){ ((TextView) findViewById(R.id.log)). setText(((TextView) findViewById(R.id.log)).getText()+result+"\n"); } //    @Override public void sendLog(String result){ Message msg = new Message(); Bundle data = new Bundle(); data.putString("result", result); msg.setData(data); handler.sendMessage(msg); } 






There may be no problems with threads, but in our case, when the application starts, we will create a separate thread to load resources, so the question is relevant for us.



The Java interface in native C ++ code will correspond to the following class:

NativeCallListener
 class NativeCallListener { public: NativeCallListener(JNIEnv* pJniEnv, jobject pWrapperInstance); NativeCallListener() {} //  //   Java  void sendLog(jobject log); //   void destroy(); ~NativeCallListener(){ } void loadAudio(); //void play(); //void playOGG(); ALCdevice* device; ALCcontext* context; private: JNIEnv* getJniEnv(); //   jmethodID sendLogID; //   jobject mObjectRef; JavaVM* mJVM; ALuint soundWAV; ALuint soundOGG; void load(); void clean(); }; 




Now you can show the implementation of the loadAudio method, which was in the first part of the article.

 JNIEXPORT void JNICALL Java_ru_suvitruf_androidndk_tutorial4_MainActivity_loadAudio(JNIEnv *pEnv, jobject pThis, jobject pNativeCallListener, jobject assetManager) { listener = NativeCallListener(pEnv, pNativeCallListener); mgr = AAssetManager_fromJava(pEnv, assetManager); listener.loadAudio(); } 




In the class constructor, we save the class descriptor and get a reference to its method:

 NativeCallListener::NativeCallListener(JNIEnv* pJniEnv, jobject pWrappedInstance) { pJniEnv->GetJavaVM(&mJVM); mObjectRef = pJniEnv->NewGlobalRef(pWrappedInstance); jclass cl = pJniEnv->GetObjectClass(pWrappedInstance); // ,       Java sendLogID = pJniEnv->GetMethodID(cl, "sendLog", "(Ljava/lang/String;)V"); } 




Now we can call the Java method by writing:

 void NativeCallListener::sendLog(jobject log) { JNIEnv* jniEnv = getJniEnv(); jniEnv->CallIntMethod(mObjectRef, sendLogID, log); } 




AAssetManager



Previously used open source library libzip to work with application resources.

With the 2.3 version of the API in Android ndk, a wonderful class appeared for working with the assets directory directly from C ++ code.

The methods are similar to the methods for working with files from stdio.h. AAssetManager_open instead of fopen, AAsset_read instead of fread, AAsset_close instead of fclose.



I wrote for him a small wrapper. I will not insert the code here, since in general the work is the same as with FILE normal.



Work with OpenAL



The article is already quite a big one, but has not started the most interesting. Please forgive me for this ...



Training


First you need to collect OpenAL. This is enough to work with WAV, but we also want to work with OGG. OGG needs a Tremor decoder.



For sound, I wrote wrappers with the necessary methods. It makes no sense to bring all the code here, I will cover the most interesting, namely the download.



Read wav file


First, it is necessary to describe the structure for headers:

BasicWAVEHeader
 typedef struct { char riff[4];//'RIFF' unsigned int riffSize; char wave[4];//'WAVE' char fmt[4];//'fmt ' unsigned int fmtSize; unsigned short format; unsigned short channels; unsigned int samplesPerSec; unsigned int bytesPerSec; unsigned short blockAlign; unsigned short bitsPerSample; char data[4];//'data' unsigned int dataSize; }BasicWAVEHeader; 






Now we read:

 void OALWav::load(AAssetManager *mgr, const char* filename){ this->filename = filename; this->data = 0; //  this->data = this->readWAVFull(mgr, &header); //  getFormat(); // OpenAL  createBufferFromWave(data); source = 0; alGenSources(1, &source); alSourcei(source, AL_BUFFER, buffer); } 




readWAVFull
 char* OALWav::readWAVFull(AAssetManager *mgr, BasicWAVEHeader* header){ char* buffer = 0; AAssetFile f = AAssetFile(mgr, filename); if (f.null()) { LOGE("no file %s in readWAV",filename); return 0; } int res = f.read(header,sizeof(BasicWAVEHeader),1); if(res){ if (!( //    . //   ,    . //          =/ memcmp("RIFF",header->riff,4) || memcmp("WAVE",header->wave,4) || memcmp("fmt ",header->fmt,4) || memcmp("data",header->data,4) )){ buffer = (char*)malloc(header->dataSize); if (buffer){ if(f.read(buffer,header->dataSize,1)){ f.close(); return buffer; } free(buffer); } } } f.close(); return 0; } 




It is worth saying something about WAV. At times, the file on the PC seems to be heard perfectly, but errors occur when working with OpenAL. This is due to the fact that broken headers. I met many converters that wrote some kind of nonsense to the headers (my logo as an example), usually in dataSize . So why does not work, but on the PC plays?

The actual audio data itself is stored after the header and their size in dataSize . If there is something wrong with this field, there will be errors. You can really count the size in the forehead. Data Size = File Size - Header Size . So, I think the players take the size of the data by subtracting, not from the header.



It seems to be easy to work with WAV, since the format is not compressed. When working with .Ogg everything is more complicated.



Read the Ogg file


What is special about Ogg compared to WAV ? This is a compressed format. So, before there how to write data to the OpenAL buffer, we need to decode the data.

The catch is that by default Vorbis streams out of FILE, so we need to override all the callback methods for working with data:



callbacks
 static size_t read_func(void* ptr, size_t size, size_t nmemb, void* datasource) { unsigned int uiBytes = Min(suiSize - suiCurrPos, (unsigned int)nmemb * (unsigned int)size); memcpy(ptr, (unsigned char*)datasource + suiCurrPos, uiBytes); suiCurrPos += uiBytes; return uiBytes; } static int seek_func(void* datasource, ogg_int64_t offset, int whence) { if (whence == SEEK_SET) suiCurrPos = (unsigned int)offset; else if (whence == SEEK_CUR) suiCurrPos = suiCurrPos + (unsigned int)offset; else if (whence == SEEK_END) suiCurrPos = suiSize; return 0; } static int close_func(void* datasource) { return 0; } static long tell_func(void* datasource) { return (long)suiCurrPos; } 






Now you need to read:

Reading ogg
 void OALOgg::getInfo(unsigned int uiOggSize, char* pvOggBuffer){ //   ov_callbacks callbacks; callbacks.read_func = &read_func; callbacks.seek_func = &seek_func; callbacks.close_func = &close_func; callbacks.tell_func = &tell_func; suiCurrPos = 0; suiSize = uiOggSize; int iRet = ov_open_callbacks(pvOggBuffer, &vf, NULL, 0, callbacks); //  vi = ov_info(&vf, -1); uiPCMSamples = (unsigned int)ov_pcm_total(&vf, -1); } void * OALOgg::ConvertOggToPCM(unsigned int uiOggSize, char* pvOggBuffer) { if(suiSize == 0){ getInfo( uiOggSize, pvOggBuffer); current_section = 0; iRead = 0; uiCurrPos = 0; } void* pvPCMBuffer = malloc(uiPCMSamples * vi->channels * sizeof(short)); //  do { iRead = ov_read(&vf, (char*)pvPCMBuffer + uiCurrPos, 4096, ¤t_section); uiCurrPos += (unsigned int)iRead; } while (iRead != 0); return pvPCMBuffer; } void OALOgg::load(AAssetManager *mgr, const char* filename){ this->filename = filename; char* buf = 0; AAssetFile f = AAssetFile(mgr, filename); if (f.null()) { LOGE("no file %s in readOgg",filename); return ; } buf = 0; buf = (char*)malloc(f.size()); if (buf){ if(f.read(buf,f.size(),1)){ } else { free(buf); f.close(); return; } } char * data = (char *)ConvertOggToPCM(f.size(),buf); f.close(); if (vi->channels == 1) format = AL_FORMAT_MONO16; else format = AL_FORMAT_STEREO16; alGenBuffers(1,&buffer); alBufferData(buffer,format,data,uiPCMSamples * vi->channels * sizeof(short),vi->rate); source = 0; alGenSources(1, &source); alSourcei(source, AL_BUFFER, buffer); } 






When loading an application, we call the C ++ method loadAudio, which calls load on NativeCallListener, which loads the title and:

 void NativeCallListener:: load(){ oalContext = new OALContext(); //sound = new OALOgg(); sound = new OALWav(); char * fileName = new char[64]; strcpy(fileName, "audio/industrial_suspense1.wav"); //strcpy(fileName, "audio/Katatonia - Deadhouse_(piano version).ogg"); sound->load(mgr,fileName); } 


I have a sound like OALSound . To work with WAV and Ogg, I have classes that inherit from it. All we need for them is to write a load implementation by overriding the base class method virtual void load(AAssetManager *mgr, const char* filename)= 0;

This allows you to unify the work with sounds.



Conclusion



Once again, I apologize that the article was quite voluminous, otherwise I can’t imagine how to write. Using the presented implementation, you can work with sound regardless of the platform. Let's say if you are writing a game engine for iOS and Android.



There is a nuance here - the audio is loaded entirely. Therefore, this solution is excellent for sounds, but not for music. Imagine how much memory an unpacked .ogg song will consume. Therefore, it will be great if someone based on this decision writes audio playback with streaming, rather than full loading into the buffer.



Sources


The project is written in Eclipse. Sources can be viewed on github .



PS waiting for critics and advice

PPS If you find grammatical errors in the text, then it is better to write in PM.

Source: https://habr.com/ru/post/176559/



All Articles