Thursday, December 5, 2013

Share file(s) using Intent

public void startSendToIntent(String packageName, String className, String mimetype, String urls) {

    // urls: "/storage/sdcard0/file1.mp4;/storage/sdcard0/file2.mp4;"
    // mimetype: "video/mp4"
    
    /* All url using ";" as the divider */
    String[] urlArray = urls.split(";");
    
    ArrayList uris = new ArrayList(urlArray.length);
    
    for (int i = 0; i < urlArray.length; i++) {
        uris.add(Uri.fromFile(new File(urlArray[i])));
    }
    
    Intent intent = new Intent();
    
    intent.setClassName(packageName, className);
    intent.setType(mimetype);
    
    intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
    
    if (uris.size() > 1) {
        
        intent.setAction(Intent.ACTION_SEND_MULTIPLE);
        intent.putExtra(Intent.EXTRA_STREAM, uris);
        
    } else {
        
        intent.setAction(Intent.ACTION_SEND);
        intent.putExtra(Intent.EXTRA_STREAM, uris.get(0));
        
    }
    
    Log.i(TAG, "startSendToIntent: packageName:"+packageName+" className:"+className+" mimetype:"+mimetype+" urls:"+urls+" uris:"+uris);
    
    mActivity.startActivity(intent);
}

Tuesday, August 13, 2013

Play video inside OpenGL ES by using MediaCodec

In Android 4.1 (API 16) and above, MediaCodec is introduced by Android. By using MediaCodec, we can easily decode video without using Android NDK, like creating video playback application, etc.
In game development, video is also needed to make the game environment more realistics, like when you are playing a racing game with video banner around the track.
In here, I'll introduce of how to decode the video and render it into OpenGL ES.
Firstly, we prepare some stuffs to decode the video, like MediaExtractor and MediaCodec.
private boolean initExtractor() {
    extractor = new MediaExtractor();
    try {
        extractor.setDataSource(mFilePath);
    } catch (IOException e) {
        return false;
    }

    // get video track
    for (int i = 0; i &lt; extractor.getTrackCount(); i++) {
        MediaFormat format = extractor.getTrackFormat(i);
        String mime = format.getString(MediaFormat.KEY_MIME);
        if (mime.startsWith(&quot;video/&quot;)) {
            tracknumb = i;
            break;
        }
    }

    if (tracknumb == -1) {
        Log.e("DecodeActivity", "Can't find video track!");
        return false;
    }

    // set track to extractor
    extractor.selectTrack(tracknumb);

    return true;
}

private boolean initDecoder(Surface surface) {
    // get mimetype and format
    MediaFormat format = extractor.getTrackFormat(tracknumb);
    String mime = format.getString(MediaFormat.KEY_MIME);

    decoder = MediaCodec.createDecoderByType(mime);
    decoder.configure(format, surface, null, 0);
    if (decoder == null) {
        Log.e("DecodeActivity", "Can't find video info!");
        return false;
    }

    decoder.start();

    return true;
}
Later, inside the OpenGL ES, prepare that surface to MediaCodec.
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {

    final String fragmentShaderSourceOES =
    "#extension GL_OES_EGL_image_external : require\n" +
    "precision mediump float;\n" +
    "varying vec2 vTextureCoord;\n" +
    "uniform samplerExternalOES sTexture;\n" +
    "void main() {\n" +
    "  vec4 color = texture2D(sTexture, vTextureCoord);\n" +
    "  gl_FragColor = color;\n" +
    "}\n";

    // Prepare your shader program here
    mVertexShader = ...
    mPixelShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderSourceOES); // use fragment shader that support OES

    mProgram = createProgram(mVertexShader, mPixelShader);

    // Prepare texture handler
    int[] textures = new int[1];
    GLES20.glGenTextures(1, textures, 0);

    mTextureID = textures[0];
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, mTextureID);

    GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
    GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);

    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);

    // Link the texture handler to surface texture
    mSurfaceTexture = new SurfaceTexture(mTextureID);
    mSurfaceTexture.setDefaultBufferSize(320, 240);
    mSurfaceTexture.setOnFrameAvailableListener(new SurfaceTexture.OnFrameAvailableListener() {
         @Override
         public void onFrameAvailable(SurfaceTexture surfaceTexture) {
             synchronized(updateSurface) {
                 updateSurface = true;
             }
         }
    });

    // Create decoder surface
    mDecoderSurface = new Surface(mSurfaceTexture);
}

@Override
public void onDrawFrame(GL10 gl) {
    ...

    synchronized(updateSurface) {
         if (updateSurface) {
              mSurfaceTexture.updateTexImage(); // update surfacetexture if available
         updateSurface = false;
        }
    }

    // use program
    GLES20.glUseProgram(mProgram);
    CLGLUtility.checkGlError(TAG, "glUseProgram");

    // bind texture
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID);

    // set vertices position
    mShape.setPositionArray(maPositionHandle);

    // set vertices texture coordinate
    mShape.setTexCoordArray(maTextureHandle);

    // draw shape
    mShape.drawArrays();
}
And set it to Media Codec.
...
initDecoder(mDecoderSurface);
...
Finally, OpenGL ES and MediaCodec are linked together and we can start the game.
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
    ...
    startDecode();
}

public void startDecode() {
    // get buffers
    decoderInputBuffers = decoder.getInputBuffers();
    decoderOutputBuffers = decoder.getOutputBuffers();

    // start getting buffer
    BufferInfo info = new BufferInfo();
    boolean isEOS = false;
    long startMs = System.currentTimeMillis();

    Log.d("DecodeActivity", "BufferInfo: size:"+info.size);

    while (!threadIterrupted) {
        // get input buffer (decoder)
        if (!isEOS) {
            isEOS = readDecoderBuffer();
        }

        isEOS = checkDecoderBuffer(info, startMs);
        if (isEOS)
            break;
    }

    decoder.stop();
    decoder.release();
    extractor.release();
}

private boolean readDecoderBuffer() {
    int inIndex = decoder.dequeueInputBuffer(10000);

    // index did not get correctly
    if (inIndex < 0)
        return true;

    ByteBuffer buffer = decoderInputBuffers[inIndex];
    int sampleSize = extractor.readSampleData(buffer, 0);

    if (sampleSize < 0) {

        // We shouldn't stop the playback at this point, just pass the EOS
        // flag to decoder, we will get it again from the dequeueOutputBuffer
        Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
        decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);

        return true;

    } else {

        decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
        extractor.advance();

    }

    return false;
}

private boolean checkDecoderBuffer(BufferInfo info, long startMs) {
    // get output buffer, to control the time
    int outIndex = decoder.dequeueOutputBuffer(info, 10000);
    Log.i(TAG , "BufferInfo: size:"+info.size+" presentationTimeUs:"+info.presentationTimeUs+" offset:"+info.offset+" flags:"+info.flags);

    switch (outIndex) {
    case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
        Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
        decoderOutputBuffers = decoder.getOutputBuffers();
        break;
    case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
        Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
        break;
    case MediaCodec.INFO_TRY_AGAIN_LATER:
        Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
        break;
    default:
        ByteBuffer buffer = decoderOutputBuffers[outIndex];
        Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);

        // We use a very simple clock to keep the video FPS, or the video playback will be too fast
        while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
            try {
                sleep(10);
            } catch (InterruptedException e) {
                return false;
            }
        }

        decoder.releaseOutputBuffer(outIndex, true);
        break;
    }

    // All decoded frames have been rendered, we can stop playing now
    if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
        Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
        return true;
    }

    return false;
}

References: (Thanks to Ray)
https://github.com/crossle/MediaPlayerSurface/blob/master/src/me/crossle/demo/surfacetexture/VideoSurfaceView.java
https://vec.io/posts/android-hardware-decoding-with-mediacodec

Thursday, August 1, 2013

Check OpenGL ES capability on Android

There are lots of Android devices around the world. And when you want to create one application that need OpenGL ES, you would think (what I thought) if the user's device can run smoothly by checking the OpenGL ES version because this is important.

So how to check the OpenGL ES version?

  1. By using SystemService. This is the simplest one. You don't need to create any OpenGL ES component.
    final ActivityManager activityManager = (ActivityManager) getSystemService(Context.ACTIVITY_SERVICE);
    final ConfigurationInfo configurationInfo = activityManager.getDeviceConfigurationInfo();
    final boolean supportsEs2 = configurationInfo.reqGlEsVersion >= 0x30000;
  2. By using glGetService. By doing this, you can get more information, like the vendor, chip, shading version, etc. But, you need to have GL context first.
    String glVersion = GLES20.glGetString(GLES20.GL_VERSION);
    String glRenderer = GLES20.glGetString(GLES20.GL_RENDERER);
    String glVendor = GLES20.glGetString(GLES20.GL_VENDOR);
    String glShadingLanguageVersion = GLES20.glGetString(GLES20.GL_SHADING_LANGUAGE_VERSION);
    String glExtensions = GLES20.glGetString(GLES20.GL_EXTENSIONS);

Tuesday, July 30, 2013

Android, OpenGL ES 3.0, GPU

Previously, I've posted OpenCL GPU supported. You can see that mobile GPU is going to be more powerful and more efficient.
When investigating the OpenGL related information, I found that OpenGL ES 3.0 is started to be supported by many GPU vendors. Also, Android 4.3 is released with OpenGL ES 3.0.
Before you are going to creating OpenGL ES 3.0 app, first, you have to know which mobile GPUs that support OpenGL ES 3.0.
Below is the list that I found, also with the chip:

  1. Imageon Adreno 300 series
  2. Mali T-600 series
  3. PowerVR Series 6 (Rogue)
  4. Tegra 4 (WhitePaper PDF)

Please let me know if you know more.
Thanks for watching :)

To check OpenGL ES version on your device, click here.

Friday, July 26, 2013

Android OS History - Big changes

After developing Android app for two years, I figured out Android OS has became better and better. Indeed, I found the history and figured out the big changes for each version. See below for these big changes.
Beta: 5 November 2007, as Android birthday date.

  1. v1.0
    • Support camera.
    • Support Wifi.
    • Support Bluetooth.
  2. v1.1
    • HTC Dream, as first Android device.
  3. v1.5, Cupcake
    • Support video record and playback in MPEG-4 & 3GP formats.
  4. v1.6, Donut
    • Support gesture touch.
  5. v2.0, Eclair
    • Support Bluetooth 2.1.
    • Support more screen size.
    • Improve gesture to multi-touch.
  6. v2.0.1, Eclair
  7. v2.1, Eclair
  8. v2.2-2.2.3, Froyo (frozen yogurt)
    • Improve speed, memory, performance.
    • Use JIT compilation.
    • Support Adobe Flash.
  9. v2.3-2.3.2, Gingerbread
    • Support NFC.
    • Support AAC encode.
    • Support more sensors (gyroscopes and barometer).
    • Improve power management.
    • Improve audio, graphical, input for gaming.
    • Change ext4 file system from YAFFS.
  10. v2.3.3-2.3.7, Gingerbread
    • Improve battery efficiency.
  11. v3.0, Honeycomb
    • Change new UI (systembar, actionbar).
    • Support hardware acceleration.
    • Support multicore processors.
    • Note: for Xoom only.
  12. v3.1, Honeycomb
    • Support Open Ancessory (USB).
    • Support FLAC audio playback.
  13. v3.2, Honeycomb
    • Support GoogleTV.
    • Improve tablet support.
  14. v4.0-4.0.2, IceCream Sandwich
    • Support WiFi Direct.
    • Support WebP image format.
    • Improve video record to 1080p.
    • Note: Theoratically compatible for 2.3.x devices.
  15. v4.0.3-4.0.4, IceCream Sandwich
  16. v4.1, JellyBean
    • Improve rendering performance to 60fps (Project Butter).
    • Improve audio to multichannel.
    • Support video and audio decoder/encoder on Java SDK.
    • Change Chrome as default browser.
    • Remove Adobe Flash.
  17. v4.2, JellyBean
    • Support Wireless display (miracast).
  18. v4.3, JellyBean
    • Support OpenGL ES 3.0.
    • Support MediaMuxer.
    • Improve MediaCodec Encoder.

Reference: http://en.wikipedia.org/wiki/Android_version_history

Thursday, July 25, 2013

Download image as Bitmap

Sometimes, we need to download image and show to ImageView.
Therefore, it needs some Http API to do so.
Below is the code of how to download image using Http API and set it as Bitmap.

Bitmap downloadBitmap(String link) {
try {
URL url = new URL(link);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setDoInput(true);
connection.connect();
InputStream input = connection.getInputStream();
Bitmap myBitmap = BitmapFactory.decodeStream(input);

return myBitmap;
} catch (IOException e) {
e.printStackTrace();
Log.e("getBmpFromUrl error: ", e.getMessage().toString());
return null;
}
}

To use it, you NEED to create a thread because it is restricted on using Http on UI thread.

Thread t = new Thread(new Runnable() {
@Override
public void run() {
Bitmap getdownload = downloadBitmap("http://lh5.ggpht.com/_hepKlJWopDg/TB-_WXikaYI/AAAAAAAAElI/715k4NvBM4w/s144-c/IMG_0075.JPG");
Log.e("", getdownload.getWidth()+" "+getdownload.getHeight());
}
});
t.start();

At last, don't forget to set INTERNET permission.

<uses-permission android:name="android.permission.INTERNET"/>

Adding source code syntax highlight

To do so, you just need to include the codes below into your template by editing the Template from Edit HTML.

<link href="http://alexgorbatchev.com/pub/sh/current/styles/shCore.css" rel="stylesheet" type="text/css">
<link href="http://alexgorbatchev.com/pub/sh/current/styles/shThemeDefault.css" rel="stylesheet" type="text/css">
<script src="http://alexgorbatchev.com/pub/sh/current/scripts/shCore.js" type="text/javascript">
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCpp.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCSharp.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushCss.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJava.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushJScript.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPhp.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPython.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushRuby.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushSql.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushVb.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushXml.js' type='text/javascript'/>
<script src='http://alexgorbatchev.com/pub/sh/current/scripts/shBrushPerl.js' type='text/javascript'/>
<script language='javascript'>
SyntaxHighlighter.config.bloggerMode = true;
SyntaxHighlighter.all();
</script>

This is using SyntaxHighlighter extension. Thanks to Alex Gorbatchev.

Sunday, July 14, 2013

Android, OpenCL, GPU

GPU, Graphics Processing Unit, is the master of the graphics performance on our Android device.It plays an important role when playing games, playing high quality video (HD), and even showing smooth UI. Therefore, it needs to support many rendering libraries, like OpenGL ES, OpenCV, OpenVG, etc.
It seems that GPU on our Android device is performing the same role as our VGA on PC. But, Android is just growing faster recently, so do all GPUs device support all those libraries?
After investigating by reading lots of articles, etc, I figured out that GPU on our Android device are still limited. Few GPUs supports OpenCL library. And what does OpenCL really do?
OpenCL, Open Computing Language, is recently used for non-graphical computing, like image processing, vector computation, matric computation, etc. See sample from Nvidia for more details by clicking here.
Here are Android GPUs that support OpenCL.
  1. Qualcomm (Adreno 300 series)
    • Devices (Snapdragon S4 GPU):
      • Asus PadFone 2, HTC Droid DNA, HTC J Butterfly, LG Optimus G, Nexus 4, Sony Xperia UL, Sony Xperia Z, Sony Xperia ZL, Sony Xperia ZR, Xiaomi MI-2.
  2. ARM (Mali-T600 series)
  3. PowerVR (Series 5XT, SGXMP)
And, there is one GPU that is still not supported OpenCL.
  1. Nvidia Tegra (GeForce ULP)
Inside the list, we could see that mostly GPU vendors are already supported OpenCL inside their newest GPU. But it seems that Nvidia is still thinking on adding OpenCL supports, why? In my opinion, it seems that Nvidia still wants to add CUDA supports for Android (CUDA 5.5 is now supporting ARM).

Tuesday, March 5, 2013

Configure Linksys WRT400N as an access point

The WRT400N is a rather basic router so I use another router with more memory and a faster processor to connect to the internet. I had originally configured the 400N on its own subnet. Today I tried configuring it to be on the same subnet as my main router.

I found useful advice here: http://homecommunity.cisco.com/t5/Wireless-Routers/How-to-configure-WRT400n-as-an-access-point/m-p/277101/highlight/true#M144970

To summarize:

  • Turn off DHCP Server on the 400N 
    • still want the 400N to get it's IP via DHCP 
  • Turn of NAT and the stateful fire wall settings 
    • not sure this is essential after you move the Ethernet cable below. 
    • NAT is in Setup, Advanced Routing 
    • Stateful fire wall is in Security as SPI Firewall protection 
  • be sure the Network Setup shows the same subnet mask as your main router. 
  • I set the IP address to the address reserved for the 400N on the other router. 
    • some suggest using a class C address you are _not_ using. 
  • save your settings 
  • move the Ethernet patch (straight through) cable used to connect the 400N to your main router from the yellow WAN socket to one of the 4 regular sockets. 
    • at this point the router will no longer see any traffic from what it thinks of as the outside world. So it may not matter if you turn of NAT, ... above.

[ROM] Samsung S7 Stock Firmware BRI (Taiwan台灣)

Latest ROM: G930FXXU1DQEU Download: https://kfhost.net/tpl/firmwares.php?record=B7E6DE774E3811E7963AFA163EE8F90B Reference: http://...