Chapter     12

Advanced Android Topics

There are a number of advanced Android topics that are beyond the scope of this book, but it’s still good for you to know about them, so that you can continue learning on your own. I will try to expose as much of this information within this chapter as possible. This will be information that we were unable to cover in this book, outlined here just so that you know it exists, and that you should take a closer look at when you have a chance. Most of it is really amazing functionality that you can add to your apps to make them more marketable.

This chapter will cover how to research on your own the more specialized areas of programming for Android. It will provide a summary of what is available, where to find it, what it can do for your applications, and what classes it uses for its implementation, as well as provide some resources for finding more detailed information and tutorials on implementing these features and attributes in your future Android applications. Any code examples, where given in this chapter, will be very short and sweet, to give you a taste of what is to come if you take the initiative to go out there and find it.

Troubleshooting: Solving Problems on Your Own

I’d like to start out with troubleshooting problems in Android first because what’s the use in researching new Classes if you can’t get the Classes that you are already working with to function properly? We have already covered a number of commands in Eclipse that can be used to find and even fix the source of code errors and warnings that are being highlighted in your Eclipse IDE. These are as follows, in order of their logical application to the problem at hand:

  • Right-Click (Project Folder) image Refresh [F5] (look for missing files and resources)
  • Right-Click (Project Folder) image Validate (look for missing interrelationships in project)
  • Project (Menu) image Clean ... (delete and regenerate R.java runtime Java binary for project)
  • Window (Menu) image AVD Manager image Start image User Data Option (Emulator Reset)
  • Eclipse LogCat Pane in the IDE (show exactly what errors are occurring)
  • http://developer.android.com/guide/faq/troubleshooting.html (latest Android developer website troubleshooting information)

The last option is the Android Development website’s troubleshooting web page section. Go there now and you will see that there are over a dozen of the most commonly encountered errors right there on the Troubleshooting “homepage.” There also are over a dozen other areas on the left that expand to reveal dozens of subsections when the “down-arrows” next to them are clicked. Some of the subsections also will have down-arrows next to them that reveal sub-subsections as well.

The first 12 of these sections essentially cover things that we have learned about in this book, so if you want to go into further Android development documentation regarding these areas, this would represent a “one stop shop,” and I highly recommend reading through it all when you have the time.

The next ten sections cover more advanced topics that we did not have the bandwidth (pages) to be able to cover in this book, but which cover some important features of the Android OS, including:

  • Computation (Runtime API Reference, Advanced Renderscript)
  • Media and Camera (Media Playback, Camera, JetPlayer, Audio)
  • Location and Sensors (motion/position/environment sensors, maps)
  • Connectivity (Bluetooth, NFC, Wi-Fi Direct, SIP, USB)
  • Text and Input (Cut and Paste, Creating an IME, Spelling Checker)
  • Data Storage (Storage Options, Data backup, App Install Location)
  • Administration (Device Policies)
  • Web Apps (Building, Debugging, Best Practices, Targeting Screens)
  • Best Practices (Compatibility, Multi-Screen, Security, Performance)
  • Google Services (In-App Billing, Licensing, Play, Cloud Messaging)

These areas of the OS may or may not be important to your particular application, depending on what its functionality entails, so you can pick and choose these topical sections based on the priority of your application development and what you want to do using the Android OS. I will cover a few of them in the next several sections of this chapter regarding your future research and development with the Android OS.

You also will find a lot of Android Developer forums online that are very useful, as well as sites such as StackOverflow that document a ton of questions and answers regarding Android programming. Simply go to Google.com and type in “Android Forums” or “StackOverflow” and you will find all of these great resources at your fingertips!

Finally, one last tip regarding research into troubleshooting any given Android programming issue: Learn How to Optimally Use Google! Yes, the Android Development website works well, and has a lot of information, as well as its own search functionality, but beware—they have a hard time keeping all of that information completely up to date for each API level. For that reason, I often use Google.com to research other websites that may have more recent solutions or programmers who are facing similar issues or problems.

To use Google effectively, make sure to use the Class and/or method names in the search parameters, or even cut and paste the error message right out of the Eclipse LogCat pane in the Eclipse IDE! If you do this, you will find that your search results will often return precise and highly relevant information that has already been asked on the Internet regarding that exact coding issue. Also, you can pose questions in the search bar, complete with a question mark at the end, such as: How do I register to use Google Maps in Android 4.1? Don’t forget, Google is working on the Semantic Web (Internet 3.0), and searches such as this are going to work better and better each day as time goes on.

Widgets: Creating Your Own Widgets in Android

As we discussed in Chapter 7, Android has its own collection of user-interface widgets that can be used to easily populate your layouts with functional elements that allow your users to interface with the program logic that defines what your application does. These widgets have their own package called android.widget that can be imported into your application and used without further modification.

Android extends this widget capability for its programmers by allowing us to also create our own widgets that can be used by Android as mini-application portals or views that float on the Android home screen, or even inside other applications (just like the standard UI widgets).

If you remember, user-interface elements are Widgets that are subclassed from View objects. Widgets can be used to provide cool little extras for the Android homescreen, such as weather reports, MP3 players, calendars, stopwatches, maps, or snazzy clocks and similar micro-utilities.

To create an app widget, you utilize the Android AppWidgetProvider class, which extends the BroadcastReceiver class. The documentation on the App Widget Development can be found at:

http://developer.android.com/reference/android/appwidget/AppWidgetProvider.html—and—http://developer.android.com/guide/topics/appwidgets/index.html#AppWidgetProvider

To create your own app widget, you need to extend this class and override one or more of its key methods to implement your custom app widget functionality. Key methods of the AppWidgetProvider class include the following:

  • onUpdate(Context, AppWidgetManager, int[])
  • onDeleted(Context, int[])
  • onEnabled(Context)
  • onDisabled(Context)
  • onReceive(Context, Intent)

To create an app widget, you need to create an AppWidgetProviderInfo object that will contain the metadata and parameters for the app widget. These are details such as the user interface layout, how frequently it is updated or refreshed, and the convenience class that it is subclassed from (AppWidgetProvider). This can all be defined via XML, which should be no surprise to you by now.

The AppWidgetProvider class defines all of the methods that allow your application to interface with the app widget class via broadcast events, making it a broadcast receiver. These broadcast events, as we discussed in Chapter 11, will update the widget, with some frequency if required, as well as enabling (turning it on), disabling (turning it off), and even deleting it if required.

App widgets also (optionally) offer a configuration activity that can launch itself when the user first installs your app widget. This activity adds a user interface layout that allows your users to modify the app widget settings before (or at the time of) its launch. The app widget must be declared in the AndroidManifest.xml file, so that the application has registered it with the OS for communications, as it is a broadcast receiver.

More information on widget design can be found at the App Widget Design Guidelines page located at:

http://developer.android.com/guide/practices/ui_guidelines/widget_design.html

Location-Based Services in Android

Location-based services and Google Maps are both very important OS capabilities when it comes to a smartphone or tablet device. You can access all location and maps related capabilities inside of Android via the android.location package, which is a collection of classes or routines for dealing with maps and locations, and via the Google Maps external library, which we will cover in the next section.

The central component of the location services network is the LocationManager system service. This Android system service provides the APIs necessary to determine the location and (if supported) bearing of the underlying device’s GPS and accelerometer hardware functionality.

Similar to other Android systems services, the LocationManager is not instantiated directly, but is instead requested as an instance from the system by calling the getSystemService(Context) method, which then returns a handle to the new LocationManager instance, like this:

getSystemService(Context.LOCATION_SERVICE)

Once a LocationManager has been established inside of your application, you will be able to do the following three things in your application:

  • Query for a list of all LocationProviders for the last known user location.
  • Register (or unregister) for periodic updates of the user’s current location.
  • Register (or unregister) for a given Intent to be fired once the device is within certain proximity of a specified latitude or longitude specified in meters.

Google Maps in Android

Google provides an external library called Google Maps that makes it relatively easy to add powerful mapping functions to your Android applications. It is a Java package called com.google.android.maps, and it contains classes that allow for a wide variety of functions relating to downloading, rendering, and caching map tiles, as well as a variety of user control systems and display options.

One of the most important classes in the maps package is the MapView class, a subclass of ViewGroup, which displays a map using data supplied from the Google Maps service. Essentially this class is a wrapper providing access to the functions of the Google Maps API, allowing your applications to manipulate Google Maps through MapView methods that allow maps and their data to be accessed, much as though you would access any other View object.

The MapView class provides programmers with all of the various user interface assets that can be used to create and control Google Maps data. When your application passes focus to your MapView object, it automatically allows your users to zoom into, and pan around, the map using gestures or keypresses. It can also handle network requests for additional map tiles, or even an entirely new map.

Before you can write a Google Maps-based application, you must obtain a Google Maps API key to identify your app:

  1. To begin with, you need to provide Google with the signature of your application. To do so, run the following at the command prompt (in Windows the Command Prompt Utility is located in the Start Menu under the All Programs menu in the Accessories folder):

keytool -list -keystore C:users<username>.androiddebug.keystore

Note  The signature of your application proves to Google that your application comes from you. Explaining the niceties of this is beyond the scope of this book, but for now, just understand that you are proving a guarantee to Google that you created this application.

  2. When prompted, the password is android. Here is what you should see:

Enter keystore password: Keystore type: JKS
Keystore provider: SUN
Your keystore contains 1 entry
androiddebugkey, 21-Jan-2011, PrivateKeyEntry,
Certificate fingerprint (MD5): <fingerprint>

  3. Select and Copy (CTRL-C) the fingerprint string value. You’ll need it in the next step.

  4. Go to http://code.google.com/android/maps-api-signup.html and enter the fingerprint data you copied into the “My certificate’s MD5 fingerprint:” box.

  5. Accept the terms and conditions, and then click: Generate API Key.

  6. On the next page, make a note of your API key.

Now that we have our key, here are the basic steps for implementing a Google Maps app:

  1. First you would want to create a new project called MyGoogleMap, an Activity called MainActivity, with a Project Build Target of Google APIs for default version support for 2.2 through 4.1. We need to do this to use the Google Maps classes.

Note  You may have to install the Google APIs using the Android SDK and AVD Manager. They are listed as Google APIs by Google Inc.

  2. In the AndroidManifest.xml file, within the <application> tag use the <uses-library> tag to point to the Google Maps library address specified above as follows:

<uses-library android:name="com.google.android.maps" />

  3. Also in the AndroidManifest.xml file, and within the <application> tag, use the <uses-permission> tag to request permission to access the Internet as follows:

<uses-permission android:name="android.permission.INTERNET" />

  4. Next you would want to define some simple user interface elements within your activity_main.xml layout definition, using a basic linear layout with a vertical parameter specified, and then a Google Maps MapView user interface element with the clickable parameter set to true, allowing the user to navigate the map, as follows:

<LinearLayout
    xmlns:android="http://schemas.android.com/apk/res/android"
    android:id="@+id/linearLayout1"
    android:orientation="vertical"
    android:layout_width="match_parent"
    android:layout_height="match_parent" >
    <com.google.android.maps.MapView
        android:id="@+id/mapview"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:clickable="true"
        android:apiKey=" Your Maps API Key Goes Here "
    />
</LinearLayout>

  5. Now enter your unique Google Maps API key that was assigned to you in the apiKey parameter in the last parameter of the MapView tag.

  6. Next open your MainActivity.java activity and extend your class to use a special subclass of the Activity class called the MapActivity class, as follows:

public class MainActivity extends MapActivity {...}

  7. One of the primary methods of the MapActivity class is the isRouteDisplayed() method, which must be implemented, and once it is, you will be able to pan around a map, so add this little bit of code as follows to complete your basic map:

@Override
protected boolean isRouteDisplayed() {
    return false;
}

  8. At the top of your MainActivity class instantiate two handles for the MapView and the ZoomTool controls (LinearLayout) we are going to add next, as follows:

LinearLayout linearLayout;
MapView mapView;

  9. Next in your onCreate() method, initialize your MapView UI element and add the ZoomControls capability to it via the setBuiltInZoomControls() method as follows:

mapView = (MapView) findViewById(R.id.mapview);
mapView.setBuiltInZoomControls(true);

Note that we are using the built-in MapView zoom controls so we do not have to write any code, and yet when we run this basic application, the user will be able to zoom the MapView via zoom controls that will appear when the user touches the map and then disappear after a short time-out period (of nonuse).

  10. Compile and run to test your MainActivity in your myGoogleMap project application in the Android 4.1 emulator.

It is important to note that the external Google Maps library is not an integral part of the Android OS, but is actually something that is hosted externally to the smartphone/tablet environment, and which requires access externally via a Google Maps key that you must apply for and secure before your applications utilize this service from Google. This is the same way that this works when using Google Maps from a website; it’s just that the MapView class fine-tunes this for Android usage. To learn more about the Google Maps external library visit:

https://developers.google.com/android/add-ons/google-apis/

Google Search in Android

Google has built its business model on one major service that it has always offered: search. It should not be a surprise that search is thus a well-supported core service in Android. Android users can search for any data that is available to them on their Android device or across the Internet.

Android, not surprisingly, provides a seamless, consistent search experience across the board, and Android provides a robust search implementation framework for you to implement search functions inside of your Android applications.

The Android search framework provides an interface for search that includes both the interaction and the search itself, so that you do not have to define a separate Activity in Android. The advantage of this is that the use of search in your application will not interrupt your current Activity.

Using Android search puts a search dialog at the top of the screen, pushing other content down on the screen as it is utilized. Once you have everything set up to use this capability in Android, you can integrate your application with search by providing search suggestions based on your app or recent user queries, offer you own custom application specific search suggestions in the system-wide quick search function, and even turn on voice search functions.

Search in Android is handled by the SearchManager class; however, that class is not used directly, but rather is accessed via an Intent specified in XML or via your Java code via the context.getSystemService(context.SEARCH_SERVICE) code construct. Here are the basic steps to set up capability for a search within your AndroidManifest.xml file.

Specify an <intent-filter> in the <activity> section of the AndroidManifest.xml:

<intent-filter>
    <action android:name="android.intent.action.SEARCH" />
</intent-filter>

<meta-data android:name="android.app.searchable"
       android:resource="@xml/searchable" />

  1. Next, create the res/xml/searchable.xml file specified in the <meta-data> tag in Step 1.

  2.Inside searchable.xml, create a <searchable> tag with the following data:

<searchable
        xmlns:android="http://schemas.android.com/apk/res/android"
        android:label="@string/search_label"
        android:searchSuggestAuthority="dictionary"
        android:searchSuggestIntentAction="android.intent.action.VIEW">
</searchable>

  3.Now in res/values/strings.xml, add a string called search_label via text value: "Search" or whatever you want it to say next to the search data entry field.

Now you are ready to implement a search in your application as described here:

http://developer.android.com/guide/topics/search/search-dialog.html

Note that many Android phones, tablets, iTVs, and e-reader devices come with a search button built in, which will pop up the search dialog. You can also provide a button to do this, in a menu maybe, or as an icon. That’s for you to experiment with, so have some fun with Search.

Data Storage in Android

Android has a significant number of ways for you to save data on your smartphone, from private data storage for your application, called shared preferences, to internal storage on your smartphone device’s memory chips, to external storage via your smartphone device’s external storage (SD card or Micro-SD), to network connection (Network Accessed Storage) via your own network server, to an entire DBMS (database management system) via open source SQLite private databases.

Shared Preferences

Shared preferences are persistent data pairs that remain in memory even if your application is killed (or crashes), and thus this data remains persistent across multiple user sessions. The primary use of shared preferences is to store user preferences for a given user’s Android applications and this is a main reason why they persist in memory between application runs.

To set your application’s shared preferences Android provides us with the SharedPreferences class. This class can be used to store any primitive data type s, including Booleans (1/0, on/off, true/false, or visible/hidden), floats, integers, strings, and longs. Note that the data created with this class will remain persistent across user sessions with your application even if your application is killed (the process is terminated, or crashes).

There are two methods in the SharedPreferences class that are used to access the preferences; if you have a single preference file use getPreferences and if you have more than one preference file, you can name each and use getSharedPreferences(name) and access them by name. Here is an example of the code in use, where we retrieve a screen name. The settings.getString call returns the screenName parameter, or the default name Android Fan if the setting is not set:

public static final String PREFS_NAME = "PreferenceFile";
...
    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);


       SharedPreferences settings = getSharedPreferences(PREFS_NAME, 0);
       String screenName = settings.getString("screenName", "Android Fan");
       // do something with the screen name.
    }

We can set the screen name with the following:

SharedPreferences settings = getSharedPreferences(PREFS_NAME, 0);
SharedPreferences.Editor editor = settings.edit();
editor.putString("screenName", screenName);
editor.commit();

Internal Memory

Accessing internal memory storage on Android is done a bit differently, as that memory is unique to your application and cannot be directly accessed by the user or by other applications. When the application is uninstalled these files are deleted from memory. To access files in memory use the openFileOutput() with the name of the file and the operation needed, which will return a FileOutputStream object that you can use the read(), write(),  and close() methods to manipulate the data into and out of the file. Here is some example code showing this concept:

String FILENAME = "hello_file";
String string = "hello world!";
FileOutputStream fos = openFileOutput(FILENAME, Context.MODE_PRIVATE);
fos.write(string.getBytes());
fos.close();

External Memory

The method that is used for accessing external memory on an Android device is getExternalStorageState. It checks to see whether the media (usually an SD card or internal micro SD) is in place (inserted in the slot for an SD card) and available for usage. Note that files written to external removable storage media also can be accessed outside of Android and its applications by PCs or other computing devices that can read the SD card format. This means there is no security in place on files that are written to external removable storage devices.

Using SQLite

The most common way to store data for your application, and the most organized and sharable, is to create and utilize a SQLite database. This is how Android stores and accesses its own data for users who utilize its internal applications such as the Contacts list or Database. Any private database you create for your application will be accessible to all parts of your application, but not to other parts of other developer’s applications unless you give permission for them to access it.

To write and read from a custom database structure, you would utilize the getWritableDatabase and getReadableDatabase methods, which return a SQLiteDatabase object that represents the database structure and provides methods for performing SQLite database operations.

To perform SQLite database queries on your new SQLite database you would use the SQLiteDatabase query() methods, which accept all common data query parameters such as the table to query and the groupings, columns, rows, selections, projection, and similar concepts that are mainstream in database programming. This is a complex topic, so I suggest getting a book on SQLite for Android from Apress if you want to master this database technology inside of Android.

Device Administration: Security for IT Deployments

As of Android version 2.2 (API Level 8), Google has introduced support for secure enterprise applications via its Android Device Administration API. This API provides developers with employee device administration at a lower system level, allowing the creation of “security aware” applications that are necessary in MIS enterprise applications that require that IT maintain a tight level of control over the employees’ Android Smartphone devices at all times. Support for this feature between API Level 8 and 16 may be yet another reason why these are the default API Level support (Minimum and Target) recommendations that are set as default values in the New Android Application Project series of dialogs that we have seen throughout this book.

A great example of this is the Android e-mail application, which has been upgraded in OS version 2.2 to implement these security features to provide more robust e-mail exchange security and support. Exchange Administrators can now implement and enforce password protection policies in the Android e-mail application spanning both alphanumeric passwords and simpler numeric PINs across all of the devices in their organization.

IT administrators can go as far as to remotely restore the factory defaults on lost or stolen handsets, clearing sensitive passwords and wiping clean proprietary data. E-mail Exchange End-Users can now sync their e-mail and calendar data as well.

Using the Android Camera Class to Control a Camera

The Android Camera class is used to control the built-in camera that is in every Android smartphone and in most tablets. This Camera class is used to set image capture settings and parameters, start and stop the preview modes, take the actual picture, and retrieve frames of video in real-time for encoding to a video stream or video file format. The Camera class is a client for the camera service, which manages the camera hardware.

To access your Android device’s camera, you need to declare a permission in your AndroidManifest.xml that allows the camera features to be included in your application. You need to use the <uses-feature> tag to declare any camera features that you wish to access in your application so that Android knows to activate them for use in your application. The following XML AndroidManifest.xml entries allow the camera to be used and define it as a feature along with including the autofocus capabilities:

<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus"/>

The developer.android website has plenty of Java code for you to experiment with at the following link locations:

http://developer.android.com/reference/android/hardware/Camera.html

http://developer.android.com/guide/topics/media/camera.html

http://developer.android.com/reference/android/hardware/Camera.Parameters.html

3D Graphics: Using OpenGL ES 2.0 in Android

One of the most impressive capabilities of the Android OS is its ability to “render” 3D graphics in real-time using only the open source OpenGL (open source graphics language) ES 1.x API prior to API Level 8 (2.2), and in later releases of Android after 2.2, the OpenGL ES 2.0 APIs. OpenGL ES stands for OpenGL for embedded systems. OpenGL ES 3.0 was recently ratified, and OpenGL ES 4.0 is under consideration and should be available soon as well. Apress has several books covering Android 4 game development using OpenGL including Beginning Android Games.

OpenGL ES is an optimized embedded devices version of the OpenGL 2.0 API that is used on computers and game consoles. OpenGL ES is highly optimized for use in embedded devices similar to the way Android Dalvik Virtual Machine optimizes your code by making sure there is no “fat” that the smartphone CPU and memory need to deal with, a streamlining of sorts. OpenGL ES 2.0 is a feature parallel version to the full OpenGL 2.0 standard version, so if what you want to do on Android is doable in OpenGL 2.x, it should be possible to do it in OpenGL ES 2.0.

The Android OpenGL ES 2.0 is a custom implementation but is somewhat similar to the J2ME JSR239 OpenGL ES API, with some minor deviations from this specification due to its use with the Java Micro Edition (JavaME) for cell phones.

To access the OpenGL ES 2.0 API, you need to write your own custom subclass of the View Class and obtain a handle to an OpenGL Context, which will then provide you with access to the OpenGL ES 2.0 functions and operations. This is done in the onDraw( ) method of the custom View class that you create, and once you have a handle to the OpenGL Object, you can use that object’s methods to access and call the OpenGL ES functional operations.

More information on OpenGL ES can be found at: www.khronos.org/opengles/

Information about version 1.0 can be found at: www.khronos.org/opengles/1_X/

Information about version 2.0 can be found at: www.khronos.org/opengles/2_X/

Information about version 3.0 can be found at: www.khronos.org/opengles/3_X/

Android Developer Documents do in fact exist for OpenGL ES 1.0 and 1.1 at

http://developer.android.com/reference/javax/microedition/khronos/opengles/package-summary.html

FaceDetector

One of the coolest and most advanced concepts in the Android SDK is a facial recognition class called FaceDetector.

FaceDetector automatically identified faces of subjects inside of a Bitmap graphic object. I would suggest using PNG24 (24-bit PNG) for the highest quality source data for this operation, as FaceDetector uses 16-bits of RGB data (5-6-5 bits of R-G-B channel values respectively) currently in its algorithm (so don’t use GIF or PNG8 format indexed color images).

You create a FaceDetector object by using the public constructor FaceDetector:

public FaceDetector (width integer, height integer, maxFaces integer)

The method you use to find faces in the bitmap file is findFaces(Bitmap bitmap, Face[] faces), which returns the number of faces successfully found.

SoundPool

The SoundPool class is great for game development and audio playback applications on Android because it manages a pool of Audio Resources in an optimal fashion for Android Apps that use a lot of audio, or where audio is a critical part of the end-user’s overall experience.

A SoundPool is a collection of audio “samples,” such as sound effects or short songs that need to be loaded into Android memory from an external resource, either inside the application’s .apk file, or from an external file or from the internal file system.

The cool thing about the SoundPool Class is that it works hand in hand with the MediaPlayer Class that we looked at earlier to decode the audio into a raw PCM mono or stereo 16-bit CD quality audio stream. This makes it easier for an application to include compressed audio inside its APK, and then decompress it on application start-up, load it into memory, and then play it back without any hiccups when it is called or triggered within the Android application code.

It gets even more interesting. It turns out that SoundPool can also control the number of audio assets that are being simultaneously “rendered” or turned from data values into audio sound waves. Essentially this means that the SoundPool is an audio “Mixing Console” that can be used to layer audio in real-time to create custom mixes based on your gameplay or other application programming logic.

SoundPool defines a maxStreams parameter that limits the number of parallel audio streams that can be played, so that you can put a “cap” on the amount of processing overhead that is used to mixdown audio in your application, in case this starts to affect the visual elements that are also possibly rendering in real-time on the screen. If the maxStreams value is exceeded, then the SoundPool turns off individual audio streams based on their priority values, or if none are assigned, based on the age of the audio stream.

Individual audio streams within the SoundPool can be looped infinitely (using a value of −1) or any number of discreet times (0 to …) and also counts from zero, so a loop setting of three plays the audio loop four times. Playback rates can also be scaled from 0.5 to 2.0, or at half the pitch to twice the pitch, allowing real-time pitch shifting, and with some clever programming, one could even simulate effects such as Doppler via fairly simple Java code. Samples can also be pitch-shifted to give a range of sound effect tones, or to create keyboard-like synthesizers.

SoundPool also lets you assign a Priority to your individual audio samples, with higher numbers getting higher priority. Priority only comes into play when the maxStreams value specified in the SoundPool Object is hit and an audio sample needs to be removed from the playback queue to make room for another audio sample playback request with a higher priority level. Be sure to prioritize your audio samples, so that you can have complete control of your audio and effects mixing during real-time playback.

MediaRecorder

In Chapter 11 we discussed the Android MediaPlayer class, which is commonly used to play back audio or video files. Android also can record audio and video media files at a high level of fidelity and the counterpart to the MediaPlayer class for this is, logically, the MediaRecorder class. It is important to note that MediaRecorder does not currently work on the Android smartphone emulators.

There are five main MediaRecorder classes that control the process of media recording. They are as follows (note that these are defined inside the MediaRecorder class, hence the dot notation):

  • MediaRecorder.AudioEncoder
  • MediaRecorder.AudioSource
  • MediaRecorder.OutputFormat
  • MediaRecorder.VideoEncoder
  • MediaRecorder.VideoSource

You construct a MediaRecorder object and operate on it using the public methods such as start( ), stop( ),  prepare(), release(), reset(), setAudioChannels(), setCamera(), setOutputFile(), and a plethora of other methods that control how the new media data is captured and stored on your Android device.

More information on the MediaRecorder class can be found at

http://developer.android.com/reference/android/media/MediaRecorder.html

VideoView: Playing Video in Your Android Apps

As our final topic, just to make sure we cover everything major you will want to do in your Android apps in this book, we’ll look at how to simply and effectively play digital video files in your Android applications. You do this through a very handy class called VideoView. We are going to show the ability to play video in our application using only three lines of XML code and eight lines of Java code, or less than a dozen lines of code in total.

Adding a VideoView Object

For video playback, we would use the VideoView class. As with TextView and ImageView objects, VideoView objects make it easy to access MPEG-4 H.264 or WebM (VP8) video in your Android applications. Your video can either be in your /res/raw folder or streamed from a remote server, keeping your application download size to a minimum.

To add a VideoView to our activity_main.xml, place the following tag in your UI layout container:

<VideoView  android:id="@+id/videoView1"
            android:layout_height="match_parent"
            android:layout_width="match_parent" />

This names our VideoView and tells it to match its parent container size using match_parent. The match_parent value does the opposite of wrap_content. It sizes the content up to fill the layout container, rather than scaling the layout container down around the content.

Adding the Java for Video

First, add three new import statements for the classes we need:

import android.net.Uri;
import android.widget.VideoView;
import android.widget.MediaController;

To get the video from our server, we also need to define its path using a Uri object, so we must import the android.net.Uri class. We next import the VideoView widget (android.widget.VideoView). Finally, to play the video in the VideoView, we will use a MediaController object, so we import the android.widget.MediaController class as well.

Next, add the following to your MainActivity.java onCreate() method to create our VideoView object:

Uri vidFile = Uri.parse("http://commonsware.com/misc/test2.3gp");
VideoView videoView = (VideoView) findViewById(R.id.videoView1);
videoView.setVideoURI(vidFile);
videoView.setMediaController(new MediaController(this));
videoView.start();

First, we create the Uri reference object, which holds the path, or address, to the video file on the server. The uniform resource identifier (URI) can use the familiar HTTP server paradigm or a more advanced real-time streaming protocol. As you can see, here we are using the HTTP protocol, which works fine, and is the industry standard, thanks to the Internet. We create a Uri object called vidFile using the parse() method with the HTTP URL to any valid path and file name in quotes. Here, the Uri object points to the content at http://commonsware.com/misc/test2.3gp, so that we have some video to play.

Now we have an object called vidFile that contains a reference to our video file. Next, we set up our VideoView object, calling it videoView and using findViewById() to locate the videoView1 ID we created in our XML layout file. This is the same thing we have been doing with the other View types, and should be very familiar to you at this point.

Now that we have a VideoView object, we use the setVideoURI() method to pass the vidFile Uri object to the videoView VideoView object, so that the VideoView is loaded with the file path to use to retrieve the video. Now our Uri is wired into our VideoView, and we need only to wire the MediaController into the VideoView so that the video can be played.

Finally let’s declare a new MediaController object named mediaControl and in the next line of code connect that MediaController to the videoView object using the .setMediaController() method.

MediaController mediaControl = new MediaController(this);
videoView.setMediaController(mediaControl);

Finally, to start our videoView object playing, we send it a .start() method call, like this:

videoView.start()

New Features in Android 4.2

Right before we went to press (the holiday weekend before, actually) Android 4.2 was released, with a lot of operating system level features that will automatically make your apps better without your intervention, which is a boon to absolute beginners, so we’re covering them here. Some of these include a new user interface element style treatment, performance (speed) optimizations, international language optimizations, dozens of security enhancements, HDR Camera support, and support for multiple users on a single Android device. Other features that we cover in this book, such as SoundPool and 3D Rendering via hardware GPUs, were also greatly enhanced with new code and features. I will point out some of the other new Android 4.2 API Level 17 (called: Jelly Bean Plus, or JellyBean+) additions in this section, so that you know about them and can further investigate their relevance to your Android application development on your own.

Presentation is a big area of enhancement in Android 4.2, in fact, there is a new Presentation Dialog UI element that you can code to, as well as support for Miracast technology. Using Miracast, users using Miracast-certified Android devices can connect to external displays (even non-Android displays) using Wi-Fi via Miracast, a peer-to-peer wireless display standard created by the Wi-Fi Alliance. When a wireless display is connected to an Android device, Android users can stream any type of new media content to that external screen, including: images, photos, videos, maps, audio, and much more. Your applications can now take advantage of wireless displays in a similar way as it can currently with direct-wired external displays, and no extra programming work is needed. Hey! That’s another absolute beginner’s feature! The Android operating system manages all of the network connections for you, and even automatically streams your Presentation or application content over to the wireless displays as needed. So be sure and research the new Presentation Classes and the Wi-Fi Alliance Miracast technology soon.

Two more areas you will want to take a look at if you are into widget development are Lock Screen Widgets and the new Daydream Interactive Screen Saver Mode. Lock Screen Widgets are fairly self-explanatory; they are widgets that display on your user’s Locked Screens! This is a really cool new feature, as it essentially allows developers to provide custom Lock Screens. Android users can have up to five Lock Screen Widgets, each of which can have its own panel in the five screen panel navigator that became so popular via the Android RAZR. Lock screen widgets can display any kind of content, and they can accept direct user interaction. Lock Screen Widgets can be entirely self-contained, such as a widget that controls audio playback and can even allow users to jump straight inside an application Activity (after unlocking, of course). Daydreams merge the capabilities of live wallpaper with home screen widgets; they are essentially an interactive screen saver mode which starts-up when an Android device is docked and charging. Daydream is really a remote content service that is provided to the user via an installed application, only in the form of an Android device screen saver. Your users can turn on Daydream mode using their Settings controls and also choose the daydream that they wish to utilize. Both of these new capabilities allow developers to increase the user engagement impact of their application significantly, so be sure to check out these new Android 4.2 features when you have a moment. More information about Android 4.2 Jelly Bean Plus features and APIs can be found via the following two links:

http://developer.android.com/about/versions/jelly-bean.html

http://developer.android.com/about/versions/android-4.2.html

Summary

There are a lot of really great features in Android that we simply do not have enough time to cover in one book, or that are too complex for an absolute beginners’ book. That doesn’t mean that you should not investigate all of the cool features that Android has to offer on your own, however, so this chapter introduced some that are very powerful and advanced for a mobile phone, tablet, iTV, and e-reader operating system.

We started out covering the most important area that you will want to research: troubleshooting. Fortunately, Android has a troubleshooting section for Android development research, and this page has references on the left-side to some on the most key areas that you will want to research further once you are finished with this book. Please make sure that you do so, it will increase your overall knowledge of what Android is currently capable of considerably.

Widgets are another key part of Android, and something that you may well want to create at some point in time, as they are like mini-apps that users can use on their desktops. They are so important, in fact, that Android has given them their own Class and section of the Android Developer website.

Location based services and Google Maps are another ideal way to add really cool features to your apps with relatively little programming effort, and Google Search is another great way to leverage things that Google has already mastered and is offering on its servers and generously allows you to offer these features to your users again with almost zero coding effort on your part.

Where graphics are concerned, there is no more powerful open source graphics library than OpenGL, and Android 2.2 through 4.1 implements the latest OpenGL ES 2.0 technology, just like HTML5 does currently. Because Android phones have built-in GPU (graphics processing unit) hardware, this means that you can “render” real-time 3D on the fly, to visualize just about anything that you want to within your application, and in three dimensions to boot! This is another very logical reason to use that default (suggested) API Level 8 through 16 support that is set as a default in the New Android Application Project series of dialogs in Eclipse and that we have been using throughout this book.

There are many other interesting areas to be discovered in Android as well, from creating your own Audio Mixing App using the SoundPool Audio Engine to creating your very own SQLite databases to using the Smartphone or Tablet Camera and MediaRecorder class to using the Face Recognition class to identify unique people within your content. All of this is covered in fine detail on the Google developer.android.com website, so be sure to go there to explore at length to enhance your knowledge of the literally thousands of interesting features in Android OS, with many more to come soon in the Android OS Version 5.0!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
98.82.120.188