“Test early and often” is a common cry among advocates of testing, as is the all-important question, “If you don’t have a test, how do you know your code works?”
There are many types of testing. Unit testing checks out individual components (“units” such as methods) in isolation (not hitting the network or the database), whereas integration testing tests the entire system, or at least large swaths of it. JUnit and TestNG are the leading unit testing frameworks for Java. Mock objects are used where interaction with other components is required; there are several good mocking frameworks for Java. Android provides a number of specific testing techniques, many of which are discussed in this chapter.
In the broader scheme of things, software verification tools can be categorized as static or dynamic. JUnit is one example of a widely used method of dynamic testing, as is integration testing. Static code analysis works by examining the code rather than running it. Two well-known static analysis tools are FindBugs and PMD, which are covered in my book and my video series on testing. This site also has a bibliography of testing books/papers and a list of Java-specific testing tools. Android has its own static analysis tool, Android Lint, covered in Recipe 3.13.
Android apps can be run on a vast array of devices, including small phones, mid-sized tablets, large phones, large tablets, and (as of ChromeOS Release 53) most Chromebooks. They also run on many proprietary readers such as the Amazon Kindle Fire tablets. Although we show how to test using the emulator in Recipe 3.1, you will want to have several real devices for testing, because the emulator is, after all, an emulator.
The terms NPE, ANR, and FC are used throughout this chapter. NPE is a “traditional Java” acronym for Null Pointer Exception. ANR is an Android-specific acronym; it stands for Application Not Responding, the first few words of a dialog you get when your application is judged to be taking too long to respond to a request. FC stands for Force Close, which occurs when Android requests that you close a failed application.
To put a real device into “developer mode,” go into Settings → “About phone” (or tablet). At the bottom you will see a “Build number” entry. Tap seven times on “Build number” and it will say something like “Congratulations, you are now a developer!” and will enable the developer options in the main Settings screen.
Daniel Fowler
Use the Android SDK’s device emulation toolkit to configure combinations of devices and operating systems. Testing on various combinations reduces issues related to hardware differences in devices.
Android devices are manufactured to cover a wide market, from low cost to high specification and high value. Android has also been in the marketplace for more than a couple of years. For these reasons, a wide range of devices with a wide range of hardware options and operating system versions are in use. A successful application will be one that can run on a broad array of devices. An app developer will usually only be able to test on a very small range of physical devices, but fortunately, a developer can boost the confidence he has in his app by using an Android Virtual Device.
A compiled app can be tested on a physical device or on a virtual device. An AVD is an emulation of an Android platform on a host machine, usually the development machine. AVDs simplify testing for these reasons:
Multiple AVD configurations can be created to test an app on different versions of Android.
Different (emulated) hardware configurations can be used—for example, GPS or no GPS.
An AVD is automatically launched and your compiled app is installed onto it when the Run button is clicked in your IDE.
You can test your app on many more combinations of Android versions and hardware versions than physical devices you possess.
Testing on AVDs greatly reduces the amount of testing required on physical devices.
AVDs can be used alongside a physical device.
You don’t need to handicap your physical device to induce error conditions—for example, to test on a device with no Secure Digital (SD) card, just set up an AVD with no SD card.
An AVD can simulate network events without the costs involved in using a physical device; for example, you can simulate phone calls or send an SMS message between two AVDs.
You can simulate GPS data from an AVD from different physical locations without moving from your desk.
When app users report bugs you can try to mimic their hardware configurations using AVDs.
Testing on an AVD can avoid messing up your real device.
Please note that on underconfigured development machines and when emulating larger Android devices the performance of an AVD will often be less than that of a physical device.
You can configure an AVD using the SDK Manager program (opened directly from the filesystem or from within Eclipse). It is also possible to create AVDs from the command line. Note that the screenshots in this recipe, and the options they carry, will vary depending on what release of the Android SDK tools you have installed.
To create an AVD with the SDK Manager, you must first load the program. When using most IDEs, there is an AVD Manager icon, the Studio version of which is shown in Figure 3-1.
You can also start the program directly from the filesystem. For example, in Windows, open C:Program FilesAndroidandroid-sdkSDK Manager.exe. If you started the program directly from the filesystem, the SDK Manager will check for SDK updates, in which case select Cancel to go to the main window, titled “Android SDK and AVD Manager.” If you opened the program from your IDE, the main window will appear without the check for updates to the SDK.
The Virtual Device Configuration wizard loads. Choose an existing profile or create a new one (Figure 3-2).
The following fields are used to define an AVD:
Give a name to the new Android device that is to be emulated. Make the name descriptive—for example, if you’re emulating a device with a version 5.1 operating system and medium-resolution screen (HVGA), a name such as Android-v5.1-HVGA is better than AndroidDevice. The name may not contain spaces.
This is the version of the Android operating system that will be running on the emulated device. As an example, for a device running version 6.0 this will be set to “Android 6.0-API Level 23.”
Here you specify the size of the device’s emulated SD card, or select an existing SD card image (allowing the ability to share SD card data among different AVD emulations). To specify a new SD card, enter the size in megabytes (MB). Remember: the bigger the number, the bigger the file created on the host computer system to mimic the SD card. Alternatively, select the File option and browse to an existing SD card image (on a Windows machine the sdcard.img files will be found in the subfolders of the avd directory under the .android directory in the logged-on user’s folder).
Check the Enabled box if you want the runtime state of the emulated device to persist between sessions, which is useful if a long-running series of tests is being performed and when the AVD is closed you do not want to have to start the tests from the beginning. It also speeds up the start-up time of an AVD.
Here you select the screen size for the device; a list of common screen sizes is presented (e.g., HVGA, QVGA, etc.). The list will vary depending on the operating system version. Alternatively, a custom resolution can be entered.
The following table lists the major choices you have to make in creating an AVD.
Name | Data type | Value | Description |
---|---|---|---|
Device |
Choice |
One of listed |
List of known devices |
Target |
Choice |
One of listed |
List of API levels |
CPU/ABI |
Choice |
One of listed |
List of CPUs: ARM, Intel, etc. |
Keyboard |
Boolean |
Yes or no |
Controls emulation of a physical keyboard (as opposed to an onscreen one) |
Skin |
Choice |
One of listed |
Size of screen, e.g., QVGA, WVGA, etc. |
Front Camera |
Choice |
One of None, Emulated, Webcam |
User-facing camera |
Back Camera |
Choice |
One of None, Emulated, Webcam |
Outward-facing camera |
Memory Options: RAM |
Integer |
Megabytes |
Determines the size of the AVD’s total main memory |
Memory Options: VM Heap |
Integer |
Megabytes |
Determines the size of the AVD’s heap memory, for allocations |
Internal Storage |
Integer |
Megabytes |
Determines the size of the AVD’s internal memory, for running applications |
SD card support |
Size or File |
- |
SD card allocated (if MB specified) or existing file used |
Emulation options |
Radio |
Snapshot or Use Host GPU |
Enable one of two performance options |
When you have selected a device and pressed Next, you will see the System Image selection screen (Figure 3-3). Choose your operating system version here, and click the Finish button to generate the AVD. The AVD will now be listed in the “Android SDK and AVD Manager” window (see Figure 3-4).
The AVD is ready to be launched using the Start button. It is also ready to be selected in a project configuration to test an app under development. When the Start button is clicked, the Launch Options window is shown (see Figure 3-5).
The options at launch are:
On larger computer monitors you will not normally need to change the AVD scale. The dpi of the Android screen is greater than the standard dpi on computer monitors; therefore, the AVD screen will appear larger than the physical device. If necessary this can be scaled back to save screen space. Use this option to get the AVD to display at an approximate real size on the computer monitor. The values need to be set so that the AVD screen and keyboard are not too small to be used.
When the AVD is started the user data file is reset and any user data generated from previous runs of the AVD is lost.
If Snapshot has been enabled for an AVD, after it has been first launched subsequent launches are quicker. The AVD is loaded from a snapshot and the Android operating system does not need to start up again. However, when the AVD is closed the shutdown takes longer because the snapshot has to be written to disk.
When the AVD is closed the current state is saved for quicker launching next time; the downside is that it takes longer to close as the snapshot is written to disk. Once you have a snapshot you can uncheck this option so that closing an AVD is quick as well, though any changes since the last snapshot will be lost.
Use the Launch button to start the AVD. Once loaded it can be used like any other Android device and driven from the keyboard and mouse of the host computer. See Figure 3-6.
AVDs combined with physical devices are a useful combination for testing apps in a variety of scenarios.
The developer documentation on running apps on the emulator.
Ian Darwin
Use one of several web-based or cloud-based app testing services.
When Android was young, it was perhaps feasible to own one of each kind of device, to be able to say you had tested it on everything. I have half a dozen Android devices, most of them semiexpired, for this purpose. Yet today there are hundreds of different devices to test on, some with two or three different OS versions, different cell radios, and so on. It’s just not practical for each developer to own enough devices to test on everything. That leaves two choices: either set up a hundred different AVDs, as discussed elsewhere in this chapter, or use a cloud-based or web-based testing service.
The basic idea is that these test-hosting companies buy lots of devices, and put them in server rooms with a webcam pointed at the screen and USB drivers that transfer keystrokes and touch gestures from your web browser–based control program to the real devices. These devices are in cities around the world, so you can test while online with various mobile service providers, get GPS coordinates from the real location, and so on.
Here are some of the providers in this space, listed in alphabetical order. Some are Android-specific while some also cover iOS, BlackBerry, and other devices. Listing them here does not constitute an endorsement of their products or services; caveat emptor!
Adrián Santalla
Here’s how to create and use a test project:
Within Eclipse, create a new Android project associated with your Android application project.
Configure the AndroidManifest.xml file of your test project with the necessary lines to test your Android application.
Write and run your tests.
The following subsections describe the preceding steps in more detail.
First of all, you need to create a new Android project alongside the main application project to store your tests. This new Eclipse project should have an explicit dependency on your main application project. The Eclipse New Android Project wizard will create this and set it up correctly when you create the original project, if you remember to click the “Create Test Project” checkbox. Figure 3-7 shows the Eclipse project structure: two projects.
HelloTestingTarget
is the target of our tests; that is, the main application.
HelloTestingTestProject
is, of course, the test project.
Once you have created your new test project, you should properly set all the values of the project’s AndroidManifest.xml file. It’s necessary to set the package name of the main source of the application that you would like to test.
Imagine that you are testing an application whose package name is my.pkg.app
. You should create a test project, and your AndroidManifest.xml file should look like the code in Example 3-1.
<?xml version="1.0" encoding="utf-8"?>
<manifest
xmlns:android=
"http://schemas.android.com/apk/res/android"
package=
"my.pkg.app.tests"
android:versionCode=
"1"
android:versionName=
"1.0"
>
<application>
<uses-library
android:name=
"android.test.runner"
/>
</application>
<instrumentation
android:name=
"android.test.InstrumentationTestRunner"
android:targetPackage=
"my.pkg.app"
android:label=
"Tests for my.pkg.app"
/>
</manifest>
The package
attribute of the manifest
tag stores the package name of the test project; more importantly, the android:targetPackage
attribute of the instrumentation
tag stores the package name that you would like to test. Again, the Eclipse wizard will set this up if you create the main and test projects at the same time. The resulting structure was shown in Figure 3-7.
Now you can write your tests. The Android testing API has traditionally been based on the JUnit 3.8 API (although it is now possible to use JUnit 4, as in Recipe 3.4) and provides several types of test classes, including AndroidTestCase
, ActivityInstrumentationTestCase2
, ApplicationTestCase
, and InstrumentationTestCase
.
When you create your first test case with your IDE, it is useful to create a test case that inherits from ActivityInstrumentationTestCase2
. This kind of test class allows you to create functional tests. Example 3-2 shows a simple functional test.
public
class
MainTest
extends
ActivityInstrumentationTestCase2
<
Main
>
{
public
MainTest
()
{
super
(
"my.pkg.app"
,
Main
.
class
);
}
public
void
test
()
{
TextView
textView
=
(
TextView
)
getActivity
().
findViewById
(
R
.
id
.
textView
);
assertEquals
(
"Hello World!"
,
textView
.
getText
());
}
}
The Main
class that appears as a type parameter in the test class is the main Activity of the main application project. The test constructor uses the main application package name and the class of the main Activity. From now on, you can create test cases using the standard methods of the Android API to get references to the Activity elements. In the preceding test we are testing that the main Activity has a TextView
with the text “Hello World!” associated with it.
The source code for this project is in the Android Cookbook repository, in the subdirectory HelloTestingTarget (see “Getting and Using the Code Examples”); the test project is in HelloTestingTestProject.
Ian Darwin
For standalone unit testing, use the test folder; for full Android unit testing, use the androidTest folder.
For the purposes of this exercise, we’ll create a new Android Studio project (see Recipe 1.10).
Name the project “HelloStudioTesting” and use a package name like com.example
.
On the next screen, select “Phone and Tablet” and pick your favorite API level.
After Gradle gets through grinding, you will see a project structured rather like Figure 3-8.
Note in particular the presence of two test folders, with the same package name, labeled test and androidTest. The project is set up to support the two most common types of testing: plain Java JUnit testing and Android-specific testing. The latter uses JUnit but is more like an integration test than a unit test. Unit tests test code in isolation; the androidTest mechanism supports testing the common Android components, and runs them in a test application running on an AVD emulator or on a real device. We’ll use both in this recipe.
Out of the box, the test folder contains an example test. We’ll replace that with this test case:
public
class
DataModelTest
{
@Test
public
void
NameCorrect
()
throws
Exception
{
assertEquals
(
"Robin Good"
,
new
DataModel
().
getName
());
}
}
Note the modern JUnit conventions: use of arbitrary method names and the @Test
annotation to mark test methods.
To make this test work we’ve created a class called DataModel
with a hardcoded getName()
method.
The key thing to understand is that this test methodology uses standard JUnit and runs it in a standard Java Virtual Machine (JVM),
so you can test any “normal” Java component, but you cannot test anything that depends on
the Android framework! We’ll cover that type of testing in a few paragraphs.
Note first that the sample JUnit test has a code comment—which I left in in
Figure 3-9—that says
To work on unit tests, switch the Test Artifact in the Build Variants view.
In practice, I find it easier just to right-click the test module, as in Figure 3-10.
At this stage of trivial examples you should expect your test to pass the first time, but it’s still reassuring to see the green bar appear (see Figure 3-11), which indicates that 100% of the tests passed.
The androidTest package is for testing Android-specific functionality, such as Activity
code.
This testing mechanism is based on JUnit 3.8, where inheritance from TestCase
is required and annotations
are not used.
You could use JUnit 4 here by adding a test runner and creating an alternate test configuration;
we’ll use this approach in Recipe 3.6.
There are several base test classes, some of which are now deprecated.
We’ll use the ActivityInstrumentationTestCase2
class and ignore the deprecation warnings.
Our example test looks like this:
public
class
MainTest
extends
ActivityInstrumentationTestCase2
<
MainActivity
>
{
public
MainTest
()
{
super
(
"my.pkg.app"
,
MainActivity
.
class
);
}
public
void
test
()
{
TextView
textView
=
(
TextView
)
getActivity
().
findViewById
(
R
.
id
.
textView
);
assertEquals
(
"Hello World!"
,
textView
.
getText
());
}
}
This code is creating the main Activity, finding the TextView
, and asserting that the TextView
contains the correct text.
We run this by right-clicking the androidTest folder (not the test folder as in Figure 3-10)
and selecting the Run menu item,
as shown in Figure 3-12.
Again, something this simple should pass on the first run, and we should get a green bar as in Figure 3-13.
These are both trivial tests of the “Hello, World” variety, but they show what you need to build up and run tests of greater complexity of both types: plain JUnit and Android-specific tests.
The source code for this example is in the Android Cookbook repository, in the subdirectory HelloStudioTesting (see “Getting and Using the Code Examples”).
The Android Studio User Guide documentation on testing.
Ian Darwin
Use Robolectric, a fast JUnit 4 test runner.
These instructions are set up for Eclipse.
Assuming you have your “main” project set up as a normal Android project, create a folder called, e.g., test in this project (do not mark it as as source folder), and then do the following:
Create a separate project using the New Project wizard (not using the New Android Test Project wizard).
Make this project depend on your main project (Build Path → Configure).
Remove the default source folder, src, from the new project’s build path.
Still in the build path, click “Link additional source”; browse to and select /MainProject/test.
Add Robolectric-3.1.jar to the new project’s classpath, either by copying it into libs or by specifying it in your build script (Maven or Gradle).
Add JUnit 4 (not 3.8!) to your new project’s classpath, either explicitly, or implicitly by choosing JUnit 4 from the “New Class → JUnit Test” wizard.
Annotate your JUnit 4 tests to run with the Robolectric test runner (see the following example).
Use Robolectric “shadow” classes where needed.
Next, create an Eclipse run configuration with the following special attributes (this section is adapted from the Robolectric website):
Choose Run → Run Configurations.
Double-click JUnit (not Android JUnit Test).
Name the project “MyProjectTestConfiguration.”
Select the “Run all tests in the selected project, package or source folder” radio button.
Click the Search button.
Select MyProjectTest.
Choose JUnit 4 as the test runner.
Click the link “Multiple launchers available Select one” at the bottom of the dialog.
Check the “Use configuration specific settings” box.
Select Eclipse JUnit Launcher.
Click OK.
Click the Arguments tab.
Under “Working directory,” select the Other radio button.
Click Workspace.
Select MyProject (not MyProjectTest; the value inside the Other edit box should be ${workspace_loc:MyProject}
).
Click OK.
Click Close.
Now run your new run configuration. Example 3-3 is a sample Robolectric unit test.
@RunWith
(
RobolectricTestRunner
.
class
)
public
class
HistoryActivityTest
{
private
HistoryActivity
activity
;
private
Button
listButton
;
@Before
public
void
setup
()
{
activity
=
new
HistoryActivity
();
activity
.
onCreate
(
null
);
listButton
=
(
Button
)
activity
.
findViewById
(
R
.
id
.
listButton
);
}
@Test
public
void
didWeGetTheRightButton
()
{
assertEquals
(
"History Log (Morning)"
,
(
String
)
listButton
.
getText
());
}
@Test
public
void
listButtonShouldLaunchListActivity
()
throws
InterruptedException
{
assertNotNull
(
listButton
);
boolean
clicked
=
listButton
.
performClick
();
assertTrue
(
"performClick"
,
clicked
);
ShadowActivity
shadowActivity
=
Robolectric
.
shadowOf
(
activity
);
Intent
startedIntent
=
shadowActivity
.
getNextStartedActivity
();
assertNotNull
(
"shadowActivity.getNextStartedActivity == null?"
,
startedIntent
);
ShadowIntent
shadowIntent
=
Robolectric
.
shadowOf
(
startedIntent
);
assertEquals
(
WeightListActivity
.
class
.
getName
(),
shadowIntent
.
getComponent
().
getClassName
());
}
}
Ian Darwin
Use Espresso, part of the Android Testing Support Library (ATSL). Espresso uses JUnit 4 (as does Robolectric), but still requires that the tests be packaged and run on an emulator or device.
Espresso is a relatively new testing framework that’s designed to bring the advantages of JUnit 4 and Hamcrest matching styles to Android testing. As with the previous generation of Android tests, Espresso tests are packaged into an APK and sent to an emulator or real device. Robolectric (Recipe 3.5) may be faster since it runs on a specialized JVM on the development machine, but Espresso is usually easier to use. And since (as the name ATSL implies) it is a support library, Espresso tests can run on devices as old as API 8. The official documentation emphasizes UI interaction examples, but Espresso is not limited to this use.
To configure Espresso, you need to add some entries to the build.gradle for your application (usually
this is in the app folder). Typically, you need to add the testInstrumentationRunner
,
compile
, and androidTestCompile
settings shown in Example 3-4.
apply
plugin:
'
com
.
android
.
application
'
android
{
compileSdkVersion
22
buildToolsVersion
"22"
defaultConfig
{
applicationId
"com.example.myapp"
minSdkVersion
10
targetSdkVersion
22.0
.
1
versionCode
1
versionName
"1.0"
testInstrumentationRunner
"android.support.test.runner.AndroidJUnitRunner"
}
}
dependencies
{
compile
'
com
.
android
.
support
:
support
-
annotations:
22.2
.
0
'
androidTestCompile
'
com
.
android
.
support
.
test
:
runner:
0.5
'
androidTestCompile
'
com
.
android
.
support
.
test
.
espresso
:
espresso
-
core:
2.2
.
2
'
}
Now you can write your tests. Every test class will need the @RunWith(AndroidJUnit4.class)
annotation.
For our “Hello, World” test, we are going to test a simple application that displays a text field, a button, and a second text field. When you click the button, the text from the first text field is copied to the second, to simulate starting some long-running Activity. Here is the core of the Activity:
protected
void
onCreate
(
Bundle
savedInstanceState
)
{
super
.
onCreate
(
savedInstanceState
);
setContentView
(
R
.
layout
.
activity_main
);
TextView
tv
=
(
TextView
)
findViewById
(
R
.
id
.
tvTarget
);
EditText
et
=
(
EditText
)
findViewById
(
R
.
id
.
tf
);
Button
b
=
(
Button
)
findViewById
(
R
.
id
.
startButton
);
b
.
setOnClickListener
(
v
->
{
tv
.
setText
(
et
.
getText
());
});
}
To test this, we will simulate the user typing something in the text field and clicking the button.
We also need to close the soft keyboard to ensure that it doesn’t remain on screen, hiding the target text field.
Then we check to ensure that the text in the target has changed. Example 3-5 shows the complete
test code, including imports—unusual for this book, but these are a bit complex to guess at.
The @Rule
is explained shortly.
package
com
.
example
.
helloespressotesting
;
import
android.support.test.rule.ActivityTestRule
;
import
android.support.test.runner.AndroidJUnit4
;
import
org.junit.Rule
;
import
org.junit.Test
;
import
org.junit.runner.RunWith
;
import
static
android
.
support
.
test
.
espresso
.
Espresso
.
closeSoftKeyboard
;
import
static
android
.
support
.
test
.
espresso
.
Espresso
.
onView
;
import
static
android
.
support
.
test
.
espresso
.
action
.
ViewActions
.
click
;
import
static
android
.
support
.
test
.
espresso
.
action
.
ViewActions
.
typeText
;
import
static
android
.
support
.
test
.
espresso
.
assertion
.
ViewAssertions
.
matches
;
import
static
android
.
support
.
test
.
espresso
.
matcher
.
ViewMatchers
.
withId
;
import
static
android
.
support
.
test
.
espresso
.
matcher
.
ViewMatchers
.
withText
;
/**
* Demonstrate use of Espresso testing.
*/
@RunWith
(
AndroidJUnit4
.
class
)
public
class
MainActivityTest
{
@Rule
public
ActivityTestRule
<
MainActivity
>
mActivityRule
=
new
ActivityTestRule
<>(
MainActivity
.
class
);
@Test
public
void
changeText_sameActivity
()
{
final
String
MESG
=
"Hello Down There!"
;
// Simulate typing text into 'tf'
onView
(
withId
(
R
.
id
.
tf
))
.
perform
(
typeText
(
MESG
));
closeSoftKeyboard
();
// Simulate clicking the Button 'startButton'
onView
(
withId
(
R
.
id
.
startButton
)).
perform
(
click
());
// Find the target, and check that the text was changed
onView
(
withId
(
R
.
id
.
tvTarget
))
.
check
(
matches
(
withText
(
MESG
)));
}
}
Most Activity-testing classes will need a @Rule
(a JUnit annotation) to specify the
ActivityTestRule
rule and to identify which Activity is needed.
This rule handles all the grunt work of setting up the Activity, running on the correct thread, etc. It is run before the @Before
(if any) and each @Test
-annotated method so you get a clean Activity
instance for each test:
@Rule public ActivityTestRule<MainActivity> mActivityRule = new ActivityTestRule(MainActivity.class);
In your test method you can do the expected operations: find a view, click a button, and check the results. To run your test, you have to configure the AVD and create a test runner configuration. On the AVD or device you want to test, go into Settings → “Developer options” and turn off the following three options:
Window animation scale
Transition animation scale
Animator duration scale
If you want to run the tests from the command line, you can just type ./gradlew connectedAndroidTest
, which will run all the tests in androidTest. To run under Android Studio, you need to create a test runner configuration
(see Figure 3-14):
Select Run → Edit Configurations.
Add a new Android Tests configuration (click the + button in the upper left).
Assign a meaningful name such as “Espresso Tests.”
Choose a module (usually it’s just “app”).
Add android.support.test.runner.AndroidJUnitRunner
as an instrumentation test runner.
Now you can run your tests using this configuration. As always, you should get a green bar, as shown in Figure 3-15.
There is more detail on some aspects of Espresso in the “Automating UI Tests” Android training guide. There is a collection of test examples in different styles at the on GitHub.
The source code for this example is in the Android Cookbook repository, in the subdirectory HelloEspressoTesting (see “Getting and Using the Code Examples”).
Ulysses Levy
Your app crashes and you are not sure why (see Figure 3-16).
We can use the adb logcat
command or the IDE LogCat window to view our AVD’s log. Example 3-6 shows how to find the failure location by looking in the stack trace using adb logcat
.
E/DatabaseUtils( 53): Writing exception to `parcel E/DatabaseUtils( 53): java.lang.SecurityException: Permission Denial: writing com.android.providers.settings.SettingsProvider uri content://settings/system from pid=430, uid=10030 requires android.permission.WRITE_SETTINGS E/DatabaseUtils( 53): at android.content.ContentProvider$Transport. enforceWritePermission(ContentProvider.java:294) E/DatabaseUtils( 53): at android.content.ContentProvider$Transport. insert(ContentProvider.java:149) E/DatabaseUtils( 53): at android.content.ContentProviderNative. onTransact(ContentProviderNative.java:140) E/DatabaseUtils( 53): at android.os.Binder.execTransact(Binder.java:287) E/DatabaseUtils( 53): at com.android.server.SystemServer.init1(Native Method) E/DatabaseUtils( 53): at com.android.server.SystemServer.main(SystemServer.java:497) E/DatabaseUtils( 53): at java.lang.reflect.Method.invokeNative(Native Method) E/DatabaseUtils( 53): at java.lang.reflect.Method.invoke(Method.java:521) E/DatabaseUtils( 53): at com.android.internal.os. ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:860) E/DatabaseUtils( 53): at com.android.internal.os .ZygoteInit.main(ZygoteInit.java:618) E/DatabaseUtils( 53): at dalvik.system.NativeStart.main(Native Method) D/AndroidRuntime( 430): Shutting down VM W/dalvikvm( 430): threadid=3: thread exiting with uncaught exception ...
In Example 3-6, we have a permission issue. The solution in this particular instance is to add the WRITE_SETTINGS
permission to our AndroidManifest.xml file:
<manifest ... > <application ... /> <uses-permission android:name="android.permission.WRITE_SETTINGS" /> </manifest>
Another fairly common error is the Null Pointer Exception (NPE). Example 3-7 shows the LogCat output you might see when getting an NPE.
I/ActivityManager( 53): Displayed activity com.android.launcher/.Launcher: 28640 ms (total 28640 ms) I/ActivityManager( 53): Starting activity: Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.aschyiel.disp/.Disp } I/ActivityManager( 53): Start proc com.aschyiel.disp for activity com.aschyiel.disp/.Disp: pid=214 uid=10030 gids={1015} I/ARMAssembler( 53): generated scanline__00000177:03515104_00000001_00000000 [ 73 ipp] (95 ins) at [0x47c588:0x47c704] in 2087627 ns I/ARMAssembler( 53): generated scanline__00000077:03545404_00000004_00000000 [ 47 ipp] (67 ins) at [0x47c708:0x47c814] in 1834173 ns I/ARMAssembler( 53): generated scanline__00000077:03010104_00000004_00000000 [ 22 ipp] (41 ins) at [0x47c818:0x47c8bc] in 653016 ns D/AndroidRuntime( 214): Shutting down VM W/dalvikvm( 214): threadid=3: thread exiting with uncaught exception (group=0x4001b188) E/AndroidRuntime( 214): Uncaught handler: thread main exiting due to uncaught exception E/AndroidRuntime( 214): java.lang.RuntimeException: Unable to start activity ComponentInfo{com.aschyiel.disp/com.aschyiel.disp.Disp}:java.lang.NullPointerException E/AndroidRuntime( 214): at android.app.ActivityThread.performLaunchActivity( ActivityThread.java:2496) E/AndroidRuntime( 214): at android.app.ActivityThread.handleLaunchActivity( ActivityThread.java:2512) E/AndroidRuntime( 214): at android.app.ActivityThread.access$2200( ActivityThread.java:119) E/AndroidRuntime( 214): at android.app.ActivityThread$H.handleMessage( ActivityThread.java:1863) E/AndroidRuntime( 214): at android.os.Handler.dispatchMessage(Handler.java:99) E/AndroidRuntime( 214): at android.os.Looper.loop(Looper.java:123) E/AndroidRuntime( 214): at android.app.ActivityThread.main(ActivityThread.java:4363) E/AndroidRuntime( 214): at java.lang.reflect.Method.invokeNative(Native Method) E/AndroidRuntime( 214): at java.lang.reflect.Method.invoke(Method.java:521) E/AndroidRuntime( 214): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run( ZygoteInit.java:860) E/AndroidRuntime( 214): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:618) E/AndroidRuntime( 214): at dalvik.system.NativeStart.main(Native Method) E/AndroidRuntime( 214): Caused by: java.lang.NullPointerException E/AndroidRuntime( 214): at com.aschyiel.disp.Disp.onCreate(Disp.java:66) E/AndroidRuntime( 214): at android.app.Instrumentation.callActivityOnCreate( Instrumentation.java:1047) E/AndroidRuntime( 214): at android.app.ActivityThread.performLaunchActivity( ActivityThread.java:2459) E/AndroidRuntime( 214): ... 11 more
The example code with the error looks like this:
public
class
Disp
extends
Activity
{
private
TextView
foo
;
@Override
public
void
onCreate
(
Bundle
savedInstanceState
)
{
...
foo
.
setText
(
"bar"
);
}
}
The preceding code fails because we forgot to call findViewById()
to assign “foo” a reference to the TextView
instance. Here is the example code with the fix:
public
class
Disp
extends
Activity
{
private
TextView
foo
;
@Override
public
void
onCreate
(
Bundle
savedInstanceState
)
{
...
foo
=
(
TextView
)
findViewById
(
R
.
id
.
id_foo
);
foo
.
setText
(
"bar"
);
}
}
This code should make the error go away.
Justin Mattson’s Google I/O 2009 presentation “Debugging Arts of the Ninja Masters,” Android Developers discussion on processes stopping unexpectedly.
Rachee Singh
Debugging the code using LogCat messages is a useful technique for developers who find themselves in such a situation.
Those who are familiar with traditional Java programming have probably used System.out.println
statements while debugging their code. Similarly, debugging an Android application can be facilitated by using the Log.d()
method. This enables you to print necessary values and messages in the LogCat window. Start by importing the Log
class:
import
android.util.Log
;
Then, insert the following line at places in the code where you wish to check the status of the application:
Log
.
d
(
"Testing"
,
"Checkpoint 1"
);
Testing
is the tag that appears in the “tag” column in the LogCat window, as shown in Figure 3-17; normally this would be defined as a constant in the main class to ensure consistent spelling. Checkpoint 1
is the message that appears in the Message column in the LogCat window. Log.d()
takes these two arguments. Corresponding to these, an appropriate message is displayed in the LogCat window. So, if you have inserted this Log.d
statement as a checkpoint and you get the Checkpoint 1
message displayed in the LogCat window, it implies that the code works fine up to that point.
The Log.d()
method does not accept variable arguments, so if you wish to format more than one item, use string concatenation or String.format()
(but omit the trailing %n
):
Log
.
d
(
"Testing"
,
String
.
format
(
"x0 = %5.2f, x1=%5.2f"
,
x0
,
x1
));
Ian Darwin
There are both open source and commercial technologies for reporting application crashes. One of the widely used open source ones is Application Crash Reports for Android (ACRA). ACRA provides its own backend reporting tool but also supports Google Docs and many other backends. If you have your own Java EE server, you can use the author’s own CrashBurnFree service, which also works on non-Android implementations of Java. Alternatively, you can sign up for one of the commercial services. With most of these, you just add one JAR file and one call to your app. Then sit back and await notifications, or view the appropriate web dashboard for lists of errors and detail pages.
There is no magic to crash reporting, and it doesn’t provide anything that you can’t do yourself. But it’s already done for you, so just use it!
The basic steps to use ACRA are:
Decide on which server/backend you’re going to use.
Add one JAR file to your project.
Annotate your Application
class (see Recipe 2.3) to indicate that it’s an ACRA-enabled application.
The basic steps to use CrashBurnFree are:
Download the server JAR, or, build it, or deploy it to your server.
Configure a security key for your own use on the server.
Add one JAR file to your project.
Add one call (using the security key) into your Application
class or main Activity’s onCreate()
method.
Step 1 is out of scope for this book; if you have a Java EE server you can probably handle it.
To use one of the commercial services, use these steps (for example, Splunk MINT, formerly BugSense—the process is similar):
Create an account.
Register your app and retrieve its unique key from the service’s website.
Download a JAR file from the website and add it to your project.
Add one call (using the app’s unique key) into your Application
class or main Activity’s onCreate()
method.
After these steps are done, you can distribute your app to users. The first one or two steps are straightforward, so we won’t discuss them further. The remaining steps require a little more detail, and we discuss them in the following subsections.
The JAR file for ACRA can be added using the following Maven coordinates (if you use Gradle, you know how to trim this down):
<dependency>
<groupId>
ch.acra</groupId>
<artifactId>
acra</artifactId>
<version>
v4.9.0</version>
</dependency>
Similarly, for CrashBurnFree:
<dependency>
<groupId>
com.darwinsys</groupId>
<artifactId>
crashburnfree-javaclient</artifactId>
<version>
1.0.2</version>
</dependency>
The JAR file for Splunk MINT is mint-5.2.1.jar. You probably know how to add JARs to your project; if not, see Recipe 1.20.
Because this mechanism reports errors via the internet, the following should go without saying (but let me say it anyway): you need internet permission to use it! Add the following code to your AndroidManifest.xml file:
<uses-permission android:name="android.permission.INTERNET" />
You usually only need to make one call, in your Application
class or Activity’s onCreate()
method.
For ACRA, you annotate your Application
class:
import
org.acra.*
;
import
org.acra.annotation.*
;
@ReportsCrashes
(
formUri
=
"http://somereportingbackend.com/somereportpath"
)
public
class
MyApplication
extends
Application
{
@Override
public
void
onCreate
()
{
super
.
onCreate
();
ACRA
.
init
(
this
);
...
}
}
Here is the code in the onCreate()
method for CrashBurnFree:
final
long
KEY
=
12345
;
final
String
PASS
=
"some enchanted evening"
;
final
String
url
=
"https://REST URL to your server"
;
@Override
public
void
onCreate
()
{
super
.
onCreate
();
setContentView
(
R
.
layout
.
main
);
CrashBurnFree
.
register
(
url
,
key
,
pass
);
...
}
And here is the code in the onCreate()
method for BugSense:
private static final String KEY = "... your key here ..."; @Override public void onCreate() { super.onCreate(); setContentView(R.layout.main); Mint.setApplicationEnvironment(Mint.appEnvironmentStaging); Mint.initAndStartSession(this.getApplication(), "YOUR_API_KEY"); ... }
To learn how these programs catch uncaught exceptions, see Recipe 14.10 in Java Cookbook, Third edition, “Catching and Formatting GUI Exceptions,” where the technique is used for similar reporting in desktop applications. See also the source code for CrashBurnFree, available on GitHub.
You can get some crash reports using only the Google Play Console web reporting page, which is accessible after you log in. There are other tools in the same problem space, listed in the online version of this book.
Atul Nene
LogCat output is great as far as it goes, but a longer-term logging mechanism will be more useful in some circumstances. Design a built-in mechanism for your app that will give additional insight in such cases. You know the important events or state changes and resource needs of your app, and if you log them in a runtime application log from the app, the log becomes an additional much-needed resource that goes to the heart of the issue being reported and investigated. This simple preventive measure and mechanism goes a long way toward reducing low user ratings caused by unforeseen situations, and improves the quality of the overall user experience.
One solution is to use the Java standard java.util.logging
package. This recipe provides an example RuntimeLog
, which uses java.util.logging
to write to a logfile on the device, and gives the developer extensive control over what level of detail is recorded.
You have designed, developed, and tested your application and released it on the Google Play Store, so now you think you can take a vacation. Not so fast! Apart from the simplest cases, one cannot take care of all possible scenarios during app testing, and users are bound to report some unexpected app behavior. It doesn’t have to be a bug; it might simply be a runtime situation you didn’t encounter in your testing. Prepare for this in advance by designing a runtime application log mechanism into your app.
Log the most important events from your app—for example, a state change, a resource timeout (internet access, thread wait), or a maxed-out retry count. It might even be worthwhile to defensively log an unexpected code path execution in a strange scenario, or some of the most important notifications that are sent to the user.
Only create log statements that will provide insight into how the app is working. Otherwise, the large size of the log itself may become a problem, and while Log.d()
calls are ignored at runtime in signed apps, too many log statements may still slow down the app.
You may be wondering why we don’t use LogCat, or tools like BugSense and ACRA (see Recipe 3.9), to handle this task. These solutions do not suffice in all cases, for the following reasons:
The standard LogCat mechanism isn’t useful in end-user runtime scenarios since the user is unlikely to have the ability to attach a debugger to her device. Too many Log.d
and Log.i
statements in your code may negatively impact app performance. In fact, for this reason, you shouldn’t have Log.*
statements compiled into the released app.
Tools like ACRA and BugSense work well when the device is connected to the internet, but it may not always have a connection, and some classes of applications may not require one at all except for ACRA. Also, the ACRA stack trace provides only the details (in the stack trace) at the instant the exception was thrown, while this recipe provides a longer-term view while the app is running.
The RuntimeLog
class is shown in Example 3-8.
import
java.util.logging.*
;
/** Runtime file-based logging, using standard java.util.logging (JUL).
* It is REALLY too bad that JUL was added before Java enums!
*/
public
class
RuntimeLog
{
// The JUL log levels are:
// SEVERE (highest value)
// WARNING
// INFO
// CONFIG
// FINE
// FINER
// FINEST (lowest value)
// Change this to MODE_DEBUG to use for in-house debugging
enum
Mode
{
MODE_DEBUG
,
MODE_RELEASE
}
private
static
final
Mode
mode
=
Mode
.
MODE_RELEASE
;
private
static
String
logfileName
=
"/sdcard/YourAppName.log"
;
private
static
Logger
logger
;
// Initialize the log on first use of the class and
// create a custom log formatter
static
{
try
{
FileHandler
fh
=
new
FileHandler
(
logfileName
,
true
);
fh
.
setFormatter
(
new
Formatter
()
{
public
String
format
(
LogRecord
rec
)
{
java
.
util
.
Date
date
=
new
java
.
util
.
Date
();
return
new
StringBuffer
(
1000
)
.
append
((
date
.
getYear
())).
append
(
'/'
)
.
append
(
date
.
getMonth
()).
append
(
'/'
)
.
append
(
date
.
getDate
())
.
append
(
' '
)
.
append
(
date
.
getHours
())
.
append
(
':'
)
.
append
(
date
.
getMinutes
()).
append
(
':'
)
.
append
(
date
.
getSeconds
())
.
append
(
' '
)
.
toString
();
}
});
logger
=
Logger
.
getLogger
(
logfileName
);
logger
.
addHandler
(
fh
);
}
catch
(
IOException
e
)
{
e
.
printStackTrace
();
}
}
// The log method
public
static
void
log
(
Level
logLevel
,
String
msg
)
{
// Don't log DEBUG and VERBOSE statements in release mode
if
(
mode
==
Mode
.
MODE_RELEASE
&&
logLevel
.
intValue
()
>=
Level
.
FINE
.
intValue
())
return
;
final
LogRecord
record
=
new
LogRecord
(
logLevel
,
msg
);
record
.
setLoggerName
(
logfileName
);
logger
.
log
(
record
);
}
/**
* Reveal the logfile path, so part of your app can read the
* logfile and either email it to you, or
* upload it to your server via REST
* @return
*/
public
static
String
getFileName
()
{
return
logfileName
;
}
}
This code has the advantage of automatically dropping verbose-level log calls when in production mode. There are, of course, variations that could be used:
You can use the same mechanism to uncover complex runtime issues while you are developing the app. To do so, set the Mode
variable to MODE_DEBUG
.
For a complex app with many modules, it might be useful to add the module name to the log call, as an additional parameter.
You can also extract the ClassName
and MethodName
from the LogRecord
and add them to the log statements; however, it is not recommended that you do this for runtime logs.
Example 3-9 shows that basic use of this facility is as simple as regular Log.d()
calls.
RuntimeLog
.
log
(
Level
.
ERROR
,
"Network resource access request failed"
);
RuntimeLog
.
log
(
Level
.
WARNING
,
"App changed state to STRANGE_STATE"
);
...
The filename shouldn’t be hardcoded, but should be obtained as in Recipe 10.1. Even better, create a directory, with logfile rotation (delete logfiles that are older than a certain age deemed no longer useful) to limit the disk storage of the logfiles.
To allow users to send the logfile(s) from their devices to your support team, you
would certainly want to write code to automate this, using the getLogfileName()
method to
access the file. Or you could use the same Java language hooks as the crash recorders (see Recipe 3.9) use,
and send the file automatically upon detecting an application crash.
This mechanism does not have to be in an “always on” state. You can log based on a user-settable configuration option and enable it only when end users are trying to reproduce problem scenarios.
Daniel Fowler
Android is designed for life on the go, where a user is engaged in multiple tasks: taking calls, checking email, sending SMS messages, engaging in social networking, taking pictures, accessing the internet, running apps—maybe even getting some work done! As such, a device can have multiple apps, and hence many Activities, loaded in memory.
The foreground app and its current Activity can be interrupted and paused at any moment. Apps, and hence Activities, that are paused can be removed from memory to free up space for newly started apps. An app has a life cycle that it cannot control, as it is the Android operating system that starts, monitors, pauses, resumes, and destroys the app’s Activities. Yet an Activity does know what is going on, because as Activities are instantiated, hidden, and destroyed, various functions are called. This allows the Activity to keep track of what the operating system is doing to the app, as discussed in Recipe 1.2.
Because of all this, app developers become familiar with the functions invoked when an Activity starts:
onCreate(Bundle savedInstanceState){…};
onStart(){…};
onResume(){…};
and with the functions called when an Activity is paused and then removed from memory (destroyed):
onPause(){…};
onStop(){…};
onDestroy(){…};
It is easy to see these functions in action. Create a simple app in Android Studio and, in the first loaded Activity, override the preceding functions, calling through to the superclass versions. Add a call to Log.d()
to pass in the name of the app and the function being invoked. (Here, for a basic app, the application name is set to MyAndroid
and an empty Activity is used. Note that by default Studio uses the AppCompatActivity
class, which is derived from Activity
.) The MainActivity
code will look like Example 3-10.
public
class
Main
extends
Activity
{
@Override
public
void
onCreate
(
Bundle
savedInstanceState
)
{
super
.
onCreate
(
savedInstanceState
);
setContentView
(
R
.
layout
.
main
);
Log
.
d
(
"MyAndroid"
,
"onCreate"
);
}
@Override
public
void
onStart
()
{
super
.
onStart
();
Log
.
d
(
"MyAndroid"
,
"onStart"
);
}
@Override
public
void
onResume
()
{
super
.
onResume
();
Log
.
d
(
"MyAndroid"
,
"onResume"
);
}
@Override
public
void
onPause
()
{
super
.
onPause
();
Log
.
d
(
"MyAndroid"
,
"onPause"
);
}
public
void
onStop
()
{
super
.
onStop
();
Log
.
d
(
"MyAndroid"
,
"onStop"
);
}
public
void
onDestroy
()
{
super
.
onDestroy
();
Log
.
d
(
"MyAndroid"
,
"onDestroy"
);
}
}
(There are other ways to print the program name and function name in Java, but hardcoded strings are used here for convenience and simplicity.)
Run the program on a device (virtual or physical) to see the debug messages in Logcat. If Logcat isn’t visible, open the Android Monitor (click the button at the bottom of the main Studio window, or use the View → Tool Windows menu, or press Alt-6). When the Back button is pressed, the three teardown messages are seen, as in Figure 3-18.
To see only the messages from the app, add a Logcat filter: use the last drop-down above the Logcat display area to select Edit Filter Configuration. In the Create New Logcat Filter dialog, give the filter a name; here we used “MyAndroid.” The Log Tag is used for filtering (the first parameter of the Log.d()
call; again set to “MyAndroid”). Logcat will now show only the messages explicitly sent from the app (see Figure 3-19).
The Logcat output can be further simplified by changing the header configuration (use the gear icon to the left of the Logcat area). The LogCat output can be cleared by clicking the trash can icon to the left of the Logcat area. It is useful to have a clean sheet before performing an action to watch for more messages.
To see the functions called when a program is paused, open another application while the MyAndroid program is running. The code for the onRestart()
method is key here. Create the function for onRestart()
, and this debug message:
@Override
public
void
onRestart
()
{
super
.
onRestart
();
Log
.
d
(
"MyAndroid"
,
"onRestart"
);
}
Run the program, click the Home button, and then launch the program again from the device (or emulator). You should see output similar to that in Figure 3-20.
Logcat shows the usual start-up function sequence; then, when the Home button is clicked, onPause()
and onStop()
run, but not onDestroy()
. The program is not ending but effectively sleeping. When the program is run again it isn’t reloaded, so no onCreate()
executes; instead, onRestart()
is called.
Run the program again, then swipe it from the screen in tiled view to kill it (or go into Apps via Settings, select the program, and then press the Force Close button). Then start the app again.
The usual start-up functions (onStart()
and onResume()
) are invoked, and then the Activity “sleeps.” No onDestroy()
is seen as the second instance is run (see Figure 3-21).
In this recipe, we discussed the following different life-cycle scenarios:
Normal start-up and then finish
Start-up, pause, and then restart
Start-up, pause, forced removal from memory, and then start-up again
These scenarios result in different sequences of life-cycle functions being executed. Using these scenarios when testing ensures that an app performs correctly for a user. You can extend the techniques described here when implementing additional overridden functions. These techniques also apply to using Fragments in an Activity and testing their life cycle.
Recipe 1.2, Recipe 1.15, and the developer documentation on the Activity
class, the Log
class, and Fragments.
The code for this recipe can be downloaded from Tek Eye.
Adrian Cowham
I wish I could have used a tool like StrictMode
back when I was doing Java Swing desktop development. Making sure our Java Swing app was snappy was a constant challenge: green and seasoned engineers alike would invariably perform database operations on the UI thread that would cause the app to hiccup. Typically, we found these hiccups when QA (or customers) used the app with a larger data set than the engineers were testing with. Having QA find these little defects was unacceptable, and ultimately a waste of everyone’s time (and the company’s money). We eventually solved the problem by investing more heavily in peer reviews, but having a tool like StrictMode
would have been comparatively cheaper.
The following example code illustrates how easy it is to turn on StrictMode
in your app:
// Make sure you import StrictMode
import
android.os.StrictMode
;
// In your app's Application or Activity instance, add the following
// lines to the onCreate() method
if
(
Build
.
VERSION
.
SDK_INT
>=
9
&&
isDebug
()
)
{
StrictMode
.
enableDefaults
();
}
Please note that I have intentionally omitted the isDebug()
implementation, as the logic of that depends on your application. I recommend only enabling StrictMode
when your app is in debug mode; it’s unwise to put your app in the Google Play Store with StrictMode
running in the background and consuming resources unnecessarily.
StrictMode
is highly configurable, allowing you to customize what problems to look for. For detailed information on customizing StrictMode
policies, see the developer documentation.
Ian Darwin
Run your code through Android Lint (included with modern versions of the Android SDK and supported by the relevant versions of the IDE plug-ins). Examine the warnings, and improve the code where it needs it.
The first “lint” program originated at Bell Laboratories, in Seventh Edition Unix. Steve Johnson wrote it as an offshoot of his development of the first Portable C Compiler, in the late 1970s. It was covered in my little book Checking C Programs with Lint (O’Reilly). Ever since, people have been writing lint-like tools. Some well-known ones for Java code include PMD, FindBugs, and Checkstyle. The first two, plus a number of other tools, are covered in my 2007 book Checking Java Programs (O’Reilly). The most recent one that’s relevant to us is, of course, Android Lint, a part of the standard Android SDK.
What each of these tools does is examine your code, and offer opinions based on expert-level knowledge of the language and libraries. The hard part is to bear in mind that they are only opinions. There will be cases where you think you know better than the tool, and you later find out that you’re wrong. But there will be cases where you’re right. And, of course, it’s impossibly hard for the computer to know which, so there is no substitute for the judgment of an experienced developer!
These tools typically find a lot of embarrassing issues in your code the first time you run them. One very common error is to create a toast by calling makeText()
, and forget to call the new toast’s show()
method; the toast is created but never pops up! The standard compiler cannot catch this kind of error, but Android Lint can, and that is just one of its many capabilities. After laughing at yourself for a minute, you can edit (and test!) your code, and run lint
again. You repeat this process until you no longer get any messages that you care about.
To run Android Lint, you can use the command-line version in $SDK_HOME/tools/lint. Under Eclipse, you invoke Android Lint by right-clicking the project in the Project Explorer and selecting Android Tools → Run Lint. The warnings appear as code markers just like the ones from Eclipse itself. Since they are not managed by the Eclipse compiler, you may need to run lint again after editing your code. If you get tired of the game, you can use the same menu to remove the lint markers. Under Android Studio, the Analyze → Inspect Code tool actually runs Android Lint. Example 3-11 shows the command-line version of lint since that’s the clearest way to show in print some examples of the errors it catches; rest assured it will catch the same errors when run under an IDE, though the messages may be less verbose.
$ cd MyAndroidProject $ lint . Scanning .: ...... Scanning . (Phase 2): .. AndroidManifest.xml:16: Warning: <uses-sdk> tag should specify a target API level (the highest verified version; when running on later versions, compatibility behaviors may be enabled) with android:targetSdkVersion="?" [UsesMinSdkAttributes] <uses-sdk android:minSdkVersion="7" /> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ AndroidManifest.xml:16: Warning: <uses-sdk> tag appears after <application> tag [ManifestOrder] <uses-sdk android:minSdkVersion="7" /> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ AndroidManifest.xml:6: Warning: Should explicitly set android:allowBackup to true or false (it's true by default, and that can have some security implications for the application's data) [AllowBackup] <application android:icon="@drawable/icon" android:label="@string/app_name"> ^ res/values/strings.xml:5: Warning: The resource R.string.addAccounrSuccess appears to be unused [UnusedResources] <string name="addAccounrSuccess">Account created</string> ~~~~~~~~~~~~~~~~~~~~~~~~ res/values/strings.xml:6: Warning: The resource R.string.addAccounrFailure appears to be unused [UnusedResources] <string name="addAccounrFailure">Account creation failed</string> ~~~~~~~~~~~~~~~~~~~~~~~~ res: Warning: Missing density variation folders in res: drawable-xhdpi [IconMissingDensityFolder] 0 errors, 6 warnings $
Nothing serious was found in this project, but several things can be cleaned up quickly and easily. Of course the unused string resources don’t need any action if they are intended for future use, but if they are old and have fallen out of use, you should remove them to keep your app clean. As with any automated tool, you know your app better than the tool does, at least for making such decisions.
The developer documentation on checking code with lint, my book Checking Java Programs, my video training series Java Testing for Developers, and my older book Checking C Programs with Lint (all from O’Reilly Media).
Adrian Cowham
Use the Android Monkey command-line tool to test applications you are developing.
Testing is so easy a monkey can do it, literally. Despite the limitations of testing tools for Android, I have to admit that the Monkey is pretty cool. The Android Monkey is a testing tool (included with the Android SDK) that simulates a monkey (or perhaps a child) using an Android device. Imagine a monkey sitting at a keyboard and flailing away—get the idea? What better way to flush out those hidden ANR messages?
Running the Monkey is as simple as starting the emulator (or connecting your development device to your development machine) and launching the Monkey script. I hate to admit this, but by running the Monkey on a daily basis, we’ve repeatedly found defects that probably would’ve escaped a normal QA pass and would’ve been very challenging to troubleshoot if found in the field—or, worse yet, caused users to stop using our app.
Here are a few best practices for using the Monkey in your development process:
Create your own Monkey script that wraps Android’s Monkey script. This is to ensure that all the developers on your team are running the Monkey with the same parameters. If you’re a team of one, this helps with predictability (discussed shortly).
Configure the Monkey so that it runs long enough to catch defects but not so long that it’s a productivity killer. In our development process, we configured the Monkey to run for a total of 50,000 events. This took about 40 minutes on a Samsung Galaxy Tab. Not too bad, but I would’ve liked it to be in the 30-minute range. Obviously, faster tablets will have a higher throughput.
The Monkey is random, so when we first started running it, every developer was getting different results and we were unable to reproduce defects. We then figured out that the Monkey allows you to set the seed for its random number generator. So, configure your wrapper script to set the Monkey’s seed. This will ensure uniformity and predictability across Monkey runs in your development team.
Once you gain confidence in your app with a specific seed value, change it, because you’ll never know what the Monkey will find.
Never run the Monkey on your production (“daily driver”) phone, as it will occasionally escape from the program under test and “creatively” alter settings in other apps!
Here is a Monkey script wrapper, followed by a description of its arguments:
#!/bin/bash # Utility script to run monkey # # See: https://developer.android.com/studio/test/monkey.html rm tmp/monkey.log adb shell monkey -p package_name --throttle 100 -s 43686 -v 50000 | tee tmp/monkey.log
-p
package_name
will ensure that the Monkey only targets the package specified.
--throttle
is the delay between events.
-s
is the seed value.
-v
is the VERBOSE
option.
50000
is the number of events the Monkey will simulate.
Many more configuration options are available for the Monkey; we deliberately chose not to mess around with what types of events the Monkey generates because we appreciate the pain. For example, the seed value we chose causes the Monkey to disable WiFi about halfway through the run. This was really frustrating at first because we felt like we weren’t getting the coverage we wanted. It turns out that the Monkey did us a favor by disabling WiFi and then relentlessly playing with our app. After discovering and fixing a few defects, we soon had complete confidence that our app operated as expected without a network connection.
Good Monkey.
The developer documentation on the Monkey.
Johan Pelgrim
Fire up two Android Virtual Devices and use the port number to send text messages and place calls.
When you create an app that listens for incoming calls or text messages—similar to the one in Recipe 11.1—you can, of course, use the DDMS perspective in Eclipse to simulate placing calls or sending text messages. But you can also fire up another AVD!
If you look at the AVD window’s title, you will see a number before your AVD’s name. This is the port number that you can use to telnet to your AVD’s shell (e.g., telnet localhost 5554
). Fortunately, for testing purposes this number is your AVD’s phone number as well, so you can use this number to place calls (see Figure 3-22) or to send a text (Figure 3-23).
44.220.184.63