This chapter will serve as an introduction to how to best integrate and optimize graphical elements in your Android apps. These include graphics such as bitmap images, tween animation (transform-based), bitmap animation (frame-based), image transitions (crossfades, or slow-blended image fades from one image into another) and digital video.
You will learn how to best use imaging techniques within your application's View
objects that make up your UI, and how to support all three levels of Android screens (QVGA, HVGA, and WVGA) via custom resource assets.
Because VGA is 640 × 480 pixels, quarter VGA (QVGA) is 320 × 240 pixels, or one-quarter of a VGA screen; half VGA (HVGA) is 480 × 320, or one-half of a VGA screen; and wide VGA (WVGA) is 800 × 480, or a widescreen version of a VGA screen.
We'll cover the use of graphics objects in both the areas of UI design (custom buttons, for instance) and user experience design (the content itself, say music videos or an interactive children's storybook).
We'll look at two packages: the android.graphics.drawable
package (I knew there was a reason that resource folder was called drawable) and the android.view.animation
package. These are collections of useful classes for maximizing bitmap imagery and for working with images that support the fourth dimension (time) via animation. For fun, we'll play with a really cool 9-patch image auto-scaling feature that Android supports for the PNG format.
Finally, we'll take a look at digital video. Using the VideoView
class makes playing digital video a snap. We'll also discuss which open source digital video formats are best to use, and how to optimize them for use on smartphones.
The central set of classes used to control the graphics-related content within your Android application is called the drawable
package. This package handles classes and methods related to drawing the following onto the Android display screen:
Bitmaps: In a bitmap, a collection of pixels make up an image—it's a map of image bits, if you will.
Shapes: Shapes are line drawings. They also known as vectors, like the lines architects use in CAD drawings
Gradients: Gradients are smooth transitions from one color to another color. They can be shaped in a straight line or circular.
Transitions: Shape transitions are smooth vector changes between one shape to another shape. This process is sometimes referred to as morphing.
Animation: Animation is an image that moves in some way.
Image transitions: These are smooth fades between one image to another image. They are usually used to transition from one image to another image.
In Android development, graphics-related items like gradients, image transitions, animated transformations, and frame-based animation can all be termed drawables. With the exceptions of tweens and transformational animation, all center their resource assets in the /res/drawable folder. (And you thought tweens were 12-year-olds, right?)
The /res/drawable folder is also where you should put XML files that define things like frame-based image animations and crossfading image transitions (which we will look at later in this chapter). So get used to seeing drawable everywhere you look, because it will be one of the most used folders in your resources (/res) folder.
The way that Android is set up to automatically implement your images via the project folder hierarchy is a bit hard to understand at first. But once you get used to it, you'll find that it is actually amazingly simple to use graphic resources, as major coding is all but eliminated. You will see that in this chapter when we implement features using as few as four lines of Java program logic.
I'm not sure what could be much simpler than this: put your imagery into the project/res/drawable folder, and then reference it by file name in your code. Yes, all you need to do is reference it in your XML and Java code, and you are finished, and with perfect results (assuming that your imagery is optimized correctly).
In this chapter, we will look at which image and video formats to use, which techniques to implement, and which work processes to follow as much as (or more than) we will be dealing with XML attributes and Java code snippets (although these are fun to play with as well).
Android offers more than a dozen types of customized drawable objects. In this chapter, we'll work with the following core subclasses of android.graphics.drawable
:
BitmapDrawable
object: Used to create, tile, stretch, and align bitmaps.
ColorDrawable
object: Used to fill certain other objects with color.
GradientDrawable
object: Used to create and draw custom gradients.
AnimationDrawable
object: Used to create frame-based animations.
TransitionDrawable
object: Used to create crossfade transitions.
NinePatchDrawable
object: Used to create resizable bitmaps via custom stretchable areas.
If you want to review all of the drawable objects, look at the android.graphics.drawable
package document on the Android Developers web site (http://developer.android.com
). You'll find that there is a plethora of graphics power in Android's 2D engine.
The most pervasive and often used type of drawable is the bitmap. A bitmap is an image composed of a collection of dots called pixels, where "pix" stands for "pictures" and "els" stands for "elements". Yes, a bitmap is quite literally a map of bits. So, let's get started with adding bitmaps to your Android apps.
How do we best optimize our static (motionless, or fixed-in-place) bitmap imagery for use within our Android applications? That's what this section is all about. We have already worked with bitmap images in the previous chapter, in the context of our ImageButton
and ImageView
objects, so you have a little experience with using truecolor 32-bit PNG (PNG32) files to obtain an excellent graphic result.
Android supports three bitmap image file formats: PNG, JPEG, and GIF. We'll talk about how Android truly feels about each one, so you can choose the right formats to meet your graphics-related design and user experience objectives.
The most powerful file format that Android supports, and the one it recommends using over all others, is Portable Network Graphics, or PNG (pronounced "ping"). There are two types of PNG:
Indexed-color, which uses a limited 256-color image palette
Truecolor, which uses a 32-bit color image that includes a full 8-bit alpha channel (used for image compositing)
PNG is a known as a lossless image file format, because it loses zero image data in the compression processing. This means that the image quality is always 100% maintained. If designers know what they are doing, they can get very high-quality graphics into a reasonably small data footprint using the indexed-color PNG8 and truecolor PNG32 image file formats.
Indexed-color PNG8 files use one-fourth of the amount of data (bits) that a truecolor 32-bit PNG32 image does. Remember the math we did in the previous chapter: 8 × 4 = 32. A smaller data footprint is achieved by using only 8 bits, or a 256-color palette of optimal colors best suited to represent the image, but with the same visual result. This is done primarily to save data file size, thereby decreasing the image's data footprint.
Truecolor PNG32 images use a full 32 bits of data for each of the image pixels to represent the four image data channels that are in most bitmap images: alpha, red, green, and blue (RGBA).
The alpha channel determines where the image is going to be transparent, and is used for image compositing. As you learned in Chapter 7, compositing is the process of using more than one image in layers to create a final image out of several component parts.
Another benefit of image compositing is that in your programming code, you can access different image elements independently of other image elements. For example, you might do this for game engine programming.
Note that at compile time, Android looks at your PNG32 graphics, and if they use less than 256 colors in the image, Android automatically remaps them to be indexed PNG8 images, just as you would want it to do. This means that you don't need to worry about analyzing your images to see if they should be in truecolor or indexed-color format. You can simply do everything in truecolor, and if it can be optimized into indexed-color with no loss of data, Android will do that for you—making your data footprint three to four times smaller.
If for some reason you don't want your images optimized at compile time, you can put them into the project/res/raw folder, which is for data that is accessed directly from your Java code. A good example of this is video files that have been well optimized for size and quality, and just need to be played. These come up in a video player example in Chapter 11, so stay tuned, as we will be using the /raw folder soon enough.
The next most desirable format to use is the JPEG image file type. This type does not have an alpha channel. It uses lossy compression, which means that it throws away data to get a better compression result.
If you look closely at JPEG images, you will see a lot of artifacts, such as areas of strange color variations or dirt on the image that was not on the camera lens. JPEG is useful for higher-resolution (print) images, where artifacts are too small to be seen. So, it is not really suitable for small smartphone screens. JPEG is supported but not recommended for Android apps.
Finally, we have GIF, an older 8-bit file format. The use of this file format is discouraged. Stay away from using GIFs for Android apps. Use PNG8 instead.
You've learned how to implement static bitmap images in previous chapters. So, let's get right into the fun stuff with animation.
Traditional 2D animation involves moving quickly among a number of what originally were called cels, or hand-drawn images, creating the illusion of motion. To steal a more modern term from the movie industry, each image, which is a little bit different from the next, is called a frame. This term refers back to the original days of film, where actual film stock would be run through a projector, showing 24 frames per second (fps).
In Android, frame-based animation is the easiest to implement and gives great results. You just need to define the XML animation attributes—what and where the frames are—in the correct place for Android to find them. Then you can control your animation via Java.
In our example, we are going to animate a 3D logo. It will come into existence via a fireworks-like particle animation.
Let's fire up a new project in Eclipse, and see how animation works in Android.
If you still have the Chapter7 project folder open from the previous examples, right-click that folder and select Close Project. This closes the project folder in Eclipse (of course, it can be reopened later).
Select Eclipse File
Project name: Name this project Chapter8
.
Build Target: Choose Android 1.5.
Application name: Let's call this application Graphics Examples.
Package name: Name the package graphics.examples
.
Create Activity: Check this box and name the activity graphics
.
Minimum SDK Version: Enter 3, which matches with our 1.5 compatibility Build Target setting.
Now we need to define our animation's frames in an XML file, which we'll call logo_animation. Right-click your Chapter8 folder and select New
Since frame-based animation in Android uses bitmap images, you place the XML file that references these bitmap images into the same folder the images occupy: the /res/drawable folder. Do not put frame animation images or XML specifications into the /res/anim folder. That folder is for transform animation (covered in the next section of this chapter). This is an important difference in how frame-based animations and transform-based or tween animations are set up and created in Android.
Next, click the logo_animation.xml tab in Eclipse, and type in the following XML to define our frame-based animation for Android (Figure 8-3 shows the new file in Eclipse):
<?xml version="1.0" encoding="utf-8"?>
<animation-list
xmlns:android="http://schemas.android.com/apk/res/android"android:oneshot="true">
<item android:drawable="@drawable/mtlogo0" android:duration="200" /> <item android:drawable="@drawable/mtlogo1" android:duration="200" /> <item android:drawable="@drawable/mtlogo2" android:duration="200" /> <item android:drawable="@drawable/mtlogo3" android:duration="200" /> <item android:drawable="@drawable/mtlogo4" android:duration="200" /> <item android:drawable="@drawable/mtlogo5" android:duration="200" /> <item android:drawable="@drawable/mtlogo6" android:duration="200" /> <item android:drawable="@drawable/mtlogo7" android:duration="200" /> <item android:drawable="@drawable/mtlogo8" android:duration="200" /> <item android:drawable="@drawable/mtlogo9" android:duration="200" /></animation-list>
This is pretty straightforward XML tag mark-up logic here. We declare the XML version and add an animation-list
tag for frame-based animation image (item
) listings. This tag has its android:oneshot
attribute set to true
, which will prevent our animation from looping continuously. Setting oneshot
equal to false
will run the animation seamlessly as a loop.
Inside the animation-list
tag, we have ten nested item
tags (nested because the animation-list
closing tag comes after these ten item
tags). These specify the location of each image in our /res/drawable folder, where each image is a frame in the animation.
Using each item
tag entry, we specify the name and location of each of our frames mtlogo0
through mtlogo9
, as well as the duration of the frame display time in milliseconds (ms). In this case, we start off using 200 ms, or one-fifth second, for each frame, so that the entire animation plays over 2 seconds, and at 5 fps, just barely fast enough to fake movement. We can adjust frame times later, to fine-tune the visual result, as well as make the animation loop seamlessly to show this feature.
We need to put our animation frame images into the /res/drawable folder, so that the XML code can reference them successfully. As you know by now, in Android, everything needs to be in its correct place for things to work properly.
Copy the ten animation frames into the /res/drawable folder from the code download.
Right-click the Chapter8 folder in the Package Explorer and select Refresh
, so that the IDE can see them.
If there are errors on your XML editing pane, right-click your Chapter8 folder and select Validate
to clear these as well.
At this point, you should see a screen that looks similar to Figure 8-3.
Now we are going to write our Java code to access and control our 2D animation. If the graphics.java tab is not already open, right-click the graphics.java file and select Open.
In order to right-click the graphics.java file, the /src folder and subfolders need to be showing in the expanded Package Explorer project-tree view, so click those arrows to make your hierarchy visible.
Here is the code for our graphics.java file, which holds our graphics
class from our graphics.examples
package:
package graphics.examples; import android.app.Activity; import android.os.Bundle; import android.view.MotionEvent; import android.widget.ImageView; import android.graphics.drawable.AnimationDrawable; public class Graphics extends Activity { AnimationDrawable logoAnimation;
@Override /** Called when the activity is first created. */ public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); ImageView logoImage = (ImageView) findViewById(R.id.iv1); logoImage.setBackgroundResource(R.drawable.logo_animation); logoAnimation = (AnimationDrawable) logoImage.getBackground(); } public boolean onTouchEvent(MotionEvent event) { if (event.getAction() == MotionEvent.ACTION_DOWN) { logoAnimation.start(); return true; } else return super.onTouchEvent(event); } }
In Android Java code, AnimationDrawable
is the class we need to use to implement our frame-based animation sequences. We import the android.graphics.drawable.AnimationDrawable
class. Then we import the android.widget.ImageView
class, which we will use as a view container to display the animation. Finally, we import the android.view.MotionEvent
, which we will use to implement a touchscreen touch trigger to interactively start up the animation play cycle. We add the three new import
statements to the ones that Android starts us out with (the first two).
Next, we add the object declaration for our AnimationDrawable
object, which we are calling logoAnimation
. This is as simple as writing the following:
AnimationDrawable logoAnimation;
Then we have our standard onCreate()
method of our activity, using our main.xml UI layout specification. In this case, we're using a LinearLayout
container with an ImageView
called iv1
inside it to hold our frame animation.
Next, we create an ImageView
object called logoImage
, which we assign to ImageViewiv1
, which we will declare in the main.xml file.
After that, we set the background resource for this newly created ImageView
to our logo_animation XML file, which specifies our animation sequence and timing. This is the bridge between display (ImageView
) and animation data (logo_animation.xml) set up so that our animation will display through the background image setting for the ImageView
. This leaves it open for us to have a source image in our ImageView
that uses transparency (an alpha channel) to create cool effects. It essentially gives us two layers in the ImageObject
, as we can set source and background images for any ImageView
object.
Finally, we define the logoAnimation
object that we declared in the first line of code in the graphics
class. logoAnimation
is an AnimationDrawable
object that gets its data from the logoImage
object via its getBackground()
method, which grabs its background image. As you can see from the previous line, that image has been obtained from the logo_animation.xml file, where we define how everything should work.
To trigger our animation to play, we use a new method called onTouchEvent()
. This method uses a MotionEvent
event to detect if the touchscreen has been touched, which generates an ACTION_DOWN
event. (Recall that an event is something that a Java class listens for and is programmed to react to, like a touchscreen touch event or a keyboard keystroke key event.)
In our code, if this ACTION_DOWN
touch event is detected, then the logoAnimation
object is sent a start()
method trigger. It plays and returns true
(I played it), or else it passes the event upward to the onTouchEvent
method on the superclass from which it was subclassed.
It's pretty logical: a subclass is a specialization of a superclass. A superclass is a more general class than the subclass and serves as the foundation class. If a subclass is sent an event it is not specialized to deal with, it sends that event to its superclass for general handling.
Figure 8-4 shows the four logical sections of code that we need to add to the default graphics
class and onCreate()
code:
Import the Android Java classes that we are leveraging in our code.
Create and name an AnimationDrawable
object that is accessible to every code construct in our graphics class.
Create an ImageView
object tied to our main.xml screen layout, set the background image resource of that ImageView
to reflect our logo_animation.xml attributes, and then have our logoAnimationAnimationDrawable
object take that frame data from the ImageView
via getBackground()
.
Trigger the animation with an ACTION_DOWN
touchscreen event in our onTouchEvent()
method.
Finally, we need to put in place the ImageView
named iv1
, which ties the ImageView
in our Java code to the ImageView
defined in our XML document (main.xml) that defines our screen UI. Here is the code, which is also shown in Figure 8-5:
<?xml version="1.0" encoding="utf-8"?> <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:orientation="vertical" android:layout_width="fill_parent" android:layout_height="fill_parent" > <ImageView android:layout_width="wrap_content" android:layout_height="wrap_content" android:id="@+id/iv1"/> </LinearLayout>
Figure 8.5. Naming our ImageView UI element in the main.xml file so it matches the iv1 name used in our Java code
In this case, we have a LinearLayout
that contains an ImageView
object named iv1
. We set our ImageView
to wrap_content
(basically to conform the ImageView
bounds to the 160 × 160 pixel dimension of our image, and thus our animation sequence).
Now let's see our animation in action. Right-click your Chapter8 folder and choose Run As
Since a screenshot cannot display an animation, we'll forego the screenshot of the 1.5 emulator. Now, here's a simple exercise to try after you run this version. Make the following changes, and then save the modified logo_animation.xml file:
Change the logo_animation
values from 200
to 100
for all of the objects, except for the first frame and the last frame.
Set these to 1000
or 2000
.
Change the animation-list
tag's android:oneshot
attribute to false
.
To run our looping animation version, right-click the Chapter8 folder and select Run As
Next, let's add a transformational animation directly underneath our frame-based animation.
Tween animation is used for shape-based animation, where shapes are animated from one state to another without specifying the intermediate states. In other words, you define the start and end positions of the shape, and Android fills in the gaps to make the animation work.
This contrasts with frame-based animation, which uses a sequence of cels, or bitmap images, like the flipbook animations of days gone by. So frame animation does its work via pixels, while tween animation does its work via transforms that move, rotate, or scale a shape, image, or even text. Thus, tween animation is more powerful than frame-based animation. It can also be used in conjunction with frame-based animation to achieve even more spectacular results.
Tween animation in Android is completely different than frame animation. It is implemented with the set of classes found in the android.view.animation
package. These classes represent the true power of tween animation in Android. They include things like advanced motion interpolators, which define how animation transformations accelerate over time; and animation utilities, which are needed to rotate, scale, translate (move), and fade View
objects over time.
"Wait a minute," you must be musing, "does 'View
objects' mean that I can apply all of this animation class power to, say, TextView
s, for instance? Or even VideoView
s?" Indeed it does. If you transform a TextView
(rotate it, for instance), and it has a background image, that image is transformed correctly, right along with the text elements of the TextView
and all of its settings.
Here, the word transformation refers to the process of rotation (spinning something around a pivot point), scaling (resizing in x and y dimensions relative to a pivot point or reference point), and x or y movement, which is called translation in animation.
As you might imagine, tween animation definitions can get very complex. This is where the power of using XML to define complicated things, like transformational animation constructs, becomes very apparent. Again, we thank Android for off-loading work like this from Java coding to XML constructs. In XML, the animation transforms are simple lists of nested tags; they are not called classes and methods. It is certainly easier to fine-tune and refine these types of detailed animations via XML line-entry tweaks rather than in Java code.
The XML for tween animations goes in an entirely different directory (folder) than frame animation (which goes in /res/drawable). Transform animation goes in the /res/anim folder.
We will use a different XML file-creation method to create our transform animation XML file and its folder, so let's get into that right now.
Right-click your Chapter8 folder in the Eclipse Package Explorer pane at the left and select New
As you can see by the options in the New Android XML dialog, Android in Eclipse has a powerful XML file-creator utility that supports seven different genres of XML files, including animation. Fill out the dialog as follows (and shown in Figure 8-7):
File: The first field we want to fill out is the name of the animation XML file, which is text_animation.xml.
What type of resource would you like to create?: Select Animation as the XML file type, which automatically puts /res/anim as the Folder field at the bottom of the dialog.
Select the root element for the XML file: Make sure that set
is selected as the root element in the file. (The root element is the outermost tag in an XML file and contains all the other tags.) <set>
is used to group and nest transforms to achieve more powerful and flexible results, as you will see in our transform XML markup.
Now click Finish.You will see the /res/anim folder appear in your project hierarchy tree in the Package Explorer pane, with the text_animation.xml file under that.
Now let's add in our XML tags to define our scale and rotation transforms, as shown in Figure 8-8. (Click the Source tab at the bottom of the main window to open the XML code editing window if it does not appear automatically.)
<?xml version="1.0" encoding="utf-8"?> <set xmlns:android="http://schemas.android.com/apk/res/android"
android:shareInterpolator="false"> <scale android:interpolator="@android:anim/accelerate_decelerate_interpolator" android:fromXScale="1.0" android:toXScale="1.4" android:fromYScale="1.0" android:toYScale="0.6" android:pivotX="50%" android:pivotY="50%" android:fillAfter="false" android:duration="700" /> <set android:interpolator="@android:anim/decelerate_interpolator"> <scale android:fromXScale="1.4" android:toXScale="0.0" android:fromYScale="0.6" android:toYScale="0.0" android:pivotX="50%" android:pivotY="50%" android:startOffset="700" android:duration="400" android:fillBefore="false" /> <rotate android:fromDegrees="0" android:toDegrees="-45" android:toYScale="0.0" android:pivotX="50%" android:pivotY="50%" android:startOffset="700" android:duration="400" /> </set> </set>
Notice that there are quite a few attributes for the tags that allow transformational animation over time. For instance, our scale tags allow us to specify to and from values for both x and y dimensions, pivot points (where the scale emanates from, or from which location on the object the scale is performed), scale offsets for nonuniform scaling, time duration, and whether to fill before or after the transformation.
For rotation tags, we have rotation to and from degree specifications, as well as x and y pivot point settings. We also have both an offset for skewed rotations and a duration attribute that controls the speed of the rotational transformation. The pivot point defines the center point of the rotation, and an offset defines how to skew the rotation from that point, much like the old Spirograph set that created cool flower-like graphics.
Now that our TextView
transform animation XML data is in place inside our newly created /res/anim/text_animation.xml file, we can insert a half dozen lines of Java code into our graphics.java file, to implement the transform animation within our application, directly underneath our frame-based animation.
As shown in Figure 8-9, the first thing we must do is to import the Android classes that are going to be used in the text animation transformation: android.widget.TextView
and the android.view.animation
classes called Animation
and AnimationUtils
.
import android.widget.TextView; import android.view.animation.Animation; import android.view.animation.AnimationUtils;
Then down in our onCreate()
method, we specify the TextView
object textAnim
and the Animation
object textAnimation
.
TextView textAnim = (TextView) findViewById(R.id.TV1); Animation textAnimation = AnimationUtils.loadAnimation(this, R.anim.text_animation);
We then call the startAnimation()
method on the TextView
object, specifying that we want to use the textAnimationAnimation
object.
textAnim.startAnimation(textAnimation);
Finally, we need to add a TextView
object named TV1
to our LinearLayout
tag and UI container in our main.xml file, as shown in Figure 8-10.
Now we can try out the tween animation. Right-click the Chapter8 folder in the Package Explorer pane and select Run As...
It runs pretty fast. Let's add a zero on the time values in our text_animation.xml file, changing 400
to 4000
and 700
to 7000
.
Compile and run the app again. You'll see that the animation runs ten times slower.
Transitions are preprogrammed custom special effects like crossfades and directional wipes. Using these effects can increase the perceived professionalism of your application.
You can use XML to set up such graphics transformations.
Android provides the TransitionDrawable
class. Here, we will use it in conjunction with an XML file in the /res/drawables directory, just as we did in the frame-based animation example, since we are working solely with bitmap images.
So let's get started.
Right-click the Chapter8 folder and select New
Name the file image_transition.xml, as shown at the bottom of the New File dialog in Figure 8-11.
Next, add the <transition>
tag as follows. The <transition>
tag has the usual xmlns
reference (to make our file valid Android XML). Inside the tag, we specify two <item>
tags referencing the images that we need to transition from and transition to. We are using the two images from Chapter 7 here to show that the transitions will accommodate the alpha channel and more complicated masking of images, which is important for advanced designs:
<?xml version="1.0" encoding="utf-8"?><transition
xmlns:android="http://schemas.android.com/apk/res/android"><item android:drawable="@drawable/image1"/>
<item android:drawable="@drawable/image2"/>
</transition>
Add the two images to the drawable folder. Figure 8-12 shows what your screen should look like once you have added the two images, refreshed the IDE, and typed in your tags.
Now we need to add an ImageView
in our LinearLayout
to hold our image transition. Put the following in the main.xml file underneath our animated TextView
, as shown in Figure 8-13.
<ImageView android:layout_width="wrap_content" android:layout_height="wrap_content" android:src="@drawable/image1" android:id="@+id/imgTrans"/>
We are specifying the first image (the "from" image) of our transition as the source image to use in the ImageView
object, and we are naming it imgTrans
via the now familiar @+id/imgTrans
notation.
Now we are ready to drop a few lines of Java code (a whopping four this time) into graphics.java to add the ability to do a fade transition from one image slowly into another.
Here is the code to set up the ImageView
we just added to access the transition:
TransitionDrawable trans = (TransitionDrawable) getResources().getDrawable(R.drawable.image_transition);
This is all on one line, as shown in Figure 8-14.
We have no new import
statements to add, so the import
statements block of code is closedin Figure 8-14. This is indicated by a plus sign (+) next to the block, which signifies that this code block can be expanded (just click the +). You can click any of the minus signs (–) in your Java code window to close classes you are finished editing, if you want to see a higher-level view of your code. Once your code becomes long and involved, you will find that you use this simple feature regularly. Try it, and get used to making it a part of your work process inside the Eclipse IDE.
This code declares our TransitionDrawable
object, which we name trans
. It sets trans
to the results of the call to the getDrawable()
method of the object returned by the getResources()
method.This obtains the image_transition.xml transition drawable specification, which points to our two circular images.
Setting up that TransitionDrawable
object and loading it with our XML file is the hardest line of code in this quartet. The next three are more familiar and straightforward:
ImageView transImage = (ImageView) findViewById(R.id.imgTrans); transImage.setImageDrawable(trans); trans.startTransition(10000);
We create an ImageView
object called transImage
and, via the findViewById()
method, we link it to the imgTrans
ID, from the second ImageView
XML tag we added to main.xml earlier. We then use the setImageDrawable()
method to set the transImageImageView
object to the transTransitionDrawable
object that we just created above it.
This second and third lines of Java code bridge our ImageView
object with our TransitionDrawable
object, and thus complete our wiring together of the various UI view and animation effect objects.
Finally, we can now talk to the transTransitionDrawable
object via its startTransition(milliseconds)
method. We will use that method to tell the transition to begin and to take place over 10,000 ms, or 10 seconds, (slow fade) to complete.
Select Run As...
Another type of drawable utility subclass in android.graphics.drawable
package is NinePatchDrawable
. A NinePatch
is a resizable bitmap whose scaling during resize operations can be controlled via nine areas that you can define in the bitmap image (think tic-tac-toe). This type of image could be used for anything from a scalable button background to a UI background that scales to fit different screen resolutions.
The advantage to the NinePatch
drawable object is that you can define a single graphic element (in our example, that will be a 2.7KB PNG file) that can be used across many different UI elements, including buttons, sliders, backgrounds, and similar items. Screen or button backgrounds can use this technology.
Android comes with a tool for editing NinePatch
objects. In the Android SDK tools folder (as shown in Figure 8-15, this is under android-sdk-windows on Windows), you will find a draw9patch.bat batch file. Running this file (from the command line or by using a right-click context-sensitive menu) starts the Draw 9-patch utility.
Figure 8.15. Locating the Android 9-patch editing tool in your operating system directory structure for the Android SDK
This chapter uses screen images from Adobe Photoshop. If you don't have PhotoShop, there a free open source software called GIMP (for the GNU Image Manipulation Program), which is similar to Photoshop and available for Mac, Linux, and Windows systems. For more information and to download GIMP, go to http://gimp.org
.
Let's try creating some 9-patch buttons.
To launch the 9-patch editor from Windows, right-click the draw9patch.bat file (in Linux and Mac systems, the file name extension may be different) and select Run as Administrator. You will see the Draw 9-patch startup screen.
Select File
The PNG32 image file opens in the 9-patch editor, as shown in Figure 8-17. Select the Show patches check box at the bottom of the window, so that you can see the effects of the patch areas as colored, translucent surfaces over the top of the image.
Draw boundaries with your mouse by dragging the mouse within the 1-pixel wide transparent border area above and to the left of our graphic, as shown in Figure 8-18. The Draw 9-patch utility adds this 1-pixel border area inside the image.
As you add points to this line with a left-click (or subtract with a right-click or Shift-click for Mac), be sure to look at the images on the right. You'll see how they change the way that Android scales your background in real time as you work on the image.
Bring the original and the newly created 9-patch image into Photoshop, as shown in Figure 8-19. Now, you can see what the 9-patch editor is doing to tell Android which parts of the image to tile or scale and which parts of the image to preserve (usually edges).
Select File
Add the following code to the bottom of our LinearLayout
container (it can also be seen in Figure 8-21 later in the exercise):
<Button android:id="@+id/Button1" android:layout_width="wrap_content" android:layout_height="wrap_content" android:background="@drawable/chromebutton" android:textColor="#770000" android:padding="3dip" android:text="CLICK HERE!" android:layout_gravity="center"/> <Button android:id="@+id/Button2" android:layout_width="wrap_content" android:layout_height="wrap_content" android:background="@drawable/chromebutton" android:textColor="#007700" android:padding="30dip"
android:text="NOW LETS REALLY SCALE THIS UP!" android:layout_gravity="center" android:textSize="17dip"/>
Click the Graphical Layout tab at the bottom of the main.xml editing window, and switch Eclipse into preview mode (as you've done in earlier examples). As shown in Figure 8-20, you can readily see the different effects of scaling the 9-patch image that were previewed in the Draw 9-patch tool.
In this case, small padding values preserve the round ends of the button, and ten times larger padding values scale the image to look more like a piece of chalk. Both different button treatments are culled from the same 2.7KB PNG image using the NinePatchDrawable
object.
Note how easy it was for us to define and use the NinePatchDrawable
class, without any Java code at all. We simply need to put the chromebutton.9.png 9-patch-compatible image into the /res/drawable folder, so it can be found and accessed by our XML.
Figure 8-21 shows the additions to main.xml. Notice that we added an attribute in each of our UI tags (ImageView
and TextView
):
android:layout_gravity="center"
Figure 8.21. Adding our 9-patch PNG in our main.xml UI layout and setting layout_gravity to center buttons
Layout gravity is like the alignment feature in word processors and browsers. It allows you to snap a layout container or a UI element to the left, right, top, bottom, or center. It's handy for designing the visual screen layout.
Now choose Run As
As our final topic, we'll look at how to simply and effectively play video files in your Android applications. You do this through a very handy class called VideoView
. We are going to add the ability to play video in our application using only three lines of XML code and eight lines of Java code, or less than a dozen lines of code in total.
For video playback, we will use the VideoView
class. Like TextView
and ImageView
objects, VideoView
objects make it easy to access MPEG-4 H.264 video in your Android applications. Your video can be easily streamed from a remote server, keeping your application download size to a minimum.
To add a VideoView
to our LinearLayout
, in main.xml, place the following new tag underneath the last ImageView
tag (and in place of our two Button
tags, which should be deleted and replaced with the following):
<VideoView android:layout_height="fill_parent" android:layout_width="fill_parent" android:id="@+id/VideoView"/>
This names our VideoView
and tells it to fill its parent container using fill_parent
. The fill_parent
value does the opposite of wrap_content
. It blows the content up to fit the layout container, rather than scaling the layout container down around the content.
With this in our LinearLayout
for our Chapter8 project, replacing the two Button
tags, we will now have video at the bottom of our app screen, under our transition object. Since in our vertical layout our VideoView
object is getting pushed off the bottom of the screen, let's temporarily disable our frame-based animation while we develop our VideoView
code. We do this by commenting out a block of code, as follows:
<!-- <ImageView android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_gravity="center" android:id="@+id/iv1"/> -->
So, to comment out a block of code in XML, simply add the <!--
opening tag and the -->
closing tag, as shown here.
Your new main.xml code should look like Figure 8-22. You can see that once this block of code is commented out, Eclipse changes the color of the code, and the Android compiler no longer sees that code. As far as it's concerned, the code is not there anymore. We'll do the same thing in our Java code as well, except using a different comment method.
In Java, a line of code is commented out by adding two forward slashes (//
). In graphics.java, we begin by commenting out our import android.view.MotionEvent
statement, as shown in Figure 8-23. Eclipse turns the commented code green to show it is no longer recognized by the compiler.
Remember that we commented out the code for our frame-based animation in our XML file. Let's now comment out the code that implements that frame-based animation in our Java file. We also comment out the touchscreen code, as follows (and shown in Figure 8-23):
//ImageView logoImage = (ImageView) findViewById(R.id.iv1); //logoImage.setBackgroundResource(R.drawable.logo_animation); //logoAnimation = (AnimationDrawable) logoImage.getBackground();
Now, add three new import
statements for the classes we need:
import android.net.Uri; import android.widget.VideoView; import android.widget.MediaController;
To get the video from our server, we also need to define its path using a Uri
object, so we must import the android.net.Uri
class. We next import the VideoView
widget (android.widget.VideoView
). Finally, to play the video in the VideoView
, we will use a MediaController
object, so we import the android.widget.MediaController
class as well.
Next, add the following to create our VideoView
object (see Figure 8-24):
Uri vidFile = Uri.parse("http://commonsware.com/misc/test2.3gp"); VideoView videoView = (VideoView) findViewById(R.id.VideoView); videoView.setVideoURI(vidFile); videoView.setMediaController(new MediaController(this)); videoView.start();
First, we create the Uri
reference object, which holds the path, or address, to the video file on the server. The Uniform Resource Identifier (URI) can use the familiar HTTP server paradigm or a more advanced real-time streaming protocol. As you can see, here we are using the HTTP protocol, which works fine and is the industry standard, thanks to the Internet. We create a Uri
object called vidFile
using the parse()
method with the HTTP URL to any valid path and file name in quotes. Here, the Uri
object points to the content at http://commonsware.com/misc/test2.3gp
, so that we have some video to play.
Now we have an object called vidFile
that contains a reference to our video file.
Next, we set up our VideoView
object, calling it videoView
and using findViewById()
to locate the VideoView
we created in our XML layout file. This is the same thing we have been doing with the other View
types, and should be pretty familiar to you at this point.
Now that we have a videoView
object, we use the setVideoURI()
method to pass the vidFileUri
object to the videoViewVideoView
object, so that the VideoView
is loaded with the file path to use to retrieve the video. Now our Uri
is wired into our VideoView
, and we need only to wire the MediaController
into the VideoView
so that the video can be played.
The next line of code connects the videoView
object to the MediaController
object using the videoView
object's setMediaController()
method, and invokes a cool code-optimization trick of declaring a new MediaController
object inside the setMediaController()
method. The long form of this would require two lines of code and an additional object variable, like so:
MediaController mediaControl = new MediaController(this); videoView.setMediaController(mediaControl);
Finally, to start our videoView
object playing, we send it a start()
method call via the last line of code:
videoView.start()
We are finished setting up our VideoView
object. Now select Run As
In this chapter, we took a look at the more advanced graphics capabilities that Android offers, including two different types of animation, image transitions, and digital video. You also learned a little more about the Eclipse IDE, code commenting, and image file formats that are optimal for Android apps.
Here are some important points to remember:
Always use PNG24 (which is really PNG32) format.
Bitmap animation and tween animation are two completely different things as far as Android is concerned. Bitmap-related animation and transitions are handled through the /res/drawable folder. Tween animation is handled via XML files defined in the /res/anim folder.
Don't limit yourself when using tween animation. Use it on any type of View
container you like—text, image, video, or whatever; wax creative.
In the next chapter, we'll start looking at how to make things interactive by setting up our applications to handle events and to listen for those events via event listeners.
98.82.120.188