Networked Video

As more and more media moves onto the Internet, it makes sense for Android to have good support for playing it back, which it does. For the remainder of this chapter, we'll explore the details of what is supported in terms of protocols and video formats, and how to harness network video.

Supported Network Video Types

Android currently supports two different protocols for network delivered video.

HTTP

The first is media delivered via standard HTTP. As HTTP is broadly supported across networks and doesn't typically have problems with firewalls as other streaming protocols have had, a large amount of media is available in this manner. Media delivered via HTTP is commonly referred to as progressive download.

Android supports on-demand media within MPEG-4 and 3GP files delivered from a standard web server via HTTP. At this time, it does not support the delivery of live video via HTTP using any of the new techniques now being used by Apple, Microsoft, or Adobe.

There are several things to keep in mind when preparing video for delivery via progressive download. First, the media has to be encoded with a codec and in a format that Android supports (see Chapter 9 for details about the formats and codecs that Android supports).

There are many free and commercial tools available to prepare media for delivery via HTTP progressive download. A few of them, in no particular order, are QuickTime X, Adobe Media Encoder, HandBrake, and VLC. QuickTime X has presets for iPhone encoding that work well with Android. Adobe Media Encoder has presets for iPod that seem to work as well. In general, if a piece of software has presets for the iPhone, they will likely work for Android devices.

Second, the bitrate of the video should be in the range of what can be delivered over the network that will carry the video. For instance, GPRS bandwidth could be as low as 20 kbps, and therefore the audio and video should be encoded with that in mind. In general, when delivered via HTTP, the media will be buffered on the device, and playback will start when enough has been downloaded that the playback should be able to go straight through to the end of the file without having to pause while waiting for more media to download. If the delivery of the media is only 20 kbps and the media is encoded at 400 kbps, that means that for each second of video the user will have to be downloading for 20 seconds. This probably isn't ideal.

If, though, the user is on WiFi, 400 kbps is probably good and will provide nice-looking video as compared to video that is encoded at 20 kbps. In general, the speed of the network that will be used has to be weighed against the quality of the video. The nice thing about using HTTP progressive download is that this can be done: the media doesn't have to be delivered in real time as it does with RTSP, which we'll discuss next.

Finally, in order for the video to be played back while it is downloading, it has to be encoded in a manner that allows this. Specifically this means that the resulting file should have what is called the “moov atom” at the front of the file. The “moov atom” contains an index of what is in the file and how it is organized. In order for the video playback software to be able to start playing back the video, it needs to know this information. If the “moov atom” is at the end of the file, the playback software can't start playback until the entire file is downloaded so it can get the “moov atom.”

Unfortunately, some video capture and encoding tools do not automatically perform this step. In some cases, it is simply a configuration setting; in other cases, you may need to do this step manually. A command-line application called qt-faststart has been developed and ported to many different operating systems and forms the basis for several GUI applications as well. It can be read about and downloaded from http://multimedia.cx/eggs/improving-qt-faststart/.

RTSP

The second protocol that Android supports for network delivery of video is RTSP. RTSP stands for Real Time Streaming Protocol and is technically not a media delivery protocol; rather, it is a control protocol that is used in support of media delivery. The form of media delivery that is supported along with RTSP in Android is RTP (the Real-time Transport Protocol) but only when paired with RTSP. In other words, RTP on Android doesn't work independently of RTSP.

RTSP and RTP are specific to real-time streaming. This is quite different from HTTP progressive download, in that the media is played as it is received over the network.

It also means that a special server is required to deliver the media. There are several RTSP servers on the market: Apple's Open Source Darwin Streaming Server, RealNetwork's Helix Server, and the Wowza Media Server are a few. Unfortunately, setting up and working with a server is out of the scope of what can be covered in this book. Fortunately, a highly reliable service exists that serves media via RTSP that we can test with (YouTube's mobile site, available at http://m.youtube.com).

As with progressive download, a couple of things need to be kept in mind when preparing media for delivery via RTSP. First the media needs to be encoded with a codec and in a file format that Android supports and that is streamable by an RTSP server. In general, streaming media for mobile devices is encoded as MP4 video and AAC audio in a 3GP container, although other codecs (H.264) and containers (MP4) are also supported.

NOTE: Android currently has two underlying media frameworks, PacketVideo's OpenCORE and one particular to Android called Stagefright. OpenCORE is the original framework that has been used in Android, and it has been exclusive until Android 2.2, when Stagefright was introduced.

In Android 2.2 (and all previous versions), OpenCORE is the framework that is used for streaming video (RTSP), although down the road this may change. The choice of which framework is used will be in the hands of the handset manufacturer, and both frameworks should be compatible on the API level. As this is all happening behind the scenes, with luck, we as developers will not need to be concerned with which underlying framework is being used.

More information about what protocols, codecs, container formats, and streaming protocols are supported by OpenCORE can be found on www.opencore.net/. Specifically the OpenCORE Multimedia Framework Capabilities document is available at www.opencore.net/files/opencore_framework_capabilities.pdf. (Unfortunately, at this time, no public documentation with regards to Stagefright's capabilities exists.)

Last, the bitrate of the media needs to be something that can be delivered in real time to the end user depending on his or her network connection. These speeds vary quite a bit depending on the network type. Second-generation networks (GPRS) offer data speeds that top out in the 50 to 100 kbps range. Encoding live video to be delivered in real time over this type of network requires that the video be encoded in the 30 kbps range to account for overhead and varying connection qualities. Moving up to EDGE networks should allow video in the 50 kbps range to be delivered reliably, and a conservative bitrate for today's current 3G networks would be in the 100 kbps range, with many networks capable of supporting significantly higher bitrates.

Unlike HTTP progressive download, RTSP can be used for live streaming media as well. This is one of its main advantages over traditional HTTP delivery. RTSP also supports seeking within on-demand media. This means that users can seek to specific points in the video without having to download all of the media up to and including that point. The server takes care of only serving the media for that point in the file to the player.

Network Video Playback

Android supports HTTP and RTSP video playback in all three video playback methods discussed in Chapter 9. Using either the built-in Media Player activity via an intent or the VideoView class to play either form of network video requires no source code changes. Simply use the HTTP or RTSP URL as the video Uri, and it will work as long as the format is supported.

VideoView Network Video Player

Here is the ViewTheVideo activity example from Chapter 9 that uses a VideoView with an RTSP URL to a video from YouTube's mobile site. The only change is the string passed in to construct the videoUri.

package com.apress.proandroidmedia.ch10.videoview;

import android.app.Activity;
import android.net.Uri;
import android.os.Bundle;
import android.widget.MediaController;
import android.widget.VideoView;

public class ViewTheVideo extends Activity {
    VideoView vv;
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.main);

        vv = (VideoView) this.findViewById(R.id.VideoView);
        Uri videoUri =
Uri.parse("rtsp://v2.cache2.c.youtube.com/CjgLENy73wIaLwm3JbT_Image
9HqWohMYESARFEIJbXYtZ29vZ2xlSARSB3Jlc3VsdHNg
_vSmsbeSyd5JDA==/0/0/0/video.3gp");
        vv.setMediaController(new MediaController(this));
        vv.setVideoURI(videoUri);
        vv.start();
    }
}
MediaPlayer Network Video Player

Working with the MediaPlayer for network video playback is similar to the MediaPlayer and MediaController code we went over in Chapter 9. In the following example, we'll highlight the portions that are specifically related to network playback. Figure 10-3 shows the example in action. For a full explanation of the MediaPlayer and MediaController, please refer to the examples in Chapter 9.

package com.apress.proandroidmedia.ch10.streamingvideoplayer;

import java.io.IOException;
import android.app.Activity;
import android.os.Bundle;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnBufferingUpdateListener;
import android.media.MediaPlayer.OnCompletionListener;
import android.media.MediaPlayer.OnErrorListener;
import android.media.MediaPlayer.OnInfoListener;
import android.media.MediaPlayer.OnPreparedListener;
import android.media.MediaPlayer.OnSeekCompleteListener;
import android.media.MediaPlayer.OnVideoSizeChangedListener;
import android.util.Log;
import android.view.Display;
import android.view.MotionEvent;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.LinearLayout;
import android.widget.TextView;
import android.widget.MediaController;

The StreamingVideoPlayer activity implements many of the available listener and callback abstract classes from MediaPlayer, SurfaceHolder, and MediaController. The OnBufferingUpdateListener is particularly useful when dealing with network delivered media. This class specifies an onBufferingUpdate method that is repeatedly called while the media is buffering, allowing us to keep track of how full the buffer is.

public class StreamingVideoPlayer extends Activity implements
        OnCompletionListener, OnErrorListener, OnInfoListener,
        OnBufferingUpdateListener, OnPreparedListener, OnSeekCompleteListener,
        OnVideoSizeChangedListener, SurfaceHolder.Callback,
        MediaController.MediaPlayerControl {

    MediaController controller;
    Display currentDisplay;
    SurfaceView surfaceView;
    SurfaceHolder surfaceHolder;
    MediaPlayer mediaPlayer;

    View mainView;

In this version, we'll use a TextView called statusView to display status messages to the user. The reason we'll do so is that loading a video for playback via the Internet can take quite a bit of time, and without some sort of status message, the user may think the application has hung.

    TextView statusView;

    int videoWidth = 0;
    int videoHeight = 0;

    boolean readyToPlay = false;
    public final static String LOGTAG = "STREAMING_VIDEO_PLAYER";

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        setContentView(R.layout.main);
        mainView = this.findViewById(R.id.MainView);

        statusView = (TextView) this.findViewById(R.id.StatusTextView);

        surfaceView = (SurfaceView) this.findViewById(R.id.SurfaceView);
        surfaceHolder = surfaceView.getHolder();

        surfaceHolder.addCallback(this);
        surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

        mediaPlayer = new MediaPlayer();

        statusView.setText("MediaPlayer Created");

        mediaPlayer.setOnCompletionListener(this);
        mediaPlayer.setOnErrorListener(this);
        mediaPlayer.setOnInfoListener(this);
        mediaPlayer.setOnPreparedListener(this);
        mediaPlayer.setOnSeekCompleteListener(this);
        mediaPlayer.setOnVideoSizeChangedListener(this);

Among the list of MediaPlayer event listeners, our activity implements and is registered to be the OnBufferingUpdateListener.

        mediaPlayer.setOnBufferingUpdateListener(this);

Instead of playing back a file from the SD card, we'll be playing a file served from an RTSP server. The URL to the file is specified in the following String, filePath. We'll then use the MediaPlayer's setDataSource method, passing in the filePath String. The MediaPlayer knows how to handle loading and playing data from an RTSP server, so we don't have to do anything else different to handle it.

        String filePath = "rtsp://v2.cache2.c.youtube.com/CjgLENy73wIaLwm3JbTImage
_9HqWohMYESARFEIJbXYtZ29vZ2xlSARSB3Jlc3VsdHNg96LUzsK0781MDA==/0/0/0/video.3gp";
        try {
            mediaPlayer.setDataSource(filePath);
        } catch (IllegalArgumentException e) {
            Log.v(LOGTAG, e.getMessage());
            finish();
        } catch (IllegalStateException e) {
            Log.v(LOGTAG, e.getMessage());
            finish();
        } catch (IOException e) {
            Log.v(LOGTAG, e.getMessage());
            finish();
        }

        statusView.setText("MediaPlayer DataSource Set");
        currentDisplay = getWindowManager().getDefaultDisplay();
        controller = new MediaController(this);
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.v(LOGTAG, "surfaceCreated Called");

        mediaPlayer.setDisplay(holder);
        statusView.setText("MediaPlayer Display Surface Set");

We'll use the MediaPlayer's prepareAsync method instead of prepare. The prepareAsync method does the preparation in the background on a separate thread. This makes it so that the user interface doesn't hang. This would allow the user to perform other actions or allow us as the developer to display a loading animation or something similar.

        try {
            mediaPlayer.prepareAsync();
        } catch (IllegalStateException e) {
            Log.v(LOGTAG, "IllegalStateException " + e.getMessage());
            finish();
        }

So the user knows what's happening while the prepareAsync method is running, we'll update the status message displayed by our statusView TextView.

        statusView.setText("MediaPlayer Preparing");
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width,
            int height) {
        Log.v(LOGTAG, "surfaceChanged Called");
    }

public void surfaceDestroyed(SurfaceHolder holder) {
        Log.v(LOGTAG, "surfaceDestroyed Called");
    }

    public void onCompletion(MediaPlayer mp) {
        Log.v(LOGTAG, "onCompletion Called");
        statusView.setText("MediaPlayer Playback Completed");
    }

    public boolean onError(MediaPlayer mp, int whatError, int extra) {
        Log.v(LOGTAG, "onError Called");
        statusView.setText("MediaPlayer Error");
        if (whatError == MediaPlayer.MEDIA_ERROR_SERVER_DIED) {
            Log.v(LOGTAG, "Media Error, Server Died " + extra);
        } else if (whatError == MediaPlayer.MEDIA_ERROR_UNKNOWN) {
            Log.v(LOGTAG, "Media Error, Error Unknown " + extra);
        }
        return false;
    }

    public boolean onInfo(MediaPlayer mp, int whatInfo, int extra) {
        statusView.setText("MediaPlayer onInfo Called");
        if (whatInfo == MediaPlayer.MEDIA_INFO_BAD_INTERLEAVING) {
            Log.v(LOGTAG, "Media Info, Media Info Bad Interleaving " + extra);
        } else if (whatInfo == MediaPlayer.MEDIA_INFO_NOT_SEEKABLE) {
            Log.v(LOGTAG, "Media Info, Media Info Not Seekable " + extra);
        } else if (whatInfo == MediaPlayer.MEDIA_INFO_UNKNOWN) {
            Log.v(LOGTAG, "Media Info, Media Info Unknown " + extra);
        } else if (whatInfo == MediaPlayer.MEDIA_INFO_VIDEO_TRACK_LAGGING) {
            Log.v(LOGTAG, "MediaInfo, Media Info Video Track Lagging " + extra);
        } else if (whatInfo == MediaPlayer.MEDIA_INFO_METADATA_UPDATE) {
            Log.v(LOGTAG, "MediaInfo, Media Info Metadata Update " + extra);
        }
        return false;
    }

    public void onPrepared(MediaPlayer mp) {
        Log.v(LOGTAG, "onPrepared Called");
        statusView.setText("MediaPlayer Prepared");

        videoWidth = mp.getVideoWidth();
        videoHeight = mp.getVideoHeight();

        Log.v(LOGTAG, "Width: " + videoWidth);
        Log.v(LOGTAG, "Height: " + videoHeight);

        if (videoWidth > currentDisplay.getWidth()
                || videoHeight > currentDisplay.getHeight()) {
            float heightRatio = (float) videoHeight
                    / (float) currentDisplay.getHeight();
            float widthRatio = (float) videoWidth
                    / (float) currentDisplay.getWidth();

            if (heightRatio > 1 || widthRatio > 1) {
                if (heightRatio > widthRatio) {
                    videoHeight = (int) Math.ceil((float) videoHeight
                            / (float) heightRatio);
                    videoWidth = (int) Math.ceil((float) videoWidth
                            / (float) heightRatio);
                } else {
                    videoHeight = (int) Math.ceil((float) videoHeight
                            / (float) widthRatio);
                    videoWidth = (int) Math.ceil((float) videoWidth
                            / (float) widthRatio);
                }
            }
        }



        surfaceView.setLayoutParams(
          new LinearLayout.LayoutParams(videoWidth, videoHeight));
        controller.setMediaPlayer(this);
        controller.setAnchorView(this.findViewById(R.id.MainView));
        controller.setEnabled(true);
        controller.show();

        mp.start();
        statusView.setText("MediaPlayer Started");
    }

    public void onSeekComplete(MediaPlayer mp) {
        Log.v(LOGTAG, "onSeekComplete Called");
    }

    public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
        Log.v(LOGTAG, "onVideoSizeChanged Called");

        videoWidth = mp.getVideoWidth();
        videoHeight = mp.getVideoHeight();

        Log.v(LOGTAG, "Width: " + videoWidth);
        Log.v(LOGTAG, "Height: " + videoHeight);

        if (videoWidth > currentDisplay.getWidth()
                || videoHeight > currentDisplay.getHeight()) {
            float heightRatio = (float) videoHeight
                    / (float) currentDisplay.getHeight();
            float widthRatio = (float) videoWidth
                    / (float) currentDisplay.getWidth();

            if (heightRatio > 1 || widthRatio > 1) {
                if (heightRatio > widthRatio) {
                    videoHeight = (int) Math.ceil((float) videoHeight
                            / (float) heightRatio);
                    videoWidth = (int) Math.ceil((float) videoWidth
                            / (float) heightRatio);
                } else {
                    videoHeight = (int) Math.ceil((float) videoHeight
                            / (float) widthRatio);
                    videoWidth = (int) Math.ceil((float) videoWidth
                            / (float) widthRatio);
                }
            }
        }


        surfaceView.setLayoutParams(
          new LinearLayout.LayoutParams(videoWidth, videoHeight));
    }

Since our activity implements the OnBufferingUpdateListener and is registered to be the listener for the MediaPlayer, the following method will be called periodically as media is downloaded and buffered. The buffering will occur during the preparation stage (after onPrepareAsync or onPrepare is called).

    public void onBufferingUpdate(MediaPlayer mp, int bufferedPercent) {
        statusView.setText("MediaPlayer Buffering: " + bufferedPercent + "%");
        Log.v(LOGTAG, "MediaPlayer Buffering: " + bufferedPercent + "%");
    }

    public boolean canPause() {
        return true;
    }

    public boolean canSeekBackward() {
        return true;
    }

    public boolean canSeekForward() {
        return true;
    }

    public int getBufferPercentage() {
        return 0;
    }

    public int getCurrentPosition() {
        return mediaPlayer.getCurrentPosition();
    }

    public int getDuration() {
        return mediaPlayer.getDuration();
    }

    public boolean isPlaying() {
        return mediaPlayer.isPlaying();
    }

    public void pause() {
        if (mediaPlayer.isPlaying()) {
            mediaPlayer.pause();
        }
    }

    public void seekTo(int pos) {
        mediaPlayer.seekTo(pos);
    }

    public void start() {
        mediaPlayer.start();
    }

    @Override
    public boolean onTouchEvent(MotionEvent ev) {
        if (controller.isShowing()) {
            controller.hide();
        } else {
            controller.show();
        }
        return false;
    }
}
Image

Figure 10-3. Streaming MediaPlayer activity during playback of video file from YouTube

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.79.20