Time for action – handling touch events

Let's intercept touch events in DroidBlaster:

  1. In the same way that we created ActivityHandler to process application events in Chapter 5, Writing a Fully Native Application, create jni/InputHandler.hpp to process input events. The input API is declared in android/input.h. Create onTouchEvent() to handle touch events. These events are packaged in an AInputEvent structure. Other input peripherals will be described later in this chapter:
    #ifndef _PACKT_INPUTHANDLER_HPP_
    #define _PACKT_INPUTHANDLER_HPP_
    
    #include <android/input.h>
    
    class InputHandler {
    public:
        virtual ~InputHandler() {};
    
        virtual bool onTouchEvent(AInputEvent* pEvent) = 0;
    };
    #endif
  2. Modify the jni/EventLoop.hpp header file to include and handle an InputHandler instance.

    In a similar way, to activity events, define an internal method processInputEvent(), which is triggered by a static callback callback_input():

    ...
    #include "ActivityHandler.hpp"
    #include "InputHandler.hpp"
    
    #include <android_native_app_glue.h>
    
    class EventLoop {
    public:
    EventLoop(android_app* pApplication,
                ActivityHandler& pActivityHandler,
                InputHandler& pInputHandler);
        ...
    private:
        ...
        void processAppEvent(int32_t pCommand);
        int32_t processInputEvent(AInputEvent* pEvent);
    
        static void callback_appEvent(android_app* pApplication,
                int32_t pCommand);
        static int32_t callback_input(android_app* pApplication,
                AInputEvent* pEvent);
    
        ...
        ActivityHandler& mActivityHandler;
        InputHandler& mInputHandler;
    };
    #endif
  3. We need to process input events in the jni/EventLoop.cpp source file and notify the associated InputHandler.

    First, connect the Android input queue to callback_input(). The EventLoop itself (that is, this) is passed anonymously through the userData member of the android_app structure. That way, callback is able to delegate input processing back to our own object, that is, to processInputEvent():

    ...
    EventLoop::EventLoop(android_app* pApplication,
        ActivityHandler& pActivityHandler, InputHandler& pInputHandler):
            mApplication(pApplication),
            mActivityHandler(pActivityHandler),
            mEnabled(false), mQuit(false),
            mInputHandler(pInputHandler) {
        mApplication->userData = this;
        mApplication->onAppCmd = callback_appEvent;
        mApplication->onInputEvent = callback_input;
    }
    
    ...
    
    int32_t EventLoop::callback_input(android_app* pApplication,
                                      AInputEvent* pEvent) {
        EventLoop& eventLoop = *(EventLoop*) pApplication->userData;
        return eventLoop.processInputEvent(pEvent);
    }
    ...
  4. Touchscreen events are of the type MotionEvent (as opposed to key events). They can be discriminated according to their source (AINPUT_SOURCE_TOUCHSCREEN) thanks to the Android native input API (here, AinputEvent_getSource()):

    Note

    Note how callback_input() and by extension processInputEvent() return an integer value (which is intrinsically a Boolean value). This value indicates that an input event (for example, a pressed button) has been processed by the application and does not need to be processed further by the system. For example, 1 is returned when the back button is pressed to stop event processing and prevent the activity from getting terminated.

    ...
    int32_t EventLoop::processInputEvent(AInputEvent* pEvent) {
        if (!mEnabled) return 0;
    
        int32_t eventType = AInputEvent_getType(pEvent);
        switch (eventType) {
        case AINPUT_EVENT_TYPE_MOTION:
            switch (AInputEvent_getSource(pEvent)) {
            case AINPUT_SOURCE_TOUCHSCREEN:
                return mInputHandler.onTouchEvent(pEvent);
                break;
            }
            break;
        }
        return 0;
    }
  5. Create jni/InputManager.hpp to handle touch events and implement our new InputHandler interface.

    Define the methods as follows:

    • start() to perform the necessary initialization.
    • onTouchEvent() to update the manager state when a new event is triggered.
    • getDirectionX() and getDirectionY() to indicate the ship direction.
    • setRefPoint() refers to the ship position. Indeed, the direction is defined as the vector between the touch point and the ship location (that is, the reference point).

    Also, declare the necessary members and more specifically mScaleFactor, which contains the proper ratio to convert the input event from screen coordinates to game coordinates (remember that we use a fixed size).

    #ifndef _PACKT_INPUTMANAGER_HPP_
    #define _PACKT_INPUTMANAGER_HPP_
    
    #include "GraphicsManager.hpp"
    #include "InputHandler.hpp"
    #include "Types.hpp"
    
    #include <android_native_app_glue.h>
    
    class InputManager : public InputHandler {
    public:
        InputManager(android_app* pApplication,
                 GraphicsManager& pGraphicsManager);
    
        float getDirectionX() { return mDirectionX; };
        float getDirectionY() { return mDirectionY; };
        void setRefPoint(Location* pRefPoint) { mRefPoint = pRefPoint; };
    
        void start();
    
    protected:
        bool onTouchEvent(AInputEvent* pEvent);
    
    private:
        android_app* mApplication;
        GraphicsManager& mGraphicsManager;
    
        // Input values.
        float mScaleFactor;
        float mDirectionX, mDirectionY;
        // Reference point to evaluate touch distance.
        Location* mRefPoint;
    };
    #endif
  6. Create jni/InputManager.cpp, starting with the constructor:
    #include "InputManager.hpp"
    #include "Log.hpp"
    
    #include <android_native_app_glue.h>
    #include <cmath>
    
    InputManager::InputManager(android_app* pApplication,
            GraphicsManager& pGraphicsManager) :
        mApplication(pApplication), mGraphicsManager(pGraphicsManager),
        mDirectionX(0.0f), mDirectionY(0.0f),
        mRefPoint(NULL) {
    }
    ...
  7. Write the start() method to clear members and compute the scale factor. The scale factor is necessary because, as seen in Chapter 6, Rendering Graphics with OpenGL ES, we need to convert screen coordinates provided in input events (which depends on the device) into game coordinates:
    ...
    void InputManager::start() {
        Log::info("Starting InputManager.");
        mDirectionX = 0.0f, mDirectionY = 0.0f;
        mScaleFactor = float(mGraphicsManager.getRenderWidth())
                           / float(mGraphicsManager.getScreenWidth());
    }
    ...
  8. The effective event processing comes in onTouchEvent(). Horizontal and vertical directions are computed according to the distance between the reference point and the touch point. This distance is restricted by TOUCH_MAX_RANGE to an arbitrary range of 65 units. Thus, a ship's maximum speed is reached when the reference-to-touch point distance is beyond TOUCH_MAX_RANGE pixels.

    Touch coordinates are retrieved thanks to AMotionEvent_getX() and AMotionEvent_getY() when you move your finger. The direction vector is reset to 0 when no more touch is detected:

    ...
    bool InputManager::onTouchEvent(AInputEvent* pEvent) {
        static const float TOUCH_MAX_RANGE = 65.0f; // In game units.
    
        if (mRefPoint != NULL) {
            if (AMotionEvent_getAction(pEvent)
                            == AMOTION_EVENT_ACTION_MOVE) {
                float x = AMotionEvent_getX(pEvent, 0) * mScaleFactor;
                float y = (float(mGraphicsManager.getScreenHeight())
                         - AMotionEvent_getY(pEvent, 0)) * mScaleFactor;
                // Needs a conversion to proper coordinates
                // (origin at bottom/left). Only moveY needs it.
                float moveX = x - mRefPoint->x;
                float moveY = y - mRefPoint->y;
                float moveRange = sqrt((moveX * moveX) + (moveY * moveY));
    
                if (moveRange > TOUCH_MAX_RANGE) {
                    float cropFactor = TOUCH_MAX_RANGE / moveRange;
                    moveX *= cropFactor; moveY *= cropFactor;
                }
    
                mDirectionX = moveX / TOUCH_MAX_RANGE;
                mDirectionY   = moveY / TOUCH_MAX_RANGE;
            } else {
                mDirectionX = 0.0f; mDirectionY = 0.0f;
            }
        }
        return true;
    }
  9. Create a simple component jni/MoveableBody.hpp, whose role is to move PhysicsBody according to input events:
    #ifndef _PACKT_MOVEABLEBODY_HPP_
    #define _PACKT_MOVEABLEBODY_HPP_
    
    #include "InputManager.hpp"
    #include "PhysicsManager.hpp"
    #include "Types.hpp"
    
    class MoveableBody {
    public:
        MoveableBody(android_app* pApplication,
           InputManager& pInputManager, PhysicsManager& pPhysicsManager);
    
        PhysicsBody* registerMoveableBody(Location& pLocation,
                int32_t pSizeX, int32_t pSizeY);
    
        void initialize();
        void update();
    
    private:
        PhysicsManager& mPhysicsManager;
        InputManager& mInputManager;
    
        PhysicsBody* mBody;
    };
    #endif
  10. Implement this component in jni/MoveableBody.cpp.

    InputManager and the body are bound in registerMoveableBody():

    #include "Log.hpp"
    #include "MoveableBody.hpp"
    
    MoveableBody::MoveableBody(android_app* pApplication,
          InputManager& pInputManager, PhysicsManager& pPhysicsManager) :
        mInputManager(pInputManager),
        mPhysicsManager(pPhysicsManager),
        mBody(NULL) {
    }
    
    PhysicsBody* MoveableBody::registerMoveableBody(Location& pLocation,
    int32_t pSizeX, int32_t pSizeY) {
        mBody = mPhysicsManager.loadBody(pLocation, pSizeX, pSizeY);
        mInputManager.setRefPoint(&pLocation);
        return mBody;
    }
    ...
  11. Initially, the body has no velocity.

    Then, each time it is updated, the velocity mirrors the current input state. This velocity is taken in input by PhysicsManager created in Chapter 5, Writing a Fully Native Application, to update the entity's position:

    ...
    void MoveableBody::initialize() {
        mBody->velocityX = 0.0f;
        mBody->velocityY = 0.0f;
    }
    
    void MoveableBody::update() {
        static const float MOVE_SPEED = 320.0f;
        mBody->velocityX = mInputManager.getDirectionX() * MOVE_SPEED;
        mBody->velocityY = mInputManager.getDirectionY() * MOVE_SPEED;
    }

    Reference the new InputManager and MoveableComponent in jni/DroidBlaster.hpp:

    ...
    #include "EventLoop.hpp"
    #include "GraphicsManager.hpp"
    #include "InputManager.hpp"
    #include "MoveableBody.hpp"
    #include "PhysicsManager.hpp"
    #include "Resource.hpp"
    ...
    
    class DroidBlaster : public ActivityHandler {
        ...
    private:
        TimeManager     mTimeManager;
        GraphicsManager mGraphicsManager;
        PhysicsManager  mPhysicsManager;
        SoundManager    mSoundManager;
        InputManager    mInputManager;
        EventLoop mEventLoop;
        ...
        Asteroid mAsteroids;
        Ship mShip;
        StarField mStarField;
        SpriteBatch mSpriteBatch;
        MoveableBody mMoveableBody;
    };
    #endif
  12. Finally, adapt the jni/DroidBlaster.cpp constructor to instantiate InputManager and MoveableComponent.

    Append InputManager to EventLoop, which dispatches input events, at construction time.

    The spaceship is the entity being moved. So, pass a reference to its location to the MoveableBody component:

    ...
    DroidBlaster::DroidBlaster(android_app* pApplication):
        mTimeManager(),
        mGraphicsManager(pApplication),
        mPhysicsManager(mTimeManager, mGraphicsManager),
        mSoundManager(pApplication),
        mInputManager(pApplication, mGraphicsManager),
        mEventLoop(pApplication, *this, mInputManager),
        ...
        mAsteroids(pApplication, mTimeManager, mGraphicsManager,
        mPhysicsManager),
        mShip(pApplication, mGraphicsManager, mSoundManager),
        mStarField(pApplication, mTimeManager, mGraphicsManager,
                STAR_COUNT, mStarTexture),
        mSpriteBatch(mTimeManager, mGraphicsManager),
        mMoveableBody(pApplication, mInputManager, mPhysicsManager) {
        ...
        Sprite* shipGraphics = mSpriteBatch.registerSprite(mShipTexture,
                SHIP_SIZE, SHIP_SIZE);
        shipGraphics->setAnimation(SHIP_FRAME_1, SHIP_FRAME_COUNT,
                SHIP_ANIM_SPEED, true);
        Sound* collisionSound =
                mSoundManager.registerSound(mCollisionSound);
        mMoveableBody.registerMoveableBody(shipGraphics->location,
                SHIP_SIZE, SHIP_SIZE);
        mShip.registerShip(shipGraphics, collisionSound);
    
        // Creates asteroids.
        ...
    }
    ...
  13. Initialize and update MoveableBody and InputManager in the corresponding methods:
    ...
    status DroidBlaster::onActivate() {
        Log::info("Activating DroidBlaster");
        if (mGraphicsManager.start() != STATUS_OK) return STATUS_KO;
        if (mSoundManager.start() != STATUS_OK) return STATUS_KO;
        mInputManager.start();
    
        mSoundManager.playBGM(mBGM);
    
        mAsteroids.initialize();
        mShip.initialize();
        mMoveableBody.initialize();
    
        mTimeManager.reset();
        return STATUS_OK;
    }
    
    ...
    
    status DroidBlaster::onStep() {
        mTimeManager.update();
        mPhysicsManager.update();
    
        mAsteroids.update();
        mMoveableBody.update();
    
        return mGraphicsManager.update();
    }
    ...

What just happened?

We created a simple example of an input system, based on touch events. The ship flies toward the touch point at a speed dependent on the touch distance. The touch event coordinates are absolute. Their origin is in the upper-left corner of the screen, on the opposite of OpenGL, which is on the lower-left corner. If screen rotation is permitted by an application, then the screen origin remains on the upper-left corner from the user's point of view, whether the device is in portrait or landscape mode.

What just happened?

To implement this new feature, we connected our event loop to the input event queue provided by the native_app_glue module. This queue is internally represented as a UNIX pipe, like the activity event queue. Touchscreen events are embedded in an AInputEvent structure, which stores other kinds of input events. Input events are handled with the AInputEvent and AMotionEvent API declared in android/input.h. The AInputEvent API is necessary to discriminate input event types using AInputEvent_getType() and AInputEvent_getSource() methods. The AMotionEvent API provides methods to handle touch events only.

The touch API is rather rich. Many details can be requested as shown in the following table (non-exhaustively):

Method

Description

AMotionEvent_getAction()

To detect whether a finger makes contact with the screen, leaving it, or moving over the surface.

The result is an integer value composed of the event type (on byte 1, for example, AMOTION_EVENT_ACTION_DOWN) and a pointer index (on byte 2, to know which finger the event refers to).

AMotionEvent_getX()
AMotionEvent_getY()

To retrieve touch coordinates on screen, expressed in pixels as a float (sub-pixel values are possible).

AMotionEvent_getDownTime()
AMotionEvent_getEventTime()

To retrieve how much time a finger has been sliding over the screen and when the event was generated in nanoseconds.

AMotionEvent_getPressure()
AMotionEvent_getSize()

To detect the pressure intensity and zone. Values usually range between 0.0 and 1.0 (but may exceed it). Size and pressure are generally closely related. The behavior can vary greatly and be noisy, depending on hardware.

AMotionEvent_getHistorySize()
AMotionEvent_getHistoricalX()
AMotionEvent_getHistoricalY()

Touch events of type AMOTION_EVENT_ACTION_MOVE can be grouped together for efficiency purposes. These methods give access to these historical points that occurred between previous and current events.

Have a look at android/input.h for an exhaustive list of methods.

If you look more deeply at the AMotionEvent API, you will notice that some events have a second parameter pointer_index, which ranges between 0 and the number of active pointers. Indeed, most touchscreens today are multi-touch! Two or more fingers on a screen (if hardware supports it) are translated in Android by two or more pointers. To manipulate them, look at the following table:

Method

Description

AMotionEvent_getPointerCount()

To know how many fingers touch the screen.

AMotionEvent_getPointerId()

To get a pointer unique identifier from a pointer index. This is the only way to track a particular pointer (that is, finger) over time, as its index may change when fingers touch or leave the screen.

Tip

If you followed the story of the (now prehistoric!) Nexus One, then you know that it came out with a hardware defect. Pointers were often getting mixed up, two of them exchanging one of their coordinates. So always be prepared to handle hardware specificities or hardware that behaves incorrectly!

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.89.24