Learning Android gestures

What happens within the system when a user touches the screen? And even more important — how to handle it right? The time has come to figure everything out once and for all! Today’s post is on what I’ve learned about Android Touch System and my experience with it.

  1. Appearances are deceptive
  2. What’s inside?
  3. System detectors for gestures and touches
  4. Your own detector for gestures
  5. Intercept and delegate!
  6. Conclusion

1. Appearances are deceptive

Recently I’ve got a task to develop a FrescoImageViewer library for viewing photographs that were uploaded with Fresco. Apart from it, I also needed to implement “pinch to zoom”, switching with ViewPager and something similar to “swipe to dismiss” — closing image with a vertical swipe. After putting together the main components I encountered a significant problem: a gesture conflict.

Since I didn’t have much experience with gestures the first idea that crossed my mind was to analyze events in onTouchEvent() inside my CustomView and hand over control when needed. But the behavior turned out to be less obvious than I thought.

Documentation says that onTouchEvent() should return true if the event was processed and false otherwise. But for some reasons it omits the fact that if we return true and then change the value back to false, the behavior won’t change until the gesture is completed. It means that after we tell the system that onTouchEvent() is interested in what’s happening, this decision cannot be changed. It was a pain in the neck so I finally opened Google and started educating myself on the framework for managing gestures used in Android.

2. What’s inside?

For better understanding let’s take a step-by-step approach and learn what’s happening inside this not so complicated mechanism. As an example I will use Activity a user has just put a finger on with ViewGroup and child View inside:

  1. The system wraps input event into MotionEvent object containing all the useful data, such as action type, current and previous touch coordinates, event time, number of fingers touching the screen and their order, etc.
  2. The object that was generated goes into Activity.dispatchTouchEvent() which is always called out first. If the activity does not return true (is not interested in processing the event on its level), the event is sent to the root View.
  3. The root element calls out dispatchTouchEvent() in all involved child elements in the reverse order of how they were added. They do the same with their child elements and let the event down the element nesting until some element reacts to it (returns true).*
  4. When reaching dispatchTouchEvent() of the lowest View, the chain goes backwards with the help of onTouchEvent() method that also returns interested/not interested result.
  5. If no one is interested, the event is returned into Activity.onTouchView().

The same goes for ViewGroup and View: before calling out onTouchEvent() the existence of nTouchListener is checked. If it was specified, we get OnTouchListener.onTouch(), if not — onTouchEvent().

*Please note: in ViewGroup once dispatchTouchEvent() method is called out we also get onInterceptTouchEvent() that allows us to intercept an event without notifying the nested elements and this way make ViewGroup behavior identical to View:

  • If while intercepting a gesture ViewGroup says that it is interested, all the child elements get ACTION_CANCEL.
  • In case you need to avoid interception from the parent container and its ancestor containers inside View, you should call out requestDisallowInterceptTouchEvent(true) in ViewGroup.

3. System detectors for gestures and touches

Luckily, there’s no need to invent the wheel and process everything manually. GestureDetector built into SDK is of great help here. It includes OnGestureListener, OnDoubleTapListener and OnContextClickListener interfaces for notifying about the event that happened and its type. Here’s how they look like:

public interface OnGestureListener {
   boolean onDown(MotionEvent e);
   void onShowPress(MotionEvent e);
   boolean onSingleTapUp(MotionEvent e);
   boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY);
   void onLongPress(MotionEvent e);
   boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY);
}
 
public interface OnDoubleTapListener {
   boolean onSingleTapConfirmed(MotionEvent e);
   boolean onDoubleTap(MotionEvent e);
   boolean onDoubleTapEvent(MotionEvent e);
}
 
public interface OnContextClickListener {
   boolean onContextClick(MotionEvent e);
}

As you can see from method names, with SDK GestureDetector we can distinguish between singleTap, doubleTap, longPress, scroll and fling (you can find a detailed description of these methods in Javadoc or in official Android documentation).

But it’s not enough! We also have ScaleGestureDetector with only one listener

public interface OnScaleGestureListener {
   	   boolean onScale(ScaleGestureDetector detector);
   boolean onScaleBegin(ScaleGestureDetector detector);
   void onScaleEnd(ScaleGestureDetector detector);
}

It recognizes pinch and notifies about its start, end and duration. Apart from listener, there’re also additional methods for getting all the necessary information (take a look at documentation).

Since now we’re familiar with build-in classes, let’s learn how to use them. It’s pretty easy! Simply create an instance of the detector you need:

scaleDetector = new ScaleGestureDetector(context, listener());

And then pass MotionEvent you’ve received to it. For example, in onTouchEvent():

@Override
public boolean onTouchEvent(MotionEvent event) {
   scaleDetector.onTouchEvent(event);
   return super.onTouchEvent(event);
}

You’re good! All the gestures that were recognized go to the listener that was passed.

4. Your own detector for gestures

Unfortunately, with standard features we can only learn about the touch and pointer movements (MotionEvent.ACTION_DOWN и MotionEvent.ACTION_MOVE) but in some cases (which are common, in fact) when processing gestures we also need to know their direction. Standard detectors can’t help us here and thus we need to write our own.

Let’s call it SwipeDirectionDetector. Logic is simple here: we remember event coordinates on ACTION_DOWN and then measure the distance to the point on ACTION_MOVE. As soon as the distance becomes sufficient for defining the direction we calculate the angle and receive data on the direction based on it.

At first, let’s identify onTouchEvent() method accepting MotionEvent and describe calculation logic within it:

public boolean onTouchEvent(MotionEvent event) {
   switch (event.getAction()) {
       case MotionEvent.ACTION_DOWN:
           startX = event.getX();
           startY = event.getY();
           break;
       case MotionEvent.ACTION_CANCEL:
       case MotionEvent.ACTION_UP:
           startX = startY = 0.0f;
           break;
       case MotionEvent.ACTION_MOVE:
           if (getDistance(event) > touchSlop) {
               float x = event.getX();
               float y = event.getY();
 
               Direction direction = Direction.get(getAngle(startX, startY, x, y));
               onDirectionDetected(direction);
           }
           break;
   }
   return false;
}

We identify Direction object as enum and add get() methods to it for identifying the direction based on the angle and inRange() for checking whether it falls within the range.

public enum Direction {
   UP,
   DOWN,
   LEFT,
   RIGHT;
 
   public static Direction get(double angle) {
       if (inRange(angle, 45, 135)) {
           return Direction.UP;
       } else if (inRange(angle, 0, 45) || inRange(angle, 315, 360)) {
           return Direction.RIGHT;
       } else if (inRange(angle, 225, 315)) {
           return Direction.DOWN;
       } else {
           return Direction.LEFT;
       }
   }
 
   private static boolean inRange(double angle, float init, float end) {
       return (angle >= init) && (angle < end);
   }
}

We are almost there. Now let’s create an instance of detector and pass the received MotionEvent to it:

directionDetector = new SwipeDirectionDetector(getContext()) {
   @Override
   public void onDirectionDetected(Direction direction) {
      this.direction = direction;
   }
};
...
@Override
public boolean onTouchEvent(MotionEvent event) {
   directionDetector.onTouchEvent(event);
   return super.onTouchEvent(event);
}

5. Intercept and delegate!

Now let’s take a look at a simple case: custom view with ViewPager and a container for “swipe to dismiss” inside. If we simply combine components, gestures will be processed simultaneously which is not good from the UX standpoint.

In order to solve the issue we need to redefine dispatchTouchEvent() and use it to notify the detector we wrote. The method needs to return true since we need to intercept it to get control. As soon as the direction is identified, we can transfer events to the right widget. Remember that you should do it only via dispatchTouchEvent().

@Override
public boolean dispatchTouchEvent(MotionEvent event) {
   directionDetector.onTouchEvent(event);
 
   //passing UP action to widgets and reseting the direction
   if (event.getAction() == MotionEvent.ACTION_UP) {
       direction = null;
       pager.dispatchTouchEvent(event);
       swipeDismissListener.onTouch(dismissContainer, event);
   }
 
   //passing initial action to widgets
   if (event.getAction() == MotionEvent.ACTION_DOWN) {
       swipeDismissListener.onTouch(dismissContainer, event);
       pager.dispatchTouchEvent(event);
   }
 
   if (direction != null) {
       switch (direction) {
           case UP:
           case DOWN:
               return swipeDismissListener.onTouch(dismissContainer, event);
           case LEFT:
           case RIGHT:
               return pager.dispatchTouchEvent(event);
       }
   }
   return true;
}

Now the gesture conflict is solved and everything looks nice and clean (which is good too :)

6. Conclusion

For examples and better understanding of this approach please see the FrescoImageViewer source code on GitHub. I sincerely hope that this post will help somebody understand the gesture system better and save their precious time ;)

P. S. This lecture by Dave Smith helped me a lot along the way so I recommend you to check it out as well.

Need MVP development, iOS and Android apps or prototyping? Check out our portfolio and make an order today!

About author

Android Developer
Sasha develops native apps for Android and loves everything connected with the world of mobile devices. Uses Java, DataBinding, Retrofit 2.0 and Realm in his work. He also creates his own solutions for optimizing the development process.

Related posts

Return to list Return to list