World wide Event and Technology related Blog

Responsive Ads Here

Finger Print gestures

Fingerprint gestures

Accessibility services on devices running Android 8.0 (API level 26) or higher can respond to an alternative input mechanism, directional swipes (up, down, left, and right) along a device's fingerprint sensor. To configure a service to receive callbacks about these interactions, complete the following sequence of steps:
  1. Declare the USE_FINGERPRINT permission and theCAPABILITY_CAN_REQUEST_FINGERPRINT_GESTURES capability.
  2. Set the FLAG_REQUEST_FINGERPRINT_GESTURES flag within the android:accessibilityFlags attribute.
  3. Register for callbacks using registerFingerprintGestureCallback().
Note: You should allow users to disable an accessibility service's support for fingerprint gestures. Although multiple accessibility services can listen for fingerprint gestures simultaneously, doing so causes the services to conflict with each other.
Keep in mind that not all devices include fingerprint sensors. To identify whether a device supports the sensor, use the isHardwareDetected() method. Even on a device that includes a fingerprint sensor, your service cannot use the sensor when it's in use for authentication purposes. To identify when the sensor is available, call theisGestureDetectionAvailable() method and implement theonGestureDetectionAvailabilityChanged() callback.
The following code snippet shows an example of using fingerprint gestures to navigate around a virtual game board:
AndroidManifest.xml
<manifest ... >     <uses-permission android:name="android.permission.USE_FINGERPRINT" />     ...     <application>         <service android:name="com.example.MyFingerprintGestureService" ... >             <meta-data                 android:name="android.accessibilityservice"                 android:resource="@xml/myfingerprintgestureservice" />         </service>     </application> </manifest>
myfingerprintgestureservice.xml
<accessibility-service xmlns:android="http://schemas.android.com/apk/res/android"     ...     android:accessibilityFlags=" ... |flagRequestFingerprintGestures"     android:canRequestFingerprintGestures="true"     ... />
MyFingerprintGestureService.java
import static android.accessibilityservice.FingerprintGestureController.*; public class MyFingerprintGestureService extends AccessibilityService {     private FingerprintGestureController mGestureController;     private FingerprintGestureController             .FingerprintGestureCallback mFingerprintGestureCallback;     private boolean mIsGestureDetectionAvailable;     @Override     public void onCreate() {         mGestureController = getFingerprintGestureController();         mIsGestureDetectionAvailable =                 mGestureController.isGestureDetectionAvailable();     }     @Override     protected void onServiceConnected() {         if (mFingerprintGestureCallback != null                 || !mIsGestureDetectionAvailable) {             return;         }         mFingerprintGestureCallback =                new FingerprintGestureController.FingerprintGestureCallback() {             @Override             public void onGestureDetected(int gesture) {                 switch (gesture) {                     case FINGERPRINT_GESTURE_SWIPE_DOWN:                         moveGameCursorDown();                         break;                     case FINGERPRINT_GESTURE_SWIPE_LEFT:                         moveGameCursorLeft();                         break;                     case FINGERPRINT_GESTURE_SWIPE_RIGHT:                         moveGameCursorRight();                         break;                     case FINGERPRINT_GESTURE_SWIPE_UP:                         moveGameCursorUp();                         break;                     default:                         Log.e(MY_APP_TAG,                                   "Error: Unknown gesture type detected!");                         break;                 }             }             @Override             public void onGestureDetectionAvailabilityChanged(boolean available) {                 mIsGestureDetectionAvailable = available;             }         };         if (mFingerprintGestureCallback != null) {             mGestureController.registerFingerprintGestureCallback(                     mFingerprintGestureCallback, null);         }     } }
For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 9:03.

Multilingual text to speech

As of Android 8.0 (API level 26), Android's text-to-speech (TTS) service can identify and speak phrases in multiple languages within a single block of text. To enable this automatic language-switching capability in an accessibility service, wrap all strings in LocaleSpanobjects, as shown in the following code snippet:
TextView localeWrappedTextView = findViewById(R.id.my_french_greeting_text); localeWrappedTextView.setText(wrapTextInLocaleSpan("Bonjour!", Locale.FRANCE)); private SpannableStringBuilder wrapTextInLocaleSpan(         CharSequence originalText, Locale loc) {     SpannableStringBuilder myLocaleBuilder =             new SpannableStringBuilder(originalText);     myLocaleBuilder.setSpan(new LocaleSpan(loc), 0,             originalText.length() - 1, 0);     return myLocaleBuilder; }
For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 10:59.

Take action for users

Starting with Android 4.0 (API Level 14), accessibility services can act on behalf of users, including changing the input focus and selecting (activating) user interface elements. In Android 4.1 (API Level 16) the range of actions has been expanded to include scrolling lists and interacting with text fields. Accessibility services can also take global actions, such as navigating to the Home screen, pressing the Back button, opening the notifications screen and recent applications list. Android 4.1 also includes a new type of focus, Accessibilty Focus, which makes all visible elements selectable by an accessibility service.
These new capabilities make it possible for developers of accessibility services to create alternative navigation modes such as gesture navigation, and give users with disabilities improved control of their Android devices.

Listen for gestures

Accessibility services can listen for specific gestures and respond by taking action on behalf of a user. This feature, added in Android 4.1 (API Level 16), and requires that your accessibility service request activation of the Explore by Touch feature. Your service can request this activation by setting the flags member of the service'sAccessibilityServiceInfo instance to FLAG_REQUEST_TOUCH_EXPLORATION_MODE, as shown in the following example.
public class MyAccessibilityService extends AccessibilityService {     @Override     public void onCreate() {         getServiceInfo().flags = AccessibilityServiceInfo.FLAG_REQUEST_TOUCH_EXPLORATION_MODE;     }     ... }
Once your service has requested activation of Explore by Touch, the user must allow the feature to be turned on, if it is not already active. When this feature is active, your service receives notification of accessibility gestures through your service's onGesture() callback method and can respond by taking actions for the user.

Continued gestures

Devices running Android 8.0 (API level 26) include support for continued gestures, or programmatic gestures containing more than one Path object.
When specifying sequences of strokes, you must specify that they belong to the same programmatic gesture by using the final argument, willContinue, in theGestureDescription.StrokeDescription constructor, as shown in the following code snippet:
// Simulates an L-shaped drag path: 200 pixels right, then 200 pixels down. private void doRightThenDownDrag() {     Path dragRightPath = new Path();     dragRightPath.moveTo(200, 200);     dragRightPath.lineTo(400, 200);     long dragRightDuration = 500L; // 0.5 second     // The starting point of the second path must match     // the ending point of the first path.     Path dragDownPath = new Path();     dragDownPath.moveTo(400, 200);     dragDownPath.lineTo(400, 400);     long dragDownDuration = 500L;     GestureDescription.StrokeDescription rightThenDownDrag =             new GestureDescription.StrokeDescription(dragRightPath, 0L,             dragRightDuration, true);     rightThenDownDrag.continueStroke(dragDownPath, dragRightDuration,             dragDownDuration, false); }
For more information, see the What's New In Android Accessibility session video from Google I/O 2017, starting at 15:47.

Use accessibility actions

Accessibility services can take action on behalf of users to make interacting with applications simpler and more productive. The ability of accessibility services to perform actions was added in Android 4.0 (API Level 14) and significantly expanded with Android 4.1 (API Level 16).
In order to take actions on behalf of users, your accessibility service must register to receive events from a few or many applications and request permission to view the content of applications by setting the android:canRetrieveWindowContent to true in the service configuration file. When events are received by your service, it can then retrieve theAccessibilityNodeInfo object from the event using getSource(). With the AccessibilityNodeInfo object, your service can then explore the view hierarchy to determine what action to take and then act for the user using performAction().
public class MyAccessibilityService extends AccessibilityService {     @Override     public void onAccessibilityEvent(AccessibilityEvent event) {         // get the source node of the event         AccessibilityNodeInfo nodeInfo = event.getSource();         // Use the event and node information to determine         // what action to take         // take action on behalf of the user         nodeInfo.performAction(AccessibilityNodeInfo.ACTION_SCROLL_FORWARD);         // recycle the nodeInfo object         nodeInfo.recycle();     }     ... }
The performAction() method allows your service to take action within an application. If your service needs to perform a global action such as navigating to the Home screen, pressing the Back button, opening the notifications screen or recent applications list, then use the performGlobalAction() method.

Use focus types

Android 4.1 (API Level 16) introduces a new type of user interface focus called Accessibility Focus. Accessibility services can used this type of focus to select any visible user interface element and act on it. This focus type is different from the more well known Input Focus, which determines what on-screen user interface element receives input when a user types characters, presses Enter on a keyboard or pushes the center button of a D-pad control.
Accessibility Focus is completely separate and independent from Input Focus. In fact, it is possible for one element in a user interface to have Input Focus while another element has Accessibility Focus. The purpose of Accessibility Focus is to provide accessibility services with a method of interacting with any visible element on a screen, regardless of whether or not the element is input-focusable from a system perspective. You can see accessibility focus in action by testing accessibility gestures. For more information about testing this feature, see Testing gesture navigation.
Note: Accessibility services that use Accessibility Focus are responsible for synchronizing the current Input Focus when an element is capable of this type of focus. Services that do not synchronize Input Focus with Accessibility Focus run the risk of causing problems in applications that expect input focus to be in a specific location when certain actions are taken.
An accessibility service can determine what user interface element has Input Focus or Accessibility Focus using the AccessibilityNodeInfo.findFocus() method. You can also search for elements that can be selected with Input Focus using the focusSearch()method. Finally, your accessibility service can set Accessibility Focus using theperformAction(AccessibilityNodeInfo.ACTION_SET_ACCESSIBILITY_FOCUS) method.

Gather information

Accessibility services also have standard methods of gathering and representing key units of user-provided information, such as event details, text, and numbers.

Get event details

The Android system provides information to accessibility services about the user interface interaction through AccessibilityEvent objects. Prior to Android 4.0, the information available in an accessibility event, while providing a significant amount of detail about a user interface control selected by the user, offered limited contextual information. In many cases, this missing context information might be critical to understanding the meaning of the selected control.
An example of an interface where context is critical is a calendar or day planner. If the user selects a 4:00 PM time slot in a Monday to Friday day list and the accessibility service announces “4 PM”, but does not announce the weekday name, the day of the month, or the month name, the resulting feedback is confusing. In this case, the context of a user interface control is critical to a user who wants to schedule a meeting.
Android 4.0 significantly extends the amount of information that an accessibility service can obtain about an user interface interaction by composing accessibility events based on the view hierarchy. A view hierarchy is the set of user interface components that contain the component (its parents) and the user interface elements that may be contained by that component (its children). In this way, the Android system can provide much richer detail about accessibility events, allowing accessibility services to provide more useful feedback to users.
An accessibility service gets information about an user interface event through an AccessibilityEvent passed by the system to the service's onAccessibilityEvent()callback method. This object provides details about the event, including the type of object being acted upon, its descriptive text and other details. Starting in Android 4.0 (and supported in previous releases through the AccessibilityEventCompat object in the Support Library), you can obtain additional information about the event using these calls:
  • AccessibilityEvent.getRecordCount() and getRecord(int) - These methods allow you to retrieve the set of AccessibilityRecord objects which contributed to the AccessibilityEvent passed to you by the system. This level of detail provides more context for the event that triggered your accessibility service.
  • AccessibilityEvent.getSource() - This method returns an AccessibilityNodeInfo object. This object allows you to request view layout hierarchy (parents and children) of the component that originated the accessibility event. This feature allows an accessibility service to investigate the full context of an event, including the content and state of any enclosing views or child views.
    Important: The ability to investigate the view hierarchy from an AccessibilityEventpotentially exposes private user information to your accessibility service. For this reason, your service must request this level of access through the accessibility service configuration XML file, by including the canRetrieveWindowContent attribute and setting it to true. If you do not include this setting in your service configuration xml file, calls to getSource() fail.
    Note: In Android 4.1 (API Level 16) and higher, the getSource() method, as well as AccessibilityNodeInfo.getChild() and getParent(), return only view objects that are considered important for accessibility (views that draw content or respond to user actions). If your service requires all views, it can request them by setting the flagsmember of the service's AccessibilityServiceInfo instance toFLAG_INCLUDE_NOT_IMPORTANT_VIEWS.

Process text

Devices running Android 8.0 (API level 26) and higher include several text-processing features that make it easier for accessibility services to identify and operate on specific units of text that appear on screen.

Hint text

Android 8.0 (API level 26) includes several methods for interacting with a text-based object's hint text:

Locations of on-screen text characters

On devices running Android 8.0 (API level 26) and higher, accessibility services can determine the screen coordinates for each visible character's bounding box within a TextView widget. Services find these coordinates by calling refreshWithExtraData(), passing inEXTRA_DATA_TEXT_CHARACTER_LOCATION_KEY as the first argument and a Bundle object as the second argument. As the method executes, the system populates the Bundleargument with a parcelable array of Rect objects. Each Rect object represents the bounding box of a particular character.

Standardized one-sided range values

Some AccessibilityNodeInfo objects use an instance of AccessibilityNodeInfo.RangeInfo to indicate that a UI element can take on a range of values. When creating a range using RangeInfo.obtain(), or when retrieving the extreme values of the range using getMin() and getMax(), keep in mind that devices running Android 8.0 (API level 26) and higher represent one-sided ranges in a standardized manner:

Sample code

The API Demo project contains two samples which can be used as a starting point for generating accessibility services (<sdk>/samples/<platform>/ApiDemos/src/com/example/android/apis/accessibility):

No comments:

Post a Comment