Gesture navigation can be overwhelming at first, but
learning how to use gestures can save you a lot of time navigating your world
of apps and menus. It was certainly the case for me. When gesture navigation
was first introduced into the Android core services, it may seem a bit
difficult for first-time users, but after a certain time, it can be more
efficacious. Gesture navigation is more about utilizing the structure and
inherent physics of the devices. Users can now indirectly interact with their
media.
Gesture navigation was first introduced to the public
as a convenient navigation method since Android 9. Gesture navigation was
available prior to the release of Android 9 through third-party applications.
Some of you may already be familiar with gesture navigation. Android fully
supports alphanumeric characters and allows a new dimension of recognition and
interaction with users. Gesture navigation supports fine-grain tracking of
change over time, which leads to more direct interaction. Using swipe gestures
on your Android phone can be an efficient way to quickly access apps and
various functions. No need to find and tap icons on your home screen; simply
touch, hold, and drag your finger.
Whether you have an Android device that runs Android 9
Pie, Android 10, Android 11, or Android 12, the process on how to enable
gesture navigation is pretty much similar. However, the user interface and the
label names that you may encounter on your Android device may have a minor
difference. If you own a new Android device, you can most likely use gestures
instead of the standard on-screen navigation buttons available by default. This
guide will work with any Android devices from brands that include Samsung,
Xiaomi, Huawei, Oppo, Vivo, Realme, OnePlus, ASUS, Sony, Google Pixel, and many
more. Just check it out.
Android, as a mobile operating system, has received
its updates and new features from Google over the last few years. Those who own
newer Android devices have probably heard of numerous navigation methods that
Android has introduced with Android Pie, the ninth major release and the
sixteenth version of the Android mobile operating system. If you are not
satisfied with the traditional navigation bar layout, you may try the gesture
navigation available since Android Pie. Gesture navigation is an alternative
way to navigate around your device, relying on swipes, gestures, and button
presses to activate and perform actions. In this guide, we will show the steps
on how to enable and customize gesture navigation on Android devices that run
Android 9 Pie, Android 10, Android 11, and even Android 12.
Users of Android (factory) devices could be quickly
annoyed, as there is no introduction to the topic when unlocking the phone for
the first time. Gestures are to be discovered (e.g., when swiping up to enter
the overview) or are pointed out in apps, independent of the gesture style of
the Android version installed. It is only after the discovery of the gesture
navigation mentioned in the beginning that users are familiar with the topic.
On the one hand, understandable help for the most important function is not
displayed after the search field is activated, which is lacking in the design.
To avoid active feedback free hints are avoided; but it is interesting that all
functionalities of the search field do not require gestures, reminder cards do,
and they do not often lead to helpful apps. This does not matter, because the
quick gesture-based entry into the menu of popular apps such as Messengers and
browsers at the point of search is remarkable, which is to the advantage of the
feature itself, independent of the browser selected. In the very few cases of
using the indicated target page, the apps that have opted for a strong
integration will be installed. After the search field is reduced to the state
of the first fingerprint information about the registration of a pointer is
shown only in the top row and indicates that a pointer is flying over. If the
top row is smaller (time or signal range) a bigger and more centered pointer is
shown instead. A wave expands in addition to the flag. Treatments are dependent
on the tile color. Account and contact information of Google Photos is used for
the attachment of personal reinforcers. If no picture is found, remote control
lights up and the pointer places itself at the bottom of the new menu. If a
notification of contact information is presented, the cars of the contact are
displayed. Email has typical contacts of the email field. It is seen only if no
contact information is found. Our prototype is the first tool of this kind. It
has been successfully used in small real-life scenarios. However, the relevance
of use in everyday life is not fully clear yet. Examples of successful
applications concern bidirectional transitions from a service tool such as the
Google Assistant or news websites of the user to annotated pictures or
important contacts based on these news. Since our device provides more services
than the Next Data, but is hardly noticeable, a related integration of such
remote controls can also be conceivable. The necessary changes to Android are
very few and not invasive. Google can keep some parts of the overlay as
exclusive features, but providing the feature could also be financially
worthwhile due the higher attractiveness level of the last year's design of
gestures demonstrated within this paper. With a higher amount of active users,
new partners should also have a higher readiness to enter into partnerships
with established developers.
With the currently available devices, the basic
functions of gesture navigation (go back, go home, and view all open apps) have
become standardized, and changes are only to be seen in the width of the
swiping area and in the speed of the commands, for instance, by “pouring” a
particular gesture into the display area. However, notifications as well as the
Google Assistant can be accessed from both sides by swiping. In this section,
the navigation functions and the available gestures are introduced again.
Devices supporting the functionalities are presented afterwards.
During a call, when the in-call UI is displayed and
the navigation bar is hidden, users can invoke the navigation bar by swiping in
from the left or right edge of the screen. Just like in edge swipe gesture
navigation, swiping longer will reveal the gestures that correspond to the
different buttons the classic navigation bar hosts. Since the indicators are
similar to what the home screen app shows, users can get a clue on what each
gesture will do. Finally, some shortcuts (like the rotation or multitasking
ones), which usually come in handy in that situation, are not shown at all –
unless the potential for user input can prompt a non-obtrusive bottom bar
expansion. And that basically describes the DNA of the imperative model for
gesture navigation – the system itself does not cater to the feature logic and
needs to be fed handheld step by step.
This brings us to some gesture-related and
navigation-related customization options, advanced gesture navigation features
that are available only in certain situations (like on specific types of visual
stimuli), and overlays and controls that can inform users that a device is
capable of gesture navigation and help them interact with the system. These are
the things an app developer or an OEM needs to look out for when enabling
gesture navigation using custom code or API, a custom logic that does not
necessarily follow the best practices.
If you don't like using gestures to switch between apps, there's a shortcut for that. Quick Gesture is a rapid button, and if you scroll roughly horizontally with a quick swipe at the bottom of your screen, you can jump directly to your next app. If you swipe further to the right, you'll scroll through the last few opened apps. If you prefer edge gestures for triggering different commands, you can use edge gestures to trigger the Assistant or adjust ambient display, and it is possible to add a sensitivity zone if you are prompted. You can also turn off fast switching between apps if you accidentally activate it too often.