In-display fingerprint sensors have become commonplace in virtually all Android smartphones, for better or for worse, and five years later…
In-display fingerprint sensors have become commonplace in virtually all Android smartphones, for better or for worse, and five years later…
Seriously. I’m tired of blinding myself at night when trying to sleepily check my phone. The sensor on the back of my old pixel could also be used to pull down the notification shade which I can’t easily reach with one hand, now, since the screen is so large.
Ultrasonic under screen readers dont need to light up that spot to work
Ohhh I just assumed it was what my pixel 8 was using.
Nope! The 9 is the first pixel that uses Ultrasonic. Its a massive upgrade from the optical scanner in the 7. Legitimately never misses
fwiw the optical one in the Pixel 8 I use is pretty good and works better than the ultrasonic of my old Samsung, which was a disaster.
On the back sucks, with your phone laying on a surface you can’t access the reader… Sony had it right, put it on the side, on the power button
That’s the placement of the sensor on Samsung’s Folds and it’s great.
There is a one-handed mode gesture that you can enable. It allows you to swipe straight down on the gesture bar to pull the entire top of the screen down.
I use that but it only works from the home screen. If I use the gesture from an app it just interact with the app.
That’s odd, I can use that gesture from any app. Wonder if it’s phone-specific.
I’m using a pixel 8 with graphene. If I try to use that gesture while browsing lemmy with the Sync app, for example, it just scrolls the feed back towards the top.
I only explain bc I’m hoping I’m using it incorrectly.
I am using a stock Pixel 6a. From the home screen, I can swipe down anywhere to pull down the notification shade.
The one-handed mode gesture (and function) is different though. Settings → System → Gestures → One-handed mode:
Usage: