What if your phone could let you — and only you — answer an incoming call, no thumb print/PIN/swipe gesture required? Instead of a finger, it’d gauge the shape of your ear; instead of a fingerprint sensor, it’d use the touchscreen your phone already has built-in.
That’s the idea behind Bodyprint, a new concept out of Yahoo’s research labs.
Built by researchers Christian Holz, Senaka Buthpitiya, and Marius Knaust, Bodyprint is designed to utilize a number of different body parts as biometric markers in different use cases. It’ll recognize your ear, as mentioned, but it can also identify you from a glance at your palm, the knuckles of a fist bump, or your grip around the edge of the screen when tightly grasping the device.
If it can sense all of these things, why not just use the screen to read your fingerprint? The short answer: the sensor resolution on current smartphone touchscreens isn’t high enough for something as small as a lone fingerprint. Give it a bit more real estate to analyze, though, and it’ll work just fine.
Most of the time, at least. The obvious fear is that such a mechanism wouldn’t be accurate enough, thereby allowing people with vaguely similar ears/palms/what-have-you past the lock screen. However, the authors of the paper are claiming that it let only the right person in 99.52% of the time.
The catch? It does so by also turning the right person away fairly frequently — the algorithm errs on the side of caution, pinning the false rejection rate at 26.82%*. If your phone turned you away roughly 1 out of 4 times you tried to answer a call, I dont imagine anyone would keep said phone very long. The test group was also relatively small — just 12 users were battletesting the accuracy.
(* this rejection rate was across all of the different possible authentication markers; if only the ear is being used as a marker, the false rejection rate drops to 7.8% or 1 in 13.)
With that said, it’s the very definition of early days for the concept. Smartphone fingerprint readers were awful for years and years — only in recent days did they get good enough that they’re anything but frustrating.
This isn’t the first time the members of this team have used peculiar things to identify people: back in 2012, Holz used a Kinect to identify people standing around a Surface touchscreen table based on their shoes.
[via AndroidAuthority]
from jushiung1 http://feedproxy.google.com/~r/Techcrunch/~3/oqQ1OhRfGCw/
via IFTTT
0 comentários:
Enviar um comentário