At present maximum of the computing, gadgets are coming with Touch Screen Technology. Cause this is the easiest method to operate a machine.
Have you ever think of something easier and faster than this? If not, Google will help you.
Google introduced Project Soli, where they are working for something easier than the Touch Screen Technology i.e HAND-GESTURES.
In other words, they are trying to operate devices using Hand Movements.
Our hands are the ultimate input device and its use is our natural regular practice. Rather than touch, gestures are faster, easy to manage and the main thing – we all are very well known of it. Reasons Why Google choose Gestures as input are –
- It’s Fast
- We all are much aware of it than the touch
- It’s our regular common practice. So, we all are master of it already
- Main hardware is more effective as well as durable than other
How does it work?
Up to this, it is clear that in Soli Hand Gestures are used as input. But how the device receives the input?
Our every movement (not only hand) can be called as motions. To track this motions Radio Frequency Spectrum i.e Rader is used.
Rader has very high positional accuracy. Due to this, it can detect small movements of our hand accurately and easily.
The Rader transmits a radio wave towards the target. The receiver of the Rader intercepts the reflected energy from the target.
They can interpret so much with the Gestures because of a Full Gesture Recognition Pipeline they have build.
Properties of Rader that make it unique
- Shrinking the entire Rader into a tiny chip is possible very easily.
This chip has nothing to break, no moving parts, no lenses. In one word it is only a Peace of Sand. So, this is too much reliable and this makes Project Soli so Promising.
Google’s Advanced Technology and Project Group spends more than a couple of years already. Hope, this technology will come in every computing device soon. After that, the definition of Computing Device Handling must be changed.
Do check out: Free Cloud Service with MAXIMUM Storage