Apple has improved the iPad Pro’s camera and included a LiDAR scanner instead of a 3D ToF sensor. This kind of sensor is found in police speed cameras but what’s the point of adding it to the new iPad Pro? Let’s find out:
The rear camera array on the back of the device looks similar to what we find on Apple’s iPhone 11 Pro series but looks much more decent on the iPad Pro as this is a huge device itself.
Apple’s new LiDAR sensor sits alongside the new 10MP ultrawide camera and a standard 12MP camera. The word LiDAR stands for “light detection and ranging” and it is actually used to measure how long it takes light to reach an object and reflect back, quite useful for scientific purposes.
So, being that much complicated, it’s more accurate than a 3D ToF camera sensor which does measure the round-trip time of an artificial light signal provided by a laser or LED to capture an entire scene.
While the LiDAR sensor uses a scanner to capture elements of a scene point-by-point using lasers. So, according to Apple, the sensor works up to five meters away, indoors and out, working at the photon level and at nanosecond speeds.
As the camera array does lack a dedicated depth sensor, the LiDAR scanner can actually work as a depth sensor along with its other special features. The LiDAR Scanner works with the pro cameras, motion sensors and frameworks in iPadOS to measure depth.
According to Apple, this combination of hardware, software and unprecedented innovation makes iPad Pro the world’s best device for augmented reality. Apple has introduced this technology to increase the augmented reality capabilities of the new iPad Pro, quite expected from a high priced / powerful tablet. More info on Apple’s official site.
Do check out: