Earlier this week Apple launched a new iPad and I dismissed the inclusion of Lidar as spec bump to maintain the illusion of innovation. However, the longer I sat with it, the more I thought I might be wrong. I’m going to break down my journey is uncovering if Lidar does in fact make sense on an iPad.
Apple hasn’t updated the iPad Pro since October of 2018. It doesn’t look that much different, it has the same buttonless design with Face ID and comes in 11″ and 12.9″ variants. It sports an A12Z Bionic chip a 2nd rear camera for ultrawide photos, better microphones and Lidar. If the new iPad Pro didn’t have Lidar and a cool new stand, there would be nothing interesting about this update.
The big generational shift is around a Magic keyboard that with cantilever hinges that gives you 120 degree viewing angles. It has the much loved scissor mechanism keyboard with an improved 0.7mm of travel. Basically, we’ve set the stage for a showdown with the Microsoft Surface.
All in all exactly what we expected. Apart from it having lidar.
What is lidar?
Lidar stands for Light Detection and Range. Basically, a laser pulse is sent out of a transmitter and the light particles (photons) are scattered back to the receiver. Since we know the speed of light, the sensor measures the time it takes for the light to return.
Lidar is most commonly used in autonomous cars to measure the distance of objects around the car to help it navigate the world safely. Waymo uses four lidar sensors and Apple noted that the tech is so advanced that even NASA uses it. It could be worth noting here that Tesla is the outlier in automotive that doesn’t use it at all.
Apple’s “Project Titan” is an autonomous car project has been spotted using Lidar. The tech is also rumored to be headed into the next iPhone.
What is Lidar Doing in the iPad Pro?
The lidar in the iPad Pro can measure an object up to five meters away and can run indoors and outdoors. Apple claims it measures photons a “nano-second speeds’ which the new A12Z chip uses computer vision algorithms to make sense of the data.
But What is it used for?
Well, you can make lava flow around your living room furniture. And.. well the use cases today are very limited so our best guess is that it’s about getting developers ready for a rumored pair of AR glasses.
The only reason to add it is Augmented Reality, which has yet to catch on.
Back in 2017 Apple launched ARKit which enables developers to build AR app very quickly. A new scene geometry interface has been added that allows developers to take advantage of the lidar sensor.
What does this mean for apps?
Measuring things will be much more accurate, they’ll be able to calculate the size of objects using the camera. The Measure app now features a Ruler view for more fine-tuned measurements.
Third-party developers have already started trying out the sensor. The IKEA Place Studio Mode later this year will enable users to furnish their entire room with augmented reality. Updates to the Complete Anatomy app can help physical therapists track their patients with a new mobility assessment tool. Other uses include an update to Shapr 3D to create a 3D floor plan, and a new augmented reality mode for Hot Lava, where the floor really is lava.
As the world takes pause to figure out how to do more things digitally we can expect traditional industries like art galleries to begin to sell pieces this way. Before Corona I would have said that AR is a technology that’s still looking for its niche.
My knee jerk reaction to Lidar in a tablet being added as a spec to be considered more innovative would have been right on the money if the global climate were different. The use cases for AR are still unfolding. As it stands today there isn’t much use for Lidar in a tablet. After the world returns to normal, we may find that an all new way to connect objects and experiences in different geographies will emerge.
The world now needs new solutions to connect digitally, Apple might be right on time with this technology.