Apple has promoted the capabilities of its new lidar sensor on Tuesday’s unveiling of the iPhone 12. Apple says lidar will enhance the iPhone’s camera by enabling faster focus, especially in low-light situations. It can help create a new generation of sophisticated applications for increased reality.
The presentation on Tuesday offered a little detail on how the lidar of the iPhone does work, but this is not Apple’s first lidar device. In March, Apple first launched the new iPad technology. And while nobody’s done an iPhone 12 teardown yet, we can learn a lot from recent iPad teardowns.
Lidar works by sending laser light and measuring the time it takes to rebuild. Since light travels at a constant rate, a precise distance estimate can be taken off the round trip time. Repeat this process over a two-dimensional grid and the result is a three-dimensional “point cloud” that displays objects located around a room, a street, or somewhere else.
System Plus Consulting analysis in June revealed that the iPad lidar delivers light through Lumentum’s range of vertical surface-emitting lasers (VCSELs). It detects the return flash by means of a range of Sony-provided sensors known as single-photon avalanche diodes (SPAD). In the following section, I’ll explain what these are.
I particularly found Apple’s announcement interesting because I was involved in a story about companies using the same technological combination, VCSEL lasers, and SPAD detectors, to build a far more powerful lidar for the automotive market. The fact that VCSELs and SPADs can be made using conventional semiconductor manufacturing techniques is one of the great sales points. They benefit from the huge economies of scale in the semiconductor industry. As VCSEL sensors become more prevalent, they will be cheaper and better constantly.
Two of the high-end lidar companies – Ouster and Ibeo – have gained more traction than most in the busy lidar business. Apple will be making a nice tailwind in the coming years with the decision to adopt this technology – as well as the possibility for other smartphone companies to take the lead of Apple.
VCSELs helped Apple make radically simpler lidar
More than a decade ago, Velodyne implemented the first three-dimensional lidar sensor. It cost around $75,000 for the spinning unit and was significantly larger than a smartphone. To put one on each iPhone, Apple needed to make lidar sensors radically cheaper and smaller, and VCSELs helped the company do it.
What is VCSEL? You have two basic options if you’re building a laser using conventional semiconductor manufacturing techniques. You can create a laser that transmits light from the top (a vertical-cavity surface-emitting laser, or VCSEL) or from the side of the wafer (known as an edge-emitting laser).
Edge-emitting lasers have been more powerful traditionally. For decades, VCSELs have been used in everything from optical mice to gear for optical networking. Traditionally, they were deemed unsuitable for high-end applications where a lot of light was required, but as the technology has matured, VCSELs have become more powerful.
In order to produce an edge-emitting laser, the wafer must typically be cut off to expose it. This makes the production process more expensive and complex and limits the number of lasers that can be produced on a single wafer. VCSELs, by contrast, emit light perpendicular to the water, so they do not need to be cut or packaged individually. This implies that dozens, hundreds, or even thousands of VCSELs can hold a single silicon chip. In principle, a chip with thousands of VCSELs is not expected to cost more than a few dollars when produced on a large scale.
With single-photon avalanche diodes, the narrative is similar. As the name implies, a single photon is detected sensitively enough. High sensitivity means they’re suffering from a lot of noise. Consequently, sophisticated post-processing is required in an application such as lidar. But one major benefit of SPADs is that they can be manufactured using conventional semiconductor techniques, like VCSELs, and thousands of them can be packaged on a single chip.
Combining VCSELs and SPADs allows conventional lidar designs to be dramatically simplified. The original 3-dimensional lidar of Velodyne mounted 64 individually packaged lasers in a column on a spinning gimbal. Each laser had its own matching detector. Velodyne’s early lidar units were so expensive because of their complexity and the need to precisely match each laser with its corresponding detector.
More recently, several companies have tried to “steer” a laser beam with small mirrors in a scanning pattern. Instead of 64, this design requires just a single laser. But it still involves at least one moving part of it.
In contrast, lidar sensors with no moving parts are manufactured by Apple, Ouster, and Ibeo. VCSEL based lidars can have a dedicated laser for every point in the field of view using hundreds or thousands of lasers on a chip. And since all these lasers are pre-packaged in a single chip, assembly is much easier than with the classic spinning design of Velodyne.
Recent iPhones already have a 3D sensor called the TrueDepth camera, which has FaceID functionality enabled for Apple. The array of VCSELs reportedly provided by Lumentum was also used. Thus TrueDepth projects a grid of over 30000 points onto an item’s face and estimates the three-dimensional shape of the face of the user according to the deformation of the grid pattern.
The iPad’s lidar sensor projects a lot fewer laser dots than the TrueDepth camera. The lidar projecting a grid of just a few hundred pixels was shown in an iFixIt video made with an infrared camera. But while the TrueDepth pattern attempts to determine depths based on the light from falling on the face, the lidar sensor on the iPad measures distances directly by measures how long it takes for the light to bounce off an object and return to the camera. This process is likely to produce better accuracy in-depth measurements as well as a longer range.
More powerful lidar also uses VCSELs and SPADs
Apple’s lidar performance is far behind the high-end sensors sold by specialized lidar companies. The company that invented three-dimensional lidar, Velodyne, has a range of more than 200 meters for its most powerful lidar, while the Apple sensor has a range of around five meters.
Other VCSEL-based lidars are also considerably powerful than Apple’s. For example, Ouster’s most powerful VCSEL-based lidar has a range of around 100 meters to detect 10-percent reflective objects.
All Velodyne-style spinning units are Ouster’s current sensors. They have 16 to 128 VCSELs in a row on a single chip — this chip is then mounted vertically on a spinning gimbal like the Velodyne units. Ouster has reduced Velodyne to price because of the simplicity of this solid-state design and it has become the largest rival of Velodyne. But Ouster’s spinning lidar sensors still cost thousands of dollars, too costly to use in mainstream cars, not to mention smartphones.
Ouster announced plans last week to ship a new solid-state lidar with no moving parts. The new unit of Ouster is supposed to have more than 20,000 VCSELs placed in a two-dimensional grid instead of putting in 16-128 lasers in a row as in the current lidar of Ouster.
Ibeo pursues a similar approach and could be ahead of Ouster. In a mass car, Ibeo designed the first lidar ever shipped – the Audi A8. This lidar was primitive, with a vertical resolution of only four lines. But Ibeo is now developing a new model called ibeoNext which will have a 128/80 pixel laser grid – a bit less than the intended sensor of Ouster but much larger than Ibeo’s previous offers. Ibeo says that its sensor will have a range of 150 meters for objects with a reflectivity of 10%.
Sense Photonics, which we covered back in January, is a final contestant worth mentioning. Sense uses VCSELs and SPADs for its lidar, as with the other companies we have discussed. However, Sense uses a technique called micro-transfer printing to spread its lasers. This makes it possible for lasers to use more power without running into problems of heat and eye safety. Sense’s lidars have not been long distances yet, but Sense CEO Shauna McIntyre has told Ars that the company aims to reach a range of 200 meters for a future sensor, which it will announce in early 2021.