Saturday, August 21, 2021

Why the iPhone 13 Pro's LiDAR scanner could be its secret photographic weapon

Smartphones may have destroyed DSLRs and pushed Olympus out of the camera business last year, but their photographic innovation isn't slowing – and the iPhone 13 Pro is expected to bring a next-gen LiDAR scanner that could take its camera to exciting new places.

So far, LiDAR (or 'Light Detection and Ranging') has had a relatively mild impact on the iPhone's photo and video powers. After debuting on the iPad Pro 2020, the depth-sensing tech arrived on the iPhone 12 Pro and powered a new 'Night mode Portrait' trick for low-light bokeh.

All very nice, if not exactly earth-shattering. But the latest leaks and rumors suggest the iPhone 13 Pro could have a second-generation LiDAR scanner. And the combination of this with machine learning advances could be big news for mobile photography and video.

The lasers from an iPhone's LiDAR scanner

(Image credit: Apple)

We're not just talking about improved autofocus and augmented reality (AR) games – the simmering promise of user-friendly 3D photography and video Portrait modes could be ready to boil over, with LiDAR supplying the heat. 

To get a sense of what this could mean for photography, we chatted to 3D capture innovators at Sketchfab and Polycam to find out why they're excited about next-gen LiDAR on the iPhone. We'll also look at recent leaks and patents that suggest the iPhone 13 Pro could finally deliver a LiDAR-powered 'Portrait mode' for video. 

That humble dot at the base of the iPhone 13 Pro's camera might soon be considered as important as the lenses above it...

Depth knell

The latest iPhone 13 Pro leaks suggest that, for traditional photography, its upgrades could be relatively minor. A brighter ultra-wide lens (bumped to f/1.8 from the current f/2.4) with autofocus looks on the cards, as does a slightly wider f/1.5 aperture for the iPhone Pro Max.

But the last few generations of iPhones have shown that the benefits of typical camera hardware (bigger sensors, brighter lenses) are hitting diminishing returns when it comes to producing real-world gains.

The rear cameras of the Apple iPhone 12 Pro

(Image credit: Apple)

The bigger leaps are coming from machine learning and new hardware, like the LiDAR scanner. It's the latter, in combination with Apple silicon and the iPhone's other cameras, that's promising to produce exciting leaps in photography and video – even as soon as the iPhone 13 Pro.

According to credible leaks earlier this year, Portrait mode photos will be "overhauled greatly" on the iPhone 13 Pro, thanks to a process that combines data from both its lenses and the LiDAR scanner. This is good news – the iPhone's Portrait mode has improved its edge detection and focus fall-off in the last few years, and another step forward would be a popular upgrade.

But these are all refinements rather than revolutions. What about brand new photographic and video powers from that "new generation" LiDAR scanner, which Taiwanese tech site DigiTimes has predicted for the iPhone 13 Pro? There are a couple of those, starting with 3D photography...

New dimension

If you haven't dabbled with 3D capture, it can sound like a gimmick. After all, 3D TV turned out to be one of tech's biggest flops and the wave of stereo cameras (like the Fujifilm FinePix Real 3D W1) a decade ago were similarly faddish. 

But many experts think 3D snapping is as inevitable as digital photography was in 2000. And a growing number of iPhone 12 Pro owners have been tinkering with their phone's LiDAR scanners with fascinating results.

One of the best places to see the growing variety (and quality) of iPhone-created 3D scans is Sketchfab. An online store where you can browse, buy and sell 3D scans of real-world places and objects, its galleries are home to some creative examples of this nascent form of photography.

A good example is the 'Street Art' competition Sketchfab ran earlier this year. This challenged iPhone 12 Pro and iPad Pro owners to scan local examples of murals, graffiti and statues using their LiDAR sensor. The results are impressive. This immersive collection of 3D photos is, artistically, no better or worse than a 2D street photo – it's just different. And it's definitely powerful new way to capture moments in time.

3D scanning is nothing new. Professionals have long used a technique called photogrammetry, which uses software to build detailed 3D models from hundreds (or sometimes thousands) of overlapping photos. But this takes a lot of time and processing power.

LiDAR scanners, like the ones on the iPhone 12 Pro and incoming iPhone 13 Pro, are the point-and-shoot democratizers of this new form of capture. They use fast pulses of light to build depth maps, which are textured by software algorithms. When it comes to resolution and detail, photogrammetry is far better, but the iPhone's 3D scanning brings unprecedented speed.

Laser beams firing out from a LiDAR scanner

(Image credit: Apple)

"It's much faster than traditional techniques like photogrammetry," Alban Denoyel, Co-founder and CEO of Sketchfab, tells us. "There are many things I scan with my iPhone LiDAR that I wouldn't bother scanning with photogrammetry because I don't have time" he adds. As if to prove the point, Alban Denoyel has been doing a 'one scan a day' challenge, the 3D equivalent of photography's widely-practiced 'portrait per day' project.

According to Chris Heinrich, founder of the popular app Polycam, the iPhone's current LiDAR scanner could be a big bang moment for 3D snapping. "There are strong parallels to the history of photography, where ease-of-use is what drove mass adoption," he says. 

"Previous 3D scanning techniques can be roughly likened to the Daguerreotype camera in terms of ease-of-use," referring to the mid-19th Century pioneer we recently celebrated for World Photography Day. "I would say the current generation of mobile LiDAR scanner is about on par with a mid-twentieth century hand-held camera" he adds. That's a technological leap of about a century.

Memory tricks

The tech is certainly impressive, but what can you actually do with the iPhone 12 Pro's LiDAR scanner, and where might the iPhone 13 Pro take it next?

There are a huge range of non-photographic use cases for mobile 3D scanning. When the iPhone LiDAR sensor's resolution improves – which rumors suggest could happen on the iPhone 13 Pro – then scanning your body to get custom fits for glasses, orthotics or jewelry is could be on the cards. Games and movies will also hoover up an explosion in the number of digital props, too.

But it's the ability to capture more personal subjects in three dimensions that show LiDAR's potential as a new form of photography. Before moving house, one man captured the home he self-built over a decade, so his kids can enjoy a virtual tour when they're older. Others have scanned their childhood bedrooms. Alban Denoyel snapped the memory below from his daughter's birthday party on an iPhone 12 Pro with the 3D Scanner App in less than 30 seconds.

With this kind of speed, holiday destinations could be snagged for posterity and three-dimensional revisits. Some even might see 3D scanning as a new form of raw capture – a future-proofed way to capture memories in unprecedented detail, even if we don't yet have the tech to fully exploit all that data. 

On the bigger scale, which is where the iPhone's LiDAR scanner is currently more comfortable, there are interesting possibilities, too. "I was super-impressed by Emmanuel [de Maistre's] 3D documentation of the catacombs in Paris', says Alban Denoyel of the iPhone-powered scans of Paris' pitch-black tunnels, which you can read about on Sketchfab.

"I'm also fascinated by the concept of volumetric memories, be it to record life events like a birthday, or the first steps of my daughter," he adds. "Those examples convey something that is impossible to convey with regular photography, and show the power of 3D as a medium".

Citizen journalists could ultimately see it as a powerful new weapon, too. Last year, the digital artist Julieta Gil won the Lumen Prize for her 3D scan (using photogrammetry) of Mexico City's 'Angel of Independence' monument, which had been covered in graffiti by women protesting against systemic violence. Soon after the protest, the government boarded it up and worked on its restoration, making the scan an important digital time capsule that could 'visited' by future generations.

Lucky number 13

Some of these LiDAR use cases are a way off, but what could the iPhone 13 Pro realistically bring to the 3D party? It needs to make three big improvements.

"The biggest weakness of this first-generation LiDAR sensor is its low resolution, which is why detailed objects do not look nearly as crisp as they do in real life," says Chris Heinrich of Polycam. "I hope to see significant improvements to LiDAR resolution in the next few device generations. The second downside is the range of the sensor, which maxes out around five meters, so capturing tall or distant subjects isn't yet possible," he adds.

Sketchfab's Alban Denoyel agrees on the limitations of the iPhone 12 Pro's range and resolution. "It's also pretty hard to get good results of something small that you need to turn around, like a pole or a shoe," he says. "I would also want longer battery life – with several scans a day, a full charge doesn't last for a full day," he adds.

The point cloud of two people created by a LiDAR scanner

(Image credit: Apple)

Hopefully, that larger iPhone 13 Pro LiDAR sensor – which has been seen in leaked dummy units – will make some progress on these fronts. If it does, it would pull even further ahead of Android phones when it comes to depth-sensing power.

So why are Android phones lagging behind, after Google pioneered the tech with the now shuttered Project Tango? "The biggest thing Android needs is a comparable sensor that’s available on many flagship phones," says Chris Heinrich. "Over the years various Android OEMs have played with mobile depth sensors, but they’ve never come to enough flagship phones to really attract developers to build great 3D capture apps for them," he adds.

"The other bottleneck is the speed of the computer chips on Android," says Chris Heinrich. "Processing the Lidar scans on-device takes a lot of computational resources, and Apple currently has the fastest mobile chips in the world. Slower chips means longer processing times and a worse user experience. That being said, I think we’ll start seeing Android devices with comparable functionality by 2022" he adds.

Just add AR glasses

Naturally, there are other barriers to widespread 3D snapping adoption that are beyond the scope of the iPhone 13 Pro. 

3D capture apps have made big leaps in speed and usability, but still have a way to go. Wider support on social platforms will help, too. "Ultimately, 3D capture will become much more relevant when you can easily share and consume those in a spatial way, for example with AR glasses," adds Alban Denoyel. 

Polycam's Chris Heinrich agrees, but doesn't think Apple Glasses are essential for growing its popularity. "If we get to the point where head-mounted 3D displays like AR glasses go mainstream, then this is obviously going to really accelerate the demand for 3D captures, since 2D photos just look super-flat in AR or VR. But I don’t think it’s a necessary prerequisite," he says.

"Humans love taking photos and videos, and 3D capture is another form of reality capture that has its own unique strengths and weaknesses as a medium," he adds. "Over time, I believe it will sit alongside photos and videos as a primary medium for capturing the world around us," he predicts.

Vlog star

A more immediate LiDAR-based trick for the iPhone 13 Pro, then, might be that rumored arrival of a 'Portrait mode' for video.   

Just like 3D scanning, this is another feature we've seen previously on Android phones. Back in 2019, the Samsung Galaxy S10 delivered a new 'Live focus' mode for video, but it was a pretty rough filter rather than anything based on genuine depth information.

According to Mark Gurman at Bloomberg, a trusted source for Apple leaks, the iPhone 13 series will feature a new 'Portrait mode' for videos, called 'Cinematic Video'. Exactly what form this will take, though, isn't yet clear. If it's simply a camera-based version of the 'Portrait Mode' that came to FaceTime in iOS 15, then it may be restricted to the iPhone's front-facing camera.

An iPhone running a Facetime video call

(Image credit: Apple)

But if Apple is planning to bring a proper video 'Portrait mode' to the iPhone 13's higher-quality rear cameras, then it'll likely need some help from that LiDAR sensor. It's possible to create a similar effect already with apps like Focos Live, though that is reliant on computational software trickery.

The bigger leap, and one that could bring serious competition to the best vlogging cameras like the Sony ZV-1, would be a system that combines a next-gen LiDAR scanner with the real-time compression of its spatial data.

Interestingly, this kind of system has been hinted at in Apple patents, as pointed out by Apple Insider. Three patents published in mid-July refer to compressing LiDAR spatial information in video using an encoder, which could allow the A15 chip to simulate video bokeh based on the LiDAR's depth info, while still shooting high-quality video.

Patents certainly aren't a rock-solid confirmation of any features, and it's possible this kind of video Portrait mode is a little further off than the iPhone 13 Pro. But it's something we'll definitely be keeping an eye out for during Apple's likely launch event in September.

Camera obscura

Exactly how much of this LiDAR-based potential we'll see in the iPhone 13 Pro remains to be seen. It certainly represents a good opportunity for Apple to pull further ahead in depth-sensing tech, before Android phones likely catch up in 2022. 

On the photography front, an improved Portrait photo mode looks like a shoo-in, while a next-gen LiDAR scanner could help push 3D capture closer to the mainstream. 

The camera module on the iPhone 12 Pro

(Image credit: Apple)

With apps like Polycam, it's already possible to scan a 3,000 square foot house in about two minutes. But improved resolution could make LiDAR useful for capturing smaller objects and memories. A boosted range could also make it possible for us to quickly capture three-dimensional memories on our travels.

Perhaps the killer feature for LiDAR, though, is its potential ability to support a proper Portrait mode for videos. We may see a simpler equivalent of FaceTime's new video bokeh, based on computational processing, before that arrives. But we'll certainly be looking out for that feature at Apple's September event, as will nervous vlogging cameras from behind their sofas.



from TechRadar - All the latest technology news https://ift.tt/3AYvPiE

No comments:

Post a Comment

NYT Strands today — hints, answers and spangram for Sunday, November 17 (game #259)

Strands is the NYT's latest word game after the likes of Wordle, Spelling Bee and Connections – and it's great fun. It can be diffi...