The photography experts behind the Halide Camera app for iPhone have published a new blog post today diving deep into the new iPad Pro’s camera system. This includes the new ultra wide angle camera and LiDAR Scanner, and why the iPad still lacks Portrait mode support.

Halide offers some in-depth info on the ultra wide angle camera in the new iPad Pro:

Then, there’s the LiDAR Scanner:

It appears the hardware just isn’t there to support night mode, Deep Fusion, and even portrait mode.

One interesting tidbit is that the iPad Pro doesn’t support Portrait mode on the rear camera. Halide explains why the new LiDAR Scanner isn’t necessarily designed with Portrait mode in mind:

Regular camera sensors are good at focused images, in color. The LIDAR sensor doesn’t do anything like this. It emits small points of light, and as they bounce off your surroundings, it times how long it took the light to come back.

This sounds crazy, but it’s timing something moving at the speed of light. This window of time that amount to hundreds of picoseconds. Pico? Yes, pico — that’s an order of magnitude smaller than nanoseconds! A picosecond is 0.000000000001 seconds. Count the zeros.

Another problem noted is that there are no APIs available that allow developers to access the new depth data, preventing Halide from using that information. The Halide developers, however, did build a proof-of-concept app called Esper that re-thinks photographic capture.

The only reason we won’t say it could never support portrait mode is that machine learning is amazing. The depth data on the iPhone XR is very rough, but combined with a neural network, it’s good enough to power portrait mode. But if portrait mode were a priority, we’d put our money on Apple using the dual-cameras.

While the LiDAR Scanner can’t yet “augment our traditional photography,” Halide says that it “opens the door to new applications that are powerful and creativity-enabling in their own right.

The full blog post from Halide is well worth a read and can be found here.