A year ago takitus speculated that the iPhone 7 plus and above depth camera might make for easy generation of 3D depth maps. I found an app capable of extracting / manipulating the depth map image from the phone: DepthCam and promptly tried it out on my shiny new iPhone X. The results:
Wow, very disappointing and nowhere near what I expected!
I wasn’t sure if this was an app or a hardware limitation, so I contacted the dev and got this response:
This looks like a really cool idea, but I think the limitation is the iPhone cameras. The resolution of the depth data is much lower than the actual photo:
Depth Data: 768 x 576 pixels
Photo: 4032 x 3024 pixels
After capture, the depth data is scaled up to match the photo but will be lower detailed.
Also, the dual cameras capture depth best between 4 and 10 feet, i.e. perfect for portrait photos, not close ups.
As a test, try using the iPhoneX front camera and see how that works. The front camera uses infrared to detect depth which might provide more detail at closer range.
I’m looking forward to the depth data improving in future devices. Hope this helps!