iPhone 7 plus will create 3D engrave images from its cameras

Has anyone tried Autodesk’s 123D Catch app for this?

I’ve used it to make a few 3D models, but not recently – would this be significantly different from the 7P functionality?

That 123D Catch app is pretty cool. Unfortunately it looks like it uses a texture map generated by the scan instead of detailed geometry. I’m afraid it won’t have the fidelity needed for this application.

Tried this - super cool!

1 Like

I did a little digging on the Lytro. They export a .png depth map from their core file format, but it looks pretty similar to the iPhone depth photos in this thread.

I could see some applications like knocking out backgrounds, or posterizing the grays and then slicing an image that way, but I’m looking through my Glowforge to-do list for something that’s worth $300 (or $269 on eBay!) to experiment with. But I will be keeping an eye out for apps that will expose that depth map on iOS, since I have an iPhone 7 Plus.

Ooh. That reminds me: Microsoft did just announce they’re releasing a 3D capture app for Windows Phone first, but that they have plans to do the same thing on iOS and Android at some undefined point in the future:

I saw them demoing this with a first-gen Kinect several years ago, and the output was rough, but if you know your way around 3D modeling software, you should be able to clean up a lot of objects to the point it can be printed/cut.

I imagine that Google’s Tango and it’s depth perception capabilities could be used in creating models or maps for 3d engraving or what have you. The first smartphone with Tango is just being released now, but if things work out some people expect those capabilities to become as standard on a smartphone as something like GPS is now.


I have a Lytro Illium if anyone local to Seattle would like some time with it, I’d be happy to make it available to you for a couple of weeks. As a photographer, I was amazed by its ability to refocus anywhere within the photo so long as you don’t exceed its macro limits.

Let me know if you want to try it out in order to extract the data for a test.



Cool. I got one for my son for Christmas. I have one of the original Lytro cameras myself (the little rectangular boxes). How’ve you found using the Illium? Worth playing around with? My son hasn’t said anything about his so I’m not sure if he’s doing anything with it or if he just stuck it in the “weird stuff I get from dad” pile :slight_smile:


I have found it to be a very specialized use camera in that its beauty is in the output. That is you can convert the photos to change the focal point while viewing the image. It creates the look of a movie as you define the focus points on different subjects or areas in the photo. Its fantastic using the Lytro software for displaying the images. Outside of the Lytro software you have to convert it to a static movie that shows what appear to be camera focus shifts but are really all from a single photo.

Unfortunately wirk has overwhelmed my plans for a number of setups with it that will have to wait for summer now.

If you have specific questions for me, please feel free to message me.


Aww, darn too far away for me to take you up on this offer. If you happen find yourself wandering down near Sacramento with your Illium, please let me know! I’ve been wanting to play with a Lytro ever since I heard about them.

1 Like

Here’s a depth map of Han Solo taken with my Lytro Illum. no IDEA what you could do with that.


And of course the face recognition in iPhone X is building a very nice depth map (using a fancy IR spot generator); anyone tried to read one yet?

I have. Not great:

Not sure if you used the front side (selfie) or main camera here? Can you comment?

That looks like a typical depth map to me; they are based upon self-similarity of images and parallax correction (or similar) - they really can’t do much better than this for dark images. They can work better for really well lit subjects with low noise.

The iPhone X front camera with dot generator might do a better job; since it uses projection and can avoid the problems with noise and low light.

Here’s me using the Glowforge as a light stage for 3d sampling Han.

The Lytro Illum is too inconvenient for many uses (IMO) but it excels for macro stuff; you can see the camera is balanced comfortably right on the glass; it can focus right up to the lens surface.

This isn’t quite working out or me yet, I need some directional light on han’s face, or the 3d map pops out inverted (the plain cream plastic doesn’t have enough detail for it to work out its depth under this lighting)


Here’s an iPhone X depth map, as captured by the iOS App DepthCam ($4.99) using the front size “True Depth” sensor.

I have adjusted the brightness map to highlight the depth region of interest. Looks like plenty to create at least a cameo etching.

That’s not noise, BTW, that’s my beard.


That cameo is going to be a treasured family heirloom, mark my words.

1 Like