The new iPhone 7 has a dual camera system which apparently will allow you to create depth mapped images! This will be incredibly awesome if the 3D engraving feature of the glowforge works as promised. You’ll be able to take a photo of someone and have the depth map available for your engrave.
The first image is of course the original image. As the iPhone has 2 lenses it is able to see stereoscopically to determine depth, which is shown as a black and white gradient, in multiple layers as depth stages.
You can then (supposedly) just import this to the glowforge software and engrave the 3D photo you just took into a material.
It makes me really reconsider getting an iPhone 7 plus. If there aren’t other stereoscopic cameras available I might just have to. This opens up a world of opportunity.
Below is an example of what should be able to be accomplished:
I was hoping to upgrade to the 7P last weekend, but apparently everywhere is still out of stock! For one of the most negatively accepted iphone releases, it sure is outselling previous iterations!
If this feature is truly your deciding factor, I would hold off for now. Seeing the depth map example, I am worried there is not enough detail for a high quality engrave. I would let this flesh out a bit before making the jump…
Great information! I purchased an iPhone 7 plus hoping that this might be a possibility. Do you have and links or sources for more information? Thanks.
As this functionality is only still in beta for the iphone 7 plus, I cant offer any real avenues for testing or playing with it right now. The 10.1 update for the iphone will be the release point for these capabilities, and it might require a special app to get the 3d depth map image if apple decides not to allow you to export them natively.
However, as they do appear to allow you to change the background blur of images on the fly, it seems that this data will be stored as part of the photograph. This means every photo you take with the dual lens system will allow you to have that data usable once theres more details on how it may be accessed/exported.
I doubt that the iPhone 7s cameras have the z-depth resolution needed to create really detailed bas relief. 2.5d/3d engraving is for me one of the most interesting aspects of laser engraving, and the hardest part is always generating the artwork. It’s hard to get detail out of 3d packages. I’d love to see a camera solution that can generate detailed depth maps for engraving.
As far as “cheap” stereoscopic camera systems, I have an enclosure for a pair of goPro 1s for shooting in 3D. If someone could just go ahead and write a program to extract the DoF info and output a glowforgable file… that’d be great.
While we’re talking about this, does anybody know about the file format of the Lytro Illum? It’s able to capture depth of field, at least to some extent, and if it’s precise enough to get fed into a slicer, the sub-$300 fire-sale pricing would be well worth it.
It seems like there might be some options out there. Here’s a couple videos that came up from a search for “stereoscopic images to depth map”…
It looks like this person has written some software that generates depth maps out of pairs of images. Maybe they’ve only reviewed the software… I’m not sure. The linked website is “wordier” than I want to tackle at the moment.
This looks lime more polished software (with a GUI and everything!) but a quick look at the website makes me think it’s unreleased or something. The only option for acquiring the software that I saw was a link to “ask for a demo”.
Oh wow, I have to check this out I have the phone but I didn’t think about getting a depth image from it…
SO if thats the case I wonder what can be done with the light field cameras…
opens up a new level of control might even justify using it like a scanner…
plans,plans plans…
I don’t think any precomputed DoF data is available (at least so far) from AVFoundation. I think at this point you have to grab the images from each camera and run your own algorithms to figure it out.
See this post on the Apple Developer Forums to see what’s currently supported by the SDK.
Looks like Android’s camera software can make depthmaps as well.
This video moves pretty slowly, but it’s good enough. Basically you take a photo using the “blur photo” mode and then extract the depth map. The person who made this video uses a site called “Depthy” to do the depth map extraction, but I bet there are other ways to do it too.
Yep, you’re right, there’s nothing there yet, at least for 10.1. Given that the feature is still beta for first party apps I don’t expect to see anything until 10.3 at the earliest. Just like they deliver data for live photos I see you getting the data for DoF as a part of a package as a grayscale map. That is if they do offer it… Frankly though I’m doubtful we’ll see enough information to be useful. I totally see Apple only keeping the data they need for the bokeh effect. Time will tell…