Is my lid camera misaligned?

I’ve got a wife, 3 kids, a full-time job, and a part-time job and I, generally, don’t make stuff. Sure, this past week I’ve been staying up until 1AM working on things now that I have prod unit. But I would have probably been a bad choice for a PRU.

You, on the other hand, are a kind, well-spoken human being who is willing to share testing and ideas… who actually makes stuff! Who could ask for more?!

13 Likes

OK, not being an arts and crafts guy, I struggled (and whoever comments about the one line that kind of goes off kilter can make one of these perfectly and send it to me!). Anyway this is a fail, not because of misalignment, but because of goofy software (maybe this will get fixed - let me explain). And yes I will be filing this… @dan, @Tony, I will file, but this seems to be something to look at…

So I had this grid on a piece of BB 0.125" ply (my usual amazon stuff) with a red sharpie. Why red? Well because then the black engrave lines would be visible if this was a possible test to perform…


(sorry, hard to get a squared image, and too lazy to fix in PS - but it is)

Now put it in to trace, and after adjustment with the shift-arrows arrived at a semi-reasonable trace (actually kind of meh, but after about 10 up contrasts the problem is the noise between the lines got too bad to be able to get good traces to show, so this is kind of not awesome scanning. I would bet I would get way better on my flatbed into AI, but this is what it is.

Tracing

And now the fail:

Fail 1: because the traced image is greater than the size it can engrave, instead of doing what any reasonable software would do (“would you like to crop it?”) or simply self-crop it, you can do absolutely nothing. Now of course if you could download, you can crop in PS, but that is not the advertised functionality here. So of course either GF needs to put a frame on their proof grade masking to advertise the actual traceable area (like in a video editor) or add crop, because otherwise this will annoy users.

It also has applied a strange scale that no matter what I did in GFUI I could never get a scale that fit (the scaling is way down, so by zooming way, way out in GFUI I can make it fit, but it is huge!

Fail 2: the alignment is clearly off. Now in fairness this is a PRU with a slightly torqued door hinge, so the camera is likely slightly tilted, so I am NOT taking points off for the fact that it is misaligned differently between left and right sides

11 Likes

I don’t know what I was expecting, probably the crop to printable area you outlined, but this wasn’t it. I’ll have to do a similar test when my unit is delivered.

1 Like

Good point to make, not having a constraint for printable area when you are doing a large trace of something large in the bed. The no go zebra stripes don’t pop up.

Did you just back out and redo the initial rectangle and make the selection rectangle smaller and using the known boundaries in relation to the GFUI rullers?

2 Likes

No. Just gave up. But decided on a variant test (because I care less about tracing to be honest).

So now I decided to make a circle with a crosshair in the middle and see how accurately I could place them from the camera image to the grid on the wood. Squares are 1" (approximately)

Layout

Result


Even the center one is off. Again disclaimer my PRU lid is torqued, but here I did put the weight on the lid to make it sit flush. Someone should repeat this on a production unit.

Lines are scores, cross is an engrave.
bullseye.svg.zip (1001 Bytes)

8 Likes

I sort of did. I took 2 sheets of 8.5x11" paper and printed a grid on it. Taped them down to a sheet of 1/8" Baltic Birch (Woodcrafts). And placed it against the left edge of the tray.

I got the same results - it scanned and then wouldn’t let me do anything with it. So I moved it over well within the margins of the cutting area.

Here’s what my bed looked like at 67% view.

Told it to add artwork and scan the bed. Here’s the result.


In my case shift-arrow (or ctrl-arrow or ctrl-shift) had no effect. That’s an issue by itself. But I figured I’d see how it scored the lines it was able to “see”. Here’s what it looked like ready to specify the operations.

Notice there’s​ also an engrave operation the GFUI found. I told it to ignore and turned the other set of lines to Score. Even though it wouldn’t do a complete grid I figured it would give me an idea how the lines would align.

Ready to go.

Here’s the start - not full lines but the drop shots can show how much it’s off.



Lower right.

Final result. Parts not done because the scan was bad. Others have decent registration on the vertical lines but not horizontal. There are also areas where the horizontal alignment was good but not the vertical. It wasn’t caused by the air assist moving the paper as the path was not a straight up & down grid pattern . The head followed a random paths doing a couple/three inches and then skipping to another part of the design so there weren’t a lot of cuts/scores that caused loose flying paper.


And the final screen image.

Bottom line, the current scan and registration is not up to the promises. Looks like work for @dan @Tony and crew awaits.

Could be resolved in software not yet released to PRUs or my machine but not what I expected. I figured it’d be pretty spot on from my other projects from SVGs.

12 Likes

I don’t think they ever promised lid-camera only alignment. When they get the head camera integrated into the process it will be so much better. But this is a great benchmark on where they are now! Thank you so much!

2 Likes

Well it is sort of implied by this:

1 . Design with just a pen

Glowforge’s onboard cameras can cut & engrave directly over top of your drawing.

It doesn’t say only in the middle of the bed.

And when you manually align your design to the material with the lid cam you can’t get right to the edge if it isn’t accurate.

The results are interesting (thanks for the experiments) but they are both PRU’s. We are assuming they run the same software as the production units but the factory calibration or construction might have been improved. It would be interesting to see if a production unit is better. Feedback from the show seems to be that they are.

7 Likes

Thanks for doing those tests!

In case anyone wants to run one but doesn’t want to make/find a grid, I made a grid SVG with varying line widths that I figured might be helpful with further tests. The lines of the grid are spaced at 1" intervals (center to center). The line widths start at 0.010" and increase by 0.010".

I don’t know what it takes to post SVGs and have the image preview actually work. Here’s the (forum modified) file…
https://community.glowforge.com/uploads/short-url/3HkT1frkQLH5XWvErS913o34MZz.svg

6 Likes

You are always helpful…

3 Likes

Oh boy, that’s a bit of a mess. Thanks for testing @jamesdhatch

Would we get better results on thicker board to avoid warping? You could just bond the printed sheet to cardboard?

Another (different) test (for anybody who’s willing and does not see it as some unreasonable demand…) is to engrave a simple 10x10mm grid on material the full bed size - no camera involved. I am sure it will engrave dimensionally accurate but there’s a bit of confusion as to accuracy of the GF generally I think.

The rulers are coming out pretty accurately when photographed without parallax, so I think it is probably more accurate that you can measure by eye with a ruler.

I would cut out an L shape and measure the inside leg length with calipers. That gives a kerf independent dimension to measure.

4 Likes

I don’t think so. I was watching the laser operations and it was wrong from the start - the first vertical and horizontals were not lined up with the grid lines on the paper. Those were in the upper left side of the paper. It also showed similar behavior when it got to the lower right. It was a fairly random walk across the paper so it was a fair amount into the project before it started having paper blow or cut out and fly off.

2 Likes

I was running this same test the other day. Ran into some of the same problems, specifically the scanned artwork outside the print area. Another issue I had is I was cutting thicker material previously. I scanned the artwork, then I set the material thickness. This caused the scanned artwork and the picture on the bed to become misaligned. I had to rescan.
I didn’t actually do the engrave, because I had other projects to do and it was longer than I thought it would be, but I’m planning on it later this week.

3 Likes

I did do a 10LPI test. It was… interesting, but I wouldn’t say conclusive.

1"grid on cardboard.

Here is a picture of the engrave lines around the center of the image.

Bottom left.

There is a weird anomaly that happened if a few places. The engrave lines bounced around (see image below). This is significantly wider that what was depicted in the scan. I’m curious if it is caused by the 10LPI setting, which is why I’m going to do this test again.

13 Likes

Note they use the plural of the word “camera.” So while alignment isn’t up to snuff with the lid camera alone, I have high hopes for what they can do with the head camera and some clever programming. I’m pretty confident (from a geometric analysis I did a while back) they will never be able to align satisfactorily with the lid camera alone.

3 Likes

I think the head camera is for edges. I will be surprised if it is used for trace mode.

I can imagine using the head camera to do some spot sampling of the artwork, and comparing the results with the lid camera image. Then based on that, warping the lid camera image locally to match the head camera reality. Like I’ve always said, a monumental programming task!

3 Likes

It looks like at 10 LPI it is sampling the grid on a 0.1" grid so you are getting aliasing. I think the other tests were done with scores, not raster.

3 Likes

Look at screenshots of the Muse’s bed photo stitching and you’ll see why it’s so hard. It depends greatly on the depth of whatever it is you’re trying to capture. With the Muse it looks like they assume a certain depth. Something sloped like a laptop would require what I imagine to be some pretty serious processing to take the depth map from the laser rangefinder, marry that up with the photos from the lid and head camera, remove any lens aberration (fisheye etc.) and then stitch things up accurately.

6 Likes