Camera Alignment

So we’re talking apples-to-apples, are you talking about a way the end user can compensate for bad camera alignment, or a suggested way that GF can calibrate the machine before sending it out?

Both. Assuming that the cameras are perfectly aligned leaving Flex, and shipment causes them to fall out of alignment, then having the same calibration sheet (that can’t physically miscalibrate in rough shipping) makes sense.

First of all, I don’t think we’ve seen an instance of any machine with perfect optical (camera) alignment across the entire bed. I don’t think we will, and that the head camera will be the only hope for that.

Second, since the honeycomb shifts easily and there doesn’t seem to be any absolute reference landmark against which to place the alignment sheet, the placement of the sheet will not be known accurately.

What good will it do to take an image of a calibration sheet in an unknown position? It seems the only good this will do is the provide a surface on which to do a trail print, and on which to use to place the final material aligned to that.

1 Like

Well I’m assuming the camera leave the factory aligned. Maybe they’re not. I can only assume they would be.

But I’m fairly sure there’s decent enough markers inside for consistent alignment: the seams of the aluminum shields, the molding on the front of the case where the Pro feed slot is, the Y rail. There should be plenty that are both consistent and do not rely on a sliding honeycomb tray for reference.

That engrave was from a cold boot. Interesting you say placement got “out of whack” with the last update because, as coincidence would have it, I got this machine the day of that update. So it sounds like it’s possible it’s just a software change. That would certainly be preferable to a hardware issue.

I don’t understand why there’d be any variation in camera placement machine-to-machine, let alone a large variation. But that’s not what’s important. If one should expect a max distortion of .25" and I’m getting a .265" distortion, that would indicate it’s out of spec.

I wouldn’t call the 0.25" a “spec.” More of an observation.

2 Likes

One of the mentions in the patent relates to mapping specific pixels to physical locations.

Yesterday all my images seemed to be shifted left compared to the cut after image. I was puzzled. then I shut everything down and went away and then came back and I had good images for placing random shapes all over a previously cut full sheet. I am not sure what happened. I have had weird image alignment from time to time. Then it goes away. Sometimes that is due to incorrect height settings of the material, but other times I am at a loss.

I’m curious as to whether or not yours is a hardware issue, firmware issue or GFUI imagine manipulation issue.

Weird! So you had the opposite effect of what I had?! When I get a chance I’m going to toss some cardboard in, and see what distortion I get across the entire bed. What toughest about this for me is, I’ve always placed my materials center-bed with my 1st unit. But it seems now I may need to right-shift everything, and that’s counter-habit at this point. But I’ll run my cardboard test and see what I see…

The axes are homed with the camera, so there is no consistent alignment with the case unless the camera is consistently aligned with the case.

It would be interesting to try placing a coin under each corner of the machine to twist it slightly and see if that affects the camera alignment.

How can you patent simple optics and geometry? Light has to come from a certain angular direction to hit a specific image sensor pixel. That is a property of lenses. What can be seen along that sight line is simple geometry.

The head has object obstruction detection. Remember the video of Dan’s coffee cup left inside the hopper? It can sense the limits of the bed just fine. The homing of the axis under the camera only finds the midpoint.

I think it detected the cup collision with an accelerometer. I don’t think it can detect the edge of the bed because during homing with the camera people report it trying to go the wrong way and bashing into the end stop repeatedly without seeming to notice.

1 Like

Accelerometer or other, it doesn’t particularly matter how it knows it hits a hard stop. Bashing the head without a starting frame of reference isn’t uncommon – it’s like someone putting you into a blackened room when you have no frame of reference where the walls are. That’s not surprising that the main troubleshooting tip is to power off and manually move the head under the camera so it has a starting frame of reference.

1 Like

Do you know how to reboot the machine? With the machine on, hold down the button for ten seconds until it turns blue, then turn it off for three minutes or so.

Then do a test layout on something cheap like cardboard before ruining a cutting board.

We were discussing this “quarter inch off” problem on the forum in one of Joe’s threads ( IIRC) a while back. It started the day of the update. Mine stayed that way until I rebooted the machine at Rita’s direction, and it is now fine.

2 Likes

Yes but it ignores hitting the endstop. It sets the origin by looking at the logo on the head using the camera. So the coordinate system is locked to the camera, not the chassis.

I’m not a patent expert. I can only tell you what’s in the description of the patent.

[0009] The change in the material can include at least one of cutting, etching, bleaching, curing, and burning. The image can be processed to remove distortion. The distortion can include chromatic aberration. The image can be enhanced by increasing contrast. Pixels in the image can be mapped to corresponding physical locations within the working area.

I know that a lot of times things are “over-patented” in the sense of this sounds like a great idea but the reality of implementation is sometimes different.

I did do that a few times, actually, on 24-June (when I got the machine). I can’t recall if that software update was on the 24th or 25th though.

So here’s my cardboard test…

I’m finding it difficult to understand what I’m seeing. Because it doesn’t look like fisheye. There’s pretty major distortion in the lower-left. That seems to be the worst spot on the map. But I don’t see equal, or even similar distortion in the lower-right. Upper-right, however, is probably the penultimate worst spot. I dunno. It’s really weird.

Settings:
0.160" thickness ~20" x ~12" cardboard
1000/25/225

it’s possible the software is trying to do lens correction, like you might do for a camera lens in photoshop/lightroom. software can correct for lens distortion. but they may be having trouble correcting for distortion and maintaining registration in software.

4 Likes

Yes because a camera maps light from physical locations in the view to pixels in its image. I don’t see how you can patent using the pixels to work out physical positions. 3D scanners have been using cameras for that for years.