Clean the top of your head?


This might be a case of placebo effect, but it’s worth a shot.

I noticed today that my GF kept needing to calibrate and was having trouble figuring out where it was. And then it struck me that there was a lot of powdery crud on the top of the head from all the stuff I’ve been cutting. And that some of it might be complicating the GF logo that the camera recognizes to do the homing thing. So I took five seconds and wiped it off. Performance seems to be better.

Or maybe I just feel better about having done something.


When I clean my unit, I clean everything we’re supposed to clean as per their instruction, plus I clean the top of the head. I figure “Can’t hurt!”


Just a reminder that Glowforge uses the alignment of the logo underneath the overhead camera to reposition the head and center it. So it makes sense that keeping the logo clean and visible would be in your best interests.


Another thing I have noticed in a video posted today showed the homing going the wrong way with a piece of material on the bed with lots of previous cuts in it. I wonder if it just gets too visually complex for it to see where the head is. The thread got closed but it would be interesting if removing the material lets it home.

The other issue seems to be bright room lights.


That’s an interesting point. I never thought of that. Does it look at the camera as the very first calibration step to decide where the head is and which way to move it? It might be seeing a reflection or stray material and thinking “oh, it’s over to the right, I need to move left”. I’ll be more attentive to that next time.

I’d love to see the “so that” part of the user story for vision-based homing.


I presume so but as it parks in the top left it looks like simply assuming it is there and moving diagonally to the middle would work more reliably. I.e. instead of asking the user to move it there, do it itself with the motors. The worst that would happen is it bangs against the front or right, but how is that any worse than what it does now?


Thinking that lights and reflections have a lot to do with the calibration step. Although I have a Pre-Release I don’t believe there is any difference between how the homing works between my machine and a production unit. Calibration still depends on an internet connection and the camera. My Pre-Release calibrates properly every single time in under 90 seconds. I have not had to physically move the head from the park position in 5 months. One difference might be that my space does not have bright external lights or windows. Just a possibility.


When I had PRU1 & PRU2, I purposely didn’t clean the interior of the machines. PRU1 was cleaned before its RMA (as a courtesy, ‘be kind rewind’) and PRU2 was cleaned when it was taken to a demo. But in either case, as grotty as the interior’s was in both cases, it did not have any issues with the homing.

The one commonality as @rpegg stated- I don’t have any additional lighting above the unit.


If I recall correctly, another difference between the PRU and production machines is that they changed the internal lighting from slightly purple tinted LEDs to more white tinted. At least, that’s what Dan has said in the past (Glowforges at conferences (CES and elsewhere))

I personally think that they should have printed the logo with ultraviolet ink and used that feature to help differentiate the head.


Naw the Gen1 & Gen2 PRU’s have the same lid lights at the production units. The Betas and the units they had at the first Makerfaire had the old lighting.


Random note, and unlikely to happen. I’d love to see a line up of the Betas to the current shipping units.


Current state of the art computer vision should be able to find the white logo on black easily and not get confused by founder’s rulers lying on the bed, etc. As long as the head isn’t bleached out by a very bright light I don’t see what the problem is.


I have spent my career in Industrial Automation. When we utilize vision in machine control; we always live by the mantra “80% lighting, 20% camera”. Consistent lighting is paramount to achieving the correct contrast in a computer read image. I’ve seen instances in a factory setting where bright sun shining through a sky light would wash out the image. We had to cover the machines in an opaque material to get reliable operation. Computer vision has it limitations and is not as easy as folks would like to believe.

I can see where excessive dust on and around the GF logo could potentially confuse the vision, however it would have to be drastic amount of debris I would think.


Although dust by itself probably wouldn’t do enough, with interesting lighting conditions it might be enough to tip the balance sometimes.

(I’d also like to point out that we’re asking for really high accuracy from the GF’s vision here – working 99.9% of the time will result in many reported failures every day.)