I agree with palmercr’s initial post. Wattage would be a more robust defense against changes in the allowed minimum or maximum power produced by the laser. (To make it clear, the wattage palmercr is referring to is the wattage out of the laser, not the power dumped into the laser.) At first blush, it would also be easier moving projects between the basic and the pro and other laser cutters. Dan has already mentioned that each machine is calibrated before it is sent out the door, and that they have much tighter tolerances for the beam than other manufacturers in the ‘consumer’ space. So they know for each system the relationship between the power dumped into the tube and the power output from the laser. They have to know this so that different machines will produce identical results using the same settings for the same Proofgrade.
It’s also a valid point that we don’t set the length of a stitch on a sewing machine by a percentage between the smallest and largest stitches per inch that the machine can do. We don’t set the temperature of our water heaters as a percentage of maximum temperature the water heater can obtain. We don’t set the voltage of a power supply by the percentage of the maximum voltage it can produce.
But.
It’s not clear to me that a laser cutter is a linear device, where power to the material has a linear effect on the material. Does doubling the power allow for a cut twice as deep? I don’t think doubling the power of a photo engrave doubles the darkness of the engrave. As the power increases the dot size gets larger but not in a linear way, and more power goes to making the dot deeper which affects the dot size as well.
The effect of the beam quality also affects the performance of a machine. It is not correct to say one will get the same results from 30 watts on the Glowforge as 30 watts on a K40 or Redsail. (I don’t know if those show power out versus power in.) Correcting for the beam area doesn’t help because the power profile of the beams are different, and the focusing cones are different.
In other words regardless of the starting point, watts or percentage, one still has to futz around to get the desired results on non-Proofgrade material.
And.
I’ll assume dan and company are pretty smart people and they considered using watts instead of percentages. Why didn’t they? Probably due to the support issues that would engender. Remember the target market for the Glowforge. I don’t want to consider how many people would complain that they can’t get their machines below the minimum wattage, which would be non-zero. Or more importantly, as tubes age the power produced gets lower as CO2 migrates into the tube. That 40 watts will no longer be 40 watts and that will cause confusion. And support issues. Or the alignment might get whacked, the beam profile might change, and again that 40 watts will no longer be 40 watts.
We also know the optics are better on the pro. Which means the beam profile could very well be different between the basic and the pro. Which means that again one can’t just correct for the beam area and expect the same cutting/engraving behavior from the two devices.
Wattage can be measured extremely precisely, but that precision implies a precision that the Glowforge simply can’t deliver. And that will cause support issues.
So.
I’m not going to second guess dan and company. As already shown, people can produce great work with the systems at hand without too much difficulty once they get the process down. And the learning curve wouldn’t change whether power was marked in percentage or watts.
It is a PITA that the low power settings aren’t enabled as production units are being shipped. I don’t know if people’s work flow will have to change once this is enabled for non-Proofgrade materials (or for dialed in settings used on Proofgrade), or if the machines don’t allow the user to set a low power. In either case, as what happened when they were turned on during for a while, this is going to cause support headaches when it is turned on.