Metric vs Imperial

I don’t think so.

We could tell by looking at the resulting motion files generated for each version for the same cut/engrave.

I think it is more for rapid moves, rather than cuts/engraves.

This all just makes me want to buy a basic for no reason other than science!

1 Like

I have a link for that… :wink:

5 Likes

Yes internally power is 0-100 but externally it is 0-100, full. However full is a different mode from precision power. When at full it uses hardware to set a value (presumably tube dependent) and runs with laser permanently enabled. For other powers it sets the hardware power to 127 (which is the max) and turns it on an off with the 10kHz motion file to give a sort of software PWM at 1.25kHz. That gives only 8 levels but it dithers to get values in between. I.e. over multiple 8 tick cycles it averages out.

1 Like

I understand how the technicalities work, I Just haven’t spent enough time plugged into the device to see the data myself so as far as I’m concerned this is all undocumented speculation. I see far too many maybes and possibly’s on the forum, which is why I’ve been taking everything with a grain of salt until I see it myself. :smile:

1 Like

It isn’t speculation on my part. I have decoded puls files and examined what they do.

1 Like

Dare to be different:

https://royal.pingdom.com/2009/07/13/strange-funny-and-baffling-units-for-measuring-almost-anything/

2 Likes

There is extensive documentation available from @scott.wiederhold. I understand you are new to the forum thus you are unlikely to have seen much of it but you will find that Scott and @palmercr have put out significantly more information on the Glowforge’s strengths and limitations than anyone else here, and certainly much more than Glowforge themselves.

4 Likes

Some of it is in the forum, which can be found using the search feature.

The rest can be found here:
https://github.com/ScottW514/GF-Hardware/wiki/01_Hardware-Overview

With newer information being posted and discussed here:

5 Likes

they use DPI everywhere, but the numbers are meaningless. Screens are still 72dpi (on Macs at least), but then a printer wouldn’t be 300dpi, as that is not a true multiple and neither is 600 or 1200. They really render at 288 (4times72), etc. So the numbers are just easy to remember but technically wrong.

Why would a printer be an exact multiple of a screen? I think a 300DPI printer does accurately print at 300DPI. Otherwise it would come out too small. And I have worked on the firmware of a colour scanner and that used a sensor that was physically 1200 DPI long.

The numbers are meaningless on a GF because the belts are metric making exact integer DPI numbers impossible. Then they are rounded to the nearest 5 in GFUI to make them look pretty.

2 Likes

because type and images need to be scaled, and they scale at multiples of 16.

1200 is a multiple of 16, 600 and 300 are not.

You can scale images by any factor using interpolation. 16 doesn’t have any significance.

1 Like

Your information needs updating.

Screens are all over the place when it comes to native DPI. The pixel size of my screen is 3840 x 2160. At 72 DPI it would be bigger than 53" x 30", which it isn’t.

DPIs of printers and screens evolved separately. Canon introduced a 300 DPI laser printer engine in 1983. The first Apple Macintosh in 1984 had a resolution of 72 DPI. Apple used the canon print engine in the first Apple LaserWriter in 1985. The DPIs were not commensurate, but it wasn’t an issue anyway. In those days you had to use PostScript driver in the printer to print because the Macintosh did not have the horsepower to do the processing - the LaserWriter had much more processing power, and the price to prove it. All of this history is in Wikipedia.

1 Like

yeah, there is no real correlation between screen resolution and print resolution. there never has been.

correct. But we still save images at 72dpi for screen use. They just end up being very small on those large screens. My argument was more about screen and printer resolutions being governed by 2-based numbers. Otherwise we’d have 16:9 be 4800 x 2700 or something equally memorable.

Most modern OS’s have some sort of resolution mode to support high-dpi displays, since for instance the “retina” display on my 27" iMac when showing things in “native” resolution is stupid (since even large windows are business card sized) so they have a resolution independent mode (i.e. specify drawing in real-world units like CM or points, so the come out a human usable size).