Can we use the GF without internet connection?

I know it may sound silly but! I know I am not the only one wondering… Can we just connect the GF directly to the computer and send the jobs to the GF to be printed? (I am kind of new in “the cloud” area)

Nope. You need internet… :confused:

:open_mouth: is there a future possibility of not having to use it???

1 Like

From all indications… Not really. No. There are no ports on the GF (besides ethernet… I think) so it has to go through the internet.

1 Like

Scratch that. No ethernet port. Wifi.

This is one of the ways that the cost of a home laser is able to come down so far. All the heavy processing is done by google’s cloud servers. The software lives in the cloud, just like Google’s search engine. Essentially, we connect to it, it does the work, then it sends that work back to us. Internet is required. But it is (supposedly) going to be light enough (in terms of data transfer) that a dial-up connection, a cell-phone hotspot, or satellite connection will suffice. You need internet with wifi, but not necessarily broadband or high-speed internet.


Not really sure why this reduces costs. I mean is there anyone out there using a GF who realistically doesn’t have a modern computer (Sure I can imagine some bizarre scenario where you have a high speed internet connection but no local computer… but really)? And modern computers have enough power to do any of this computation, unless they are doing some absurd level of calculations. I’m doing machine vision in realtime on a Raspberry PI at 30fps at 1024x768 and that’s a $35 CPU for goodness sake.

I also really, really wish there was an ethernet port as that makes it so much easier to integrate (and let’s be honest that is so little cost as to be silly to even discuss) into your network (I’ll have to remember how to change this thing every time I change the wifi password)

1 Like

Yeah, Its quite a bit of a bummer in many ways. But the nice thing is that they can update their interface whenever and there shouldn’t have to be upgrades we have to install. As well it allows the use of tablets (and phones?) to do basic operations. Hopefully this will outweigh the inconvenience in the long run.


There is no reason your home PC couldn’t do the exact task the “cloud” is doing for your tablet and trade off latency for CPU speed. I mean the “cloud” is just a computer sitting in someone else’s room. It’s not like when you go to print, they recruit hundreds of servers to “compute” your job I imagine. Yes, this might raise the minimum requirement of the machine, but anyone buying a $2000+ laser cutter probably has a beefy computer; and by beefy probably a core i5 would suffice. I imagine any desktop capable of running the current Mac OS or Windows OS could run the “server”.

I know they show the mom making the magnets, but seriously mom isn’t buying a $2000+ device to make a few magnets for little Julie’s birthday party, so she probably has some other use for the device, and is doing real design work on a PC. The tablet is really a super quick and dirty placing tool. That being said the latest tablets are pretty serious multicore devices themselves…


Agreed. Maybe they’ll eventually offer a version for your PC. The hardware should be able to do it no problem. But they have hinted that you are able to hack the software (at the loss of your warranty and with a large risk)… So maybe that will solve it?

PS Her name was Mia… not Julie. :wink:

1 Like

It reduces costs in two ways:

  1. As an alternative to making your home PC do the calculations: They save money by only having to program for a single platform, and by having all debug information available to the developers. Porting software around to every random OS option is a logistics nightmare. Yes, they could write things almost entirely in HTML5 or some other language which is platform independent… but I am not aware of any such platform which includes I/O capacity to an external device.

  2. As an alternative to an onboard computation solution (RaspPi or other): They don’t have to include the extra board in the Forge. And the board would have to be a beefy one if it is to handle all of the processing required for the entire interface and cutting operation


Really you have to support 2 desktop OS (mac and Windows) and adding an embedded Linux board to run the thing could be an option ($100) say. My 3D printer came that way, and to enable network printing on the printer, you optioned up the rpi with octopi.

If you made the front end HTML 5 and ran the we server on the embedded board you’re fine. The front end doesn’t do the IO, the backend does. They obviously have some computer in there to communicate with each the cloud and do the CNC functions. If you made the server RESTful, then you can tack on native clients whenever you want. As for debugging, the thing can upload debug data whenever, just like ms office or whatever.

I don’t care as much as I have Fios and it is super fast, but essentially folks in rural areas are kind of stuck…

Not 2 desktops. You also must support linux. And multiple versions of linux at that. And while the iOS and Windows have now made desktop and mobile the “same” you also need to accomodate Android, so make that a fourth.

And you must update your program when any of them update their program in such a way as to invalidate yours. And you may at some point be forced to become an app instead of a program, so then have to go through validations. Or you may be reliant upon a framework, which may update and force you to make changes to fit to that.

And whatever frameworks you have to use will have their own communication protocols which you are slave to, and may cause issues.

And no, you cannot simply upload debug data whenever, since many of the people with issues are people who are running the forge, and attached computer, completely offline. And they may be running an old version of the software, or an old version of the firmware, and that version may not support the debug information you actually need for the problem they are having, which is most likely a known and solved problem in the latest firmware, but you cannot say for sure, because they are on an old firmware.

And for those who do supply the debug information, you cannot link the information to the printer reliably, because it is decentralized. And you need to run a server of your own in order to be receiving and storing all of this debug information you are gathering from all the various firmware/software/os combinations out there. And you need to be indexing that as well so you can search it. And running heuristics on it so you can identify patterns which may indicate a predictable fault.

As long as you have a computer somewhere doing the number crunching to solve things down to Gcode, then the board on the printer has basically no requirements at all. Which is what the 3D printers and the Forge have. I am rather certain (but not absolutely certain) that the $100 board on your printer doesn’t do the slicing. If you are just tossing it GCode, then that is the same as any 3D printer out there, and the same as the Forge does already have. It is the computation power/capacity of the slicer equivalent which ramps up the cost (albeit not too much), and the tech requirements of the user (being able to update firmware when needed. This is the major sticking point). Doing the slicing/vector calculations on the board also places some restrictions on input file types. The cloud has the same limitations, but can be expanded universally as the company sorts out new methods.

It is 100% possible for GlowForge to have implemented a non-cloud approach. But it would have increased physical cost a small degree, and would have mandated more programmers on staff, and pushed back timelines on implementation of new functionality. It does boil down to a judgement call in the end though. There are reasons for going cloud, and plenty of reasons against it. Which is why typically the only real response out of Dan is… if this is a make or break issue for anyone, they are sad to see you go, but ready to issue a refund.


@jacobturner is quite correct.


Being a developer of both cloud based and non-cloud based mission-critical software where lives hang in the balance, I must disagree on debug. I will tell you we get way more problems with people’s network connection or browser settings/version than we ever do on native software. At the moment GF is debugging with a bunch of nerds (folks like us), but when you roll out to the public, it’s not the same. You can say “you can only run on browser X” but that of course is a non-starter since grandma has no idea what browser she is running since it just comes with the computer (no idea why grandma needs a laser cutter, but maybe for her quilting club?). The 3AM debugging of some doc’s firewall/browser settings are a hoot, let me tell you.

As for platforms, you still have to do this, as anybody who develops browser based apps knows that safari for desktop is not at all the same as safari for mobile, and the same goes for chrome, various firefoxes are in a world of their own (particularly on linux), and of course pre-Edge MS browsers induce nausea in most developers. So of course you use a framework, just like on a desktop app, because that’s the only sane way to develop now.

As for running analytics, you still need to do the exact same thing!? I mean the data is the same (platform of user, printer info, etc)… Yes there are patterns of errors in slicing, but I am going to make a wild guess again that more errors will be from networking issues (firewall settings blocking the protocol that the user has no idea how to adjust on their Comcast router, than from a mis-slice from the base software). Of course the analytics are cloud based (elastic search or whatever), so the client just uploads periodically log data.

Not sure why doing the slicing on the board produces any limitations from a file type standpoint (they are computers?).

Modern higher-end embedded boards have the processor power of most low-mid range laptops. Yes, it might take a few more seconds, but that’s really the difference, and depending on your network speed (remember many folks have DSL) that may be canceled out by latency.

You update the boards like every computer on earth with some server hosted updating system.

The biggest downside to cloud though is that like all those games that now are paperweights because the company stopped maintaining the servers, if GF either decided to deprecate the product or goes out of business (gosh I hope not, and can’t wait until GF MkII) we are all then stuck with a giant useless box. In the what are your tools there are people running 1980s tools that still work, but a cloud connected box that has no cloud, is dead. Think about the fact that we needed a change to DMCA to even be allowed to try and revive those services ( ) which should be a serious concern. Not saying @Dan that you guys won’t be awesome corporate partners with all of us and are well funded at the moment, but anyone who bases a business workflow on this tool, should have that concern as due diligence.


Have to admit, this is the single factor that aaaalmost made me not buy one. It still makes me crazy nervous. I have really annoying network issues often, and am also about to start moving around to different parts of the world. The general trend of not being able to use an item that is sitting right there next to me, because my connection is spotty… it’s maddening.


You are correct, many of the problems are problems either way (I wasn’t completely explicit about that, but attempted to state as much). Debugging is always an issue and a headache (well, for some of us that rephrases to “exquisite joy” but we are freaks), the reason centralized is an advantage is that you can add callbacks and logging to isolate issues, and query functional devices for information to rule out various scenarios.

Yes, you will still get networking issues. I didn’t say it makes fewer bugs, just clearer/easier debugging process.

And yeah, you still need to accommodate a few multi-platform type of issues, but almost entirely GUI level, and with plenty of cookie-cutter approaches available. “You still have to do this” is true, but the level of work required is sufficiently different it is worth noting as a minus on the non-cloud side (which again… there are likely few true differences, and in the end it is a choice of the company).

The analytics… yes, my entire point was that you do still need to do the same thing. Which weighs against the non-cloud side heavily. Because the primary advantage (business side, not consumer side) to non-cloud is not having to maintain server hardware. But if you want the same level of debug capability, you wind up having to run basically the same level of cloud capacity.

The reason that the slicing on the board has limitations (I did specify in the cloud you have the same limitations) is because only so many file types will be coded to work at any given time. Some because they simply do not exist yet (who knows what revolutionary random CAD replacement/wannabe is around the corner). Adding new support to a cloud, trivial. Adding new support to a distributed firmware… ought to be trivial, but PEBKAC.

Yes, higher end embedded boards do have solid power. They are even getting to the point that they have enough memory you can start to ignore it as a limitation (most of them). But that is higher end boards, which means (slightly) more cost to the product. And again… tradeoff, decision, etc. I will say that I have run a slicing job which took 8+ hours, and the difference in speed between an embedded board and a desktop with top end graphics cards in that particular case would be astronomical (and I know of no embedded board which would have the memory to have pulled it off). This is of course an edge case, and unique to 3D printing (ie - I cannot think of any laser job which would have similar issue, unless of course someone does an UHD fractal… even then not too likely).

GlowForge has committed to releasing us a basic firmware, in the event they do collapse somehow. And of course at the end of the day, this machine is a bundle of stepper motors and power controllers, all of which are easily controlled by myriad approaches. So it will never be a paperweight, you will just need to wait for (or participate with) an open source movement which comes in to pick up the pieces. I would say that if the GlowForge cloud system were to go down, it would be less than a month until someone has Raspberry Pi type of control swapped into the Forge, and within a year that someone manages to get the native board functional on some level (though the swapped board approaches will certainly be far superior to any modified firmware approach).


I, too, have sporadic network issues and one of my biggest concerns is the cloud-only nature of the software that drives a GF laser cutter/engraver. For me, the promise of “basic firmware” is not much comfort and I really don’t buy the cost argument that cloud is significantly cheaper than a stand-alone application.

Both my CNC mill and lathe and 3D printer seem to do fine with free or low cost stand-alone control programs and the 3D printer’s proprietary software supports both Mac and Windows.


There is one big advantage to having the software “in the cloud.” You write it once and you don’t have to worry so much about the operating system. All the time they save NOT worrying about Windows 8.1 vs Windows 10 vs Mac OS X vs Linus is time they can spend improving the core features that we all benefit from. I’d rather have them upgrading a feature than spending time debugging a USB chipset issue, you know?

I have some very real concerns about driving this beast from the cloud, but it absolutely provides us with some potential benefits as well.


I too was concerned at first about the cloud only approach before I purchased. Specifically if something were to happen to the company. (I currently own a now defunct OnLive system that props a door open)

The GF team have been pretty open about wanting the system to carry on. In the future, if the company ever went a different direction, 3rd parties have seemingly been welcome to pick up the torch and write open source for it from what I understand.

With that security in mind, I’m kind of glad they went with the cloud. I don’t even want to think about how much longer the pre-order wait would be if they were porting a desktop app.

As the company grows, so will the apps and tools to go with it. Who knows, one day we might have a universal app suite with localized data and cloud syncing like Evernote or something.

1 Like