Right now, half of the top 8 new posts are stuck on startup problems

Most of the time it’s not a mechanical issue, but related to the user’s connection to internet/wifi.

Many providers now push “band steering” and don’t default to separate band width, so users may need to call their providers to verify they can set up a dedicated 2.4gHz network for the GF. GF is very sensitive to wifi issues, and interference from other devices can affect it, even if your computer or other devices are just fine, so changing channel on your network may be necessary to fix that (though often an “Offline” error results from that).

It also helps tremendously to frequently clean the computer & the internet cache. A heck of a lot of information has to pass back & forth using the GF App, and things get “clogged up” significantly faster than before having the GF App.

Also when I find I get painfully slow, it’s usually due to Firefox trying to do an update–close your browser and reopen that, or try another browser.

In short, anything in the path of communication for the GF to the cloud, and your communication to home base so it talks to the GF can also result in stalls/sticks in the operation.

I know that. Lots of new users don’t, and they have to come post here to find out.

The app should offer more than this:


Even just a link to a Help article near where it says “Offline” would be an improvement.

1 Like

You mean like this?


The problem isn’t lack of information, it’s a lack of ability (or willingness) to look for it.


I’m looking at the app right now and that information is not visible. You have to dig for it.

If people are unable to find the information, or even just unwilling to look for it, they still need help. It’s a product, not a religion.

Based on the number of complaints we see, my position is that help should be more easily available. We shouldn’t be gatekeeping and telling people to be smarter.


You click Support at the top of every page, then Troubleshooting. Not sure how much easier they could make it - for this particular issue. There are others that are not so easy to find.


I know you can’t make a horse drink that water, but making it easier to find is helpful.


Isn’t there a couple thousand year old adage to the effect “give a person a fish, they eat for a day, or teach a person to fish and they can eat for a lifetime”.

We can’t force people to read, even though they need to, but I do also wish the connection between an error message and the “how to” was easier to direct where to look.

But once you are aware of few clicks to get to the search tool, it is relatively easy to navigate. Though not as good as other sites built by companies with significantly many more years dealing with customer service…

And yes, it’s a product. A very advanced product but with a simplified user interface (compared to other products like it). And as a buyer of this product, I understood that I have to read and learn how to use this product myself. And if something wasn’t working right, where to go to find out why. Digging? Yes, maybe down a couple links, but it’s all there.

And I am not a fan of their marketing that over simplifies it. It’s a highly technical machine–especially since it also is dependent on the internet, which unfortunately I think the providers are defaulting to settings that a person has to know to ask to change from “band steering” to dedicated–so if knowledge of IT is not extensive, it is yet another learning curve!

Great, then we agree things could be better.

That is all I am saying.

Things could be better.


The monkey has flown the coup. but 5ghz should b support by glowforge as well as a physical cabled network.

That being said. Given my understanding of how band steering works, it should be a solvable problem by glowforge. instead of waiting to see a beacon on the network it should prompt you for a SSID and actively try to connect to it like a hidden network.

Even in the face of defective implementations of band steering, glowforge should have a way to connect. If they don’t it’s really a bad router implementation that while it doesn’t help the owner of the router should be allowed to return, replace or even get a refund. I’m tired of improperly implemented consumer routers/APs that don’t conform to specs.

1 Like

In theory not too hard. It already will connect to a hidden network. I have my main wifi/router 2.4, 5, and second bonded 5(one of the new wifi 6 routers), then an apple time machine or whatever it is(an oddball one with a 4tb hdd but whatever), that has a dedicated 2.4ghz wifi signal for just the GF to connect to. It’s ssid broadcast is disabled but I just used the GF just fine a few min ago(and it’s how it’s always connected).

I assume you just typed in the SSID during the GF WiFi setup?

I think It was broadcasting during setup. I turned it off after it was connected.

I dont really remember if I tried the initial connection with it off or not. Or which one. My original was doa and FedEx sent two back from the truck.

They 100% need to spend a few more dollars and put in a better wireless chip. It’s quite sad to see that they haven’t done so already with all the issues that arise on this forum. Telling users to move closer to or to reset their router is unacceptable when good chipsets that support all bands + BLE + Linux (with is what the GF is running) are under $10 MSRP. There is no reason a multi-thousand dollar machine cannot have proper WiFi connections. Imagine the outrage if your Macbook was using an IoT wireless chip…:scream:

@bansai8creations Band steering should not affect clients that don’t support 5ghz, it works by responding only to 5 GHz association requests and not the 2.4 GHz requests from dual-band clients.

It also helps tremendously to frequently clean the computer & the internet cache. A heck of a lot of information has to pass back & forth using the GF App, and things get “clogged up” significantly faster than before having the GF App.

What…? This is not how browsers work. They don’t get “clogged up”, nor do they send cached data back to the Glowforge UI. The cache makes websites faster by removing the need to open a connection (the slow part) and download data again (the generally faster part.)

In all my poking around building the Material Manager, I spent a good amount of time looking into how data is uploaded to the Glowforge UI. It does a single post with the file you are uploading followed by opening a web socket connection that receives status updates to let you know when the file is ready. This is all very optimized. Clearing your cache will provide no real benefit beyond letting you read more free news articles by clearing the session state for news sites. It will also temporarily free up a few MiB of disk space. In the end, none of this affects your Glowforge anyway.

@GrooveStranger at this point I’m willing to trade in future passthrough support for $1000 back on my Pro! I only use the pass-through for a USB cable so I can plug it into the onboard serial port… :slight_smile:


Clients that don’t support 5gHz

Sorry, not sure what you mean–by client, do you mean device? My experience and empirical evidence is that band steering did not work successfully with my GF, and after I went back to dedicated networks, I did not have issues again.

I agree the WiFi chip should be more robust because it is that sensitive.

But I don’t buy into the “move closer to the router” either–and I never have, and I was able to resolve my Offline issues with network channel change.

And sorry for using non-technical expressions–but cleaning your computer and internet cache & removing cookies has been espoused by many (including anyone in IT at multiple companies I’ve worked with) as always a good idea and keeps things running well. And it does help the performance of all applications, including the GF App. Again, empirical. Or maybe I just have an older PC (I’ve never liked Apple, sorry). Affecting the GF unit itself, maybe not, but you can’t use the unit without the App.

But my GF App performs better after I do a clean & clear out the crap that browsing can often introduce… and occassional defrag. And also when Corel is not trying to load up its stupid pop-up windows for special deals.

I mean the physical chip that is soldered to the Glowforge board. Think of this as an additional CPU dedicated to certain wireless frequency ranges. Cheaper chips or chips meant for low power IoT devices only have the functionality for limited frequencies. This may be done to save a few dollars or to reduce power consumption. A few dollars is a lot when IoT devices need to be in the $20 range and more circuitry means a higher power consumption which is worse for battery life.

This is the chip in my Glowforge, the one without Bluetooth: http://www.ti.com/lit/ds/symlink/wl1805mod.pdf

For some reason the chip they chose to use in the Glowforge only supports the 2.4 GHz range, this is odd because the monetary and power savings have no real benefit in with this type of machine. Band steering itself doesn’t do anything other than preventing dual-client chipsets, a physical chip that supports both 2.4GHz and 5GHz, from using the 2.4GHz range. It does so by only accepting connection requests on the 5GHz channel and falls back to 2.4GHz. For devices that don’t know 5GHz exists, the Glowforge, there would be no impact.

fyi…a one-off observation is not scientific proof that an action has the effect you are perceiving…its most likely the placebo effect. You did something to your computer and now it “must” be functioning better. If you were tracking by the minute computer metrics and doing statistical comparisons with prior recorded numbers I would be more willing to believe you. This is what we do when testing new software features: A/B testing, blue/green testing, performance testing, etc.

As an aside, I challenge you to find a software engineer that would do anything a typical IT worker suggests. I bet it will be hard because they suggest the wide range of actions built up over the years many of which won’t actually fix any problems however in the rare case you are experiencing one of those rare issues all their bases will be covered.

Quick example, the help desk suggests rebooting your router. Rebooting a cheap router will likely solve the issue they frequently run out of memory to store routing tables and may incorrectly implement the standards, rebooting a high-end router will literally fix nothing.

But I don’t buy into the “move closer to the router” either–and I never have, and I was able to resolve my Offline issues with network channel change.

This will work in areas with high interference from other devices. In a typical suburban neighborhood, I would expect this to have little effect.

Oh, defragging…that is a myth to bust another day.


:slight_smile:. Indeed, I don’t believe I have ever cleared my browser cache and cookies, and I have been using the Internet since before there were browsers. When I was doing tech support, I may have had a collection of busy work I could give people so I’d have time to move on to the next customer. Not that I would ever have sent someone on a pointless Windows reinstallation just to get them off the phone…

Rebooting mine will destabilize my network for 15 minutes while it re-learns the best channels and balances traffic.


@icirellik, I do entirely agree a better & more robust chip is needed and had expressed this too in a bit of a rant over a year ago when I was having my Offline issues.
(And yes, I am aware it’s a chip soldered to the board–I used to work with & audit some KVM switch manufacturers, so I do know something about manufacturing and materials, but I left the choice of components & design of the board to the SW/EE/design–except when had to deal with some failure analysis…),

And I can’t argue with you why running CC cleaner should or should not be a factor, but my “one off” findings are actually from years of empirical evidence. I don’t disagree that in theory it should make no difference, but with systems I run, it has. Suspect more recently there are some trackers or cookies or other crap that some sites keep adding (or trying to run in the background) hat slow down my browser function on my average (or less than average these days) PC–noticeably.

Defrag–with so much in the cloud or remote, I suppose it rather useless now & just a habit… back in the day when you had to have programs loaded on the PC (when everything still had CD readers and before internet or cloud, and even perhaps holdover from systems that used floppy discs), defrag would make a difference with the “average” PC’s…

I would highly suggest installing uBlock Origin and worrying less about trackers.

This is why I didn’t want to talk about defrag, its a long topic.

My comment about defragging is in regard to it being one of the largest computer myths to date. It was at best a Windows myth perpetuated by disk manufactures that knew reading large blocks of sequential data from spinning disks was slightly faster than reading it randomly. This could technically make a computer boot faster, or load applications faster depending on how the files were stored and loaded. It didn’t help that NTFS was really bad at allocating disk sectors and it does inherently fragment itself over time.

For typical usage patterns, small files being randomly accessed, it provided zero benefits. This was all before 2000 anyway. Once spinning disks added more platters, with multiple read/write heads, and begun to ship with proprietary interfaces there was no longer a way to defrag a hard disk if you wanted to. The interfaces no longer allowed an OS to tell the disk where to put the bits, this is part of the reason why it is so hard to securely delete data from a hard drive. The onboard interfaces handle read/write balancing, caching, and data access outside of the scope of the OS. The OS no longer controls how the data is stored.

Still, none of this actually mattered all that much which is why *nix based OSes (Linux, Unix, Mac (Darwin), BSD) never had a defrag option. Proper file system design and standard usage patterns did not benefit much from such a concept.

Finally, moving back to the present if you have an SSD there is literally no reason to defrag. Random access and sequential reading are essentially the same things since they are no longer based on a physical spinning disk and constantly rewriting data on and SSD shortens its lifespan Anyway, there is a whole lot more history/information on this subject and I’d guess the top Google results are likely Windows forums saying do it. Still, it definitely won’t make anything worse, yet I wouldn’t expect it to make your computer faster. It runs automatically since Windows7 and later anyway.

p.s. suggesting that you have years of empirical evidence does imply you recorded it. Otherwise, its just hearsay.

For folks with the knowledge it might be a minor thing to have an app that would just do the necessary steps for the painfully ignorant to fix specifically Glowforge issues. As I am among those painfully ignorant I have no idea of how hard or easy it would be to just check the settings and offer to fix them so such as me can click yes and feel brilliant.

I am very lucky to never have had those issues but would be at a loss if I did.

Why would Glowforge choose to put in only the 2.4 GHz chip? I know that some of these hardware decisions were done quite some time ago and therefore would be something that isn’t revisited. Would it simply have been cost? Did they assume that 2.4 GHz would be easier to support without the other band? Just curious as to the decision.

From reading the P&S topics, it seems like an unending torrent of WiFi and cable issues. Maybe in the scheme of things this isn’t a major issue, but from our perspective and from someone who is dealing with a non-functional Glowforge, it is a big issue that shouldn’t be so intractable.