Why would Glowforge choose to put in only the 2.4 GHz chip? I know that some of these hardware decisions were done quite some time ago and therefore would be something that isn’t revisited. Would it simply have been cost? Did they assume that 2.4 GHz would be easier to support without the other band? Just curious as to the decision.
From reading the P&S topics, it seems like an unending torrent of WiFi and cable issues. Maybe in the scheme of things this isn’t a major issue, but from our perspective and from someone who is dealing with a non-functional Glowforge, it is a big issue that shouldn’t be so intractable.
I’m not sure 5 GHz is much of an answer to anything, especially if we don’t know why people are having trouble with their 2.4 GHz networks. It has less range and now you have more complex antenna considerations and the software needs to support switching between bands.
IMO a wired ethernet port would have been a much more sensible investment. Blaming all these problems on the WiFi chip (rather than the software or firmware or antenna design or …) seems a stretch to begin with, let alone believing it’d work better if they’d just used a Broadcom part instead of a TI part.
All WiFI is terrible. I have a collection of IoT and hobbyist electronic devices, where I get to see just how terrible. In addition to the one I described in another thread that connects to the weakest base station, I have several that try to connect on startup. If that doesn’t work, they hang, and if their connection is working but drops, they’ll never come back until they’re restarted. This is the kind of mistake that is very easy to make in software, and that can be fixed in software (most of the time). I have exactly zero knowledge or evidence of such problems with the Glowforge, but I’ve been fighting this type of stuff for a while and in my opinion, it’s usually not as simple as “weak radio signal: make stronger”, but rather about how the device reacts to fluctuating signal, interference, and mutual interaction between implementation bugs.
Thank you for the perspective. I’m not sure if what would you call this from a cognitive fallacy perspective, but it seems that we all presume that all people are intelligent. equipped and benevolent actors: They are smart enough, have unlimited resources, and act altruistically. Entropy wins unless you throw all three of these things in.
Couple this with a presumption that all hardware problems or software problems are easily solved if you just throw enough of those actors in my first statement at them.
Yes, we can work through all these issues. Please be patient while we mop up after ourselves.
Once again, we always have two issues with support requests: the problem and the client’s reaction to the problem. A good institution can’t neglect the hand-holding necessary for customer success.
We don’t have any perspective on how serious or systemic the issues are. It seems like they are monumental, catastrophic, intractable. Bad to use 2.4 GHz alone. Well, the alternative might not be any better as you said.
More curious as to why a wired NIC wasn’t standard?
Honestly, having a 2.4GHz + 5GHz chip would likely have solved the problems. It would work out of the box with people’s home routers, it supports higher bandwidth, and would stop the standard response on this forum from being “make sure you have a 2.4GHz network”…
Glowforge users are not engineers.
I just don’t understand why they chose such a low-end chipset on such an expensive device…inexperienced engineers?
Someone needs to spend a day and solder a hat onto the PCB? We know there is a spot for it.
Indeed. One more factor: that people actually do the troubleshooting steps you ask. It’s also so much easier when you have the option to change things and see what happens (I recall a recent forum post where someone was temporarily in possession of two carriage plates and it was rather obvious to determine that the fan on one was broken). Turning the router on and off is probably achievable for most, but what about moving it? So many visits to relatives for the holidays where the WiFi only worked in half the house (and inevitably not including the guest room). Oh look, the router is some POS supplied by the cable company and it’s behind the TV with a 3 inch cord. I wish I could tell people with WiFi problems to go to Target and buy a Nest WiFi or Eero 3 pack and see if it’s still broken. Or one of these like I have. Blankets the whole house with the bonus that birds fall from the sky fully cooked.
I don’t know what possible thought process would lead to leaving out the ethernet port. I’m thinking the same Jony Ive aesthetic that gave us one button instead of a screen that could say what’s wrong.
Thanks for the insights–once upon a time defrag helped, but decades later no longer applicable (glad I only did it a handful of times in the last decade, then).
And agree, empirical is not statistical evidence. But it should not be dismissed out of hand if you’re studying user experience with a product in the real world. But if working on a designed experiment, of course it’s not meaningful.
It is certainly a fallacy to expect this, and the fallacy of magical leaps that you have a clear notion of the destination but no idea of the means to get there or the built in paradoxes that you cannot choose only the “good” stuff.
However Entropy takes energy to oppose and I find that the best place to put your energies. “Best” cannot be defined by any scientific method as it only shows a path to results and the definition of best falls to the deepest part of your primary occupation, and a part given very short shrift by most folk no matter what religion or lack of it. I regard it as “best” to be a logical discussion and not a catechism.