On the limits of complexity in GF interface

Has anyone ever seen a list of the absolute hard limits, or at least recommended limits, for complexity of raster and vector files in the user interface?

Coming from the world of vinyl cutters, I have a cricut explore (which uses a very similar “you must be online to cut” cloud-based system). I was a fairly early adopter, and when it was new, there was a hard limit on the number of nodes that you could have an SVG, or the attempt to cut would fail. The good news is that they improved their server side software, which allowed for far more complicated cutting jobs.

Also, I wonder if anyone at Glowforge has ever announced improvements in this area that are in the hopper? I see lots of posts about long preparation times and failed cuts, so I can only imagine that it’s being worked on.

2 Likes

They have not published anything that concrete but I would not expect them to as it is always changing.

I know they can do much better on raster than we are currently seeing as early in the PRU program we were doing really large photoengraves that were well in excess of 3 hours.

2 Likes

Not so far. But I believe that’s because it’s not just one issue - there are several factors that can contribute to a file hanging up.

Dan has mentioned several times that they’re working to improve the limits, but no telling how far they have to go…we’re still only a couple of months into the release, and I suspect they’ve been focusing on getting these things built and shipped out during that time.

3 Likes

They haven’t published any limits that I am aware of. I even went so far as to ask (Limits of GFUI for rendering?) but all I got in response was “here are some says you can probably work around the problem”.

1 Like

I don’t know what the limits are, but I’ve run into them once with a fairly complex Vector image, which had way too many nodes. I simplified the vector, without losing any details, and it loaded just fine after that.

1 Like

Cricut released a major update about a year in, as I recall. I’ve not had any failures since they upped their limits, and I’ve routinely sent vectors with 10000+ nodes.

It’s definitely not apples to apples, and I’m sure that Glowforge has a lot of issues that are higher priority than this at the moment. I was just curious if there were published guidelines, and it sounds like “no”.

Thanks to everyone for chiming in.

(PS to anyone from Glowforge, if such guidelines do exist, publishing them would be awesome.)

3 Likes

Unfortunately there are no simple guidelines or we’d share them.

1 Like

I imagine this will be an obvious area for GF to upgrade in the future. I can’t think of any technical reason why the serve coudn’t break up a larger design and spool it to the cutter microcontroller in batches. It’s probably just not on the top of their priority list right now.

Forget the future, we’re working on it at present. :slight_smile: Unfortunately it’s blocked by some large architecture changes (like support for multipart pulse files) so it hasn’t been a quick fix.

9 Likes

That’s great to hear. Sounds like November is going to be a great month for you guys.

Any update - I mean - I run into: image

Or the one that it is too complex / too big / too fill in the blank with something that says I can’t print regularly. It’s just hard to make interesting work that fills entire sheets for giveaways at work without bumping into the limits of the software over and over. I am in software - understand all the trade offs of cloud vs. local, understand rearchitecture takes time to get right while you’re running a current service at scale for customers. But I also know I have a $10k mac pro maxed on memory, storage, and compute cores sitting next to the printer that can’t be taken advantage of and a laser that is just controlled with simple pulse, move left/right/away/closer, focus that I can’t make happen because something somewhere didn’t scale appropriately (not enough ram, not enough compute, not enough storage) to satisfy my need to print a given design. That makes me sad and somewhat angry as I simplify or cut down on the number of copies of something I’m trying to “print” so that I can meet the arbitrary limits of software that is outside my control.

I don’t think your software is going to be where you make money. I don’t understand why this is something that has been and will continue to be closed source. You’ll make money on supplies and you’ll break even on hardware. You don’t even have a model to charge for the software. Going closed source without an architecture for plugins and an ability for someone with time and energy they wish they could give you for free seems like an odd decision in these times.

Again, you know your business better than I do, but the day to day fails on scaling to what I consider moderate sets of instructions for a simple brain of the laser is dissappointing to me.

2 Likes

Most of these are due to running into a memory limit related to the amount of time for a cut/engrave because of the way the motion plan is handled. There are other factors but the limit is usually something north of 3 hours. We can lower the lines per inch for an engrave or do other work arounds or wait for a fix. The original plan was to load the buffer, execute and then reload with a short pause in between creating a seamless result. Seems they don’t have the part where it is able to continue an operation after the buffer has reached capacity. Could be a lot of reasons for that. But as far as we know the plan has not changed.

2 Likes