Trace, Resize & Quick peek of the Glowforge UI

demo
glowforgeui

#1

Figured it would be better to break this out into it’s own thread.

Once again, Thanks @dan.


Zoom Option for Placement?
Making a file for Glowforge
#2

Can’t freakin wait lol


#3

Thanks @dan and thanks to @karaelena and the Talkshow gang for being so dedicated.

My observations about this, having spent the last week learning the ins and outs of the Gcode extension for Inkscape or using JScut and sending the results to ChiliPeppr to do a simulation, is that the Glowforge puts it all together in one interface. I know you can work within some of the CAD programs that speak directly to the machines.

Question: how different is the Glowforge in this process?

I myself am fairly enamored of the fact that you can resize so easily and that the scan to vector splits in the middle rather than doing the edges (might be my poor understanding of Inkscape that gets in the way).


#4

The ability to drag & drop copies around a live view of the material is killer.


#5

Absolutely. Every scrap of material can be used without the probability that alignment is wrong and part of your cut is off the material. Did I mention that I’m a scrap hoarder?


#6

Its the ominous Glowforge face-eye…to be feared by all. Great video…:grinning:


#7

When he started drawing I thought he was doing the Kilroy sketch from another thread :slight_smile:


#8

Thanks for sharing this video Dan. Does the Glowforge UI display any indication of final scale of objects as you shrink and expand scanned items?


#9

The most impressive thing about this demo is how little Dan’s art skills have improved since he first started demoing the Glowforge. :smile:


#11

That face should be the mascot. With the little face in the eye. It will be iconic and terrifying.


#12

His video skills have improved a ton though. Witty banter on camera?! Being entertaining and informative?! Today I learned sometimes Dan is silly.


#13

I’m not quite sure I understand?

There are rulers across the side and top

They appear to actually be degrading.

Apparently you missed the video where I trebuched cows in slow motion earlier.


#14

I absolutely did. I’m going to fix this oversight asap.


#15

I actually think they’ve gotten worse - jkjk

EDIT:[quote=“dan, post:13, topic:2876”]
They appear to actually be degrading.
[/quote]

He already noticed this lol :wink:


#16

From my limited experience with going from design to machine, the normal step for CNCs is to get the design into gcode. In some cases the design software writes the gcode as a plugin or extension. In some cases there is a dedicated program to interface with the machine to work the design into the gcode. So in my workflow it is Inkscape > Gcode Tools Extension > Chilipeppr > CNC Simulation. It has been a challenge to understand all the parameters that each step along the way throws at me. For example, with a CNC router, I had to understand how the bit profile works. It wasn’t clear to me what the cut angle was referring to until I understood that it was all about whether the bit plunges in at 90 degrees to the surface without moving, or gradually going down to cut in angle. It took me a while to just appreciate all the parameters in Chilipeppr.

One aspect of Glowforge that I know is different is the cloud-based UI. The Chinese K40 folks all seem to bemoan the software that ships with their machines. It seems that quite a few keep some venerable Windows XP machines up, dedicated to their lasers. Then there is the perennial question: will it run on a Mac? Linux? Windows 10? This is a major selling point of the Glowforge for me. I don’t want to be tied down to any architecture or operating system.

Mind you, this is all simulation for me since I don’t have a CNC yet. I have started on the electronics of a CNC router and have the motors, a tinyg and a power supply. So I’m just spinning tape at the moment!


#17

Something I noticed (could be angle or quality of video too) is that when @dan placed the little face on screen inside the first eye it was cut much farther left than it was placed in the image. Are there rulers on the bed for material alignment? Is there some kind of calibration stage to make sure the image/camera placement matches actual cut placement? How do you handle cutting when precision placement is required? For example, I have an iPhone that’s a one shot, I can’t screw it up. In a Trotec I’d align it with the right and top guide rail and place the engrave plate 4mm in from both edges to ensure correct placement of the engrave. Is something similar possible with GF? Eyeballing and camera placement is a cool gizmo, but by the numbers placement is often more valuable.


XY home position
#18

Haha if there’s not, you can bet this will be the first thing I cut for my glowforge.


#19

Dan said there are rulers along the top and side.


#20

I think you may have missed the point of his post. Regardless of what the rulers say, if you place an image on the camera captured image of the material, you would expect it to cut there, which it apparently did not.


#21

This leads me to wonder how we can reliably align a cut/engrave session with an existing feature on the material being cut. With the lasers I’ve used in the past, I had a little red dot that I could assign to be the origin. I’m now wondering if that approach is more accurate than using a camera image for alignment.