continuing from Raster vs Vector engraving
Thinks that a single command and control provides for all the zombie lasers in regard to performance metrics
The old way of things was hey we think a tube will last X time depending on y usage rate. Well this is pretty much useless as what constitutes heavy or light usage. It is purely subjective. If I run my laser at 15% for 1000 hours and someone else runs it at 80% for 1000 hours take a guess whose tube has more miles on it.
So we need a better metrics some that aligns closer to consumption of a depleting resource
• Time of usage – nope the laser tube doesn’t have an hours count down if it did then ‘depending on your usage’ would not be the disclaimer when providing life cycle
• Photons emitted (micro moles) – would be darn cool but I don’t know if this is realistic, quantum sensors are in the 1,000$ range. When I was setting up my hydroponics I rented one for about 150$ for a week. Can’t imagine that setting up a licor would be cost effective
• Watts sent to the tube – as with everything complex there could be a way to just keep a running tally of what watts have been sent to the tube (not the system as we don’t care to measure fan or stepper usage). This is raw consumption X watts have been sent to the tube with a capacity of Y (infinity because we don’t know yet) and do the math and you can see yea I’m getting close the end of this tube can shoot out.
• Power level at X duration – if measuring watts is not possible then measure duration spent firing at x power level certainty is possible. I ran for 1 minute at 100% power = the same as 2 minutes at 50% power (this assumes that tube burn down is linear (we know that power vs ablation is not linear but we are not measuring cut performance just consumption)). So once you have that you can normalize the data to get power used thus far similar to doing a watt’s measurement
Now for the power of the command and control to model how things are going to last or how to adjust for drift
• What is being gathered
o The material being cut (be it proof grade or user entered) be it specific or just general classifications this just depends on how you want your group by clause and what you want to provide as user enterable fields
o The thickness
o The power level used
o The speed used
o Maybe a check box that says is this a test/sample cut for calibration
• You can then start to create histograms of what settings are used for what X material. This can give you stuff like for X we find that 70% of people fall inside of here and 10 on this side for more power or more speed or less of each etc…
• As the tube burns down uses some life etc… the users will re calibrate the settings to compensate something that you used to be able to cut at 20% now needs 30% power or needs 10% less speed etc…
o Now that we know the consumed life on the tube (above) we can then take the histograms of just each material and where people typically fall for settings and then create bins for well these people have this y consumption and these people have y*.5 consumption and then look at what the normal range of operation is for that material between the different bins(as many as is reasonable)
o Splitting out by consumption and then by what’s normally operations for settings for each material you can then start to project where things may start drifting to. This also has the potently for auto calibration of settings based on consumed life of the tube.
• The power of HUGE amounts of user data coming in that patterns are formed and association this wasn’t possible on any of the previously deployed lasers. It was more instinctual knowledge or cheat sheets
or I’m a nut and don’t know what I’m talking about
my daily database for what I do at work is about 9TB that’s my configuration database
my performance metrics analytics database is 144 cpus, 432GB ram, 4.5 TB DB, currently doing analytics on 8.87 million metrics