Model 3 vs. Glowforge

Or sued out of existence since it’s a manufacturer decision.

Unless law is passed absolving liability… and that’s another slippery slope.

2 Likes

Now let’s see one taking us to grandma’s house in Minnesota in December at night. :slight_smile:

1 Like

I think that self driving cars, until everybody understands them, should identify in some way. I will probably freak out the first time I pass a car with no driver in the seat, or where the driver is asleep behind the wheel (even scarier thinking they might jerk awake and grab the wheel or hit the brakes or do some other reflexive thing before fully awakening).

They basically need to earn my trust as much as the person that is in the car as far as I’m concerned.

Your mileage may vary :slight_smile:

4 Likes

Something like this?

2 Likes

Watching the 3 cameras on the right is pretty neat! Interesting to see different things light up differently for the computer’s vision.

Liability is pushed onto the operator of the car. They’re sitting at the wheel, they can take over control at any time, and every time when the self-driving features are activated they are required to agree before the features will be turned on.

I’m not against self-driving cars but operators need to keep themselves alert. People already don’t take driving seriously at all, they have no emergency maneuvering skills, they’re essentially cluelessly riding around in 4,000lb weapons, and making a self-driving car propagates the decline.

2 Likes

This has nothing to do with the technology existing or being ready to deploy. This is a societal issue that may impact the implementation of the technology, but has no impact on the technology existing.

The first is obvious. New drivers might react in unexpected ways so of course we drive differently around them. The second is also obvious, most of the time we are breaking the law when behind the wheel, and we are ever-fearful of a ticket and the increased insurance costs associated with it.

From the second sentence on in this paragraph has nothing to do with the first sentence. A self-driving vehicle would do none of those things, except travel at the speed limit. Also, you do realize that if someone is traveling within the limits of the law (i.e. driving the speed limit), that you are the one in the wrong trying to get them to move out of your way right? Flashing lights, honking horns, tailgating, etc… are all misuses of safety features and/or unsafe driving techniques that can put people at risk. For the record, I also hate it when people are going slow in front of me, particularly when I have no way to go around them… but I slow down and wait for my opportunity to go around them.

Tesla is on the street today with cars on autopilot. While you can tell its a Tesla, you have no way of knowing whether the autopilot is active or not.

If you drive on the roads today then you are entrusting your life to hundreds if not thousands of people that you don’t know anything about. You don’t know if they passed their drivers test with a perfect score or a bare minimum score. You don’t know if they even have a driver’s license. The only thing that you can control on the road is yourself. The only thing that you can predict on the road is… exactly nothing. Even if a driver reacts to a situation in the exactly correct manner, any one of hundreds variables might effect that reaction.

5 Likes

So I asked my friend who has a Model S what the Tesla does if someone tailgates or rolls up on them real fast if it is in Autopilot. The first response was oil slick, followed by some Mario Kart options. The real answer is that it doesn’t react to that situation.

1 Like

ROFL! Oil slick works. :rofl:

2 Likes

I agree, and it bothers me. Florida has a lot of Teslas running the streets.

Edit: Adding below.

Yup, Florida has plenty of those too. I am generally able to watch as I approach and either get around them or watch them wreck.

Acceleration therapy is definitely my thing. And being able to travel distances.

I’ve been watching Formula E with some amusement. I calculated that, based on what I’m seeing in their races, it would take 15 to 17 car changes to get from Central Florida to New Orleans. That is expensive.

If you keep the same electric car, even with charging stations along the way, it still becomes a 2 day trip minimum, and probably 3 depending on types of chargers and hours you’re willing to drive. I can make it in 10 hours or less.

One last observation: Electric cars are burning coal. Sure, there is nuke, wind, solar, etc. but most energy plants are still dirty. I still like my dinosaur powered vehicles :slight_smile:

3 Likes

We have a Model S that isn’t outfitted entirely for autopilot; however, we have friends with autopilot hardware. It is safe to say that the software is not fully functional yet. While driving around town, you end up taking the controls at various times. When we were with Google, we got occasional updates regarding the waymo and the word is that the waymo functions very well at low speed and in town conditions. I heard somewhere, can’t remember where, that the tesla model uses cameras and radar to identify and respond to objects; whereas, the waymo uses a camera only system mounted on the top of the car, assuming that eyes in the car provide humans with enough info to respond. That top mounted camera could be placed on the top of almost any car, theoretically. Though, I think Google has partnered with Chrysler fiat for the early models, I expect that software and hardware to expand to lots of car manufacturers. I also have so much faith in google’s engineers and design teams to produce awesome cars. And the autopilot engineers have had a lot of turn over with Tesla…

Anyway, I’m super hopeful about self driving cars. I think almost anyone would recognize that the world would be much better off with entirely self driving vehicles (safety-wise) though not so much with 90% self driving vehicles. And, I have brothers who are “car guys” who can barely stand the idea of relinquishing control, trusting technology and giving up driving fast and furiously. I get it and I think it’s legit, except for the trusting tech part. I love technology and I’ve read several studies about human’s inability to trust technology at their own peril. I’m still a little worried about AI though. :slight_smile:

4 Likes

Elon has already said that Telsa is essentially fully self drivable now, but that regulations are playing catch up to how quickly software has progressed, so the features are off.

I’m a day one pre-order in CA so I am hopeful of a model 3 by the end of the year, if not early next year. I am torn between the silver and wrapping it in a matte grey similar to the announcement color.

I’m carefully avoiding choosing colors and options to help soften the wait.

1 Like

I’m not arguing that it has any impact on the technology. My point is that technology alone isn’t the only barrier to adoption of self-driving cars. Like it or not, technology solutions do not dictate societal acceptance of changes in human behavior. (But I also don’t believe the technology is nearly there - been in the tech business too long and seen too many silver bullets that were only a short time from massive deployment before it turned out that we were 5 or 10 years from practical widespread deployment.)

Making someone sit behind the wheel and pretending they’re responsible is fallacious - it’s only a stopgap attempt to deflect liability. The value of the self-driving car is that you don’t need to be paying attention. If I have to pay attention and be ready to take over at any moment then I have to be as engaged as if I were driving it. Auto-pilot is not what most people think of when they look forward to a self-driving car - look at the comments here about being able to take a nap or read a book.

Cars will not be allowed to kill people without significant limits on their ability to operate (e.g. under constraints like highways only, trucks only, parking lots only, etc.). And no one denies that they will kill people - they’re just arguing that they’ll kill less than human drivers do. The industry needs to have them implemented in sufficient numbers so that “the genie is out of the bottle” before the bodies start stacking up or it will be a (multi-)generational adoption curve. It’s not 2 years out.

Technologists argue logic. Society makes decisions based on emotions.

2 Likes

Kinda like Big Boy?

Autonomous robots might react in unexpected ways.

Really? Most of the time I am not breaking the law when I am behind the wheel.

or, you know, of being mis-identified, dragged from the car, beaten, shot, wrongly imprisoned… the “cop-fear” depends greatly on where and who you are.

that one gets tricky in California. It is illegal to block the flow of traffic in the left lane, even if the flow of traffic is higher than the posted limit.

If the driver is texting with both hands, it is on auto. If they are only texting with one hand and pretending to look at the road, it is in manual mode.

So much for PEBS (Predictive Brake Assist Systems)

2 Likes

What precisely would be the difference between a self-driving car and AI?

1 Like

Given a scenario with a set number of variables, and infinite repeats of said scenario where each of those variables stays the same each time, the robot would react the same way every time. Change a variable and the robot may react differently. Now that isn’t to say that some piece of hardware on said robot doesn’t malfunction, but that isn’t the fault of the robot. That is absolutely not the case with a human.

Congrats. You fall outside the norm of human drivers. I fully admit to the fact that I regularly travel at a rate higher than the posted speed limit, and I will on occasion interact with my phone. I’m trying to get better about that last one.

Not really very tricky. You’ll note that a qualifier in my statement was “within the limits of the law”. Your scenario puts the driver into an illegal scenario, therefore my previous statement doesn’t really apply… except for the part where the driver behind who is trying to pass is taking potentially dangerous actions in an attempt to manipulate the driver in front to get out of their way. As they say, two wrongs don’t make a right.

:rofl: True.

PEBS can break just like any other piece of hardware on a car, and then it won’t brake :wink:

Well, self driving cars are surely a form of AI, but I was referring to the point when the robots take over and eliminate humanity. The HAL scenario from space odyssey. It was a tongue in cheek kind of comment.

1 Like

This is true today since the current Tesla Autopilot is NOT self-driving. Tesla Autopilot is a driver assistance feature that requires the driver maintain awareness and keep their hands on the wheel. It will warn the driver to return hands to the wheel if they are absent for some time (I don’t have one yet, so I don’t know how long hands need to be absent before the warning occurs). If they get this warning too often, the car will disengage Autopilot (pulling over to the side of the road if the driver doesn’t take back control) and will refuse to engage until the car has been parked.

However, Tesla owners report long trips and daily commutes in heavy traffic are much less taxing using autopilot.

Most consider Autopilot to be at Level 2 on the autonomous-driving scale. The scale in brief (condensed from TechRepublic: Autonomous driving levels 0 to 5: Understanding the differences :

Level 0: Driver is in full control. No automation at all

Level 1: A specific function is done automatically. This includes almost all cars today that have cruise control.

Level 2: Car can control both speed and steering using sensor input about the road. Driver must remain ready to take control.

Level 3: Car is able to operate autonomously in some conditions (such as freeway only) but a driver is still required to intervene if necessary. However, driver is not required to monitor the operation as much as Level 2.

Level 4: Fully autonomous with no driver input for an entire trip–but still limited to certain scenarios. So Grandmother’s house through the snowy wood is probably not covered.

Level 5: Fully autonomous and equivalent to human driver even in extreme environments.

Tesla is pushing toward full Level 5. This enables some of their key scenarios such as letting your car go be an autonomous taxi while you are at work.

At Level 4 and 5, liability would have to be borne by the manufacturer/service provider (none of these Autonomous Driving systems are completely standalone–they all depend on highly detailed maps and learning systems built in the cloud.)

5 Likes

In the distant grim future, all self-driving cars will be linked to social media. In the event that the car has to decide who is put at risk, the occupants with the thinnest networks will be prioritized.

Or wait, maybe that was a Black Mirror episode…

10 Likes