The Question is, who was driving?

Jun 2018
2,818
721
South Dakota
#3
pull Tesla auto pilots as a legal way to drive until it can pull over for police and emergency vehicles.
Thats been tested to a degree. As you know cell phones can be jammed. The use of that capability is being challenged all over the world. Just jamming phones would cripple the ability of rioters (antifa) to communicate and coordinate their attacks but it's been declared unconstitutional. I suspect this might be too. Another gadget that got shot down was one that shuts off your radio and replaced the sound with a siren. The one I want is the EMP emitter than kills the electronics in a "Boom Car".
 
Nov 2017
1,343
757
Virginia
#4
I'm not an expert on the legal system, but it seems to me it would be nice to see this issue addressed by a court to establish some precedent on the issue.

As someone with a background in electrical engineering, computer science, robotics, AI, and as a licensed driver in Virginia, I can make some comments and observations about this issue. I'd have to study things like the algorithms, functions, sensors involved, design of this "Autopilot" system (which I haven't done), but from what I understand, it's not a system that has the same capabilities as a "driverless" vehicle.

As someone who has played around with Microsoft's PC flight simulator, I also happen to be somewhat familiar with an autopilot system for fixed wing aircraft. An aircraft's autopilot is just a system that basically makes sure the aircraft is flying at a preset altitude (elevation above ground or sea level), preset cardinal direction (i.e., how many degrees from North), and preset speed. There's a little more involved in that, for example the climb rate is also a setting that can be adjusted, and from what I've seen there it doesn't have any rudder control for fixed wing aircraft (I don't know how it works on helicopters, and a fixed wing aircraft's cardinal direction can be adjusted without the need to be able to control rudder input); there may also be other details involved in its use that real pilots are trained with that I'm not aware of, but essentially an autopilot on an aircraft doesn't do things like take off, land the plane, start the engines/APU, switch on or off any other devices or equipment, navigate to a goal, file or amend flight plans, or communicate with ATC to receive flying instructions. I think it can only be set & engaged by the human pilot, and will disengage itself and alert the human pilot if it detects a problem that needs to be dealt with by the human pilot.

I take it that essentially Tesla's Autopilot system is doing something similar to what an autopilot does on an aircraft, but ironically for a roadway vehicle an "autopilot" would be more complicated. I'm guessing that it makes sure that the vehicle is driving within its lane & it isn't rear-ending the vehicles in front of it. I guess it also makes sure it's either driving no faster than a set speed upper limit, or the posted speed limit, can merge when it's in a lane that's coming to an end, stop at red lights or stop signs, stop at railroad crossings when the lights flash and gates are down, etc. I don't know what it does if it encounters a human directing traffic (which takes priority over the rest of the traffic rules) & am guessing that it doesn't know what to do, since it is necessary to have an extremely complicated system that can interpret human signals issued using their hands, the blowing of a whistle, hand-held traffic guide wands, lights, or signs, etc.

One could argue, I suppose, that a Tesla Autopilot eliminates or "filters out" the dangers introduced by a drunk driver in a conventional vehicle. When it comes to drunk driving vs simply falling asleep at the wheel, the legal implications on a driver are far more serious for drunk driving; it has to do with issues like safety and "intent" (or negligence), as well as the existence of laws that address driving while under the influence of a substance or intoxicated. From a technological perspective, whether or not you were drunk when you fell asleep at the wheel of a Tesla with its Autopilot engaged makes no difference when it comes to how unsafe that is; the impact is the same, and it probably isn't very safe (at least not as safe as it is with a driver who's awake, alert, and paying attention).

Either way, this technological issue isn't what really matters; if a driver is operating a vehicle - even if it's only partially, they're liable for what happens. From what I understand, an aircraft pilot is considered to be in total control of an aircraft even though the autopilot is partially operating the aircraft; I think the same liability principles for an aircraft pilot can apply for an automobile driver. To me it seems that having a Tesla Autopilot engaged is not that much different than driving a vehicle with an automatic transmission instead of a manual transmission; with either version of transmission system, a driver is liable for the operation of the vehicle. Since the cops had to take steps to "virtually" force the vehicle to come to a stop (by triggering certain sensors in a certain way), that tells me that either the Tesla Autopilot wasn't working properly, or it's not designed to pull over when there are flashing police lights or sirens. If it's not designed to pull over when there are flashing police lights or sirens, that means that the driver is actually in direct control & liable for the vehicle's operation.

Legally, someone who's drunk isn't supposed to even be behind the wheel of a vehicle - at all. There are even laws dealing with being drunk in public, so there's also that issue. Being a Tesla with Autopilot engaged isn't a good enough reason to escape the liability of DUI or DWI, when the driver falls asleep while drunk and behind the wheel; consider that in some cases it may involve having to do things while awake, and outside the vehicle, such as getting a flat tire or the vehicle becoming disabled in the middle of the road. There are things a driver is supposed to do that requires being awake, even if the vehicle is pulled over to the side on the shoulder. The blinkers need to be turned on, cones or flares may need to be placed on the road. Being outside of the vehicle and wandering around in the middle of the highway is not a good idea if someone is drunk. Because of this, even the passengers in a "driverless" car may be liable & have a certain level of responsibility of making sure that at least one of them isn't drunk, so they can address highway issues, etc.
 
Dec 2013
30,309
18,405
Beware of watermelons
#5
Thats been tested to a degree. As you know cell phones can be jammed. The use of that capability is being challenged all over the world. Just jamming phones would cripple the ability of rioters (antifa) to communicate and coordinate their attacks but it's been declared unconstitutional. I suspect this might be too. Another gadget that got shot down was one that shuts off your radio and replaced the sound with a siren. The one I want is the EMP emitter than kills the electronics in a "Boom Car".

That is seriously dystopian though i am seriously against driverless cars.
 
Dec 2013
30,309
18,405
Beware of watermelons
#6
I'm not an expert on the legal system, but it seems to me it would be nice to see this issue addressed by a court to establish some precedent on the issue.

As someone with a background in electrical engineering, computer science, robotics, AI, and as a licensed driver in Virginia, I can make some comments and observations about this issue. I'd have to study things like the algorithms, functions, sensors involved, design of this "Autopilot" system (which I haven't done), but from what I understand, it's not a system that has the same capabilities as a "driverless" vehicle.

As someone who has played around with Microsoft's PC flight simulator, I also happen to be somewhat familiar with an autopilot system for fixed wing aircraft. An aircraft's autopilot is just a system that basically makes sure the aircraft is flying at a preset altitude (elevation above ground or sea level), preset cardinal direction (i.e., how many degrees from North), and preset speed. There's a little more involved in that, for example the climb rate is also a setting that can be adjusted, and from what I've seen there it doesn't have any rudder control for fixed wing aircraft (I don't know how it works on helicopters, and a fixed wing aircraft's cardinal direction can be adjusted without the need to be able to control rudder input); there may also be other details involved in its use that real pilots are trained with that I'm not aware of, but essentially an autopilot on an aircraft doesn't do things like take off, land the plane, start the engines/APU, switch on or off any other devices or equipment, navigate to a goal, file or amend flight plans, or communicate with ATC to receive flying instructions. I think it can only be set & engaged by the human pilot, and will disengage itself and alert the human pilot if it detects a problem that needs to be dealt with by the human pilot.

I take it that essentially Tesla's Autopilot system is doing something similar to what an autopilot does on an aircraft, but ironically for a roadway vehicle an "autopilot" would be more complicated. I'm guessing that it makes sure that the vehicle is driving within its lane & it isn't rear-ending the vehicles in front of it. I guess it also makes sure it's either driving no faster than a set speed upper limit, or the posted speed limit, can merge when it's in a lane that's coming to an end, stop at red lights or stop signs, stop at railroad crossings when the lights flash and gates are down, etc. I don't know what it does if it encounters a human directing traffic (which takes priority over the rest of the traffic rules) & am guessing that it doesn't know what to do, since it is necessary to have an extremely complicated system that can interpret human signals issued using their hands, the blowing of a whistle, hand-held traffic guide wands, lights, or signs, etc.

One could argue, I suppose, that a Tesla Autopilot eliminates or "filters out" the dangers introduced by a drunk driver in a conventional vehicle. When it comes to drunk driving vs simply falling asleep at the wheel, the legal implications on a driver are far more serious for drunk driving; it has to do with issues like safety and "intent" (or negligence), as well as the existence of laws that address driving while under the influence of a substance or intoxicated. From a technological perspective, whether or not you were drunk when you fell asleep at the wheel of a Tesla with its Autopilot engaged makes no difference when it comes to how unsafe that is; the impact is the same, and it probably isn't very safe (at least not as safe as it is with a driver who's awake, alert, and paying attention).

Either way, this technological issue isn't what really matters; if a driver is operating a vehicle - even if it's only partially, they're liable for what happens. From what I understand, an aircraft pilot is considered to be in total control of an aircraft even though the autopilot is partially operating the aircraft; I think the same liability principles for an aircraft pilot can apply for an automobile driver. To me it seems that having a Tesla Autopilot engaged is not that much different than driving a vehicle with an automatic transmission instead of a manual transmission; with either version of transmission system, a driver is liable for the operation of the vehicle. Since the cops had to take steps to "virtually" force the vehicle to come to a stop (by triggering certain sensors in a certain way), that tells me that either the Tesla Autopilot wasn't working properly, or it's not designed to pull over when there are flashing police lights or sirens. If it's not designed to pull over when there are flashing police lights or sirens, that means that the driver is actually in direct control & liable for the vehicle's operation.

Legally, someone who's drunk isn't supposed to even be behind the wheel of a vehicle - at all. There are even laws dealing with being drunk in public, so there's also that issue. Being a Tesla with Autopilot engaged isn't a good enough reason to escape the liability of DUI or DWI, when the driver falls asleep while drunk and behind the wheel; consider that in some cases it may involve having to do things while awake, and outside the vehicle, such as getting a flat tire or the vehicle becoming disabled in the middle of the road. There are things a driver is supposed to do that requires being awake, even if the vehicle is pulled over to the side on the shoulder. The blinkers need to be turned on, cones or flares may need to be placed on the road. Being outside of the vehicle and wandering around in the middle of the highway is not a good idea if someone is drunk. Because of this, even the passengers in a "driverless" car may be liable & have a certain level of responsibility of making sure that at least one of them isn't drunk, so they can address highway issues, etc.

Geohot was on track to release his tech that basically hacked your car to drive itself. Im not sure what happened to it but he goes into it here





Geohot could be the future of self driving cars.
 
Likes: Neil
Sep 2015
12,897
4,842
Brown Township, Ohio
#7
Tesla stock has tanked. I think the auto-pilot on a Tesla has verbal commands and the only reason I thought about buying one but changed my mind when the first Tesla Incident happened. Elon Musk suffered a mental breakdown.
 

Similar Discussions