• The Forums are now open to new registrations, adverts are also being de-tuned.

Beware Driving Automation

The thing that often catches out an autonomous car is an unpredictable human, exactly the same thing that often catches out other human drivers.!Removing humans from the equation will significantly improve safety.

You'd need to isolate the roads from humans then.

Railways work rather well as regards automation - for a reason.
 
You'd need to isolate the roads from humans then.

Railways work rather well as regards automation - for a reason.
No pedestrians or cyclists would be ideal from an automated safety point of view, but that would likely be infeasible, especially in residential areas.

Once most cars are able to sense humans the system (ie all cars sharing data) would better anticipate a movements than a single car on main roads or at busy times, eg school pick up.

Those pedestrians and cyclists are a challenge for drivers too. I personally believe that a human is more susceptible to making a mistake than an autonomous car and therefore inherently less safe.
 
The sources are specified. It is the reasons ,for the continual increase in the number of road accidents, that are not.
Where is the source of the data used by Statista for this trend in road accidents then?
 
Just a reminder that a driverless car and autopilot and two different things.... I think we're miles away from having Johnny Cabs replacing Uber (in spite of the Google cars and Weymo etc)
Agreed, my point is that autonomous cars will become viable when most cars have the ability to sense what’s going on around them, ie when cars with driver aids like Tesla today.

Until then I suspect that public and legislator perception would forever hold back the general use of autonomous cars on public roads. Familiarity will help, but i believe that critical mass on shared data points would be the turning point.
 
Agreed, my point is that autonomous cars will become viable when most cars have the ability to sense what’s going on around them, ie when cars with driver aids like Tesla today.

Until then I suspect that public and legislator perception would forever hold back the general use of autonomous cars on public roads. Familiarity will help, but i believe that critical mass on shared data points would be the turning point.

We should think about autonomous cars in the same way that trains and passenger jet airplanes are managed.

Trains and passenger jet airplanes traffic is managed centrally, by knowing where each train and airplane are.

Collison avoidance via onboard sensors (electronic or human) is the last line of defence.

Autonomous cars can avoid hitting each other electronically, by knowing where the other cars are. The onboard sensors will only be needed to avoid other unexpected objects such as people, animals, or debris on the road.
 
We should think about autonomous cars in the same way that trains and passenger jet airplanes are managed.

Trains and passenger jet airplanes traffic is managed centrally, by knowing where each train and airplane are.

Collison avoidance via onboard sensors (electronic or human) is the last line of defence.

Autonomous cars can avoid hitting each other electronically, by knowing where the other cars are. The onboard sensors will only be needed to avoid other unexpected objects such as people, animals, or debris on the road.
Planes and trains have very different use cases to cars and this is key to understanding that cars cannot be managed in the same way. It is a huge mistake to apply this line of thinking; the debacle that is in-car touch screen technology fully illustrates this. Fine in planes and trains, really not fine (actually dangerous) in cars.

Cars operate in much more complex and immediate environments. There are many many more individual journeys made by car, using an almost infinite network of routes, origins and destinations, very often in very close proximity to other vehicles and road users.

Over simplification of these issues, together with an almost total absence of clarity on ‘roles, responsibilities and liabilities’ for vehicle occupants (passenger or driver) is resulting in a dangerous trend towards technologies that are simply not needed or wanted.

No doubt that these systems will eventually be forced into use. A transition to a sci-fi world of transportation, to satisfy the egos and bank accounts of those that would create it.
 
In the case of the Boeing 737 MAX-8 jets air traffic control knew exactly where the planes were. UNFORTUNATELY the planes didn''t know where the ground was. metaphorically speaking. Likewise in Baltimore total reliance on electronic steering on a container ship lead to a disastrous bridge collision despite the presence of an emergency manual system which was unmanned at a time of crucial navigation demand. In both cases the total reliance in the sophisticated electrical technology was misplaced?
 
Is there - or will there be - sufficient electricity for the computing required for cars to self drive? Looks to me like one more demand on what will be a stretched resource with more deserving candidates than replacing what is done tolerably well by humans.
 
Looks to me like one more demand on what will be a stretched resource with more deserving candidates than replacing what is done tolerably well by humans.
An interesting question that actually covers a huge amount of the "brave new world of AI computing" to which I don't know the answer.

But returning to the task of driving, while there are always drivers who make bad judgments and cause collisions, it is one of the most complex tasks that most humans undertake on a regular basis and there are an almost infinitely larger number of successful journeys undertaken than unsuccessful ones - which is something to celebrate.

Bearing that in mind (and this is in no way minimising the human tragedy of every major injury or death that does occur on the roads), is the goal of deploying increasing levels of automation under the guise of eliminating collisions really the best use of resources?
 
In the case of the Boeing 737 MAX-8 jets air traffic control knew exactly where the planes were. UNFORTUNATELY the planes didn''t know where the ground was. metaphorically speaking. Likewise in Baltimore total reliance on electronic steering on a container ship lead to a disastrous bridge collision despite the presence of an emergency manual system which was unmanned at a time of crucial navigation demand. In both cases the total reliance in the sophisticated electrical technology was misplaced?

The Boeing 737 MAX-8 crashed because the pilots weren't properly trained and were not aware of the new system's existence or how to bypass it.

The fact remains that every single day there are around 100,000 commercial flights flying on autopilot without incident, but them of course that's what you'd expect when the pilots are properly trained to use the aircraft's systems.

Your post is the equivalent of taking a driver who has never driven a car with automatic transmission and putting him in the automatic car, and then when he crashes because he couldn't find the clutch pedal to disengage the engine, labelling automatic transmission technology as being dangerous.

As for the Baltimore incident, the ship experienced a powe loss prior to crashing into the bridge, it's easy to blame automation, but how would the port pilot steer the ship manually without power? Ships' rudders are no longer operated by rods and ropes.
 
Is there - or will there be - sufficient electricity for the computing required for cars to self drive? Looks to me like one more demand on what will be a stretched resource with more deserving candidates than replacing what is done tolerably well by humans.

20 years ago there wasn't enough computer power to do what we do today.

Obviously, we don't have today the computer power that will be required in 20 years.

But what's the question exactly?
 
The Boeing 737 MAX-8 crashed because the pilots weren't properly trained and were not aware of the new system's existence or how to bypass it.
It’s a bit more nuanced than that.

Boeing deliberately introduced automation routines so that the MAX-8 handled similarly to previous generations of the 737 in order to circumvent the requirement for pilot training on a new type. They also botched the automation meaning that a scenario could develop whereby the crew had no alternative but to be passengers when the plane ended up in a smoking hole.

All in all salutary lesson in how badly conceived automation can have catastrophic consequences.
 
20 years ago there wasn't enough computer power to do what we do today.

Obviously, we don't have today the computer power that will be required in 20 years.

But what's the question exactly?
The electricty required to run some peoples data-centric vision of the future is vast. For example NVIDIA plan to ship out 1.5 million servers by 2027. The power required to run those servers annually is 85.4 terrawatt hours -the same energy consumption as a small country.
 
Last edited:
The electricty required to run some peoples data-centric vision of the future is vast. For example NVIDIA plan to ship 1.5 million servers by 2027. The power required to run those servers annually is 85.4 terrawatt hours -the same energy consumption as a small country.

Good. We should start building those nuclear power stations ASAP then.
 
The Boeing 737 MAX-8 crashed because the pilots weren't properly trained and were not aware of the new system's existence or how to bypass it.

The fact remains that every single day there are around 100,000 commercial flights flying on autopilot without incident, but them of course that's what you'd expect when the pilots are properly trained to use the aircraft's systems.

Your post is the equivalent of taking a driver who has never driven a car with automatic transmission and putting him in the automatic car, and then when he crashes because he couldn't find the clutch pedal to disengage the engine, labelling automatic transmission technology as being dangerous.

As for the Baltimore incident, the ship experienced a powe loss prior to crashing into the bridge, it's easy to blame automation, but how would the port pilot steer the ship manually without power? Ships' rudders are no longer operated by rods and ropes.
Most plane crashes are causes by human error (just!!).....Pilot error is thought to account for 53% of aircraft accidents, with mechanical failure (21%) and weather conditions (11%) following behind.....Its long been agreed that planes would be far safer without pilots at all....and the computers flying them would not only have not made those pilot errors but would have prevented some of the others by not taking the risk to fly through rather than around a storm and knowing better how to handle potential mechanical failure......however would you get on a plane with no pilot???.....no, me neither!
 
From the man who urges reduced consumption....

Consumers of electricity are increasing faster than generators of electricity - and no one seems to notice or care.
Not true though....we consume less now than in decades....

The highest peak electricity demand in the UK in recent years was 62GW in 2002. Since then, the nation’s peak demand has fallen by roughly 16% due to improvements in energy efficiency......
 
From the man who urges reduced consumption....

Consumers of electricity are increasing faster than generators of electricity - and no one seems to notice or care.

In this case, computing power is the last place where we should be making savings.... we should cut on wasted air-conditioning in empty office buildings in America and other similar idiocies.
 

Users who are viewing this thread

Back
Top Bottom