• The Forums are now open to new registrations, adverts are also being de-tuned.

Beware Driving Automation

Nothing good about an emerging mega user of electricity coming on line.

BTW, I don't have concrete data to support this, but it is likely that AI will save more energy than it consumes. There's been some very interesting results when using quantum computing to optimise large-scale energy consuming systems.
 
It’s a bit more nuanced than that.

Boeing deliberately introduced automation routines so that the MAX-8 handled similarly to previous generations of the 737 in order to circumvent the requirement for pilot training on a new type. They also botched the automation meaning that a scenario could develop whereby the crew had no alternative but to be passengers when the plane ended up in a smoking hole.

All in all salutary lesson in how badly conceived automation can have catastrophic consequences.


Unless those who made these decisions at Boeing were robots... then these crashes ate 100% human error - i.e. the people at Boeing who created this mess, not the computer they built without telling anyone.
 
BTW, I don't have concrete data to support this, but it is likely that AI will save more energy than it consumes. There's been some very interesting results when using quantum computing to optimise large-scale energy consuming systems.
So you think one day data centers will no longer require direct connection to the national grid to operate due to their power requirements?
 
Those pedestrians and cyclists are a challenge for drivers too. I personally believe that a human is more susceptible to making a mistake than an autonomous car and therefore inherently less safe.

My view is that the environment in which we expect cars to operate - even without humans messing up - isn't good enough.

My experience of car assist systems is that they can be easily confused as regards speed limits and road markings. GPS is taken as a given in terms of reliability but there are people who live in areas of the UK where warnings are issued regarding GPS degradation - and reliance on these systems is risky in strategic terms.

My view is that automation can be made to work reliably on established trunk routes between some cities and towns - and on specific main routes within urban and suburban areas. It requires investment in setting up those routes and maintaining them - and a shift in thinking so that roads are treated a bit more like rail in terms of management. Night time trunking of road freight should be a target for implementation.
 
Unless those who made these decisions at Boeing were robots... then these crashes ate 100% human error - i.e. the people at Boeing who created this mess, not the computer they built without telling anyone.
But it wasn't the pilots who got it catastrophically wrong as implied in your earlier post. As I stated, it was ill-conceived and badly deployed automation that was to blame - and that was down to people on the ground at Boeing.

Which neatly brings us back to the matter that automation can and does fail, for a variety of reasons ranging from poor or incomplete understanding of the requirements leading to inappropriate design decisions, to bad implementation through to component failure. Even high-reliability systems have a failure rate. The art is to understand the failure modes, the consequences of failure and what monitoring / mitigations need to be in place to minimise or eliminate risk. The 737 MAX-8 incident demonstrates that there are circumstances where the commercial imperative overrides the impulse to "do the right thing" and we mustn't be blind to that as we rush towards our automated version of Nirvana.
 
Unless those who made these decisions at Boeing were robots... then these crashes ate 100% human error - i.e. the people at Boeing who created this mess, not the computer they built without telling anyone.

There were a chain of problems.

  1. An old aircraft design that they wanted to add to to enhance the product
  2. The enhancement in engine size/power changed the flight characteristics under some circumstances
  3. An unwillingness to require pilots be retrained - so adding automation to modify the flight characterstics to make them similar to the older versions of the aircaft
  4. The added automation was applied to a non Fly By WIre system - in effect an add on to an existing electro mechanical system. It ended up with one critical external sensor to activate its function
  5. The implementation was such that the corrections applied by the system were not constant but periodic - meaning that a pilot might correct its actions and think they were back in control - but the system would activate again and override a short period later.
Some pilots experienced the problem but knew how to deal with it. Some didn't. We had two fatal accidents.

Now what was surprising with this case was that add on control system seems to have been implemented without sufficient concern about failure - or the critical single point of failure on the sensor. Traditionally active control systems would be subject to more robust design criteria.

Judging by some of the comments I've seen about automated driving systems I think we're seeing a shift in culture - people making design decisions that are in effect over optimistic and don't deal with contingency and failure. I can't help feeling that some of that relaxation of culture managed to seep into the 737 Max control augmentation. Just because we have driving assist systems and they appear to work (with human supervision) doesn't mean that we are only a short step away from full self driving automation.

I think we should be concerned that engineering culture has been diluted over the last 50 years - with younger generations of designers being less wary and more trusting on behalf technology.
 
My experience of car assist systems is that they can be easily confused as regards speed limits and road markings. GPS is taken as a given in terms of reliability but there are people who live in areas of the UK where warnings are issued regarding GPS degradation - and reliance on these systems is risky in strategic terms.
That's interesting....were is the signal weak or inaccurate?....not doubting you but I can find anything on the Googlenet!....and as it comes down from satellites I cant see it makes much difference wherever you are as long as you can see the sky. With 31 working GPS satellites you are bound to be able to contact the minimum four you need for accuracy..
 
That's interesting....were is the signal weak or inaccurate?....not doubting you but I can find anything on the Googlenet!....and as it comes down from satellites I cant see it makes much difference wherever you are as long as you can see the sky. With 31 working GPS satellites you are bound to be able to contact the minimum four you need for accuracy..

An example of the good guys doing stuff.


Plenty of ongoing activities well to the east of us in Europe where it's bad guys doing stuff.
 
That's a bit different to is not working or being unreliable or having a weak signal.
 
That's interesting....were is the signal weak or inaccurate?....not doubting you but I can find anything on the Googlenet!....and as it comes down from satellites I cant see it makes much difference wherever you are as long as you can see the sky. With 31 working GPS satellites you are bound to be able to contact the minimum four you need for accuracy..
We lost GPS coverage a few weeks ago during a drive in Cheshire, not far from Jodrell Bank. The car sat-nav and both of our iPhones were affected by the dropout.

Edit - GPS accuracy is affected in built-up areas too.
 
That's interesting....were is the signal weak or inaccurate?....not doubting you but I can find anything on the Googlenet!....and as it comes down from satellites I cant see it makes much difference wherever you are as long as you can see the sky.
There was always a section of the M6, near Jodrell Bank, that the COMAND system on my first W212 E63 would warn that GPS location was unreliable. And it was!
With 31 working GPS satellites you are bound to be able to contact the minimum four you need for accuracy.
One would think so, but GPS signals are so small that almost any interference will destroy accuracy. Add in a less than ideal satellite constellation and perhaps a few hills and trees and it isn't always as reliable as many think. One of the tricks that automotive GPS systems use to mask poor position resolution is that they assume you will be following the road it last "knew" you were on, heading in the same direction it last recognised, and at roughly the same speed. This gives the illusion that its resolution is sound, even when it's not.
 
There were a chain of problems.

  1. An old aircraft design that they wanted to add to to enhance the product
  2. The enhancement in engine size/power changed the flight characteristics under some circumstances
  3. An unwillingness to require pilots be retrained - so adding automation to modify the flight characterstics to make them similar to the older versions of the aircaft
  4. The added automation was applied to a non Fly By WIre system - in effect an add on to an existing electro mechanical system. It ended up with one critical external sensor to activate its function
  5. The implementation was such that the corrections applied by the system were not constant but periodic - meaning that a pilot might correct its actions and think they were back in control - but the system would activate again and override a short period later.
Some pilots experienced the problem but knew how to deal with it. Some didn't. We had two fatal accidents.
One of the scariest things resulting from points 4 & 5 was that even if the pilots realised what was going on and defeated the automation, unless this happened very rapidly then chances were that the aerodynamic load on the horizontal stabiliser was so high as a result of the nose-down attitude and rise in speed that it was impossible for them to apply enough torque to the mechanical controls to trim it back.
 
That's a bit different to is not working or being unreliable or having a weak signal.

Makes no difference - it means it's not reliable for things like self driving vehicles. Worse - affected vehicles may not know it is not functioning correctly. (Loss of service is in principle detectable - loss of service integrity may not be detectable if GPS is your sole reference).
 
In this case, computing power is the last place where we should be making savings....
Why? What is the point of giving up a vital and scarce (we still do not have enough renewable electricity for our existing needs) resource for things we can already do well enough (driving) or pursuit of greed (crypto currency)? Utter folly.
we should cut on wasted air-conditioning in empty office buildings in America and other similar idiocies.
SFA you or I can do about that. More AI here does nothing to solve that.
 
BTW, I don't have concrete data to support this, but it is likely that AI will save more energy than it consumes. There's been some very interesting results when using quantum computing to optimise large-scale energy consuming systems.
Shout me when it invents perpetual motion....
 

Users who are viewing this thread

Back
Top Bottom