Feb 16

Three Automakers and Lyft Ask Feds to Adopt National Standard for Self-Driving Cars

 

By Jon LeSage

Automakers and ride-hailing firm Lyft have asked Congress to unify self-driving car guidelines by a national standard.

Executives from Toyota, General Motors, Volvo, and Lyft urged lawmakers in Washington yesterday to unify the patchwork of state laws governing testing and deployment of autonomous vehicles. The federal government has constitutional authority to override conflicting state laws, they said.

Laws, and enforcement of them, are varying by state. California has become known for sparring with companies including ride-hailing firm Uber, over self-driving test protocols. Michigan’s recently adopted rules are considered to be much broader, making room for fully autonomous vehicle to eventually be allowed on the state’s public roads.

Nine states – California, Florida, Louisiana, Michigan, Nevada, North Dakota, Tennessee, Utah, and Virginia, along with Washington D.C. – have passed legislation related to autonomous vehicles, according to the National Conference of State Legislatures. The association also said that governors in Arizona and Massachusetts issued executive orders related to autonomous vehicles.

This patchwork of varying and conflicting state laws threatens to hold back innovation, said Lyft government relations vice president Joseph Okpaku in testimony before a House subcommittee. Legislators in more than 20 states have proposed nearly 60 bills to regulate self-driving vehicles since January 1, he said.

Lyft is looking forward to test driving autonomous Chevy Bolts with partner company General Motors. The two companies are preparing to test out a fleet of self-driving Bolt taxis beginning this year. GM has already started testing out 50 of these electric vehicles in California and Michigan.

GM would like to the U.S. Department of Transportation secretary have power over the question.

Congress should grant authority to the Transportation secretary “to grant specific exemptions for highly automated vehicle development,” said Michael Ableson, a General Motors vice president, during a hearing.

These companies were likely pleased to hear the September announcement by then-DOT Secretary Anthony Foxx issuing long-awaited federal guidelines on testing and developing fully autonomous vehicles. The DOT called for uniform nationwide policies applying to autonomous vehicles.

State and federal lawmakers have been concerned about laws keeping up with self-driving vehicle technology breakthroughs. Car shoppers can now purchase semi-autonomous, connected car features; some auto executives predict that fully automated vehicles could be available within five years.

The Congressional subcommittee gave signs of bipartisan support for the development of autonomous vehicles. They were noticeable silent over the question of adopting a national standard that would override state rules, according to USA Today.

SEE ALSO:  Chevrolet Bolt is Front and Center For GM’s Self-Drive Technologies

Toyota Research Institute CEO Gill Pratt expressed concerns over vehicle safety. A federal standard should be clear on how safe autonomous vehicles should be on roads. The public won’t support the new technology until that issue is addressed, he said.

“Society tolerates a significant amount of human error on our roads. We are, after all, only human,” he testified. “Humans show nearly zero tolerance for injuries or deaths caused by flaws in a machine.”

The Obama administration made statements about supporting self-driving car technology as a way to eliminate road fatalities within 30 years. The Trump administration so far hasn’t been clear about policy on autonomous vehicles.

For now, states are leading the way.

USA Today, HybridCars.com

This entry was posted on Thursday, February 16th, 2017 at 5:55 am and is filed under General. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.

COMMENTS: 13


  1. 1
    Loboc

    +2

     

    Vote -1 Vote +1

    Loboc
     Says

     

    Feb 16th, 2017 (8:26 am)

    Very specific legislation could also hold back innovation.

    Language, such as “sealed beam headlights”, is more limiting than listing specifications like lumens or longevity.

    This one is particularly sticky since States license drivers. When the driver is robotic what happens? Does the robot need to be 18? Pass a written test? Present credentials to enforcement? Have insurance or be listed as a driver on the owner’s policy?


  2. 2
    Dan Petit/Petit Technical College

    +1

     

    Vote -1 Vote +1

    Dan Petit/Petit Technical College
     Says

     

    Feb 16th, 2017 (8:26 am)

    The physical laws of entropy will always negate autonomy. (If you could read all of this post before voting, I’d appreciate it).
    Here’s why, (and please be patient with this explanation, as I’m not belittling autonomy proponents), These autonomy engineers, perhaps unknowingly, are treating full sized and weighted masses of accelerated forces with the glib arrogance as befitting a small innocuous remote controlled Xmas toy car which they received when they were young.

    Intrusiveness into the driving tasking of real people responsibly driving their vehicles, a small delay at a time, without their informed consent, even if ergonomically-possible, is not just dishonest, it is catastrophically EVIL!

    For example, even currently, the software to ignore Accelerator Pedal Position until other subroutines allow, as apparent with the Tesla garage sudden accelerations through the back wall and into the house, could have killed a family member.
    So, what was supposed to happen if NHTSA calls that “operator error”, and the underinformed owner has all sorts of financial problems resultingly ensuing?

    Then, the plot thickens;
    Denial by the corporate structure, as supported be ten levels of corporate veil (“to protect jobs”), allows the intrusively-infectious and yes, arrogant curse of autonomy into the nations highways, causing all manner of chain reactive loss, while overly-bonused execs sit there pretty, saying nothing as counseled by the legal department.

    Seriously; How insane is this? Seriously;

    The bumper car ride at the amusement park is just that, an amusement, in a controlled set of dynamics.

    Lyft and GM would be corporately immune to lawsuits to the most arrogant and even evil extent for this. Why? Here’s why;

    Commercial vehicles are in an unsafe and design-incapable-for-safety condition within only 90 days when not exactingly maintained by the “enlightened self interest to stay alive” of a real human being who works to survive. That interest in a human driver not getting killed as a driver benefits the riders, and is not to be found in automation, even if it were able to be properly maintained, which it won’t, because 85% servicing facilities all across the nation are already short of qualified technicians to do the work already coming in, and they already are refusing to work on drive-by-wire steering system equipped vehicles which do not have steering columns.
    99.9% of buyers would not choose a vehicle if they found out during the sales presentation that the vehicle did not have a steering column. What fuels autonomy proponents delusiveness that anyone would get into a driverless vehicle not attached to a track, much less that same person viewing the scowls from nearby drivers?

    Look at the strict aviation rules that have been in place for a hundred years at the FAA.
    Strict maintenance and maintenance logs must be certified.

    ********************************************************************
    * As of 2-1-2017, our student techs must keep strict records as does FAA require. *
    * Only a half percent of techs are qualified for this work. Quite a shortage of techs! *
    ********************************************************************

    Look at all the HUMAN control tower infrastructure controlling of aircraft to keep them separated.
    If automation alone could do that reliably, it would have been done already.
    But now, representing the capability to keeping thousands of vehicles separated by only inches at high speed, and representing that a perfectly synchronized long chain of vehicles will somehow be perfectly chain reactive to all conditions of control is statistically impossible due to the physical laws of entropy; The only constant in the universe is [unexpected] change. (Hence, the concept that engineers trend to arrogance.)

    Reductionism and denial of responsibilities of faulting onto the publics’ misplaced trust “as operator error” will always be a rip off, as the masses and forces of combined accelerations by laws of physics become subject to entropy.


  3. 3
    American First

     

    Vote -1 Vote +1

    American First
     Says

     

    Feb 16th, 2017 (10:11 am)

    Loboc:
    Very specific legislation could also hold back innovation.

    Language, such as “sealed beam headlights”, is more limiting than listing specifications like lumens or longevity.

    This one is particularly sticky since States license drivers. When the driver is robotic what happens? Does the robot need to be 18? Pass a written test? Present credentials to enforcement? Have insurance or be listed as a driver on the owner’s policy?

    I recommend having the National Highway Transportation Safety Administraion (NHTSA), a part of the Departament of Transportation, and the Insurance Institure for Highway Safety (IIHS), an independent agency, work together to establish a set of rules and specifications, and later a set of drive tests for all autonomous vehicles for the U.S. market, such that we have a national standard (which may become global). This should work isolated from all manufacturers’ influence and inputs such that no manufacturer will have a dominance or a “head start”. So if these rules and specs exist, and make GM, Ford, and TM reprogram all their vehicle to follow these rules, then all will have to obay these rules to comply and accept the “standard”. Then all the State legislations can be simpler and just allow autonomous vehicles that pass these rules and specific drive tests. I would like to participate and be fascinated as I watch the autonomous vehicles pass these tests.


  4. 4
    DonC

    +1

     

    Vote -1 Vote +1

    DonC
     Says

     

    Feb 16th, 2017 (10:19 am)

    No standards or a “patchwork” of standards probably works best for development — especially when some states want to welcome development within their borders — but you can’t roll out autonomy without a national standard. Also agree with Toyota that we have a “double” standard — we tolerate a great deal of error in humans but not machines. That’s important because autonomy is software, and there is always one more bug in every software system.

    Interesting that Lyft and GM are pushing for a national standard. Does this suggest they are closer than some others? (Just an observation and question, no inside information on this).


  5. 5
    Kdawg

     

    Vote -1 Vote +1

    Kdawg
     Says

     

    Feb 16th, 2017 (10:50 am)

    DonC: Does this suggest they are closer than some others?

    Possibly. There’s an article at IEVs talking about how Tesla only has about 500 miles of actual full autonomous driving under its belt. We know they have done minimal testing on real-world streets, and some on closed courses. Meanwhile GM/Cruise Automation has shown videos of Bolt EVs navigating busy San Francisco streets. I have no idea what happened to Nissan. They’ve gone silent. And we haven’t heard anything from the Germans. Ford just now announced investing into autonomous vehicles.


  6. 6
    Steve

    +2

     

    Vote -1 Vote +1

    Steve
     Says

     

    Feb 16th, 2017 (11:38 am)

    DonC:
    No standards or a “patchwork” of standards probably works best for development — especially when some states want to welcome development within their borders — but you can’t roll out autonomy without a national standard. Also agree with Toyota that we have a “double” standard — we tolerate a great deal of error in humans but not machines. That’s important because autonomy is software, and there is always one more bug in every software system.

    Interesting that Lyft and GM are pushing for a national standard. Does this suggest they are closer than some others? (Just an observation and question, no inside information on this).

    Of course we are less tolerant of errors in machines. If the machine isn’t vastly superior and virtually error free, why would you want it?


  7. 7
    Streetlight

    +1

     

    Vote -1 Vote +1

    Streetlight
     Says

     

    Feb 16th, 2017 (12:00 pm)

    Let’s open with this…There’s just no way US Supreme Court would expand the Commerce Clause against what is “traditionally state authority”. (citations) Conversely, expansion manifests a liberal undertaking notwithstanding this Administration’s anti-reg crusade.

    Given that, America First #3 post follows pretty much the path of my ignition interlock device (IID–the Breathalyzer). Indeed, virtually all my company’s testing standards developed in our landmark 1986-89 California Pilot Program were adopted (in one form or another) by NHTSA’s Model Standard in 1991-3 era. Thereafter California adopted this unabridged.

    Today a reading of current enacted California AV legislation most distinctly and clearly calls out for federal guidelines. So what’s the beef?

    That said. The makers (GM, Toyota etc) simply need to step up. Leading to (at least some FULL TIME) an ad hoc AV standards oversight task force constructing a grand-tour roadmap that’ll populate FULL-TIME brick & mortar AV standards committees. Then NHTSA will happily post these in the federal register for states to adopt. This is a 7-10 year effort.


  8. 8
    Kdawg

    +2

     

    Vote -1 Vote +1

    Kdawg
     Says

     

    Feb 16th, 2017 (12:45 pm)

    Steve: Of course we are less tolerant of errors in machines. If the machine isn’t vastly superior and virtually error free, why would you want it?

    I think the debate comes down to defining the intersection point of “superiority” and “error level”. What is the acceptable values of each based on the other? It’s somewhat proportional. The more automation helps us (superiority), the more we may be willing to accept errors. If it only helps us a little bit, then the thing it helps us with, it better do damn well.


  9. 9
    HVACman

    +1

     

    Vote -1 Vote +1

    HVACman
     Says

     

    Feb 16th, 2017 (1:34 pm)

    Conspicuously absent in this article is any mention of Tesla’s interest in a national standard. hmmm…I would think they would right in there at the forefront of this national standards movement if their Auto Pilot program is so advanced.

    Not mentioned, but as important, the SAE is hard-at-work developing international technical standards and protocols for wireless inter-communication between nearby autonomous vehicles that will be a basic requirement for autonomous travel as it goes more mainstream. These SAE standards would also be likely incorporated into any national standards.

    Regarding the commerce clause: travel is inherently inter-state (at least on the “Interstates”) and there will soon be autonomous commercial travel. To suggest that each state can have separate autonomous standards would be as technology-limiting and commerce-constricting as suggesting that each state can have different auto-pilot standards for air travel. Chaos would reign.


  10. 10
    DonC

     

    Vote -1 Vote +1

    DonC
     Says

     

    Feb 16th, 2017 (4:29 pm)

    Steve: Of course we are less tolerant of errors in machines.If the machine isn’t vastly superior and virtually error free, why would you want it?

    I find this very strange. If a robot could clean your house as well as you could, with the same level of expertise and errors, would you not want it? I don’t see why it needs to be better. In the driving context, if a machine posed the same risk as a human driver, why would I care if a human was driving or not?


  11. 11
    Dan Petit/Petit Technical College

     

    Vote -1 Vote +1

    Dan Petit/Petit Technical College
     Says

     

    Feb 16th, 2017 (5:38 pm)

    Don,
    There is no such thing “as the same risk”.

    That could not possibly be quantified.

    A Roomba vacuum cleaner, for example, has no huge mass, and can’t be comparable either as to speed. Again, if people are so lazy to not clean their own house, they probably have heart disease and would need the money for pills.

    The truth is ” If somebody builds it (anything) and markets it, then some (fool) will buy it.”

    It remains to be seen, in this case, if the fool gets into it.


  12. 12
    Jim Seko

    -1

     

    Vote -1 Vote +1

    Jim Seko
     Says

     

    Feb 16th, 2017 (7:03 pm)

    Steve: Of course we are less tolerant of errors in machines.If the machine isn’t vastly superior and virtually error free, why would you want it?

    Tesla’s autopilot 1.0 reduced accidents by 40%. Autopilot 2.0 is expected to reduce accidents 90%. In other words, 10x safer than human.


  13. 13
    Dan Petit/Petit Technical College

     

    Vote -1 Vote +1

    Dan Petit/Petit Technical College
     Says

     

    Feb 16th, 2017 (7:11 pm)

    You have to prove that on a larger scale than what Elon Musk inferred falsely for a sample set that he had deliberately and deceptively over generalized.

    We talked about that right here.