GF Online

Main Menu

  • Home
  • Bicycle riders
  • Bicycle safety
  • Bicycle market
  • Bicycle racing team
  • Financial

GF Online

Header Banner

GF Online

  • Home
  • Bicycle riders
  • Bicycle safety
  • Bicycle market
  • Bicycle racing team
  • Financial
Bicycle safety
Home›Bicycle safety›Public streets are the laboratory for self-driving experiences

Public streets are the laboratory for self-driving experiences

By Mona Mi
December 23, 2021
0
0


Tesla’s relentless vision for self-driving cars has come to life on American public roads – with its self-driving technology implicated in at least 12 crashes, with one fatality and 17 injuries. Prosecution and a government investigation followed. But a question remains: how is Tesla allowed to do this in the first place?

The answer is that there are no federal regulations to prevent Tesla – or the many other autonomous vehicle companies – from using public streets as a laboratory. As long as a driver is willing to take over, the only thing stopping a company from bringing an experimental autonomous vehicle to the public is the threat of a lawsuit or bad publicity.

In June, when the National Highway Traffic Safety Administration ordered – for the first time – that car crashes involving driver assistance or autonomous features be reported, many concluded that Tesla was the only motivation. However, the order names 108 automakers and tech companies, revealing just how widespread unregulated self-driving test drives can be.

Any future regulations will be hammered out between diametrically opposed camps. On the one hand, safety advocates, who argue that autonomous driving features, like those that control speed, direction and braking, should be proven safer than drivers before they are allowed on public roads. . On the other side, there are the proponents of the auto and tech industry, who say that these features cannot become safer than humans without unlimited testing in the real world.

The question for regulators, automakers and the public is: is regulation making us safer or will it slow down the adoption of technologies that make us safer?

Security advocates may disagree on what testing should be required, but they agree that there should be a certain standard. “You can’t anticipate everything,” said Phil Koopman, an expert on safety standards for self-driving cars. “But the auto industry is using this as an excuse to do nothing.”

First days

While outrage over autonomous vehicle crashes has increased calls for regulation, technology has always been ahead of the law. Just look at the introduction of the car. John William Lambert is credited with building America’s first practical gasoline car and surviving the first recorded accident, in 1891.

Later New York recorded a few firsts: the first accident between vehicles – a car and a bicycle – in 1896 and the first pedestrian to die, when an electric taxi struck Henry Bliss in 1899. But that wasn’t until in 1909 New York introduced its first comprehensive highway code. Considered by William Phelps Eno, these laws, which today constitute the basis of the highway code, delayed the onset of car accidents by 18 years.

“Cars shared the roads with people and horses,” said Bart Selman, a professor of computer science at Cornell University with expertise in the history of technology. “There have been a lot of accidents and I found that along the way. “

What’s different about autonomous driving technology is the speed and scale with which it has arrived. “It makes perfect sense for regulators and government to get involved in this process, given the scale and speed at which this is happening,” said Mr. Selman.

Despite the risks, the promise of self-contained features can be seen in a study of insurance data that showed they reduce the number and severity of some crashes. Cars equipped with forward collision warning and automatic braking reduced front-to-rear collisions by 50 percent and front-to-rear collisions with injuries by 56 percent, for example.

Autonomous driving levels

While there are no federal restrictions on stand-alone testing, many states have set limits on certain levels of automation. The Society of Automotive Engineers, a standards body generally referred to simply as SAE, has defined six levels of automation, which implies an important legal distinction.

A level 0 vehicle is fully manually controlled. At level 1, an automated system can help with control, such as changing lanes or cruise control. Level 2 allows two or more automated controls to operate at the same time as long as a driver is ready to take over. So far, there is no limitation to driving on public roads.

At level 3, the car can drive completely on its own under certain circumstances, for example on highways, but the driver must be ready to take control.

At level 4, a car can also drive on its own under limited circumstances, but without human intervention. At level 5, the car is fully driven.

The distinction between levels up to level 2 and those above is important for legal reasons. In a level 2 car accident, the responsibility rests with the driver. For Levels 3 to 5, the responsibility may lie with the system and the companies that manufacture it.

But a loophole allows automakers to avoid liability. Car manufacturers themselves determine the level assigned to their systems. This may explain why Tesla sells its automated driving assistance system as a full self-driving capability (billed at $ 2,499 per year) but classifies it at level 2. This allows developers to play in both. sense.

But marketing a system under the name of Full Self-Driving has safety implications. A study last year by the AAA Foundation for Road Safety placed 90 drivers in a 2018 Cadillac CT6 equipped with the Super Cruise Level 2 system. Some drivers have learned that the system is called DriveAssist, with training that emphasized limits of the system, and others it was called AutonoDrive, with training that emphasized the capabilities of the system.

Those who were told the system was called AutonoDrive were less attentive, and more often than not, the hands of the steering wheel and the feet of the pedals were removed. The 31-mile route also had a curve that would force the driver assistance system to hand control back to the driver. Notably, both groups were equally slow to regain control.

“We have known for years that humans are terrible at monitoring automated technology,” said Shaun Kildare, senior research director for Advocates for Highway and Auto Safety.

The problems with Tesla’s autopilot system


Map 1 of 5

Claims of safer driving. Tesla cars can use computers to manage certain aspects of driving, such as changing lanes. But some fear that this driver assistance system, called Autopilot, is not safe.

A federal investigation. The National Highway Traffic Safety Administration is examining autopilot involvement in crashes, after 12 incidents involving Teslas crashing into parked emergency vehicles. The agency has the power to force a recall or require new security features.

Shortcuts with security. Former Tesla employees said the company’s chief executive, Elon Musk, insisted battery life could only be achieved with cameras despite objections from some engineers.

Driving assistance and accidents. A look inside a 2019 crash that killed a 22-year-old college student shows just how tragic shortcomings in Tesla’s autopilot system and distractions can be. In another incident involving the death of a 15-year-old boy, after a Tesla struck a truck, a California family sued the company, claiming the autopilot system was partly to blame.

The industry has approached the problem of attention with protective measures. But these systems are not foolproof. Videos on YouTube show drivers how to easily trick Tesla’s monitor. Even more advanced systems, like Cadillac’s, which uses a camera to make sure the driver’s eyes are on the road, can fail.

“The problem with driver surveillance, people say if you have a good camera, you’re fine,” said Mr. Koopman, who was the lead author of the proposed technical safety standards for fully self-driving cars for the standards body known as ANSI / UL.

In the 1990s, he said, a system to keep truckers alert beeped if they closed their eyes. The result? “They learned to fall asleep with their eyes open,” Mr. Koopman said.

It’s hard to find a useful test. Testing for specific tasks, like driving on a test track, improves a vehicle only on that track, and not to deal with unpredictable drivers. Rather, the UL test is a 300-page list of technical considerations, ensuring that celestial bodies do not interfere. (Plausible: a video shows a Tesla braking for a yellow moon.)

If an engineer meets these requirements, “we think you’ve made a reasonable effort to try and design your systems to be safe enough,” Koopman said. “It’s a methodical way of showing the use of best practices. “

Yet the engineering test alone is insufficient, he said. It is intended for use with other standards that cover equipment and driving skills.

Even full testing is no guarantee. Federal Aviation Administration procedures were used as a model, but they cleared the Boeing 737 Max, which was grounded for 20 months after two airliner crashes left 346 people dead.

Regulations

It’s easy to see why the pioneers of autonomous driving technology are wary of regulation.

Their industry already faces a patchwork of state-by-state regulations, although they primarily require proof that a business is insured, rather than having met a safety standard.

California is among the more stringent states, with its 132-page standards document for autonomous operation covering licensing, insurance, data sharing, and the requirement of a driver “competent to operate the vehicle.” as determined by the company. Florida, one of the less restrictive states, has passed legislation that allows Level 4 and 5 cars “to operate in that state regardless of whether a human operator is physically present in the vehicle.” Rideshare companies testing autonomous vehicles there are required to carry liability insurance.

Add to this mix the likely number of agencies that may be involved. The Ministry of Transportation’s online automated vehicle platform lists 14 people involved in the development of automated driving systems.

Another complication is the tension between the agencies. The National Transportation Safety Board, which investigates accidents, has been particularly supportive of autonomous vehicle regulations from NHTSA, a sister agency that is largely responsible for motor vehicle safety standards. In an NHTSA publication titled “Automated Driving Systems: A Vision for Safety,” the agency wrote, “NHTSA offers a non-regulatory approach to the safety of automated vehicle technology,” saying it did not want to stifle progress.



Related posts:

  1. Northborough receives state grant to improve pedestrian safety
  2. Long Beach seeks to narrow lanes to calm dangerous segment of Spring Street • Long Beach Post News
  3. Letter to the Editor: Other safety concerns remain with the proposed gravel mine
  4. New Suffolk County law establishes a ‘safe distance’ between bikes and cars
  • Terms and Conditions
  • Privacy Policy