Improving Cars’ “Perception Algorithm” Best Way to Safer Self-Driving Technology

Autopilot

Many car companies are now looking to new technologies to make their cars navigate roads easily / Photo by: Free Stock Photos via Wikimedia Commons

 

Admittedly, the push for cars is to make them more able to navigate roads without much assistance, and many in the industry are looking into things that may stand in the way of fulfilling that reality.

 

One of the first things that would have to be considered, as researchers from USC said, is, of course, the possible safety hazards that this advancement may bring to our streets. To answer this, Eurekalert.org reports that the researchers then published a new study that looked into “a long-standing problem for autonomous vehicle developers.”

 

What are these problems?

 

These are the notoriously difficult to understand perception algorithms that have thus far endangered many during the course of companies’ testing the cars through these algorithms. The researchers explained that the reason why perception algorithms pose a threat is because not many understand what these deep learning convolutional neural networks take into consideration when determining which moving objects on the street are either cars or pedestrians.

 

Companies, according to the researchers, should first invest in perception algorithms that work and are able to both program and eventually develop these systems in order to “use this information to further train the system.”

 

Improving these systems would benefit greatly in the mission towards automation, as it will lessen road accidents and help companies avoid incidents that may have happened while their products are in use.

 

Using the Timed Quality Temporal Logic, a new mathematical logic formulated by the team themselves, they were able to test the “sanity conditions” that perception algorithms use to “see” certain objects in the road.

 

As described by study co-author Jyo Deshmukh, a USC computer science professor and former research and development engineer for Toyota, the sanity conditions are the aspect of perception algorithm that helps it understand cars and their movement relative of the streets they are on.

 

By monitoring the sanity conditions in which the perception algorithm operates in, the researchers are then able to note if there are bugs in the system.