Auriga Develops Computer Vision for Autonomous Driving

In many ways, 2016 was an important year for driverless vehicles full of historic moments. In that year, major automakers—from Daimler to BMW, Audi, and Hyundai—invested billions of dollars in autonomous vehicle development and worked in partnership with tech players to become leaders in the race for self-driving cars. It was the year driverless and semi-autonomous cars were made available to the public for the first time, at least in trial mode, and the year that sparked a hot debate on the challenges of self-driving mobility and car automation.

More than a dozen companies—including Tesla, Uber, Google, and Baidu—have already demonstrated self-driving cars performing in real environments. Ford is aiming to have its fully autonomous car ready in four to five years. Nissan, Toyota, and Honda are looking to put completely driverless cars on highways by 2020. Tesla announced that its driverless technology will be launched in full in 2018. Elon Musk, Tesla’s CEO and the frontman of this driverless revolution, says that in 10 years, it will be rare to find a car that is not autonomous.

Driverless Dilemmas

Autonomous vehicles are expected to be much safer than human-driven ones and reduce crashes by around 90 percent. Nevertheless, no driverless car has yet demonstrated absolute reliability: that is, they are reliable most of the time but not always. Self-driving vehicles still struggle when dealing with extreme weather and those one-in-a-million situations when something completely unexpected happens. On the road to driverless cars, there are a number of tough but important dilemmas for the industry to tackle.

First and foremost, it is necessary to decide who will be protected in case of an unavoidable accident. In other words, we have to choose whether to hit a child who runs in front of a car or to swerve into a tree, plunge into a group of pedestrians or oncoming traffic, or plummet over a bridge. Mercedes-Benz spokesman David McCarthy admitted that in dangerous situations, the driverless technology would prioritize the safety of the passengers rather than that of bystanders outside of the vehicle.

One more dilemma is who gets the blame and pays the damages when driverless cars crash. Should it be the car owner, the car manufacturer, or the software developer? Volvo, one of many automakers eager to market an autonomous car, expects there will be a shift from driver liability to product liability, making the automotive industry the primary liability stakeholder.

Nevertheless, improved safety will be one of the biggest benefits of autonomous vehicles. In addition, they will greatly improve traffic and fuel efficiency and provide us more free time. There are still a myriad of questions to answer, but driverless vehicles are now nearing the live phase, and the race for self-driving cars is getting hotter.

How We Make a Car “See” the World

That said, the fast and precise recognition of objects ahead and the ethical resolution of tough situations are the most challenging tasks for today’s connected car technologies. Most self-driving cars use a combination of sensing technologies to “see” the road ahead. Range-to-object detecting sensors, such as lasers and LIDAR, indicate the distance between a vehicle and surrounding objects. Visual sensors, such as cameras, recognize color and fine appearance details. Many players in the self-driving world are developing deep learning systems that learn how to drive safely in various circumstances based on huge amounts of labeled sensor data.

Part of a big driving automation project, Auriga is developing a semi-automated object identification tool to label acquired data and create a stream that can be used for machine learning. With some manual operator assistance, the application should automatically recognize and label objects using the uploaded video, LIDAR data, and speed data with a predefined set of labels, including lane markings, traffic lights and signs, trees, other vehicles, cyclists, and pedestrians. Labeled data forms a data stream for subsequent machine learning and driving automation. The more data it labels, the better the system will “see” the environment in the future.

As cameras are very susceptible to changing weather conditions, it is vital to consider the day–night cycle, seasonal changes, rain, fog, haze, and snow, as it can completely obscure lane markings and even signs. Self-driving cars are designed to rely on rigid road rules and can struggle to make sense of what is on the road ahead while driving under extreme conditions. All these circumstances were considered, and video streams of driving in poor weather conditions form a considerable part of the created data lake.

Elena Baranova, Head of Auriga’s Engineering Department, had the following to say regarding Auriga’s involvement in the connected vehicle market:

The application Auriga works on is a part of our customer’s innovation strategy for the connected car market. This project requires from our engineers profound expertise in embedded development and a creative mindset, the ability to think outside the box—the skillset demanded on the connected vehicle market and a key area of expertise for Auriga. We are honored to be part of such a challenging project and appreciate the customer’s faith in us.