Autonomous cars are the future of driving. According to some in the industry, we’ll have fully autonomous vehicles on our roads in the next decade. Truly autonomous cars should allow a person to enter the vehicle, input a destination, and then never have to touch the controls as the car drives them to where they need to go.
However, autonomous cars will require changes to existing laws, and may even redefine what it means to ‘drive’. Currently, cars and drivers are recognized as separate entities, but autonomous cars will be blend both together.
If this becomes the norm, will we still need driver’s licenses?
Fully autonomous cars should mean that new car owners shouldn’t actually need to learn to drive. The car ‘sees’ the world around it, feeds that information through an algorithm, and make decisions in real time based on what’s going on in the vicinity, removing the human element in driving. Proponents of autonomous vehicles claim the technology will be better at driving than humans, making our roads safer.
If the technology is perfected, the person in the car shouldn’t have to do anything during the journey, they could read a book, watch a movie, or even take a nap. If that’s the case, absolutely anyone should be able to hop in and ‘use’ a car, and they wouldn’t need a license as they aren’t actually driving it.
If we move away from traditional driving tests, every element of the car would need to past multiple tests to ensure complete safety.
Both the physical car, and the algorithm that operates it will need to be rigorously checked against the highest standards, through a huge variety of different situations.
Simulations aren’t enough, we need concrete real-world testing to ensure the cars can operate in as chaotic a system as roads. Real roads with real people, making decisions that don’t make sense, are much tougher to navigate than a simulation that is likely creating optimal situations for the car to learn from.
If universal tests for autonomous vehicles are adopted, they need to be in depth enough to ensure the car and the logic that operates it are robust enough to keep the passenger safe, as they have no control over what the car will do in an emergency situation.
Algorithms don’t have human instructs like self-preservation, or reflexes that can deal with different stimuli. Any movement an autonomous car makes is a choice, dictated by the algorithm based on what’s happening around it. Those choices are made on logic, and sometimes don’t translate to what’s best for the human passenger, or the people around the car.
These cars will be programmed to deal with normal occurrence on the roads, but in very specific ‘edge cases’, they will likely stumble, or stall completely as the algorithm tries to understand how to react to situations its never seen before.
It’s clear that there’s still a long way to go before we achieve fully autonomous cars. But when we do, governing bodies will need to tread carefully on the issue of whether to allow them on the road with no control over who can operate them.
If you’re an automotive engineer, learn the skills you’ll need to work with autonomous vehicles.