With all the talk of self–driving and autonomous cars, one can't help but worry about the consequence of these advances. There is definitely more to driving than accelerating and braking. When humans drive, they don't just use their eyes to see what is straight in front of them, they filter out unnecessary information, like someone walking towards the road, knowing that they will wait until its safe to cross. We also pay extra attention to other things that may not necessarily be on the road, such as a car that is attempting to pull out into traffic, just in case the person misjudges it and pulls out dangerously.
All of these things make us safe drivers but will autonomous cars be able to do the same thing? Is technology advanced enough to program the vehicle as to which information it should be paying attention to?
Humans don't learn these things from driving lessons or what their parents taught them. They learned through experience and it almost becomes an automatic way of thinking, triggered by the nuances of social learning. But how does a car inherit the ability to read social cues and block out others? If the car were to treat every pedestrian or cyclist as something it needs to avoid, it would never get anywhere.
Researchers at UC Berkeley are currently working on this problem that they call "the freezing robot problem". They are researching the ways that AI reacts, as well as the ability to teach the system to drive via modelling and repetitive observation to interpret what behaviours mean and how to react to them. The computer will have to decipher rational behaviour from irrational behaviour based on the intentions of other drivers and estimate what the other driver wants, how they will achieve it and predict what they will do next. This happens for example when a person indicates that they will be turning right, the system will have to predict that the car will slow down considerably in order to turn.
This means that researchers still have a long way to go in terms of developing a more adaptive driving style for autonomous vehicles that are able to interpret the context of different situations. Social context also requires an understanding of how the presence of the vehicle itself changes the behaviour of people around it and the influence it has on situations. The system needs to learn to drive defensively to avoid hitting pedestrians but also drive assertively so that doesn't stop at the slightest impediment.
Although the prospect of self–driving cars is not far away, the more important question is will they be as effective as human drivers?What's your opinion? Let us know in the comments below.