Driverless cars are a reality for some and science-fiction for others. From now until autonomous vehicles are regularly operating on our roads, the emphasis is on safety and creating co-pilot systems that reduce deaths and injuries. And cost-effective camera technology will no doubt play a huge part in this by providing drivers with a helpful, virtual assistant.
The question regarding driverless cars is not if they will appear on our streets, but when.
Most of us have heard about Google self-driving cars, but the first ever self-driving vehicle began working way back in August 2016 in Singapore. The driverless taxi operates throughout Singapore’s university district. It’s already received such strong support from the users that its founder company NuTonomy, now plans to introduce more self-driving taxis in Singapore by the end of 2018.
And South Korea isn’t far behind. It has built a small town, known as K-City, near Hwaseong, to test driverless cars in real road environments. Some 360,000 square metres and $9.7m are devoted to advance the field of self-driving technology. The country is so confident, that they aim to have autonomous vehicles on national roads by 2020 – that’s less than two years from now.
It seems that more and more countries are joining the autonomous race. But, and this is a very important but, from now until when we do have truly autonomous vehicles sharing the roads with us, it is going to be a long and evolutionary process. Let’s face it, we will not wake up any time soon and discover that our cars are doing the school run or popping down to the supermarket without us. But the change is coming and will, without a doubt, transform the automotive industry for good.
So what is an autonomous car?
The answer to this question gets a little complicated because it’s not just a case of here’s a ‘normal car’ and here’s a ‘self-driving car.’
There are six levels of autonomous vehicles. Their classification is based on the level of interaction between the driver, car and the world through which they drive:
- Level 0 is no automation and the driver makes all the decisions without any aids.
- Levels 1 and 2, roughly where we are today, is when the driver has more aids to help with their driving but is obliged to monitor the systems and must be prepared to act upon them.
- Level 3 is where the driver does not have to continuously monitor the systems, because they will be told when to take over.
- Level 4 is where the car is effectively driving itself, but the driver can still overrule the systems.
- Level 5 is a completely autonomous vehicle, with no human intervention.
Generally, experts agree that level 5, although technically possible now, is unlikely for the foreseeable future. Anyone who tells you that by year so and so, vehicles will be operating at a completely autonomous level, is a crystal-ball gazer. The truth is, no-one knows.
And one of the main reasons for the long evolutionary process and uncertainty, is not the technology, but acceptance by governments, regulators, insurers and the public. On a government level, think of the situation in the US, where even individual states cannot agree as to whether autonomous cars can operate on their roads. Regulators and insurers are facing a nightmare of a problem of who’s to blame in accidents: the supposedly perfect driverless car, or the imperfect human being? And when it comes to the public, the simple fact is, the general populous are not yet ready to hand over their cars to computer systems.
Huge safety dividend
But, between now and the time when cars drive themselves, there is a huge safety dividend which we need to grasp. Never before have we been faced with such a great opportunity to reduce deaths and injuries on the roads through safer driving.
And this has been made possible because, thanks to the development of autonomous cars, the technology now exists to achieve a greater level of safety for cars on the road today.
For example, if you look at a test-bed autonomous vehicle, it is bristling with the latest AI capability.
A typical autonomous vehicle set-up will boast, at the front, two short-range radars, a surround-view camera and a long-range lidar. The sides and the rear will be equipped with surround-view 360 cameras and a short-range radar. In the cockpit will be a ADAS camera system and driver monitoring camera, also known as facial recognition. Controlling all this will be a processing AI box which collates all the information and decides upon the actions as needed.
This level of ultra-sophisticated kit is not limited to autonomous vehicles only. It can be purchased right now as the vehicle safety industry quickly realised that cameras and sensors, working together with artificial intelligence, could do much of the donkey work needed to keep a vehicle safe. So, what are these technologies?
Take the ADAS system for example. By the way, ADAS stands for Advanced Driver Assistance System and it’s a technology which has been developed for safer driving.
Originally designed for driverless vehicles, it’s a system which uses cameras and sensors, and a sophisticated algorithm, to notify the driver of a potential problem. This might be a weaving cyclist, a stopped vehicle in the road, drifting across lanes, or sudden braking of the vehicle in front. All theses hazards
can be instantly communicated to the driver, allowing them to take the necessary action to avoid an accident.
The really clever bit is how it works. Cameras and sensors relay the visual information to a powerful processor. This in turn constantly scans and analyses data from sensors and the on-board computer (speed mode, turns, braking, etc.). It continually identifies potentially dangerous situations, alerting the driver to them via a sound, or graphic signal on a display screen.
Tiredness and fog
The ADAS Facial Detection – Driver Drowsiness Detection System is another clever piece of kit designed to make driving safer and a by-product of autonomous vehicle technology development.
Based on AI technology, this drowsiness detection system is designed to continuously scan the driver’s face for signs of fatigue and tiredness. If the driver fatigue detection system detects these symptoms, it will inform the driver with visual and audible alerts, and where necessary, bring their attention back to the road.
But no matter how good a driver we are, fog represents perhaps the ultimate road challenge. Being faced with a visibility of almost zero, on a fast road, is perhaps the driver’s worst nightmare. The SeeTrue anti-fog camera algorithm enables users to see clearly through fog, haze, mist, rain and smoke by handling live video images and displaying a visibility-enhanced live view of the actual scene. The car-fog camera does this without IR projection.
In short, all three systems act as early warning devices which give the driver a distinct advantage on the roads.
So, we may not be at Level 5 yet and it will be some years off before you can send your car off on its own to collect friends from the airport, but, the important thing is, that the technology now exists to make driving safer. And its technology which is affordable, and available right now.
Autonomous cars are the talk of the town, but how long before we see cars on the streets that have no human driver? And is that the real issue that we should be focussing on? Technology has advanced so rapidly that a by-product for the push for self-driving cars is the chance to dramatically improve road safety. Cameras are now so sophisticated, that drivers can now have extra eyes on the road, in real time. The implications for road safety is huge, but whereas the media might wish to report on auto-driving cars that are clocking up test miles by the million, little is reported on ‘traditional cars’ that can be vastly improved by cost effective accessories. This article looks at the overall role of camera technology in driverless cars, but more particularly, examines how industry has used the development knowledge to improve the ability of cameras to play a leading role. Driving cars can now be safer than ever before.