Our future with driverless cars
Driverless cars used to be the sort of thing you’d see in sci-fi films – but in 2018 they’re becoming a reality. Autonomous car technology is already being developed by the likes of Lexus, BMW and Mercedes, and they’ve even tested Tesla’s driverless Autopilot system on UK roads. Across the Atlantic, Google is developing its automated technology in the wild, and Apple is rumored to be working with BMW on its own – probably automated – cars. “I expect that our children will live in a world where humans are not allowed to drive. 1.2 million deaths are caused by traffic accidents every year. 95% of those are due to human error,” said Professor Maarten Steinbach at TU Eindhoven University during a demonstration at the Automotive Engineering Science (AES) Laboratory.
The possibility of using driverless cars has increased over the years. With new inventions, it has now become a reality to use a driverless car for day-to-day purpose in the near future. The driverless car concept was first spoken about in Air Wonder Stories magazine in the 1930s. But it became a reality only in the 1980s. Karnagi Mellon math lab developed first self-sufficient autonomous car.1997 the math lab introduced an autonomous bus. Since then autonomous vehicles have traveled a long way developing step by step. Especially within the last decade. It was identified as a large achievement when in 2008, an autonomous Waltz Wagon was introduced, and it could identify the stop sign. In 2010, Google launched a fleet of autonomous cars which was able to travel 140000 miles on California roads traveling alongside with other vehicles and pedestrians obeying all the traffic signals including traffic lights. In 2012 the same fleet was able to complete 300000 miles without a single accident. Google and the waltz wagon are the only companies which show the interest in commercially introducing autonomous cars to the market. However, Mercedes Benz started their researches on autonomous vehicles in the 1980s.
Although some people will always want the status and other satisfactions of owning a car, what most people want is affordable mobility without having to purchase, tax, insure, park and fuel a four-wheeled millstone around the neck. In the well-planned cities of continental Europe, that kind of mobility is provided by efficient and affordable public transport. Contrastingly, in more benighted places (like most British towns and cities), autonomous vehicles are an obvious alternative.Uber has already started offering rides in self-driving cars in Pittsburgh, a notoriously demanding urban environment. At the moment, there’s always a “driver” sitting in the front, but the obvious intention is to dispense with the human as soon as it’s deemed safe to do so.
As vehicles become smarter and more automated, there will also be advances in data communication between cars on the road and between cars and roadside installations. In other words, there is a need to teach those “autonomous” cars to take into account real-time data about their surroundings and other vehicles. Automated driving in real life should take into account the role of data, object detection and social driving. These are all incredibly complex things to try and teach a machine, says Robert Lohman from a start-up called 2getthere, which works on automated transit applications ranging from Automated People Movers to Shared Autonomous Vehicles.
The Google Car
Autonomous vehicles rely on a range of sensors to interact with the world around them, with the Google Car prototype coming equipped with eight. The most noticeable is the rotating roof-top LIDAR – a camera that uses an array of either 32 or 64 lasers to measure the distance between objects, building up a 3D map at a range of 200m and allowing the car to “see” hazards. The car also sports another set of “eyes”, a standard camera that points through the windscreen. This looks for nearby hazards like pedestrians, cyclists and other motorists, as well as reading road signs and detecting traffic lights. Speaking of other motorists, bumper-mounted radar, already used in intelligent cruise control, tracks other vehicles in front of and behind the car.
Externally, the car has a rear-mounted aerial that receives geolocation information from GPS satellites, and an ultrasonic sensor on one of the rear wheels monitors the car’s movements. Internally, the car has altimeters, gyroscopes and a tachometer (a rev-counter) to give even finer measurements on the car’s position, all of which combine to give it the highly accurate data needed to operate safely. Using these arrays, the Google canCar can read the road like a human. However, these sensors come with their own limitations. Autonomous cars simply replace the human eye with a camera, leaving them vulnerable to extreme sunlight, weather or even defective traffic lights. In current autonomous cars, the way this selection of pixels is analyzed could be the difference between a safe journey and death.
Since Google unveiled its self-driving car, it has spun off this part of the business into a separate arm under the name Waymo. The name comes from Google’s mission to find “a new way forward in mobility.” Car-to-infrastructure communication is essential for enabling autonomous driving. For example, as your car approaches a red light, information will be given to you. How can we provide this information in every car at every red light? There has to be a solution for that if you want to enable autonomous driving in areas with traffic lights. The German automotive industry is one of the most powerful advocates of a connected car-traffic infrastructure. Manufacturers including Daimler, BMW and Audi paid $3.1 billion for the Nokia Here mapping service, which will be used as a platform for a connected-car environment. A joint statement released by the consortium reads: [Nokia] Here is laying the foundations for the next generation of mobility and location-based services.
For the automotive industry, this is the basis of new assistance systems and ultimately fully autonomous driving. Extremely precise digital maps will be used in combination with real-time vehicle data in order to increase road safety and to facilitate innovative new products and services. To become a viable solution, these systems will be required in every vehicle, including those still used by humans. It’s likely that emergency vehicles like ambulances and police cars will continue to use human drivers, so they’ll need a method of communicating with the autonomous cars around them. You have to know where [an emergency vehicle] comes from and when it will be there, so the information is shared between this car and your car. Dealing with human’s unpredictable behavior represents a significant challenge for the technology. Human drivers are able to interact with each other and make allowances, but also make countless, small mistakes when driving – mistakes to which current self-driving cars simply can’t adapt. Although there was little that could be done to avoid the Google Car’s latest accident, it’s a stark reminder of autonomous technology’s biggest hurdle.
Exposition Magazine Issue 14
Department of Industrial Management
University of Kelaniya