- Open Access
- Total Downloads : 9
- Authors : Priyanka, Neetu Sharma
- Paper ID : IJERTCONV3IS10072
- Volume & Issue : NCETEMS – 2015 (Volume 3 – Issue 10)
- Published (First Online): 24-04-2018
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Going Driverless with Sensors
Priyanka1, Neetu Sharma2
1,2 Department of Computer Science &Engineering, Ganga Institute of Technology and Management, Kablana,
Jhajjar, Haryana, India
Abstract: This paper explores the impact that has been working towards the goal of vehicles that can shoulder the entire burden of driving. Google driverless cars are designed to operate safely and autonomously without requiring human intervention. They wont have a steering wheel, accelerator or a brake pedal because they dont need them, software and sensors do all the work. It takes you where you want to go at the push of a button. This Technology step towards improving road safety and transforming mobility for millions of people.
Index Terms: Artificial intelligence, Hardware Sensors, Google Maps, and Google Driverless Car.
-
INTRODUCTION
It wasnt that long ago when road Maps may become extreme valuable as Antiques. A couple of months ago a Google CEO Larry Page drives in a car around to pick up a friend of his. This car has one special feature; there is no driver at all. The car drove Larrys friend twenty miles to Google without a driver. We will dream this about decades. Already we have seen a host of advancements to make safer drive like Lane assists, parking assists or even collision prevention assistance. With more advance technologies that finds greater emergence, future roadways and become a mesh network along autonomous vehicles. They share information with each other and large network speed, breaking and other variables and move in a coordinated formation. Here we are talking about Google driverless car. A world with increasingly connected climate, cars take over, where humans are out of equation.
-
AUTONOMOUS VEHIVCLE
An Autonomous vehicle (sometimes referred as automated car or self driving car) is a robotic vehicle that is designed to fulfilling the transportation capabilities without a human operator. Qualifying to it as fully autonomous, vehicle must be able to navigate without human input to the destination that is predetermined over unadapted roads and is capable to sense the environment. Audi, BMW, Google, Ford are some of the companies developing and testing these vehicles. Technologies making a system fully autonomous are Anti-Lock Brakes (ABS), Electronic Stability Control (ESC), Cruise control, Lane Departure Warning System, Self Parking, Sensors, and Automated Guided Vehicle Systems.
-
GOOGLE DRIVERLESS CAR EXPLAINED
Only with occasional human Intervention, Googles fleet of robotic Toyota Cruises has logged more than 190,000 miles (approx. about 300,000 Km), driving in busy highways, in city traffic and mountainous roads. In a near future their
driverless car technology could change the transportation. Director of The Stanford Artificial Intelligence Laboratory, Sebastian Thrun guides the project of Google Driverless Cars with elucidations:
-
Steering can be done by itself, while looking out for obstacles.
-
For corrections of speed limit, it can accelerate by itself.
-
On any traffic condition it can GO or STOP by itself.
Figure 1: Google Driverless Car
-
-
UNDER THE BONET It integrates three constituents:
-
Google Maps
-
Hardware Sensors
-
Artificial Intelligence
GOOGLE MAPS: A self driving computerized car has unveiled by Google; which has no wheel for steering, brake or accelerator, just has buttons to start, stop, pullover and a computer screen to show the route. Through GPS and Google maps to navigate. A Google map provides the car with information of road and interacts with GPS to act like a database.
HARDWARE SENSORS: Real time and dynamic Environmental conditions (properties) attained by the car. To need real time results, sensors are attempted to create fully observable environment. These hardware sensors are LIDAR, VEDIO CAMERA, POSITION ESTIMA-TOR, DISTANCE SENSOR, and COMPUTER.
LIDAR: (Light Detection And Ranging) also LADAR is an optical remote sensing technology which is used to measure the distance of target with illumination to light in the form of pulsed laser. It is a laser range finder also known as heart of system, mounted on the top of the spoiler. A detailed #-D map of the environment is generated by the device VELODYNE 64-beam Laser (for
autonomous ground vehicles and marine vessels, a sensor named HDL 64E LIDAR is designed for obstacle detection and navigation. Its scanning distance is of 60 meters (~ 197 feet). For 3D mobile data collection and mapping application this sensor becomes ideal for most demanding perceptions due to its durability, very high data rates and 360 degree field of view. One piece design patented the HDL 64Es uses 64 mounted lasers that are fixed and each of it is mounted to a specific vertical angle mechanically with the entire spinning unit, to measure the environment surroundings. Reliability field of view and point cloud density is dramatically increased by using this approach.)High resolution maps of the world are combined by the car laser measurement to produce different types of data models that allows it to drive itself, avoiding obstacles and respecting traffic laws. A LIDAR instrument consists of a Laser, Scanner and a specialized GPS receiver, principally.
Figure 2: HDL 64E Lidar
-
-
HOW IS LIDAR DATA COLLECTED?
A beam of light is reflected by the surface when it encounter with the Laser that is pointed at the target area.
To measure the range, this reflected light is recorded by a sensor. An orientation data that is generated from integrated GPS and Inertial Measurement Unit System scans angles and calibration with position. The result obtained is a dense, and point cloud (A detail rich group of elevation points consists of 3D spatial coordinates i.e. Latitude, Longitude and Height).
VIDEO CAMERA: A sensor that is positioned near to the Rear-view mirror that detects the upcoming traffic light. It performs the same function as the mildly interested human motorist performs. It reads the read signs and keeps an eye out for cyclists, other motorists and for pedestrians.
POSITION ESTIMATOR: An ultrasonic sensor also known as ( heel Encoder) mounted on the rear wheels of vehicle, determines the location and keep track of its movements .By using this information it automatically update the position of vehicle on Google Map.
DISTANCE SENSOR (RADAR): Other sensors which include: four radars, mounted on both front and rear bumpers are also carried by this autonomous vehicle that allows the car to see far enough to detect nearly or
upcoming cars or obstacles and deal with fast traffic on freeways.
AERIAL: A highly accurate positioning data is demanded by a self navigating car. Readings from the cars onboard instruments (i.e. Altimeters, Tachometers and Gyroscopes) are combined with information received from GPS satellites to make sure the car knows exactly where it is.
COMPUTER: Cars central computer holds all the information that is fed from various sensors so to analyze the data, steering and acceleration and brakes are adjusted accordingly. Not only traffic laws, but also the unspoken assumption of road users is needed to understand by the computer.
ARTIFICIAL INTELLIGENCE: Artificial Intelligence provides the autonomous car with real time decisions. Data obtained from the Hardware Sensors and Google Maps are sent to A.I for determining the acceleration i.e. how fast it is; when to slow down/stop and to steer the wheel. The main gal of A.I is to drive the passenger safely and legally to his destination.
-
WORKING OF GOOGLE CAR
-
Destination is set by The Driver and software of car calculates a route and starts on its way.
-
LIDAR, a rotating, roof mounted sensor monitors and scanners a range of 60-meters around the surroundings of car and creates rudimentary detailed 3-D map of immediate area. An ultrasonic sensor mounted on left rear wheel monitors movements to detect position of the car relative to 3-D map.
-
DISTANCE SENSORS mounted on front and rear bumpers calculate distances to obstacles.
-
All the sensors are connected to Artificial intelligence software in the car and has input from Google VIDEO CAMERAS and street view.
-
Artificial Intelligence stimulate the real time decisions and human perceptions o control actions such as acceleration, steering and brakes.
-
The surface installed in the car consults with Google Maps for advance notification of things like landmarks, traffic signals and lights.
-
To take control of the vehicle by human is also allowed by override function.
Figure 3: How it Works
-
-
AN END TO TRAFFIC JAMS FOREVER
Autonomous cars will be able to talk to each other and navigate safely by knowing where they are, by using RADAR, CAMERAS, GPS, SENSORS and Wireless
Technology in relation to other
vehicles and by means with connectivity they can communicate with obstacle like traffic signals. As a result traffic flow becomes smoother; an end to traffic jams and greater safety would be achieved by illuminating the frustration and dangerous driving thats often triggered by sitting in heavy congestion for ages. When it comes to sustainability, the self driving car also holds great promise by figuring out the most direct, least traffic jammed route by driving without quickly accelerating
or breaking too hard, all which leads to saving on fuel consumption.
Figure 4: Going Driverless on road
-
TRIALS AND TRIBULATIONS
We seldom think about , what needs to be happen behind the scenes to bring this potentially life changing technology to the market , while its easy to get lost into it. Ahead of the Law is the major problem to this technology, as Lawmakers have a huge impact on innovation. In the US most federal and state automobile Laws assume a human operator. Before the technology can be Commercialized these need to be repealed. To legalize the operation of autonomous cars on the roads, Nevada became the first state in 2012. An attempt to gain state support for similar changes in Law, Lobbyists from Google have been travelling around other states and targeting Insurance comp-anise as well. The technology also poses serious puzzle to Insurance in terms of Regulatory issues and Liability.
-
CONCLUSION
This paper explained about Google Driverless car revolution which aims at the development of autonomous vehicles for easy transportation without a driver. For the economy, society and individual business this autonomous technology has brought many broad implications. Cars that drive themselves will improve read safety, fuel efficiency, increase productivity and accessibility; the driverless car technology helps to minimize loss of control by improving vehicles stability as these are designed to minimize accidents by addressing one of the main causes of collisions: Driving error, distraction and drowsiness. But still these cars have a lot of hurdles to go through before they became everyday technology.
REFERENCES
-
www.theneweconomy.com/i nsight/google driverless cars.
-
KPMG(2012), Self Driving Cars: The Next Revolution, KMPG and The Center for Automotive Research; at www.Kpmg.com/ca/en/Isuues-AndInsight/Articles Publications/Documents/SelfDriving-Cars-next -revolution.pdf.
-
Stephen E. Reutebuch, Hans-Erik Andersen, and Robert J.McGaughey; Light Detection and Ranging (LIDAR): An Emerging tool for Multiple Resource Inventory.
-
bgr.com/2013/01/27/google driverless-car-analysis-306756/
-
.www.dezeen.com/2014/05/28/public test drive first driverless- cars bygoogle/
-
Spectrum.ieee.org/automation/robotics/artificial intelligence/how- google-selfdriving-car-works.