Introduction to Autonomous Vehicles
The term
autonomous stands for
self: governing. Therefore, an
autonomous vehicle can be defined as a
self: driving vehicles or even a
driverless car. It is a vehicle that has a capability to drive itself
with minimal or no human intervention. A self:driving car or an autonomous
vehicle has sensors and algorithms (like GPS, sonar, etc.) installed which can
sense the environment around and any movement in order to drive around safely.
There are several levels to an autonomous vehicle. These levels are determined
by the Society of Automotive Engineers as the following:
- Level 0: there is no automation.
- Level 1: there is shared control. It can also be termed as hands on
automation.
- Level 2: this level is called the hands:off level.
- Level 3: eyes:off level.
- Level 4: mind off level.
- Level 5: Steering:wheel optional.
If vehicles such as cars, trucks, etc. are truly made autonomous it is needed
for them to copy the
human decision: making procedure. It is to be taken into
consideration that not all decisions that are made while driving are not
entirely traffic based or about following a safe path. There are ethics that are
to be given importance while manufacturing the vehicles. These ethics has to be
developed into algorithms that can be installed into the software of these
vehicles which is a very difficult task.
Issue discussed in this following Article
The issue that has been discussed in this article is whether there is a need to
regulate Autonomous vehicles at this nascent stage of its development?
In order to get to a conclusion I will be discussing different aspects of an
autonomous vehicles like the ethical issues surrounding it, who is to be blamed
if something goes wrong, and what steps certain countries have taken in order to
curb those issues that have risen due to it.
Importance of Ethics
Firstly let us start which a situational example, let's say in the future you
own an autonomous car and one day while driving that car has to make a terrible
decision between swerving left and run over a kid, or swerving right and run
over a man. All of this keeping in mind, if your car does not swerve in either
of the directions, it hits both of them resulting in their deaths.
So, in this
particular situation what must be the ethical and appropriate approach? Who
decides it? And who is going to control your car when this decision has to be
made? Any choice that is made in this situation is going to be incorrect. One
cannot decide what has to be done just on the factor of their age. What do you
think would be right decision? Maybe, you do not swerve at all and let both of
them die which would stop you from discriminating on the basis of age but at the
same time this would be even worse as you had a chance to save at least life and
you chose not to. How does one deal with this dilemma?
Given the above example, there can be numerous amount of similar situations
where making an ethical decision can be a difficult task. Sure, many might think
that if such a situation rises a person can just push the breaks or the owner
may just take back the control but there are several aspects connected to it.
Like, in the above situation it was a rainy day and the roads are wet, now if
you push the breaks you risk three lives which would include your own life.
Now,
let's say you are on a highway and there are two cars ahead of you? What would
you do then? Because clearly you cannot stop in the middle of a highway. At this
time and age, the best autonomous car that exists fail to sense the existence of
tiny objects and even little animals. This can also mean that it fails to detect
the presence of several other things like potholes, rocks on the road, cats and
dogs and many more. This can cause a lot of trouble like equipment failures
(sensory errors, tire puncture or even deviation from the safe path).
This shows how crucial role ethics play while programming autonomous vehicles.
Trolley Problems
Trolley problem is one of the most important experiments when it comes to
ethics. This involves ethical dilemma between is it ethical/appropriate to
sacrifice one life to save multiple lives.
This involves a tram is on a path to kill five people down the track, but a
driver has a choice that he can divert the course of the tram to another track
which would kill just one person. While discussing the ethics related issues in
autonomous vehicles the Trolley problem scenarios are given a special
consideration. Which means while designing an autonomous vehicle it may be
required to have a program or algorithm designed and installed which would make
a choice whom or what to hit when a collision is unavoidable.
Status of Autonomous Vehicles in the U.K., U.S. & Germany
- U.S.
In the United States of America, Nevada was the first state in the world to
allow autonomous vehicle on public roads. Nevada law defines autonomous vehicles
as a motor vehicle that uses artificial intelligence, sensors and global
positioning system coordinates to drive itself without the active intervention
of a human operator.[1] The law states that the driver would not have to pay
attention to the driving while the vehicle is on autopilot mode.
Whereas in California, the law states that humans are supposed to be present
while the autonomous vehicle is on autopilot just in case of emergencies and
he/she has to take control over the car.
In the U.S., 29 states have successfully enacted laws with respect to the
regulation of the autonomous vehicles.
- U.K.
In the United Kingdoms, the Heathrow Airport uses electric driverless pods for
over a decade now. The U.K. is now working on building an autonomous vehicle
friendly environment for public testing of these vehicles in four different
cities. The Global Automotive Consumer Study in 2018 has found that nearly 49%
of the people believe that self:driving or autonomous cars will be unsafe. This
is way less than the percentage (73%) of people in the year 2017.
In 2021, U.K. has been working on a legal proposal to allow self:driving
automated lane keeping systems upto 37 mph just after getting very mixed
reaction from the experts in 2020. This would allow drivers to take back the
control during unplanned situations like weather related emergencies and road
constructions.
- Germany
This year i.e. in 2021 July in Germany, the Federal Act Amending the Road
Traffic Act and the Compulsory Insurance Act came into effect which allows the
autonomous cars operate on their own without the presence of a driver in
specified areas on public roads.
- India
The Indian Government has not reacted or made any decision regarding the
Autonomous vehicle as its impact is not fully understood. Drones have been
banned for this very reason and even Google is not allowed to use 'street view
which would enable a user to actually see 360 degree panoramic street imagery.
Conclusion
Companies like Tesla, Ocado, Starsky Robotics and Otto have been working and
developing of autonomous vehicles. As of 2020, Tesla cars can be classified as a
Level 2 autonomous vehicle. At this level, a driver indeed needs a driver all
the time in order to take control over the car in case of emergencies. Wymo was
originated by Google. But as of 2016, these cars were involved in over 14
collisions and the car's software was responsible for a crash.
Therefore, it can concluded that no matter how good the programming of an
algorithm is if ethics is ignored and legislations are not made in accordance to
it there can be trouble when some actual tragedy occurs. For instance, if a
crash happens and it occurs due to technological errors, a case can be made
against the manufacturers. It is always going to be a grey area when it comes to
ethics and making decisions. Therefore, it is the need of the hour for countries
to frame legislations in order to regulate the manufacturing, programming and
use of autonomous vehicles.
End:Notes:
- (2021). Available at: http://www.nishithdesai.com/fileadmin/user_upload/pdfs/Research%20Papers/Preparing_For_a_Driverless_Future.pdf
accessed: 20 November 2021.
Please Drop Your Comments