Humans, as a species, have an innate motivation to constantly seek out easier means of living through technological advancement. Due to this evolution, the world we live in today is dominated by methods of efficiency, with the automotive industry being a leading example of developing and upgrading their products to supply customer demands. From the first car manufactured in 1886, the exponential rise of automotive ingenuity saw the prototype for the first “autonomous” car developed in 1925 and the evolution of this idea has remained at the forefront of vehicular technology since. In a modern setting, the incremental growth seen by Tesla, founded in 2003 and named after famous inventor Nikola Tesla, under the leadership of Elon Musk, brought about the introduction of their model of “Autopilot”. This function took the market by storm by 2015; which, according to reports, grew company revenues from around $4 billion in the same year to approximately $31.5 billion by 2020. Boasting features like the installation of 8 cameras providing 360 degrees of visibility at up to a range of 250 meters with 12 ultrasonic sensors, Tesla advertises their vehicles as having “full self-driving capabilities’ and poses an idealised revolution of the way we drive.
However, there have been several tragic examples of vehicular deaths via autonomous driving being recorded since their introduction on roads internationally. Notably the recent crash in December 2021, when a Parisian taxi, a Tesla Model 3, was said to have stopped at a red traffic light and suddenly sped forward, “hitting and dragging with it a cyclist who later died” according to a police source, also confirming a further 20 injured. A representative of the French government stated that Tesla claimed “there is no indication that a fatal accident in Paris involving a Tesla Model 3 taxi was caused by a technical fault”.
The ensuing inquiries from national governmental agencies worldwide reflects escalating concerns about the philosophical debate regarding whether it is the fault of the vehicle or its operator when investigating autonomous vehicular-related deaths. In July 2020, a German court ruled that Tesla made misleading claims about Autopilot’s current and future capabilities, calling into question the USP of Tesla Autopilot vehicles as the need for physical human intervention is paramount to public safety. Members of the U.S. Congress also highlighted the morality of Tesla’s marketing strategy to the Federal Trade Commission (FTC). Labelling their vehicles “a threat to motorists and other users of the road” due to overstated claims about their vehicles’ advertised self-driving competencies, they attacked the colossal car company and its CEO for a perceived lack of self-accountability. Subsequently, in an interview with Times Magazine after being named their ‘Person of the Year 2021’, Elon Musk responded to this growing pressure by saying “you’re not going to get rewarded necessarily for the lives that you save, but you will definitely be blamed for lives that you don’t save.” While Consumer Reports warned users that “Autopilot” claims is a “misnomer”, this debate continues to develop after the publication of a official report by the Law Commission for England, Scotland and Wales; implying that drivers of self-driving cars should face leniency in potentially fatal and injury-related incidents involving the vehicle. Moreover, they propose “the creation of an Automated Vehicles Act to reflect the “profound legal consequences” of self-driving cars”.
We at Pocket Box promote innovation especially through forward thinking companies like Tesla but would like to remind to all self driving car owners and potential purchasers of an autonomous vehicle to always be aware while operating ANY vehicle and remain vigilant of unforeseen technical issues which may affect the functionality of your vehicle.