Electric cars are the way of the future. I don’t think there is any room for debate here. Every major car manufacturer is looking at going electric, if not pledging to go electric entirely by the mid 2020’s. What is up for debate, is weather we will be in control of said electric cars, or if they would be driving themselves. I’m inclined to believe that there wont, for at least a very long time, be a worthy substitute for human inclination when it comes to driving. Cybersecurity firm McAffe helped my argument when they used deviously simple low-tech hacking to completely manipulate the behavior of the most popular all electric car in the world, Tesla.
When they aren’t stealing upgrades from second hand buyers, as I spoke about in my other post, Tesla typically is paving the way for the future of automotive travel. The success of the all-electric car is owed entirely to Tesla. Their current big push, self-driving cars, has caught yet another snag. McAffe used tape to make a speed limit sign denoting a 35 miles per hour limit seem to say “85” miles per hour. There was no electronic trickery, no code manipulation. Just some tape. The unbelievable part, is that when the Tesla’s camera saw the sign, it rocketed “up 50 miles per hour”. Tesla argues the technology is outdated in the tested car, which was a 2016 model, and the camera makers, Mobileye, says that the eye is just one facet of the vehicles detection system, and that its failure is not indicative of a larger issue. I believe it is.
I am a huge fan of cars. I believe that public transportation is far more efficient, economical, and better for the environment, but cars are still very incredible feats of human intelligence and damn are they fun to drive. While part of me can chalk my disdain for self-driving vehicles up to my love of the physical act of driving, the other part comes from everything I have learned as a computer scientist. Computers are only as smart as we make them. We can’t ever create a computer smarter than us. They can be more efficient, quicker, and remember better, but they ultimately rely on us. Driving is incredibly dangerous, and I don’t believe that there will ever be a time when algorithms can make up for the situation Tesla ran into a number of years ago when the parameters passed into the function are “Do I hit this person and protect the driver?” or “Do I swerve away and save the pedestrian, killing the driver?”. In these situations, unfortunately, that question must be answered, but machines don’t have nuance or ethics, they also suffer no consequences of their actions, therefore they should not be responsible for who lives or dies. If a piece of tape can force this machine to operate in an unsafe way, imagine what manipulation of code could do. While I don’t in any way think that something like what McAffe did should discredit or halt the advancement of self-driving technology, it should create a serious discourse on whether the current desired application of self-driving cars, which is the removal of a need to drive, is a good idea.
Source Article: https://www.technologyreview.com/s/615244/hackers-can-trick-a-tesla-into-accelerating-by-50-miles-per-hour/