Jared has never been a fan of new age technology, and when it comes to autonomous vehicles he doesn’t shy away from telling us exactly how he feels.
When it comes to cars, the only seat I have ever been comfortable in is the driver’s seat. Even if I am sitting in the front passenger seat I get nervous when someone else is driving. You could say I have some serious trust issues, and I will agree with you wholeheartedly. Yes, when I don’t have a choice I do sit as a passenger, but I try to avoid it whenever possible. However, if someone ever asked me to get inside a fully autonomous vehicle I would say, “Hell no”!
The hype over autonomous driving in the modern era was first kickstarted in Hollywood when Tom Cruise drove around in a self-driving Lexus in the action-thriller Minority Report. Sure, it seemed cool, but it wasn’t just fiction, because autonomous technology was actually in its developing stages then. Today, with some of the tech we see in the new BMW 7 Series, the new Audi A8 and all Tesla vehicles, autonomous driving is almost a reality. Even tech giants like Google and Apple are experimenting with the development of self-driving cars.
These driverless cars are currently being tested in the real world, on public roads, and I was never too happy about that. I mean how much can you really trust a machine? Haven’t you guys seen Terminator? All jokes a part though, this year saw some very serious incidents unfold when it came to autonomous car testing. A Tesla driver was killed in an accident when he ignored warning signs while the vehicle was engaged in something called ‘pilot mode’. Even if the accident was caused because of the driver’s own negligence, the fact of the matter was that the car was in auto pilot and it couldn’t take any action to prevent the accident on its own. If a car is unprepared to heed to its own warning system, then that vehicle should never be allowed to be self-driving in the first place. More recently though, the autonomous-car industry faced even more scrutiny and criticism after a self-driving Uber killed a pedestrian in the US. Again, Uber claims the driver was not paying attention, but the fact of the matter is that the car was driving on its own and it ran over a pedestrian and killed her.
Even after these incidents the software developers for Tesla and Uber both insisted that these self-driving cars are 3.7 times less likely to be involved in a fatal crash than a regular car. The truth of the matter though, is that if those drivers hadn’t trusted the vehicles warnings system and had they kept their hands on the steering wheel the whole time, these accidents would never have happened. The driver should always be in complete control of the car, and if the car does detect any kind of imminent dangers then it can warn the driver while he is already in control.
We all own some form of technology that we use on a daily basis like a smart phone, microwave, laptop computer, just to name a few. Now, how many times has your cell phone hung, and how many times has your laptop crashed or your microwave gone kaput? Well, I’m sure this has happened to a lot of people on many occasions, so who’s to say that the software, or the GPS signal, or the sensors in an autonomous vehicle wont ever malfunction. Self-Driving Cars? Thanks, but no thanks.
Write your Comment