Have you watched this video?
If not, you’re living in the past. Self-driving cars are coming, that’s something that we all know by now, and that should not scare us, even though many people are trying to demonize this technology.
I’m not writing this post to debate whether or not self-driving cars will kill the driving experience; what I want to ensure with this post is that ENGINEERS WON’T CODE ANYTHING IN YOUR CAR THAT WILL KILL YOU OR WILL KILL OTHERS. Engineers are smart people, cars need pass lots of security tests before they are put in the market, and what is most important: the most important reason behind self-driving cars is to increase security as close to 100% as possible in order to preserve the human life.
Using the same scenario that the video uses, I will try to explain why engineers don’t need to program your car in order to kill you or to kill someone else, but rather saving your life and the others.
This is the scenario:
- You have a fully connected car, equipped with tons of sensors (cameras, proximity sensors, GPS, heat sensors…) all around the car, processing in real time.
- You have one truck on front, loaded with dangerous materials that can fall any time.
- You have one vehicle, at your same level, on both your left and your right. Let’s say, a car with 4 people inside on your left; on your right, a motorist without helmet.
Under this scenario, what the video proposes is that, in case the truck loses its load, the car would have 3 different choices, and every one of those options involves, without any other alternative, putting your life or other’s life in danger:
- If the car decides to go straight, you will hit the load and/or the truck, and therefore, YOUR LIFE MIGHT BE IN DANGER.
- If you decide to skip the collision with the truck, two things might happen:
- In order to preserve as many lives as possible, the card decides to go right, so there’s a big chance of hitting the motorist, putting THE MOTORIST LIFE IN DANGER.
- Your car has detected that what you have on your right is car, so in case of collision with it, there is a bigger chance for its passengers to survive, and therefore, the car, knowing that, decides to go right and hits the car, putting THE 4 PASSENGER LIVES IN DANGER.
Oh man! There is no escape… there is no hope in humanity. Engineers will have to decide what lives are more valuable, and we all are going to die!
Well, eventually, we all are going to die, but it won’t be because your car is trying to kill you. Think about again for a second: you have a self-driving car, full of sensors, and tons data being processed in real time, so the question is…
Is that scenario valid?
Absolutely not! You have a ‘smart car’! You should never be even in that scenario. Because your car has cameras, it can easily detect that what it has on front is truck with a dangerous load, so it can maintain a proper security distance; even the car would know that has vehicles on both sides, and therefore, it would even reduce speed as necessary until NO ONE IS ON DANGER, and when possible, it would change lines, so it would avoid any dangerous situation. The car can even detect what’s approaching it, and plan many alternatives ahead of time, rather than deciding who’s live should be in danger.
Engineers won’t program your car to kill anyone, but rather will make use of all the technology available in order avoid these kind of dangerous situations. Self-driving cars are there to make your life and other’s life easier, and more comfortable.
Think about it for a second, the car will always avoid dangerous situations, and for every possible scenario, engineers are working full time in order to ensure that there is always a solution that will TAKE YOU HOME SAFELY.