I'm aware of the NASA/JPL rules for developing safety-critical software[2] but I'm not sure if any car manufacturers follow anything similar.
Does anyone here have any knowledge of the software development practices of any automakers and what they do to ensure safety and reliability? And is there anything else I can do to mitigate this risk (short of buying a very old car, which would have other safety downsides)?
[1] https://en.wikipedia.org/wiki/Sudden_unintended_acceleration [2] http://spinroot.com/gerard/pdf/P10.pdf
Either way, if you've had a fuel injected car you were still exposed to these issues. You would have to go buy a carbureted engine from the 80s or before to get away from these "unintended acceleration" issues, as in the end a car with EFI probably has a computer actually controlling the injection. I'd be way more wary of daily driving an 80s or older car from a general safety standpoint than a software issue. You're way more likely to be t-boned at an intersection than a software glitch causing an accident; having a much more modern car will help from a crash safety standpoint than having a carburetor.
There's a ton of things that can go wrong in a car which can cause an accident. The software stack is surely one of those things, but even a 100% mechanical car can have a lot of failures as well. Ever have vacuum hoses fail on an old car? Carburetors get stuck or clogged? Personally, I'd prefer a computer controlling components directly instead of tons of vacuum lines and springs trying to keep things tuned right. On top of that I'll also get much better efficiency and reduce harmful emissions which hurt my family and my neighbors.
To answer your last question first, buy a car that hasn’t been launched within the last 12 to 18 months. That’s not software specific, that general vehicle safety across the board as they will be working through the initial warranty issues. So if you are looking at second hand and you know model ABC was launched 2016, don’t buy one made in the 2016/2017 period.
ISO 26262 rates every system on a critically rating, if they have a ASIL rating of C or D they have multiple back up systems in place. This falls under functional safety which is a newer (5 years or so) area targeting that cars are now highly complex interconnect systems linked with software - the idea being that you target specific subsystems to make sure their function isn’t totally taken out due to some failure or error in the wider system.
Cyber security wise there is an EU reg coming in from 2024 making sure that OTA updates are safe, reducing hacking attack vectors and the like. This is being introduced to new cars and designs as a result of the issues cited above.
As far as people hacking in via the infotainment to access the car control systems - there are firewalls between infotainment and primary car control to mitigate against that issue. There multiple networks in a single vehicle to isolate systems so that no one central unimportant system (infotainment for eg) can take out the whole vehicle.
Software in cars to this level is new, it’s evolving and it takes 7 or so years to create a new platform. This means there is a lag in the system, especially during this transitionary period.
However car makers take this stuff incredibly seriously and their software teams are absolutely not run in the same way as a lower consequence dev situation. Lives are on the line and the type of devs who work in this field know that.
Nothing is perfect but the safety downsides of an old car are widely considered to be far greater than the threat of hacking or bad code in a new car.
The reality is that this is the current state of affairs. Most of people doing software for cars have not the foggiest idea what software is really about.
All the software I read is just impossible to understand. And no standard help in many cases.
Some examples I've seen in code: - Use of kind of hungarian notation to the point that a loop variable was named something like "uibe32bb_i_lns" - Comments in other human languages that were not english - Use of recursion - Have seen a call like name1::name2::name3::name4::name5::name6::name7::name8::name9::name10::name11::name12::name13::name14. The names where some kind of hungarian notation, those calls where everywhere in the code. - Lines more than 1000 characters wide, as a rule - Files north of 100kB of code I can go on and on and on....
Some examples of exchanges with people:
1) Software architect, of a ECU: one programer asks for the memory and CPU budget for a function. The reply was "I'm the architect, I've no idea what you are talking about"
2) System chief architect, for a very important project of a big auto-maker: one engineer says something about software errors. The architect interrupts, and explains that the software never makes an error. Because a computer only does what it is told to make. -- that is terrible enough, for example ignoring the possibility or a SEU, but he goes further, to say that any kind of test is not necessary, because, SW, as stated, makes no errors.
Some general points: - 99% of people in "SW" do not know what gdb is. They debug by "cout <<" - I found nobody who knows what tail recursion is - 90% are only able to program, to some extent, in one of C++ or Python, but no other language. - Mentioning Ada, Lisp, Forth will trigger a waterfall of insults saying those are old and should never be used.
I keep buying the most basic cars. I'm genuinely terrified to think in anything automatic in my car.
That means you can get OTA upgrades that 99% of the times will work flawlessly, but a day may do not, the day you are in a rush in the early morning.
Since most connected cars are de-facto owned by their vendor a potential breach or deliberate sabotage might brick ALL at once across the globe or in some specific areas/countries.
...
A modern car is a car co-piloted by a human and a computer. A local airgapped computer might have bugs, a connected one might have vulnerabilities. Be more scared about them.
In mere local safety terms I can say most cars I know are partially mechanical that means for instance your steering wheel can auto-steer BUT with (more than) a bit of force you can steer it mechanically even if automation completely fail. Similar the break pedal have some servo systems but still partially work in mechanical forms, so might became very hard to push but still able to break a bit.
The most dangerous common design I know are:
- impossibility to turn off certain ADAS who might act really badly in certain weather condition, like the classic ABS on icy roads;
- automatic doors lock when car move, NO DAMN WAYS to unlock them while the car still moving;
- manual parking break disappeared so a kind of emergency breaking ALSO usable by a passenger (for instance if the driver fell ill suddenly) ABSENT and no electronic replacement either since the electronic one if present refuse to engage if the car is moving;
- cockpit design that makes very hard/slow for a passenger to push the driver feet out of accelerator etc if he/she fell ill suddenly.
I consider the above as a sign of VERY BAD design, so I doubt those who made it can be trusted for anything else in safety terms...
2. The code, in many cases, is probably an unmaintainable mess. Embedded programming is not always modern programming, for good and bad.
3. Today, the computers in cars are doing more, and the systems are more complex. It's reasonable to expect more serious problems as a result.
4. Companies do safety testing, of course, but there's no such thing as as "100%" test coverage for complex physical machines running outside of a lab.
5. The best way to judge the safety of cars is the best way to judge safety for airplanes: let other people test them out for a while and then check whether or not they report problems.
Optimize this problem by buying a car with the best safety rating. This is something that can be objectively measured, both in crash testing/labs and from reviews of real-world crash results. Expect that a crash could be inevitable as it is totally out of your control. Optimize for the best odds of surviving a crash without issues.
[1] https://asrg.io/
Plus OEMs have a vast parts and software supply chain that can be compromised.
I suspect that in couple years timeframe we can see massive incident, like ransomware, that will disable entire fleet of a single OEM globally. Like imagine all Mercedes around the world to just stop operating - these kind of incidents
Once you've committed to never driving after having had a drink (and surely never more than 1 drink), never driving while tired or on medication, have completed several advanced driving courses/car control clinics, chosen the top cars based on safety and crash testing, only then might it make sense to use software development methods as a tie-breaker to pick a car.
I am never going to put my life in the hands of some software doing image analysis using machine learning.
> https://illmatics.com/carhacking.html
is a good starting point. But there are a bunch of buses on a modern car, some of them are critical, some less so. Some are firewalled off, others are open.
As you know you can get access to a lot of the car's inner workings by plugging into the ODB2 port. Its perfectly possible to brick some cars by fuzzing the ODB2 port.
In principle, most things in cars _should_ fail safe. even if they are electric or talking over a bus of somesort.
As a hacker like any other who realized that all supposedly ultra safe American quality (TM) software in mission critical applications is in fact less secure on average than random amateur projects, I have been worried about software in vehicles for 20 years. I correctly predicted that it will lead to remote control vulnerabilities such as the uConnect vulnerability disclosed a decade later. There are obviously more of such vulnerabilities out there, just nobody is researching this. I also suggest people start looking at HVAC.
In 2015, some security researchers found a vulnerability in the Chrysler Uconnect software which allowed them to connect to the car's IP address (yes, each car had an IP address, which you can't get rid of), and control the vehicle (as in actually control it). There were 1.5 million vehicles IIRC that were vulnerable to this. So if a bad guy found it first he could have controlled all those vehicles at once from the comfort of his home, probably causing 10% of them to crash and kill people (given that 1/10 of your average modern driver would probably panic (or not panic but still fuck up) from the slightest surprise on the road).
I also am of the opinion that people regularly die from software faults in vehicles, but we just haven't figured this out yet.
What is NASA/JPL rules? Some more misra C crap where it's just making the code more "readable"? Most "software engineers" have extremely wide gaps in their understandings of basic things from programming, to math, to physics. The problem has much more to do with this than cute little best practices recommendations.
However, most of the safety systems software is held to a very high standard, and much happens in embedded systems where the surface level for software foot-guns (such as state) are minimal. I wouldn't worry about buying a new car for these reasons. Though I would try to find one with as many physical buttons as possible.
In my personal experience, the automotive industry has a problem with aggression and dishonesty, both of which seem to go hand-in-hand.
Both of these cultural traits tend to have a negative impact on quality and safety.
Due to regulations you will not be able to find a non-veteran vehicle without those systems, nor you'd want to, but BMW, Mercedes, Subaru and Lexus still have models which are well balanced and don't rely on those to such a heavy degree. This would be my advise as well.
Disclaimer: I am not against (almost) perfectly deterministic safety systems such as ABS. On the contrary - I consider them to be a massive advantage or almost mandatory.
When will auto companies wake up and realize that physical controls are better in every way?
Any car running CANBUS is vulnerable to a potentially fatal attack. They have not resolved this. However, you also generally cannot avoid it. Even the base model Honda civic is vulnerable to attacks on the drive-by-wire system. In a less morbid sense, most modern cars cannot even be serviced at home without going to the dealer for a reset of whatever subsystem. ABS comes to mind.
I would not detract from an old car. A car 25 years old has 99% of the safety features of a modern car and, in good working order, will protect you just the same. Or maybe I just don't worry about it because the probability of anything greater than a minor fender bender killing you is pretty high even with modern tech.