Utterly irresponsible, but a critical reminder of why security is important

I’m glad that they responded so quickly.

@ mtylerjr - I really don’t care what people call me, as long as no one attempts to prevent me from feeding my family using the skills I’ve learned over the years. Not at all a fan of regulation and legislation that seeks to limit who can perform certain jobs based on arbitrary criteria.

But then, as someone who holds neither a CS nor an EE degree, that’s not surprising. There’s much I’d not have been able to do in my life if the appropriate (in someone’s mind) degree had been demanded at every turn. I’d hate to see others miss out on opportunities because of a few bad eggs.

1 Like

@ devhammer - I understand where you’re coming from, but put it like this. Should people require a license to drive? The license isn’t hard to get. Driving isn’t hard to do. But a few bad eggs has made it so that these things are mandatory now. I actually encourage you to look into the IEEE licensure exam if only for your edification (and I should as well).

You don’t actually have to have gone to college to pass these things. Politicians are very reactionary. We have to be prepared for the day when the public demands a higher standard of developer in mission critical applications. As a Mechatronics Engineer (self proclaimed), I predict that then robots become a household thing, some sort of mishap will occur, and it will be the Software side (esp security) that gets called into question.

So today, in medical applications, there’s a requirement for deliberate engineering work to allow the product to pass the approval process based on the circumstances it’ll be used (blood pressure monitor is different to a life support system for example). That means that all those devices have higher costs but bring a higher level of reassurance. The organisation is deliberate about design, including safety, which goes to help ensure a more understood outcome. (BTW I recommend listening to the Embedded podcast, Elecia and Chris have worked in this field and have had guests on involved in this field. [url]http://embedded.fm/[/url] ). I don’t know if that requires a “licensed practitioner” to sign off the work (like the PE exam allows in say bridge building - that’s what I gather from my listening to Engineering Commons podcast) but it certainly brings a rigour to the process of release.

Then you compare that to what we see in automotive engineering. There is huge rigour around the mechanical aspects of vehicles; crash impact, passenger protection, fire protection, etc. That work has been going on over many tens of years to ensure that there were appropriate measures taken to protect humans both as occupants of the vehicle and outside. But what we have seen over the past ten years is the expansion of the electronics within a vehicle, and an even bigger expansion on the dependence of those systems, and the inter-connectedness of vehicle systems. It seems that what needs to happen is the rigour around the control systems in cars needs to step-change and address security as an integral part of the process. That is just a maturity thing - who would have thought that a car would be on the internet permanently 5 years ago, but now that’s common so the measures taken to address that need to improve. But it’s also not a simple matter of regulations. Some of this should be self governing - being “grown up engineers” as well as having minimum requirements for controls, rather than some litigation-driven requirement to have the firmware engineer be licensed.

I can see multiple failures here. The car manufacturer bought a component, the radio/AV head unit, that had a flaw that exposed the device to attack; that company needs to up their game on security to not allow a vulnerability like that to get through; it’s a vulnerability on multiple levels too because not only could attackers come in from the internet (which should be a pretty simple measure to block!), they could overwrite the firmware (IFU gone wrong!). It appears then that the car manufacturer put the head unit on the same CAN bus as the primary engine and vehicle control bus, which to me doesn’t make sense when it should be on a messaging bus at best. It also seems the decision on what head unit to put into certain models was a “feature” decision (highly usual), so if you bought the vehicle model with the lower screen size or whatever the feature was, you weren’t affected… The fact that these independent decisions resulted in an exploitable flaw just proves the complexity of integrating things in this componentised world.

I still say that commercial realities will have a role here. You can’t expect a developer to say no to shipping a product when the alternative is to lose their job - not everyone is in an environment where the next job can be easy to find. That doesn’t make it right to say no, or to say yes, it just means that we shouldn’t judge people for doing what they’re told (or we’d question armed forces the world over). Legislating that SW Engineers are licensed isn’t the answer, but being ultra-clear with minimum requirements WRT security, safety systems in cars, and code quality is inevitable. In Australia we have safety requirements to allow a vehicle to be imported and sold; perhaps those requirements will eventually expand to cover vulnerability assessment and code quality, but in the meantime vendors need to take control and lift their self-assessment. If they don’t, then they will see their sales take a hit; the unintended acceleration of Toyota Prius is just one example of how that could affect a company, and the fact that as purchasers become more tech-savvy (and when we get “bad” press hype on things like this Jeep vulnerability) I think you’ll see those vendors will be pretty fast to start that improvement process…

@ Mr. John Smith - FWIW, technically a license is NOT required to drive in the U.S. It’s required to drive on public roads, because those roads are built with tax dollars. If I build a private track, I can choose to allow anyone to drive on it, license or no, as long as I’m willing to accept the liability (or require participants to sign away any liability, or get an insurer to share the risk of liability with me for a price).

Likewise, if I own a farm in the U.S., I can operate a motor vehicle on that farm without being licensed (though laws on this may vary based on jurisdiction). My point being that I don’t think driver’s licenses are an apt analogy.

My concern with licensure for software developers is that any licensing scheme presents an opportunity for power or profit. If you look at how hack licenses work in major cities, they’re effectively a way to limit competition in those industries, which is why you see things like Mayor DiBlasio in NYC going after Uber. The taxi companies that have paid huge sums of money for the taxi medallions aren’t prepared to allow upstarts to come in and take “their” customers without a fight.

Make software developer licensure mandatory, and you then have to give someone the power to approve those licenses, and presumably to charge fees for them. Both of those are major opportunities for graft.

I don’t oppose licensure because I want lower standards. I oppose it because there’s ample evidence that many such programs end up being abused. If the “solution” is worse than the problem, it’s a net loss, IMO.

@ devhammer - Oh the irony. The the licensing body to prevent security flaws could succumb to exploits abusing said power.