Tom-M

Functional Safety & Security

Blog Post created by Tom-M Employee on May 8, 2018

Functional safety concentrates on protecting people, assets and the environment from inadvertent harm caused by non-malicious actors, for instance by bad planning, bad implementation, a bad set of requirements or random failures.

 

Cyber security on the other hand concentrates on harm caused by malicious actors. Somebody deliberately causes the system to fail in a way that brings some advantage to them.

 

Given that functional safety concentrates on “accidents” and “mishaps” and security deals with deliberate “hacks”, you do need to think about it somewhat differently. For instance, it is more important to think about what is possible as opposed to what is probable.

 

In many languages there is only one word to cover both safety and security. For instance, in German it is Sicherheit. Therefore, I generally try and remember to say, “cyber security” instead of security to make it clear which I meant.

 

All systems with functional safety requirements have security requirements. At a minimum in functional safety you must protect against foreseeable misuse and somebody hacking the system comes under that category. There will be lots of systems with security requirements which are not safety relevant. Therefore, systems with functional safety requirements are a subset of systems with security requirements.

 

 

Sometimes the root cause of both safety and security concerns are the same. Suppose you have 1,000 lines of code and it contains a single design error. If you only consider safety then that buggy line of code may never be executed or may execute at a time when the bug doesn’t matter. However, a hacker becoming aware of such a bug can try to exploit the situation so that the dodgy line of code is always executed.

 

Like functional safety, cyber security comes with its own terminology. You have attack surfaces, a PUF (physically un-clonable function), side channel and glitch attacks. Some of the security requirements such as threat assessments, parallel things like a hazard analysis in functional safety. Also, there are procedures for setting a target security level which are quite like those for SIL determination. Perhaps the biggest similarity though is that both are emergent system level properties and it is very hard if not impossible to add security or safety afterwards to an already designed system.

 

Within Analog Devices we are lucky in that some years ago we acquired the Cyber Security Solutions (CSS) business of Sypris Electronics who are based in Tampa Florida and they have become the Trusted Security Solutions group within Analog Devices. As the safety guy I do need to know something about security but it is good to have the real experts on call.

 

  • Regular security patches are generally not possible on the factory floor for fear of upsetting production
  • Many of the nodes used in industrial are resource constrained with RAM often << 1 Mega byte
  • The equipment lifetimes can often be twenty years or more
  • Some of the controller equipment is dangerous and can cause harm
  • Much of the equipment is time critical and security can add big time overhead
  • Many of the protocols are proprietary

 

An interesting example, if you have a nuclear shutdown system - is it appropriate to lock out the safety guy from the shutdown system if he gets his password wrong three times?

 

Within industrial circles the most famous cyber security incident is the Stuxnet virus. It was designed to target the Iranian nuclear enrichment program via a Siemens S7 PLC. It is believed to have been written and deployed by state level actions. There is an excellent documentary film on the topic called “Zero Days”.

 

This blogs video is the trailer for Zero Days – see http://www.zerodaysfilm.com/trailer

 

Within IEC 61508 it references the IEC 62443 series for cyber security requirements. Therefore, for my next blog, the discussion will be on the “The IEC 62443 series of cyber security standards”.

 

 

 

Outcomes