Need for Assurance: Part 1 – Protecting Data in IoT Series

Encryption, authentication and key management are a bit different in the Internet of Things (IoT) world than in the domain of PC’s, laptops and tablets but there are lessons to be learned from the work being done in this area. For example full drive encryption, which is in common use to protect confidential data on business laptops, can be an effective protection for data in the IoT space as well but how do you know that it is really doing the job?

After authentication, encryption is supposed to be transparent. Unlike an application such as a word processor or a flashlight app on a phone it is not immediately apparent if the encryption product is doing its job when we’re talking IoT. This blog is about exactly that – “assurance” and will describe some of the ways in which one can be assured that the encryption and management is being performed correctly.

First: from a data protection perspective, what is the difference between IoT devices and laptops? Laptops have owners and the concept of being lost or stolen.  They usually have a trusted user that authenticates to the computer before accessing the data or function of the device. In the IoT world there is no trusted user present to authenticate to the device when it is power cycled and the device is not necessarily in the possession of a user.  Also IoT devices are, by definition, network connected.  As I described in a previous blog  this lends itself well to having a central key manager authenticate and then provide the keys encryption keys to the IoT device securely over the network when it boots up.

So how can we gain “assurance” that this encryption and central management is being performed correctly in IoT devices? It is a layered approach. From the bottom up:

  • The cryptographic algorithms that are employed need to be validated to ensure they are operating according to specification. A good way to get this level of assurance is to submit them to a cryptographic algorithm validation program such as the one run by NIST. NIST’s CAVP will provide confidence that that the hashing, random number generation, AES encryption, etc. have been implemented correctly.
  • Next, the way in which these algorithms combine into a library or “cryptographic module” needs to be validated. The Federal Information Processing Standard FIPS 140-2   has long been the baseline certification for the cryptographic module that performs the actual encryption. “This standard specifies the security requirements that will be satisfied by a cryptographic module utilized within a security system protecting sensitive but unclassified information … The security requirements cover areas related to the secure design and implementation of a cryptographic module.”  FIPS 140-2 requires that all the algorithms used in the module (for example AES or the random number generator) are certified via the Cryptographic Algorithm Validation Program.  That is a good start but FIPS 140-2 is limited to the boundary of the cryptographic “module”. It doesn’t necessarily certify the entire product or “security system”.
  • It is the function of The Common Criteria for Information Technology Security Evaluation to provide assurance at the system level.    “Common Criteria” is an international standard for computer security certification.   It is a “framework in which computer system users can specify their security functional and assurance requirements (SFRs and SARs respectively) through the use of Protection Profiles (PPs), vendors can then implement and/or make claims about the security attributes of their products, and testing laboratories can evaluate the products to determine if they actually meet the claims. In other words, Common Criteria provides assurance that the process of specification, implementation and evaluation of a computer security product has been conducted in a rigorous and standard and repeatable manner at a level that is commensurate with the target environment for use.”There are three collaborative protection profiles (cPP) that build on each other to provide the overall system level assurance that labs can test against:
    1. cPP EE: Encryption Engine
      “The FDE cPP – Encryption Engine describes the requirements for the Encryption Engine piece and details the necessary security requirements and assurance activities for the actual encryption/decryption of the data…”. I think there is good potential for adapting this work for protecting data at rest in the IoT.
    2. cPP AA: Authorization Acquisition
      “The FDE cPP – Authorization Acquisition describes the requirements for the Authorization Acquisition piece and details the necessary security requirements and assurance activities necessary to interact with a user …” . This module has less applicability in the IoT because there is often no trusted user present to authenticate but I mention it because it sets the stage for enterprise management.
    3. cPP EM: Enterprise Management
      The purpose of the cPP EM is to provide security critical requirements for Enterprise Management software that is used to manage systems in an enterprise that contain FDE solutions. The cPP EE module builds on top of cPP AA. It details the security requirements and assurance activities necessary for common FDE features.
      The cPP is still being developed by the ITC committee but will soon be available for public review and comment.

As you can see from above the trust in the overall system can be built up by validating multiple layers of the system from the bottom up.  In this way one can gain the assurance that their data is protected, even in the IoT.

If you liked what you read and would like more great insights on tech industry news to come straight to your inbox, subscribe here!

Previous Post
Protect Against the 4 SED Attacks Discussed at Black Hat
Next Post
7 Best Practices to Prevent a Case of Stolen Data When On-The-Go