I’ve been hearing a recurring theme in discussions with clients about security and compliance planning: We’re compliant, so we’re secure, right?
To me this thinking indicates confusion about the difference between compliance and security.
Regulations and compliance are designed to do specific things. While those things may overlap with cybersecurity, they are not an effective security program. Consider what Target President and CEO Gregg Steinhafel said just after the retailer’s 2013 breach: “Target was certified as meeting the standard for the payment card industry in September 2013. Nonetheless, we suffered a data breach.” Target was compliant, but not secure.
A real-world, nontechnical example underscores this. In Virginia, where I live, we are required to get our cars inspected annually for safety. One of the inspected items is turn signals, but based on my observations, none of the cars being driven in Northern Virginia and D.C. has functional turn signals.
Jokes aside, this is a perfect example of a regulatory requirement that doesn’t equal increased safety. Why? Because while the vehicle may be in compliance, the driver is making poor decisions about how to drive. Part of a good security program is teaching users how to make smart decisions and act in an appropriately secure manner. If we aren’t educating our users, our security program is flawed and, regardless of compliance, we run the risk of security breaches. Compliance equals operating turn signals, security equals using turn signals while driving. Very similar concepts, but dramatically different results.
Many of our organizations have strong compliance programs. For instance, a hospital houses sensitive medical records. We as a community want those records kept private and confidential. The Health Insurance Portability and Accountability Act (HIPAA) spells out how medical records are to be handled and the repercussions if they are mishandled. Therefore, hospitals comply with HIPAA regulations.
Companies comply with many industry standards and we have groups within our organizations to enforce these regulations.
The government’s position is follow these rules or we will penalize you, costing you time, effort and money. We are naïve if we don’t believe that the hackers of the world have unofficially taken the same position. Follow good security practices or we will cause major disruption to your business and cost you time, effort and money.
So why do legal and senior leadership put effort into compliance but assume security is someone else’s problem – or implicitly covered? I think most of it stems from two factors. One, the government is fairly clear about what needs to be done to avoid fines and penalties from noncompliance. Two, the results of noncompliance are a clear and present danger. In contrast, security risk is amorphous and undefined – is it technology, auditing, policy, reporting or something else? The results of being insecure seem vague, existential and nonimmediate.
But there are specific things that can, at low cost, increase security. According to Ponemon’s report “The Cost of a Lost Laptop,” an unencrypted lost laptop is about a $50,000 threat, while the same laptop, encrypted, represents roughly a $5,000 threat. The cost to encrypt a laptop is $200 to $400. This takes it from a serious cyberincident to a more straightforward, “We lost a laptop and we need to replace it” incident.
Security risk directly translates to financial risk, and being compliant does not ensure we are secure. If we want to be good stewards of our company’s data and information, we need to address security directly from a strategic perspective. Every time we put it off or assume it’s covered because we have a strong compliance program, we roll the dice on the future operational capacity of our companies.