CISO Says…Be Humane!

Nixu Blog

December 9, 2019 at 12:46

The world is driven by risk assessments. Crossing the street, ordering some gadget online or lying to your boss… It’s all risk assessment. Doing business is no different, but there is more to life than risk alone. What about ethics?

In 2018 the General Data Protection Regulation (GDPR) came into effect, and customers started asking me questions: What about the fines? Can we really get fined? How much? This is what I call ‘risk-thinking'. Maximum impact thinking. No-one asked me if I could advise them on how to protect personal information. That would be ethical thinking. This is what the GDPR (and a lot of other regulation) does. It is an attempt at translating ethics in calculatable risks.

For risks to become acceptable they don’t have to be zero. We want risks out of the red zone, but often the yellow zone is fine, because moving to the green zone would cost too much. Do we have to fully comply with GDPR (or any other regulation)? Probably not, but just enough to land in the yellow zone.

Risk assessments and human error

Let’s do some calculation and start with the estimation of the impact. The maximum fine a company can incur under GDPR is  €20 million, or 4% of the worldwide annual revenue of the prior financial year (whichever is greater). For a company like Shell this would be €12 billion, for apple €3.5 billion and Facebook €400 million. That will not be good news for the stockholders either. On September 7, 2017, Equifax announced an identity theft impacting approximately 145.5 million U.S. consumers, and five days later their stock had lost about 20% of its value ($4 billion), before eventually losing another 10% to reach its lowest valuation. Any fines are considered additional financial loss. As difficult that is to swallow, money is not the only issue:  privacy breaches lead to people losing their jobs too.

In 2017 the Netherlands the Autoriteit Persoonsgegevens, the federal identity protection enforcement agency, did not divvy out any fines although there were more than 10.000 data leaks reported. The Italian, French and British agencies, however, did inflict financial penalties. Telecom Italia was fined €840.000 which was about 0.004.3% of their €19.8 billion revenue in 2017.

It is now more difficult than ever to gauge the probability of an incident occurring. What are the chances of data leak? Or better: what are the chances that the leaked data will become public knowledge? If a laptop containing personal data is stolen from an employee’s car, this is considered a data leak and you should report it, risking a fine. If you don’t report it then who is going to find out? And if someone does find out, could you deny knowing the laptop contained personal data? Will this impact the fine?

For example, a manufacturing company with an annual revenue of €700 million owns about 2,000 laptops, half of which contain personal (customer) data. Every year the company loses about 10 laptops because they are stolen from cars or left in the train. Let’s assume thieves are interested in the value of the hardware and they decide to sell them on the black market. The probability that a buyer will report a data leak is miniscule − let’s say one to 100. I would estimate the data leak due to a stolen laptop is less severe than the Telecom Italia example, so let’s assume a 0.004% fine. The annual risk is then 10 laptops x ½ containing personal data x 1/100 chance X 0.004% X 700 million = 1400 euro. This risk would be well in the green zone. An action taken to mitigate this risk, like encrypting all your laptops, would cost more.

The challenge is that it becomes nearly impossible to calculate the probability of a data leak, because there are thousands of ways that it could occur.  Like many things however, we can learn from the past. The top five action varieties, which make up for 74% of total breaches identified in the Verizon DBIR 2018 (page 8) are: use of stolen credentials; RAM scraper (malware); Phishing; Privilege abuse; Mis-delivery (wrong address/email address).

Physical theft (the laptop example) is listed seventh (7%). Typically, the risk induced by stolen credentials is higher than the risk induced by a stolen laptop, but this is very difficult to estimate. The same is true for phishing, where it is almost guaranteed that at least one person in your company will open a predatory email.

The probability of a data leak in the case of stolen credentials and/or phishing depends very much on the circumstances. When two-factor authentication is in place, it is much harder to abuse stolen credentials and it is much harder for a phisher to gain control over a fully patched desktop.

The bottom line is that the decision to protect personal data should not be made based on a simple equation (risk = chance x impact). Calculating GDPR risk is not the way to go. The principle reason to protect personal data is a question of ethics. At Nixu we want to be humane. Humane is one of the corporate values that rose most prominently when asked from Nixuans. We want to show compassion, appreciate diversity and changing situations in people's lives (see One Nixu Playbook). We protect personal information. We protect people.

 

Want to keep track of what's happening in cybersecurity? Sign up for Nixu Newsletter.

Related blogs