Encryption issues - how important is your data?

24 May, 2021
Nigel Thorpe
Nigel Thorpe
Technical Director

Data encryption and insurance - same, same, but different?

Data encryption is a bit like insurance - we all know we need it (a necessary evil you might say), but it’s difficult to decide what we need to protect, and with an increasing amount of options out there, it’s a mission in itself to find the right provider. That’s probably why when we take out insurance we tend to only get coverage when we feel it’s absolutely necessary – for example, for our property, our cars and when we travel.

Many businesses, small and large, feel the same about data encryption – they consider it a necessary evil and just do the bare minimum to solve their encryption problems. Often that means deploying full disk encryption (FDE) across their endpoints but this approach is simply a checkbox activity as it creates more problems with data encryption than most people realise. The reality is, FDE is opening the floodgates to a range of encryption issues and errors and creating a false sense of security. 

What does Full Disk Encryption do? 

Full Disk Encryption does indeed protect everything that is on a computer’s hard drive, including the OS, user files, and any type of Data therein without the user having to think about what to encrypt. But, that protection exists only when the machine is turned off and the FDE encryption key is not present. Most people don’t realise that when the computer is turned on, and the encryption key has been engaged, every file is automatically decrypted for any process, no matter whether the process is legitimate or malicious. 

Ironically, the “full” in FDE does not mean comprehensive – it just means at the highest level, well, some of the time. We believe data Security should be at the lowest level, all of the time!

What about Transparent Data Encryption?

The other misconception is Transparent Data Encryption (TDE) add-ons, which are available from most database vendors. These systems also live up to their title – they encrypt the database without noticeable impact, BUT they do not protect unstructured files, or temporary or log files outside a database. 

This shortcoming is critical as most applications make use of unstructured Data but TDE does not encrypt such data, leaving it vulnerable to mis-use and theft.  TDE is also database-specific, meaning you'll need separate TDE licences for each database software, and each license will need to be managed separately. That makes TDE not only ineffective at protecting ALL data, but costly and time consuming. 

Encryption issues are growing, FDE and TDE aren’t helping

Using FDE and TDE to establish protected locations, or security silos, makes us feel that our ‘sensitive data’ is better secured. But, as we’ve shown, what happens to the data when it’s moved or copied outside its security silo? 

If your staff do not need to run reports, analyze data, make presentations, or work on proposals then FDE and TDE is a safe option. But the reality is, your staff will need to extract data from applications and databases in order to perform their job. If you’re only relying on FDE or TDE then this exported ‘ad-hoc’ data will be copied out from its security silo, and become unprotected and scattered across a corporate network on endpoints, file server storage and so on - this is why encryption problems are growing. 

The million-dollar-question - how do we prevent data theft?

Most businesses have this question at the top of their boardroom agendas, but the irony is, the most common solution is one that doesn’t seem to be working - just take a look at the news headlines.Many businesses admit that they do not know where all this information lies. 

In a 2020 Ponemon report, 67% of respondents say discovering where sensitive data resides in the organisation is the number one challenge in planning and executing a data encryption strategy. This is dangerous because just one successful ransomware attack that cruises around the corporate network, is capable of siphoning off all this locally stored data.

To overcome this, many businesses are relying on data classification technology to identify ‘important’ or ‘sensitive’ data so that it can be encrypted. But this is itself a significant challenge; the same Ponemon report also found that 31% of companies cited classifying which data to encrypt as difficult. If information classification continues to be used as a means to prevent encryption issues, then a significant amount of ‘sensitive’ information will be missed.

Why is data classification creating more encryption problems?

Let’s take a look at the steps required to classify information. 

The first step is to perform a thorough assessment of the data held by the organisation, such as intellectual property, source code, merger and acquisition plans, financial records, customer records, personally identifiable information (PII), human resources records etc.

Then for each type of information, a detailed risk and business impact analysis must be executed, measuring the value of data to the business, taking into account aspects such as financial and operational considerations, regulatory requirements and the cost to reputation and brand in the event of a breach.

These first two steps alone raise some significant issues. 

Firstly, as we’ve established, most organisations don’t know where this data is stored. And even if it can all be located, how accurate is the classification process? Manual classification is impractical for most organisations, but automation means that search patterns and rules must be developed, all involving their own inaccuracies so that it is highly likely that a proportion of ‘sensitive’ data will be mis-classified.

The other challenge is that the initial effort to catalog and assign classifications to all existing data must then become an ongoing process for users to assign classification tags to data as new information is created, modified and shared. This is likely to be automated – with the same potential for misclassification as before – but often, the user is allowed to override the assigned classification.

And this raises the next problem: Classification and Data Loss Prevention (DLP) rules are unfair. 

Are Data Loss Prevention rules creating more encryption issues?

Unfortunately, data classification rules penalise everyone because of a few bad actors. This makes employees less efficient and encourages risky behavior. Staff who just want to get their jobs done will often subvert or circumvent the system, or intentionally mis-classify data to avoid draconian policies and procedures.

Let’s now assume that the organisation has performed a successful deployment of a classification and DLP system. What happens when the world changes? Perhaps data privacy legislation is altered; or a new line of business is opened; or you notice that some kinds of sensitive data have been mis-classified. 

When this happens, the classification and security rules need to be updated. Like we said, this should be an on-going process, but if the organisation is small, or it holds a relatively small amount of data, an ongoing approach may be feasible. But for most organisations it’s bordering on impossible to implement effective data labeling policies for the purpose of assigning security measures, and to maintain accurate asset tagging at scale.

The bottom line - data needs to be protected at the file-level

Data encryption has been with us for decades. It’s tried and trusted technology but it should be used to protect all data – not just that which is classified as the most important. We need to ask ourselves, what is it that we’re trying to achieve?

Today, our data needs protection, from theft by external parties, from insider exfiltration, and from accidental exposure. That means, ALL data, not just the seemingly ‘sensitive’ stuff.  Otherwise, what is its purpose? Since cybercriminals are adept at connecting small pieces of data to form a bigger picture, even seemingly trivial information can be useful in the wrong hands.

So, why is it that the accepted norm is to encrypt only the ‘most important’ data, or only data that is stored? What about data that is in-use, or in-transit? 

I believe the reasons why encryption problems are growing stems from the abundance of access controls and authentication mechanisms that only put control barriers in front of information. If we keep adding more stringent access controls and authenticating at every step with multi-factor systems it’s just like building higher security fences with stronger locks. If someone manages to digitally pick the lock, or to cut through the fence, the data behind the fence is still unprotected.

There is a better way - the SecureAge Security Suite harnesses the power of PKI-based encryption technology, to provide 100% Data protection - every file, every place and every time. Best of all, it offers real world usability with a simple approach that is inherent and invisible. In short, it doesn’t force anyone to become a cybersecurity expert, instead, it allows people to work as they normally do without sacrificing security. Click here to find out more about the SecureAge Security Suite. 

We use cookies to improve our website experience and assume that by continuing to browse, you’re OK with it. To find our more about how we use cookies, please see our Cookie Policy.