Effective Encryption Protects Data When it's Most Vulnerable - Infosecurity Magazine

Effective Encryption Protects Data When it's Most Vulnerable - Infosecurity Magazine


Effective Encryption Protects Data When it's Most Vulnerable - Infosecurity Magazine

Posted: 30 Oct 2019 02:30 AM PDT

Most organizations that rely on computing systems understand the need to protect data against cyber-attacks and data breaches by using encryption. Unfortunately, even the most well-informed and well-intentioned organizations fail to encrypt their data when and where it's most vulnerable. They often aren't getting the protection they think they're getting when implementing encryption, particularly when choosing full disk encryption, or when being told by email hosts, cloud storage providers, and messaging/communication service providers that their data is encrypted.

An effective selection and implementation of an encryption tool has one purpose: to protect the data when it's most vulnerable, such as when it has value to a company ( i.e. when data is accessible, in motion, or in use). That's precisely when those volume-level encryption tools stop being effective or fail altogether. 

Data-at-Rest vs. Data-In-Motion vs. Data-In-Use
In order to understand the limitations of encryption product types, it's helpful to remember that data exists in essentially three states: at-rest, in-motion, and in-use:

  • Data-at-rest is information that's stored in a digital form on a physical device, like a hard disk or USB thumb drive
  • Data-in-motion is digitized information that's traversing a network. For example, when users send an email, access data from a remote server, upload to or download files from the cloud, or communicate via SMS or chat functionality.
  • Data-in-use is digital information that's actively being accessed, processed, or loaded into RAM, such as active databases, or a file being read, edited, or discarded.

While there are various crossover points among the states, data must be protected in all three, as well as during their transitions from one state to another. When a vendor or cloud service provider claims that data is encrypted, that doesn't mean that it's protected at-rest, in-motion, and in-use. It's far more nuanced than that.

Consider one of the most well known encryption tools: full disk encryption. The name alone makes it sound as if every file and every activity that takes place on that disk is encrypted and secure. Hardly. Full disk is effectively physical hardware security that only protects your data when the computer in which it's installed is either not logged in or not turned on. That is precisely when data is least vulnerable.

The Problem with Password-Based Encryption
When people think of encryption, they also think of keys, and to access those keys, a password always seems to be involved somewhere. In several instances, that's true. Full disk encryption requires a password that unlocks the key that decrypts files on the disk as they are accessed. But for many other forms of encryption, that's not the case. User-defined passwords play no role in mobile calls or online purchases using HTTPS, both of which rely heavily on encrypted data streams.

Passwords and the fear of someone forgetting them or making weak ones stop many organizations and individuals from using encryption for all states of data. Or even worse: it compels them to use encryption only on a limited subset of the most sensitive data, leaving everything deemed innocuous entirely plain and vulnerable.

Beyond the weaknesses of a password itself, the need to share it with another party when sharing the encrypted data and symmetric key presents a challenge. Password-based or symmetric key encryption doesn't allow for seamless and secure file sharing or transport. As such, it's not a good fit for securing data-in-motion.

While it may protect data at rest, it offers nothing for data-in-use. This is where asymmetric key pairs make more sense. If you can't count on full-disk encryption or password-based encryption to protect data-in-motion, what can you count on? 

Public key encryption is a solid contender. Where symmetric key encryption used a single secret key to encrypt and decrypt, public key or asymmetric key encryption employs a key pair consisting of a secret private key and a public key, lending to its name. Most often, the public key encrypts data while the private key decrypts it.

Since the public key is just that - public - it can be freely distributed by any means to anyone, allowing for seamless sharing. Lacking the private key, data encrypted with the public cannot be decrypted; thus making it safe for transport or storage, i.e. data-in-motion and data-at-rest.

So, When is Encryption Truly Effective?
It doesn't do any good to only protect data when it is least vulnerable, such as full disk encryption, or to add security measures that are themselves insecure or inconvenient, like complex passwords and required password changes. Data that has any value is data that is active, in motion, or accessible, making it highly vulnerable to user error or malicious attacks.

Valuable data is vulnerable data, and that's when encryption must work. File level encryption based on public key infrastructure and sent over secured connections is one transparent encryption solution that begins at data creation and ensures that data is protected at-rest, in-use, and in-motion.

Encryption tools of various shapes and sizes can effectively prevent data loss or breaches, regardless of the state of the data. It's not enough to point to the existence of some form of encryption and claim that data and systems are secured by it.

Wherever the data resides, is processed, or travels, the appropriate encryption solution must be along for the ride.

Here Is What Facebook Won’t Tell You About Message Encryption - Forbes

Posted: 06 Oct 2019 12:00 AM PDT

In the world of individual privacy and data security, this could be the ultimate irony. Facebook, the company that has taken more fire than any other for misusing, abusing and losing user data, has become the last line of defence in the fight against government access to that same user data. Led by the U.S. and U.K., Facebook is under increasing pressure to delay plans to expand encryption across its platforms until backdoors can be added to enable government agencies access to user content.

But is everything as it seems? Has Facebook really turned away from its casual approach to user data to become the poster child for user privacy? Facebook has its own dilemma around monetising data on an encrypted platform, so what motive does it have to promote security at the expense of its own access? Unsurprisingly, the answer is that Facebook's agenda is not quite the surprise it may have seemed.

Quick recap. Despite building a business around the monetisation of user data, Facebook also owns WhatsApp, the world's preeminent messaging platform, now used by some 1.6 billion users monthly. Back in 2016, WhatsApp completed its deployment of end-to-end encryption. For the first time, a universally popular messaging service had given up the ability to access the content it was transmitting.

That proved to be a game-changer. Suddenly a level of content security that had relied on more specialist apps or user-applied encryption was available to all. Rather than encrypt the traffic between users and WhatsApp, the platform assured users that with its end-to-end encryption "only the recipient and you have the keys needed to unlock and read your messages—every message you send has an unique lock and key." And this was available to everyone, everyhwhere. "All of this happens automatically—end-to-end encryption is always activated. There's no way to turn it off."

Because users hold the keys, WhatsApp has no way to access the content, to break the encryption, even if they want to. Accessing content requires a hack applied to an endpoint—as has been seen in certain nation-state attacks, where smart devices are infected with malware that attacks the messaging apps. From a platform perspective, though, sitting in WhatsApp HQ, the content cannot be accessed.

So what's the issue? Put simply, it's that if WhatsApp cannot access the content, then law enforcement and government agencies cannot access it either. Not without compromising an endpoint, a smartphone. There is no level of pressure that can be applied to the platform to have it relent, no court orders or warrants, it is not possible for them to crack the encryption without a hack.

Politicians and security officials around the world have flocked to WhatsApp, and its more specialist competitors Signal and Wickr, relying on those secure platforms to communicate with one another, safe from "lawful intercept" within national telecoms systems that compromise calls, SMS messages and data traffic.

But many of those same politicians, and security officials have also lamented the fact that they are now "going dark," arguing that criminals, terrorists, pedophiles, can send messages safe from government snooping, relying on the platforms to keep their secrets away from prying eyes. That, say the officials, is a nightmare.

And that nightmare is set to get worse. Under fire for user data abuses and privacy scandals, struggling to recover user confidence in the wake of Cambridge Analytica, Facebook changed its strategy. Privacy would now come first, and the encrypted messaging that has revolutionised WhatsApp will now be applied to its other services, in particular Facebook Messenger, with its 1.3 billion monthly users.

Back in June, there were reports that the U.S. government was debating legislating to mandate backdoors into such messaging platforms. In July, U.K. Home Secretary Priti Patel accused Facebook of frustrating the fight against terrorists and child abusers. Also in July, U.S. Attorney General William Barr argued that technology companies must not stand in the way of backdoors being added to their platforms.

The direction of travel had been set. And now, in an open letter, government officials from the U.S., U.K. and Australia have asked Facebook to delay further deployment of encryption without "including a means for lawful access to the content of communications to protect our citizens." Essentially, backdoors.

The government letter to Facebook says that "companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes," arguing that extending encryption from WhatsApp to Facebook is more dangerous as it's a higher risk environment for child exploitation—with children on the site, securing messages to those users is a risk.

And on all counts the governments have a point. There is no doubt that preventing law enforcement from intercepting content shared between criminals or terrorists, or among pedophile groups or between pedophiles and their victims is not a good place to find ourselves. But Facebook's argument, echoed by others in the technology community, is that a backdoor is a backdoor. If you weaken the defences around the system you cannot maintain control of how those weakness are exploited.

EFF described the government letter as an "all-out attack on encryption... a staggering attempt to undermine the security and privacy of communications tools used by billions of people," and urged Facebook not to comply. The organization warned the move would endanger activists and journalists, and could be used by "authoritarian regimes... to spy on dissidents in the name of combatting terrorism or civil unrest."

And this is the crux. The stakes are high in the U.S. and Europe, where, despite legal protections against government snooping, many are now seriously concerned. But in other parts of the world, the risks could be literally life and death. If you build a secret door into the back of your house and tell some friends, or give a bunch of neighbours spare sets of keys, you cannot claim your house is as secure as it was beforehand.

"I believe the future of communication," Facebook's Mark Zuckerberg wrote in March, "will shift to private, encrypted services where people can be confident what they say to each other stays secure and messages and content won't stick around forever."

Responding to the latest government entreaties and the open letter, a Facebook spokesperson said "we strongly oppose government attempts to build backdoors because they would undermine the privacy and security of people everywhere."

But there's a twist. In parallel with the encryption battle, even more pressure is being applied to Facebook to moderate user content. The post-office defence, where Facebook argues it cannot be responsible for what is posted on its network is falling away. Australia and New Zealand, the European Union and the U.K. are regulating to mandate exactly that, with the threat of sanctions facing the industry. But guess what—you cannot physically moderate encrypted content.

Much of the focus on the government calls for encryption backdoors has been on lawful intercept, warranted law enforcement access to communications between named individuals. But there is a much broader issue—the automated monitoring of all content within messages to flag any issues and remove prohibited content.

As the recent open letter to Facebook from the U.S., U.K. and Australia says, nothing should be done that "erodes Facebook's ability to detect and respond to illegal content and activity, such as child sexual exploitation and abuse, terrorism—preventing the prosecution of offenders and safeguarding of victims."

The letter applauds Facebook's current efforts to monitor content—"16.8 million reports to the U.S. National Center for Missing & Exploited Children (NCMEC)," last year, "2,500 arrests by U.K. law enforcement and almost 3,000 children safeguarded in the U.K.," as well as Facebook acting "against 26 million pieces of terrorist content between October 2017 and March 2019." And the critical fact—"more than 99% of the content Facebook takes action against—both for child sexual exploitation and terrorism—is identified by your safety systems, rather than by reports from users."

The governments caution that "much of this activity... will no longer be possible if Facebook implements its proposals as planned... this would significantly increase the risk of child sexual exploitation or other serious harms."

The issue with Facebook's content came to the fore with the terrorist attacks in Christchurch, then waves of media coverage exposed extremist content, racism, anti-semitism. Step by step Facebook nibbled away at its past stances. Banning nationalist groups and individuals, sanctioning Facebook Live misbehaviour, recruiting armies of moderators. But given Facebook's scale, and despite vast investments into AI, it's an unwinnable battle. But if that content can't be seen, it can't be policed.

On the core platform, the secrets of the moderation business have hit the headlines in recent months. Policing posts and sanctioning users is a dirty business. Unencrypted messages are already checked, AI could be applied across the platform with the right security architecture in place. That would add WhatsApp's 1.6 billion users into a moderation remit. And that would push the onus back onto Facebook.

And so the suggestion is that there is a self-serving set of motives behind Facebook's stance on encryption, it hasn't just become the world's leading privacy advocate. And that leads to a different irony playing out here. Facebook is generating strong support from the technology community for its defence of encryption. But the drivers behind that are more likely to be motivated by a defence against more forced moderation than privacy advocacy, and that same technology community is first to slam Facebook's avaricious business model at the expense of its users.

Something has to give.

Comments

Popular Posts

6 Anti-forensic techniques that every cyber investigator dreads | EC-Council Official Blog - EC-Council Blog

How to Encrypt Your iPhone or iPad Backup - MUO - MakeUseOf

A Look At Blockchain Smartphones Available Now - I4U News