You must have heard of everyone complaining about the lack of data privacy and the information security challenges. Search engines, social media platforms, gaming apps… you name it, they access your data, one way or the other.

Aside from a handful of sites like Duck Duck Go, there aren’t too many examples where your actions aren’t being tracked. Looks like nothing you do is really secret.

And it’s not just what you do online. Various systems are collating your offline actions, running them through databases and using artificial intelligence (AI) to track, interpret (and occasionally influence) what you do. China is a discomfortingly wonderful example of what a nation can do once it decides to follow you with AI.

Infographic on data privacy

data privacy challenges

So the new question that now emerges is: in the connected world, is data privacy really possible?

The single word answer is no, it isn’t possible in the conventional sense.

To answer that, we’d need to look at the challenges to data privacy, from the point of view of the common human:

1. Data privacy policies are too wordy

Facebook privacy policy is 4,205 words long (read it); Instagram is 2,446 words (read it); the bare Google policy is 3,822-word strong, without YouTube Kids and the like (you can read them here: Facebook Instagram and Google).

Assuming they are three of the most visited sites, their combined privacy policies runs into well over 10,000 words, all in legalese.

The average reading speed is 200 words per minute (Source), so it’d take only 50 minutes to read these three policies, right?

Wrong.

For one, it’s not easy for the common human to understand legal language so quickly.

For another, if people visit just 4 new websites every day, it’d mean reading and understanding over 12,000 words of legalese. Every day.

That’s nearly 4.4 million words a year.

This is the first challenge to data privacy: data privacy policies are too wordy to read and understand for the common individual.

2. Companies are assuming the paid model won’t work

You’ve so often heard the saying “If you are not paying for the product, you are the product.”, right?

Somewhere, the entire practice of silently harvesting data and selling, sharing or stealthily using data to promote other products came from one single assumption.

Google, Facebook, Instagram and similar companies assumed the paid model won’t work.

For some reason, these companies began by assuming their business wouldn’t work if they used a subscription model.

So how would you expect these organizations to survive, grow and make money?

They found the answer: capture data of users and use it to fuel growth.

So here’s the second data privacy challenge: companies began by assuming their paid model won’t work and the only way for them to make money would be to collect data of users in lieu of the services offered.

3. “You’re already being spied upon anyway!”

Let’s take Facebook, the most talked about social media platform.

Considering the number of users’ data privacy controversies the Menlo Park headquartered social media behemoth has courted in the recent years, one’d think there would have been a steep fall in the number of its users.

Not really.

The number of monthly active Facebook users is like this (Source Statista)

Q1 2016: 1,654 million

Q1 2017: 1,936 million

Q1 2018: 2196 million

Q1 2019: 2,375 million

Doesn’t look like too many people are disillusioned and leaving Facebook, right?

If that’s not convincing enough, look at Facebook’s advertising revenues globally. (Source: Statista)

2015: 17, 079 million US dollars

2016: 26, 885 million US dollars

2017: 39,942 million US dollars

2018:  55,013 million US dollars

The advertising revenues have risen by a little over 222% from 2015 to 2018. Advertisers wouldn’t be spending such huge amounts if were no eyeballs, right?

It’s like people have given up. They feel they’re being spied upon anyway, so why bother.

The average user is sure they’re going to be looted of their privacy, their data. So they feel helpless about it and continue using Facebook. Information security challenges no longer sadden or discourage users.

It’s also possible that the social media platform has become an addiction like tobacco or alcohol. Despite knowing of the obvious pitfalls and risks, people can’t get off social media. They must have their daily dose.

This is the third of the key challenges in data privacy: people have either given up on data privacy or are too addicted to really care about losing their data.

4. The technology is too sophisticated

“More data has been created in the past two years than in the entire previous history of the human race” wrote Bernard Marr in Forbes. That was in 2015, mind it.

The rate at which computing power has grown has exponentially increased humankind’s ability to generate, store and analyze data. Technology is getting way so sophisticated. Data security problems continue to grow and enough simply cannot be done without a proportional investment in data protection.

With more people than ever spending their time online, data mining is growing rapidly too.

Governments are trying to do their own bit by setting up commissions and enacting and implementing regulations. For instance, the European Union brought in General Data Protection Regulations (GDPR) in 2018. Most people agreed the regulations were might tough and that it would forever change the way companies used personal data of individuals.

It’d be interesting to see what GDPR has done in a year. If you’re looking for a short answer, here it is: the impact has been less than dramatic, but experts claim GDPR will soon begin showing its claws.

Whatever the case, one thing is sure, big data security issues and challenges are growing huge. Technology seems to be slowly outpacing what governments can do by way of laws.

This is the fourth of the privacy and data security challenges: technology and computing power is growing so sophisticated and so gigantic, government regulations alone may be insufficient in solving data mining privacy issues.

5. There are way too many apps for adequate auditing

Apple’s App Store has about 2.2 million apps (Source). On Google’s Play Store, there are 2.6 million apps (Source). Further, 13% of these apps, about 330,000, are certified as poor quality apps.

Makers of these apps range from one-person solopreneurs to mid-sized companies to multinationals. That means there’s no consistency in best practices beyond the basics.

Any number of apps can bungle up handling your personal data, intentionally or otherwise.

Even, unintended data breaches are happening more frequently. And somehow, governments appear slow in prosecuting erring corporates; Facebook seemed to have escaped almost unhurt, if you look at some of the questions that were asked to Mark Zuckerberg in the senate hearing.

And here’s another fine-print: a New York Times article reported that “Facebook officials said that while the social network audited partners only rarely, it managed them closely.”

So this is what happens. Facebook, or for that matter any other platform, collects your data.

Next, third-party service and apps integrate with Facebook and Facebook allows them access to your data.

One day you want to close your account and want the platform to delete your data. (Remember, you have a “right to be forgotten” under regulations like the GDPR.) So the platform agrees and deletes your data. It will also ask the third-party service to delete your data. So far so good.

The problem is your platform doesn’t conduct any audit of the third-party service. That’s why you can never know for sure if the third-party service actually deleted any or all of your data.

Here is the fifth of the major data security challenges: there are way too many apps and integrations and major data collectors like Google or Facebook don’t seem to have enough bandwidth to ensure compliance all the way through.

6. Then there’s the marketing angle (AKA trap)…

You remember the Strava controversy, right? Here’s a recap, in case you don’t.

The fitness app Strava released a heatmap that showed the activities of its users. Like where people are jogging, walking or doing certain exercises to stay fit and improve their helath.

That included US defence personnel sharing their activities as well.

So you could “you could find the borders of secret military outposts, as well as track patrol routes of soldiers at those bases”, as Wired put it.

Dangerously enough, it also showed locations of airstrips and locations of the US armed forced where the US was not known to have operations. In one single act, Strava let out secrets that otherwise would have taken other countries a long time to figure out.

Naturally, it became a serious security issue for the US.

Why did the soldiers share their activities in the first place?

A small part of the explanation lies in the way products and services are marketed these days. Such fitness apps, for instance, stress heavily on the minutest of muscle movement, stretching, calorie counts, and a zillion other fitness parameters.

Not all these parameters matter all that much.

But in the race to outdo competitors, one app after the other keeps adding ridiculous levels of detailing. They market these features as must have. They urge you to keep a count of the smallest kinds of exercise variations. Body fat, fitness, abs, calf-muscles, triceps, quadriceps… everything is taken to an unbelievable extreme.

Which where they encourage people to share their details. So users share their details. And then some.

And here is the sixty of the major data security challenges: some marketers take their marketing message to the extreme and manipulate users into divulging too many personal details.

Video on challenges to data privacy