Facebook, Google and other big tech companies have been under scrutiny more than ever in the past months. The tech giants have become so big and powerful that it’s reasonable to review their business practices in more detail. Everything that big tech companies know about their users could be used in harmful ways very easily.
A first step is to understand what Facebook and Google are doing and in light of the Cambridge Analytica, Brexit and Russian election-meddling scandals.
People had an opportunity to peek behind the scenes of big tech companies such as Facebook.
However, this unique opportunity was somewhat wasted. Listening to the elected representatives in the US and I have no doubt it’s similar in other countries, it’s incredible how tech un-savvy many are who are shaping our laws and regulations. Today, I want to focus and help every reader understand what it is that Facebook, Google and all tech big companies really know about us.
Data Is The New Oil
It’s an almost dusty catchphrase but it is true that data is the new oil. The metaphor is very accurate. If you take it one step further. Why is oil so valuable? Why do we value this black liquid that’s pumped out from the lower layers of the earth so much?
The answer to this is fairly simple. It’s what you can do with oil that makes it valuable.
Nobody cares about the black liquid, but people care about being able to stick a nozzle in the rear of their car, pump in some gas and ride to whatever place they please. That is freedom and that’s one thing that makes oil so valuable.
How about plastics, all the little toys, medical equipment and countless other things that we rely on. Plastics are in-part produced using oil. That’s what we need oil for. It delivers an outcome for us.
Now, how does that relate to data?
Why Data Is So Much Better Than Oil?
Data on its own is not really useful but it forms the basis of so many extremely useful things. Have you heard Mark Zuckerberg speak at the Senate meeting a few months ago? Have a look at this video and let that sink in exactly.
What I want to point out is how Zuckerberg very clearly frames the question about what data Facebook users can delete. It’s the data that they put on Facebook. It’s the pictures, check-ins, likes, shares and all of that. But there is so much more to it.
The Secret Treasure Trove of Secondary Shadow Profiles and Shadow Data
This is where the true value is, all our behaviours, every single click, hover, scroll, activity. These are all activities that we are not consciously aware of but are absolutely crucial for the business success of big tech companies today.
Every little bit of interaction and behaviour is measured, tracked and fed into algorithms to refine their predictions. The predictions delivered by artificial intelligence and machine learning models then serve its users ads and content that is very likely to get eyeballs that stay on screen. We are also more likely to click on an ad if we think it is relevant.
Let me give you a taste once again by Facebook CEO Mark Zuckerberg.
We gather from the US politician that Facebook has on average around 29’000 data points about its users. 29’000! I doubt you like, share and add up to 29’000 pieces of content to Facebook.
So, what is in there?
It’s data about our behaviours that are so powerful for big tech companies, it gives them all so much information and helps them place even better ads and manipulate us to staying longer on screen.
Let me give you an example:
Normal data collection: I’m on Instagram (owned by Facebook), I scroll through my feed and stop scrolling at a picture of a sunset and double-tap it to like it.
Shadows data collection: I’m on Instagram, I scroll through my feed and stop scrolling at a picture of a nice sunset for 1 second and then keep scrolling.
In both cases, Facebook will know that this picture somehow grabbed my attention. Admittedly, to a different extent but both did. As I’m using the app on my smartphone, it can easily be tracked and used to improve my feed i.e. give me more content that my eyes will be glued to.
We can only control so much of our online identity. As Zuckerberg says, I can delete all the data I’ve shared with Facebook. The same goes for Twitter, Snapchat, LinkedIn and the others, they’re all pretty friendly about that. I can have my emails, messages, photos and videos, that’s the easy part.
But when it comes to our behavioural data, it’s a different story. Katarzyna Szymielewicz from the Panoptykon Foundation points this out in much more detail in her recent Quartz article Your digital identity has three layers, and you can only protect one of them. But to give you a high-level idea, let me illustrate the following.
Instagram which is owned by Facebook – or any tech company and their app – knows that I’m 27-years old, male, live in London, have an iPhone and stop scrolling at nice sunsets on my Instagram feed. I’ve viewed accounts of people posting nice sunsets and I’ve read comments of people who commented on nice sunsets. They also know at what time of the day I did all of that and likely from which location and WiFi including my broadband provider.
I could go on but you get the idea. They know me.
Obviously, this is not limited to sunsets but also to products, services, political content and pretty everything else.
Where does my data go?
That data will be fed into the big algorithms which will further improve their targeting engine to give its users even more relevant content including ads. Based on my activities, I can be profiled and it’s very easy to deduce what content to show to someone else with similar characteristics.
That is where the value for Facebook and other big tech companies is. That’s why more users on Facebook are good for the company because that means more data and a more narrow targeting.
This is clearly worth paying for and advertisers are keen to target potential customers with that. I don’t necessarily disagree with that approach but whilst buying another pair of jeans at Levi’s or a handbag at Gucci is all nice and well but there are downsides too.
What if the microtargeting is used to drive a certain political agenda?
What if the ads that are placed are fake news type of ads?
We’ve seen with the recent scandals of Cambridge Analytica, that digital advertising with the powerful data that big tech companies capture can become a threat to democracy and citizens are being microtargeted with manipulative content.
The Guardian pointed out how “an algorithm that could analyse individual Facebook profiles and determine personality traits linked to voting behaviour.” was built using Facebook data. This was likely achieved using the primary data that users provide and the secondary shadow data to infer further characteristics i.e. people liking sunsets are likely left-leaning.
Where Does This Leave Us?
It’s hard to battle the tech giants as the tools are ubiquitous and heavily used. We can’t control our behaviour within the apps so much but the awareness of the quasi-surveillance is a step forward. As a first recommendation be mindful of what you’re searching, clicking, liking, swiping and even just looking at.
Regulating Big Tech Companies
Secondly and more importantly, big tech companies are very powerful. Too powerful. As the laws are only catching up slowly with initiatives such as GDPR (General Data Protection Regulation) in Europe we have to elect politicians who understand digital technology and the digital revolution.
It’s the most powerful force of our age and if legislation doesn’t regulate technology then the users will eventually become powerless against tech companies and their algorithms.
We see it in the Zuckerberg testimony again, that most elected representatives are clueless about technology and how it shapes our lives today. It’s up to us citizens to elect or even be the next generation of lawmakers who can regulate tech companies so that technology benefits society and democracy rather than endanger it.
What You Can Do Today?
Must-Reads on Tech vs Democracy
Leave a comment below and let me know what you think!