Social Scores in the Private Data Industry
Intern Anthony Tang gets initiated into the darker aspects of data manipulation

I am no expert in data privacy.
When I started interning at TIKI as a freshman in college, I had a pretty vague understanding that Big Tech collected my data and did something with it. But beyond that, data was a confusing and nebulous concept. I never felt directly impacted by data, even though I spent a sizable chunk of my day on social media.
One of the first things Shane had me do as an intern was to start learning more about the data industry. After bouncing back and forth among articles about data brokers, third-party trackers, and digital footprints, I read a story that caught my eye.
A man named Mikhail Arroyo was not allowed to move in with his mom while recovering from a tragic accident. He was forced to stay in a nursing home for months. It was because the apartment’s landlord had used a tool called “ScorePLUS.” This tool, offered by a data giant called CoreLogic, deemed Mikhail “too risky” to live in the apartment. It looked at over 80 million arrest records and trickled Mikhail’s riskiness down to a “single score.”
This tool reminded me of a spectacle in the news last year: China’s social credit score. If you didn’t already know, China implemented a social credit system in 2020 which “monitors the behavior of its population and ranks them.” Examples of actions that could impact this score are: forgetting to pay a fine, saying something “unacceptable” online, cheating on an exam or even leaving a bad review. And having a poor social credit comes with punishments. If your credit score is low enough, you may not be allowed to enroll in higher education, purchase plane tickets, and you may even have your WiFi connection slowed.
Reading about China’s social credit score last year seemed dystopian and invasive. But while reading Mikhail Arroyo’s story, I couldn’t help but ask myself, “how different really are these two scoring systems?” China’s social score system is run by the state, and the ScorePLUS is run by a private company; but does that really matter? They both used data to generate a number to quantify a person’s morality. And they both could impact an individual’s day to day life and hinder them from doing basic things – like renting an apartment.
The article became more frightening as I kept reading. A similar story happened to another man from South Carolina who was looking for a new home after a flood. When he applied for an apartment, he was flagged as a sex offender and denied. Later, it was revealed that the flag was a mistake in the data. CoreLogic had mistaken him for someone with the same name. This is not the only instance of big data making a mistake. In fact, Eric Dunn, the director of litigation at the National Housing Law Project says that “well over half the [criminal record reports] he’s looked at had some kind of inaccuracy.” It was ridiculous to me that a score with such high usage among landlords had such low accuracy.
Outside of background checks for houses, I wanted to learn other ways these implicit social scores could impact everyday people. One article that stood out to me discussed businesses using data scores to conduct price discrimination. I had recently learned about price discrimination in my microeconomics class. This is when businesses (usually monopolies) charge different customers different prices based on how much they are willing to spend. It's easier explained with an example:
Let’s say you are willing to pay $10 for a shirt, while I am only willing to pay $5. Typically, the price of the shirt is fixed (the same for both of us). If the business charges $5, both of us are willing to purchase the shirt and we each spend $5. If the business charges $10, only you are willing to purchase the shirt and you spend $10. However, if the business knows exactly how much we are willing to pay, they could charge me $10, and charge you $5. Using data that tracks your past purchase history, the business can estimate how much you are willing to spend on their product, causing consumers to lose out and pay more.
The direction that our data industry is moving in is honestly really scary. Our data is our story. Openly sharing our story with companies allows them to take advantage of us. We lose a lot of control over what we choose to do, buy, or think when corporations know everything about us and can predict our behavior. With the current trajectory of the data industry, I can picture a future generation where consumers no longer have free will because everything is manipulated by corporations behind the scenes. If we think China is invading their citizens' privacy with their state sponsored social credit score, I think we should be more wary of what's happening in our private data industries.
Fortunately, people are slowly coming to realize this is a problem. According to Pew Research Center, more than 93% of adults want to have control over who can access information about them. And this is where TIKI comes in. TIKI creates an easy and accessible way to regain control over our data. Instead of automatically giving Big Tech consent to use our data when we use their platform, TIKI enables us to pick and choose what data we feel comfortable giving. Our data on the internet is valuable and most of all, it's our data. We shouldn’t let Big Tech take that from us.
Written by Anthony Tang