The Case Against Customization

Investigating the harmful effect of algorithms that personalize user experiences without the user’s explicit consent.

Rucha
4 min readJan 1, 2021

In the past, I’ve talked a lot about my personal carbon footprint and my consumption in the physical sense. Today, I want to talk about the massive size of our digital footprints, and their possible side effects.

Ever since I started studying computer science, my interest for data security and privacy in terms of technology grew exponentially. When you work in any field, you end up learning a lot about what is happening underneath that surface layer. After my dad passed away, I had to go through all of his digital accounts. I realized how incredibly large our digital footprints are, and how customized everything is to our personal preferences. I found myself feeling like my dad’s digital presence needed to be saved because it felt so much like him. It was in this moment that I really started to question whether or not technology should be this customized.

Almost every consumer-facing technology company uses an algorithm to personalize your digital experience. From blatant “for you pages” to adaptive front pages based on location, algorithms are constantly trying to make us engage with content, and grasp our short attention spans. I will be the first to admit that these algorithms, and these user experiences are incredibly convenient, entertaining, and sometimes even useful. In this past year, I’ve “discovered” new content creators from Youtube’s homepage that suggests videos based on other videos I watch. However, these algorithms and user experiences are not good for our mental health, personal growth, and are ultimately detrimental in the long run.

I’d like to preface this by saying that a lot of these consumer-facing products are free, and as users, we have a right to not use something if we don’t want to. I also understand that these algorithms are literally built to capture our attentions and sell us stuff to make companies more money. However, there should be regulation for these algorithms because they are directly impacting our brains and how they function. In the following section, I’d like to dig deeper into what I think needs to happen.

A lot of the issues I have with algorithms based on customization are related to consent. For example, on a website like Tumblr, you have to intentionally follow someone in order to see their content on your feed. For many years, this is how a lot of social media companies operated. In my opinion, this kind of curation is customized, but it is intentional and user-driven.

Now, on most social media websites and apps, there is some sort of core user flow related to “exploring”. Whether it is a separate section that is curated “for you”, or inline suggestions that show content in your main feed based on other accounts you follow, these are examples of algorithmic customization that is unintentional and data-driven. This kind of customization is highly exploitative and uses the passive interaction of liking something as consent for the algorithm to use your data.

On an even deeper level, there are algorithms that simply alter your user experience without any sort of user driven action, and use your location and other data to determine your views. Examples of this can be seen on most newspaper websites that show different headlines to people based on where you live in the world.

Any sort of algorithm that customizes your user experience should require your explicit consent, and if you don’t want to participate, you should be able to turn it off. I want to reiterate that a lot of these services are free, and if that is indeed the issue, a paid service should be provided that allows for user data privacy in relation to customization.

When I surveyed a group of my friends, most of them said that they liked digital experiences that were customized to their preferences. Some agreed that companies should be explicit in how they go about using user data and that kind of transparency would be enough for them to continue letting the algorithms run their course. If tech companies allowed users to turn off their helpful algorithms, I would personally take that opportunity. Algorithms that suggest content end up boxing people into categories and can subconsciously affect how individuals live their lives. We wonder why groups of people are so polarized, but if you look at our digital experiences, they are incredibly powerful in homogenizing us without us noticing.

I hope larger tech companies take this problem and implement ways to improve their user’s real lives, not just their digital lives. Data privacy is indeed coming to light more and more, but it needs to be something that is known to all users and taught from a young age.

--

--