Are we in charge of our own decisions, or is it the algorithm?
Convenience reduces friction, but it may also reduce ownership.
I can’t remember the last time I actively decided to watch something on a streaming platform. Not because I couldn’t, but because I didn’t have to. My subscription bills are high enough to reflect the helpers I call Crave, Netflix and Disney+.
They greet me with the same quiet confidence: Here’s what we recommend for you. Somehow, they’re rarely wrong. Or at the very least, rarely inconvenient. They even remind me of my own habits: Since you watched Avatar, you should also watch [insert all the other movies I’ve already watched], as if I needed reminding.
The real question
It’s not just television. Think about Spotify. You open the app and click on “Discover Weekly” to listen to a playlist of music curated from your latest listens, ready to help you to find your niche. No scrolling through genres, no asking yourself what you’re in the mood for. Zero effort.
This is the passivity we apply when using most applications. So, the real question isn’t whether the algorithm knows us well enough. Instead, why are we so comfortable letting it decide for us?
The answer
We don’t let algorithms decide because we think they’re smarter; rather, we do it because it’s cognitively easier. In a world overflowing with options, making choices has never felt more exhausting.
Every choice, from what to watch, what to wear and what to eat, takes away from a limited pool of mental energy we spend daily. We want to believe we’re optimizing our lives, but really, we’re just trying not to get overwhelmed by them.
The personal development world has long understood this. As Psychology Today explains in its article, “Thriving in a World of Decision Fatigue,” our mental energy depletes with repeated choices.
Countless books I’ve read have also pointed to how hyper-successful entrepreneurs wear the same outfit every day to reduce decision fatigue. The logic is straightforward: fewer decisions, better performance.
Algorithms are designed with the same end-goal in mind and operate on the same principle. They narrow the field, filter the noise and present a shortlist. This reduces friction. However, in doing so, algorithms don’t just simplify our options. They subtly shift the responsibility of making choices away from us and onto the system itself.
When I was younger, my options were simple. I had a handful of television channels. Nickelodeon, Disney, Cartoon Network, and you know the rest. If I wanted to watch a specific show, I had to wait for it. Shows aired at specific times on specific days. If you missed it, you missed it.
At eight years old, choosing which channel to watch felt like a life-or-death decision. But looking back, the choices were manageable and finite.
As technology accelerated, so did the number of options available to us. There was a time when algorithms were not nearly as refined as they are today. Think about the early days of Instagram. Your feed was simple. It showed the content from accounts you followed, in chronological order.
There was no ranking, no predictive sorting, just posts as they came. Nowadays, content is ranked, filtered and pushed toward audiences most likely to engage with it. What you see is no longer simply who you follow. It’s what the system predicts you’ll respond to.
There’s something reassuring about knowing that the content placed in front of you has been calculated around your “interests,”—the friction disappears. Your brain doesn’t have to work as hard to decide what deserves your attention. If I’m being honest, at the moment, it feels great. I’d pay a premium for convenience. But, at what cost?
Behind the scenes, engineers translate human behaviour into mathematics. Algorithms sort and reorder content based on the probability that you will engage with it. The result of this is that over time, recommendation systems shape our exposure. The more data we feed through clicks, likes, pauses and searches, the more our platforms mirror us back to ourselves. We see more of what we already like or what we’re statistically likely to enjoy.
The same pattern goes beyond entertainment. Autocomplete functionalities shape our expression, navigation apps shape our movement, and AI tools shape our workflows. Each one increases efficiency, which is the engineering and economic goal.
But what happens when something goes wrong? Notice how responsibility diffuses. You send the wrong word in a text to someone and say, “It was autocorrect.” You watch something disappointing and think, “Well, it was recommended.” The system absorbs part of the blame. The shift is subtle but significant. What started as reduced effort slowly becomes reduced ownership.
There’s something quietly disappearing in the age of personalization—the art of accidental discovery. For example, taking a wrong turn and discovering a new cafe. Algorithms tend to remove the “wrong turns” which might make things more accurate, but serendipity increases the element of surprise.
Surprise is often where growth happens. When you allow yourself to explore, you find ideas outside of your norm and experience more discomfort. The more your platforms narrow content, the less likely you are to expand intellectually across design, fashion, politics, lifestyle or thought itself. Efficiency expands convenience, but it can shrink range.
There’s a moment in the movie Eat Pray Love where the protagonist, Elizabeth Gilbert, admits she’s lost her appetite. Not just for food, but for life itself. She longs to feel wonder again and to experience something that isn’t pre-selected or predictable. That feeling isn’t really about hunger; it’s about the desire to be surprised.
We know ease has trade-offs. Friction is often what forces us to think. When friction is engineered out, we practice thinking less. So, what kind of decision-maker do we really become when everything is optimized?
Friction is needed. Every philosophy reminds us of this in some form. Think of yin and yang or tension and release. In a world engineered for optimization, maintaining autonomy will require something increasingly rare—intentional effort. This is not rejection of technology or fear of algorithms, but awareness. Algorithms are tools, and tools do exactly what they are designed to do.
So, the question is: are we still conscious designers, or have we become passive recipients?


Great piece — very topical. Needs a podcast 🎙️ so we can claim back our inalienable right to freely choose.