The tree of life was dying, and we had run out of ideas on how to save it. We had tried everything we could think of and were hopelessly stuck when a voice chirped through a walkie-talkie, informing us that the mouse in the corner could give us a hint.
As I awaited instructions from a fake mouse in an escape room above a KFC, I couldn't help but think this was a fitting conclusion to my weeklong experiment surrendering my life to algorithms.
If our computers are a window to the online world, algorithms are key mediators whose intervention clearly impacts what we see, and therefore what we do. They filter our search results (for example: Google, Bing), curate our social media feeds (Facebook, Twitter) and make recommendations for new products and experiences (Amazon, Yelp).
For internet users, these algorithms can help us sort through the mass of information at our fingertips. For the companies that develop them, algorithms are tools for amassing valuable data, streamlining the shopping and browsing experience and encouraging us to spend more money. Algorithms are also becoming important tools in the high-stakes arenas of workplace management, financial investment and policing.
Despite algorithms' centrality to the online experience (and increasingly our real-world experience) their inner workings are largely a mystery, since they're the closely guarded intellectual property of the companies that use and develop them. In an effort to better understand algorithms, how well they work and how well they know me, I embarked on an experiment to surrender my life to algorithms for a week. This might not be what the coders had in mind, but that didn't stop me from trying, or from gleaning some helpful insights along the way.
To facilitate the surrender of my free will, I broke down my routine daily choices into five categories: what I wear, what I watch, what I eat, what I listen to and what I do.
Dressing for algorithmic success
To start, I needed some clothes for the week. I made contact with a company called, which uses algorithms to provide a box of clothing -- the "Fix" -- delivered to your door.
Stitch Fix is open about its use of algorithms in its process, but it does employ a human stylist to make the final decision. "In the styling process, algorithms can do things like detecting people's sizes and preference for price," Daragh Sibley, director of data science at Stitch Fix, told me, "but stylists bring an unparalleled ability to improvise and appreciate a client's full intent or the unique circumstances of each Fix."
We couldn't find any clothing services that were purely algorithm-based, and we even asked Stitch Fix if it would bypass the stylist to just send their algorithm's top recommendations, but that was a no-go. It appears style still has a human element to it... at least for now.
Because I was making a video about this experience, I asked for tops only from Stitch Fix, since my top half is mostly what's featured on screen. The five tops I received were all very different. Some, like the button-up pictured above, were hits. Others, like the thin sweater that my fiancee said made me look like a "hacker from a movie," were misses.
According to the letter from the stylist that accompanied the Fix, the rising temperatures in my area inspired her to send more warm-weather clothes and lighter jackets. A very personal detail I assume was part of the human touch the stylist brought to the Stitch Fix equation. The human touch comes with a $20 styling fee (which Stitch Fix waived for our experiment), and that money gets credited toward any clothes the customer decides to keep.
For food and activities, I relied on the algorithm at Yelp, where folks turn for reviews of local businesses. I thought it would be like killing two birds with one stone, but it was more of a roller coaster.
The first hiccup occurred when the Yelp algorithm recommended a restaurant that was against my dietary preferences. All these algorithms require data about us to understand our likes and dislikes and to make appropriate recommendations, so this was to be expected. I went into the Yelp app and added my dietary preferences. The screen reloaded as the app assured me it was incorporating my new inputs and would provide me with what I was searching for. Ultimately, though, the restaurant I was seeking to avoid ended up in the top spot.
Since the algorithm skipped over my dietary preferences, I decided to skip over its recommendation... my first and only act of rebellion against the algorithms I had set out to surrender to.
After noticing the restaurant I was seeking to avoid paid Yelp for advertising, I reached out to Yelp to ask if advertising status impacts a restaurant's place in the algorithm. A Yelp spokesperson responded, "A user's personal preferences are a key input, among the many inputs factored into the organic search results that each individual user sees on Yelp, but it is not the only one. A business's advertising status is not one of those inputs and is not used by Yelp's algorithm to determine or rank organic search results."
The Yelp spokesperson also mentioned several other inputs that impact a restaurant's placement by the algorithm: "operating hours, service offering, menu content, overall rating, review and photo content, distance from the user, and more."
The Yelp algorithm's recommendations for activities brought me to some parks, stairs, a beautiful dam and a Magic Kingdom-themed escape room. There were also some businesses that were closed for the season or fully booked that I had to skip.
For what to watch, I surrendered to Taste.io, an algorithmic movie recommendation service that crosses many different streaming platforms. Like the other algorithms I tested, Taste.io required some data about my movie preferences to inform its recommendations. After rating about 50 movies, each day I watched the first recommended movie or TV show that I hadn't seen before. They were as follows: LA 92, Still Walking, The Road, HBO's limited series John Adams and Before Sunrise.
The Taste.io algorithm definitely picked up on my interest in history with LA 92 and John Adams, and my interest in foreign and independent cinema with Still Walking and Before Sunrise. It was a bit weird watching Before Sunrise, the second film in the Before Trilogy, without having seen the first one, but I was still able to enjoy it. The Road was the one flop for me, just because I thought it took itself a bit too seriously and was generally quite a downer.
For my music algorithm, I surrendered to Apple Music, which already had some data about my musical tastes because I'd been using it for a little while to listen to albums before I buy them outright.
For the week, I listened only to Apple's algorithmically generated playlists. When getting up and going out, I listened to its Get Up! mix, and while chilling outside I listened to its Chill Out mix.
For a passionate music nerd like me, these playlists can't really compete with the ones I make for myself. My favorite playlists were the ones with the most songs I already liked, and none of the new songs I heard really stuck with me.
Neither Taste.io nor Apple immediately responded to a request for comment.
My last algorithmic task of the week was the Magic Kingdom escape room recommended by Yelp. My fiancee (bless her) joined me on this most fitting final challenge: escape the room, escape the algorithms.
The escape room proved too challenging and overwhelming for us to navigate without some sort of help. That help came in the form of hints provided by a tiny mouse in a cage in the corner.
Looking back on that moment, the escape room was something of a microcosm of the internet: filled with different bits of information and experiences and distractions. If you put the information together wrong, the tree of life might die. Your problem wouldn't get solved. You'd be led astray and end up wasting time, energy and money on something that wasn't all that fun, productive or helpful.
The mouse is a prime example of what an algorithm is supposed to be at its best: It's there to help keep your eye on the prize so you don't get overwhelmed, wander off track and have a bad time. It has more information than you do about what options are available, and it's able to communicate them to save you time.
But in the real world, not all algorithms work as well as the helpful little mouse.
Overall, the biggest lesson I learned from this weeklong experiment was that the degree to which these algorithms are able to figure out my desires and act on them is largely dependent on three things: who is developing them, why they're being developed and the data that goes into it.
The more specialized the service and the more data it gathered about me, the better it seemed to work. For example, Stitch Fix requires a style quiz on sign-up to train its algorithm, and the algorithm is ultimately checked by a human stylist. Stitch Fix also says that as you order more Fixes and give more feedback, the experience improves. More and better data leads to more precise results.
Taste.io was a fun one to surrender to, since it encourages users to rate movies before getting recommendations.
Apple music's algorithmically generated playlists were underwhelming, and Yelp's algorithm was all over the place. Neither of these algorithms encouraged me to input data, and what little data I did input didn't add a whole lot to the overall user experience.
The algorithms' greatest success of the week was in getting me to spend more money. I spent hundreds of dollars more than I normally would've on food and activities during my week of surrender. And if I had purchased my favorite clothing from Stitch Fix (which charges you only for the clothes you keep, and I returned everything it sent me), it would've been hundreds more.
It's clear that the recommendations made by these algorithms are imperfect, like the human beings who produce them. The algorithms sometimes make great suggestions, and they sometimes make bad ones. When you're looking for a new restaurant to try or a new song to listen to, that's not a huge problem. However, the deployment of algorithms in our society isn't limited to these types of decisions. When the stakes are as high as someone's job, someone's health or someone's freedom, it's a different story.
This experiment taught me a lot, but the biggest takeaway for me is how much more I've got to learn to fully grasp how algorithms are playing greater and greater roles in guiding our lives.