Our editors recommend
It's *the algorithm* again, feels around the interwebz, and a new emotion for the age of algorithmic anxiety—profile-litost
⭐️ Our Editors Recommend…Algorithmic Anxiety
Almost every internet platform today uses algorithmic recommendations. Facebook, Twitter, Instagram, and TikTok adapt your feed to what will be more engaging For You based on some combination of captured demographics and online behaviors. Google Maps often reroutes us in ways that may be more convenient based on unknown variables. The food-delivery app Seamless front-loads menu items based on your recent ordering habits, the time of day, and what is “popular near you.” E-mail and text-message systems supply predictions for what you’re about to type.
As Kyle Chayka of The New Yorker points out in The Age of Algorithmic Anxiety, it can feel a little like “an obnoxious party guest who finishes your sentences as you speak them.” Likewise, in a recent essay for Pitchfork, Jeremy D. Larson described a nagging feeling that Spotify’s algorithmic recommendations short-circuit the process of organic discovery: “Even though it has all the music I’ve ever wanted, none of it feels necessarily rewarding, emotional, or personal.” It can leave us feeling unsure of who we are and what we like, as in “do I really like that, or do I like it because I keep encountering it?” Conversely, it can evoke a sense of discomfort where the algorithm—a metaphor for the combination of online surveillance and recommendation systems that make up much of current online experience—knows us a little too intimately leaving us to feel quietly ashamed or even publicly called out.
Folk theories can help us explain to ourselves what’s happening and how we might control it. Whether it’s Airbnb hosts gaming how to appear at the top of guest’s search results or prefacing a Facebook post with a fake wedding announcement or flipping quickly past unwanted TikTok videos or thumbs-downing Netflix shoes, we’ve devised hacks for managing the algorithm’s influence. Users can’t be blamed for this irrational response. After all, tech companies have gone out of their way to keep their systems opaque, both to manage user behavior and to prevent trade secrets from being leaked.
Another approach is to avoid the algorithm altogether, but you’ll quickly encounter limits. Join a new photo-sharing app like Glass and there’s very little content. Host conversations on Discord, and it’s likely you’ll encounter recommendations that have been curated from other platforms anyway. Try to create your own rabbit holes by finding a film you like and watching others by that director, and the platform you’re watching it on will learn anyway. And you’ve got to wonder when you check out “new items our editors love” on Etsy, if there are any human editors at all.
🙌 Feels on the Interwebz
I’m still playing with the format so this week it’s time for something new. How about a lightning round of internet and emotion?
American Sign Language is evolving in all kinds of ways, but one important change has been how the language is changing because of video. Thriving ASL communities of YouTube and TikTok, not only invent new signs for the times, but adapt existing words for to accommodate the tight space of video screens. Read more about the generational shift and how technology is changing ASL in this New York Times piece.
The research on dog-human emotion entanglement continues with 2 new studies that say dogs can read emotion from people’s faces and that people can read emotion from dog’s faces. Other studies have shown that humans can read dog emotion from their body movement (tail wags!) or barks, so now we know the face is important too.
One of the problems with artificial intelligence that picks up on emotional signals (aka emotion AI or affective computing) is that it understands only one signal at a time. One method might pick up on facial expressions, another on voice, still another text. So far, putting it all together hasn’t been easy. Researchers at University of Trento may have cracked the code though; they recently created an algorithm for multimodal emotion recognition that combines speech and facial recognition. I mean, humans can understand much more than two modalities in the blink of an eye but it’s still interesting progress.
This week I discovered a series of wise parenting TikToks by @menzennial that vividly illustrate how we minimize kid’s feelings—telling them they’re fine and shushing them and telling them to go to their room until they can stop crying. The videos show just how confusing it can for kids when they are expected to handle emotions like adults, well some adults. Emotional maturity doesn’t come from leaving a child to cope with big feelings on their own. As a parent, I’ll confess there’s a bit of “I’m in this picture and I don’t like it” especially since it was recommended by “the algorithm” which brings us to this week’s feeling—profile-litost, or profilitost for short.
Friday Feeling > Profile-Litost
🔑 DEFINITION
A state of agony and torment created by the sudden sight of one’s self, reflected back in a recommendation algorithm.
See also: algorithmic anxiety, feedback theory, I’m in this picture and I don’t like it
📜 A BRIEF HISTORY
Lítost is one of those “untranslatable words”, like saudade in Portuguese or hygge in Danish. The Czech word means something along the lines of “anguish resulting from an acute awareness of our own misery”. The linguistic root is the verb litovat which means to repent or regret. To experience litost requires a certain self-awareness that teeters on the brink of self-loathing where shortcomings are cast into sharp relief.
Offline, the emotion surfaces when we catch a glimpse of how others see us in an offhand comment or when we notice our reflection in a store window as we pass by. Online, it follows us everywhere.
As recommender systems became a pervasive part of life online, so too did the sudden shame of seeing our deepest desires, recent traumas, and darkest fears reflected back. Facebook and Twitter use algorithmically sequenced feeds, displaying what the platforms determined would be most engaging to the user. Spotify and Netflix introduced personalized interfaces that cater to each user’s tastes. And when TikTok’s For You tab, Instagram Explore, Pinterest recommendations converge on the same recommendation, it’s natural to wonder “Is this really me?” and then “Is this who the algorithm thinks I am?” That’s profile-litost.
💬 EXPRESSION
Profile-litost is one of those emotions that is hard to articulate. Once in a while, there’s a targeted ad that cuts deep or a sequence of posts that feel a little too on the mark. When that happens, it’s common to joke “I’m in this photo and I don’t like it”, a reference to a retired Facebook reporting option for hiding photos from the timeline. It’s also appropriate to respond with “I feel attacked” or “it me”.
Because its causes are black-box, the first impulse is to decode the mystery.
This may take the form of rationalization, explaining away that uncomfortable recommendation because of an errant click on the strangest gifts to buy on Amazon or an atypical late-night, anxiety-fueled exploration of the military method of falling asleep in under one-minute.
For the slightly more tech-savvy, it’s typical to blame others in a household or workplace, “Ah, someone else here is looking at elastic-waist pants, so I’m getting the ads too.”
Others may respond to the auto-magic recommendation algorithms by attempting some of their own practical magic. This involves developing a folk theory about how the algorithm works, such as “I see more posts from people I recently added”, which then leads to an improvised counterpoint like snoozing or muting new connections for a while.
💗 EXPERIENCE
Profile-litost is like catching a sudden unflattering reflection with your mobile camera at an awkward angle in super high-res which leads to self-loathing at seeing yourself “as you really are”. The feeling appears in your music app recommendations, then pops up in your social media feed as suggested likes, follows you in ads as you browse random websites only to resurface later as you try to make a purchase.
The fortunate few can keep this digital doppelganger at a distance. For others, there’s a more disturbing emotional journey.
The shame of self-recognition
A flicker of denial in the mental calculations that prove it is not the real you
Or defiance accompanied by confounding the algorithm through strategic clicking, unfollowing, and liking
A flash of restless anger, which may cause you to lash out at the platform, other users, brands, or yourself
Lingering sadness, regret, or self-loathing
🎉 FUN FACT
Spotify is the place where profilitost may be most evident—you’re confronted with the horror of your own musical taste every week in Discover Weekly. As if that weren’t bad enough, you can get dragged for your musical taste by a bot with How Bad Is Your Spotify? After you log in to Spotify the bot mocks your musical taste as it calculates the results in a twisted mirror-image of Spotify Wrapped.
Since the bot has been "trained on corpus of over two million indicators of objectively good music, including Pitchfork reviews, record store recommendations, and subreddits you've never heard of," it clearly knows best.
🎩 PERSON OF INTEREST
Milan Kundera didn’t invent the word litost, but the Czech author is the person who has explained it with the most resonance. In The Book of Laughter and Forgetting, when the main character, a student, notices his love interest slowing the pace of her swimming to match his, he sees only his own weakness. That one small moment causes a strong rush of inadequacy, followed by the impulse to lash out. Kundera explains the feeling as, “A state of torment caused by the sudden sight of one’s own misery.”
💡 BIG PICTURE
Pervasive modern recommendations algorithms have translated the untranslatable. An emotion once known only to Czechs (or readers of Kundera) is now a widely understood facet of internet culture.
Rather than being resigned to a constant flicker of self-loathing as a backdrop to your internet experience, try to put profile-litost in perspective. It is a truth universally acknowledged that recommendations are not about you, but about generic tags like age or location or online behaviors that are unreliable at best. While an ad or a post or a song might feel a little too on the nose, it’s actually just a wild guess.
🤔 LEARN MORE
Read about litost and over one hundred other untranslatable emotions in Tiffany Watt Smith’s Book of Human Emotion
Watch John Koenig’s Dictionary of Obscure Sorrows which offers new words for emotions
Learn more about “algorithmic folk theories”, the explanations that we come up with to understand how algorithms work on Quartz
The X-Ambassadors may have captured the essence of litost in their eponymous song
🙈🙈🙈🙈🙈
That’s all the feels for this week!
xoxo
Pamela 💗