subjectivity as algorithmic impenetrability

Technology may be automating more and more things away from us, but it keeps bouncing off from our semantic core. The subjective realm is impenetrable to algorithms, and will always remain that way.

Technology enthusiasts in the West usually take this as a challenge, as if the goal of technology is complete replacement of humanity. But they are mistaken. As pointed out in an earlier post, they misunderstand how evolution works, and this misunderstanding parallels a paranoia that too has cultural origins.

  • Western cultures break the mind-body duality in favor of body, and value the objective over the subjective. Since technology automates what is objective, it creates an insecurity. People naturally feel threatened and act defensive to protect what is meaningful for them.

  • Western cultures value what humanity creates more than what creates humanity. Nature is viewed as an object that should be dominated, manipulated and subjected to the human will. Since technology is emerging through humanity, people naturally worry that it will treat its creators with disregard, the same way they treat their own creators.

Going back to the topic of this post, here are two great examples of how the subjective realm has proved to be impenetrable to algorithmic infiltrations.

Failure of Algorithmic Seduction: Amazon Case

When I think of creating desire, I think of my last and only visit to Milan, when a woman at an Italian luxury brand store talked me into buying a sportcoat I had no idea I wanted when I walked into the store. In fact, it wasn't even on display, so minimal was the inventory when I walked in.

She looked at me, asked me some questions, then went to the back and walked back out with a single option. She talked me into trying it on, then flattered me with how it made me look, as well as pointing out some of its most distinctive qualities. Slowly, I began to nod in agreement, and eventually I knew I had to be the man this sportcoat would turn me into when it sat on my shoulders.

This challenge isn't unique to Amazon. Tech companies in general have been mining the scalable ROI of machine learning and algorithms for many years now. More data, better recommendations, better matching of customer to goods, or so the story goes. But what I appreciate about luxury retail, or even Hollywood, is its skill for making you believe that something is the right thing for you, absent previous data. Seduction is a gift, and most people in technology vastly overestimate how much of customer happiness is solvable by data-driven algorithms while underestimating the ROI of seduction.

Eugene Wei - Invisible Asymptotes

Seduction is built on the mystique of the unfamiliar. That is why it is much easier to be captivated by someone whom you have just met. Data-driven algorithms on the other hand behave like people who know you for years.

Also seduction is a two-way process that unfolds dynamically over time. It involves tailoring a physical form around innate desires which are revealed through interactions. Advertisements on the other hand are unspontaneous one-way interactions.

Failure of Algorithmic Aesthetics: Netflix Case

Netflix came to a similar conclusion for improving its recommendation algorithm. Decoding movies’ traits to figure out what you like was very complex and less accurate than simply analogizing you to many other customers with similar viewing histories. Instead of predicting what you might like, they examine who you are like, and the complexity is captured within.

David Epstein - Range (Pages 111-112)

Algorithms can analyze only the explicit syntactic interactions between humans and make indirect inferences about the implicit semantic processes going on within. Since aesthetic judgment is a heavily semantic (subjective) affair, algorithms are better off trying to understand whose aesthetic taste is closer to whom, rather than directly making the judgment calls themselves. In other words, we discover great new songs and movies through each other. Of course, user interfaces hide away this relational complexity and we end up feeling as if the algorithms are making recommendations on their own.