Living in a fast-escalating world of technology, AI brings about its promises of unparalleled efficiency and the ability to sift through enormous amounts of data in less than the blink of an eye.
And I confess to being among the first on the bandwagon to attempt to get AI to synthesize research and park my curiosity and independent thought in favour of the illusion of control offered by backseat driving.
But beneath that gleaming surface of innovation lies a stark reality: the more we lean into the arms of AI, the further away we drift from the uniquely human insights that truly drive understanding, and empower us to harness human creativity and innovation.
…the more we lean into the arms of AI, the further away we drift from the uniquely human insights that truly drive thinking differently.”
While amazing in their own right, the capabilities of AI are necessarily limited in their emotional depth. True empathy, emotional insight-those things that underpin human interaction and provide a foundation for decision-making-are radically absent from even the most sophisticated AI systems.
It is able to mimic the recognition of emotional cues but not actually feel or understand the real emotional weight behind human expressions.
This gap is not just technical - it's deeply philosophical.
We should challenge the very notion that emotional acuity can algorithmically be engineered. Perhaps we should also assume that we may not possess the ability to detect a mechanical impersonation once its complexity passes a certain point.
But one of the biggest failings of AI so far is an inability to pick up on all the contextual subtleties that actually define human culture and society, let alone human nuance.
While AI can process the 'what' of data, it remains largely oblivious to the 'why'-that complex tapestry of societal norms, cultural backgrounds, and personal experiences that come into play to inform human decisions.
Such a lack of contextual understanding leads to solutions that, though technically correct, are practically out of touch and undermine their utility and acceptance.
A pertinent example of AI’s limitations can be seen in the world of music. AI-driven playlist generation has led to a noticeable convergence in the popularity of new music, with songs becoming increasingly similar and lacking the variation and uniqueness that human curation and creativity encourage.
Nick Cave beautifully articulates the complex, internal human struggle of creation and furthermore states the emerging horror of AI is that it will be forever in it’s infancy, that algorithms don't feel, and data doesn't suffer!
In response to a letter in The Red Hand Files discussing how many fans were sending in songs ‘written in the style of Nick Cave,’ Nick was quoted as saying,
"You did that way too easily. Therefore, it’s not worth anything.”
More than just an angry, defiant reaction to an encroaching and competitive artificial music creator, he presents a compelling case for the necessity of human involvement in creation to truly appreciate ‘what good looks like’ and ‘why it feels good.’
“Judging by this song… it doesn’t look good. The apocalypse is well on its way. This song sucks.”
Removing ourselves from this equation is proving to be misguided - and seems reminiscent of the “I’m feeling lucky” button on Google search in the ’90s…
Allowing AI to influence popular creative opinion is a self-perpetuating cycle that underscores the necessity for a human touch in the engine room of creativity — to maintain the diversity and richness that define creative expression.
While trust forms the bedrock of human interaction, the critical but challenging extension of this trust is into our relationship with technology. A mirage seems to manifest in the interaction with AI wherein rapid solutions accorded by it superficially look trustworthy. Mostly, we give AI systems poor prompts and context, then take its outputs as good enough for reality. This acceptance can make us fall into the error of establishing these fabrications as truths, building further knowledge on precarious bases as if it were real.
Speed and ease make AI an indispensable tool when answers have to be swift. What is gained in that speed is lost in the depth of the solution, usually insufficient to cope with complex issues. As Albert Einstein so astutely said, "If I had an hour to solve a problem and my life depended on the solution, I would spend the first 55 minutes determining the proper question to ask, for once I know the proper question, I could solve the problem in less than five minutes." This reiterates the significance of understanding a problem properly before searching for a solution-a cumbersome but largely belittled step as AI has been returning results at record speeds.
The impersonality and inability of AI to participate in empathetic discussions raise the bar even higher for trust in this context. It is easy to imagine that, beyond being alien, AI-driven solutions could never be trusted for the basic emotional and contextual anchoring that human interactions provide. Thus, in our march to embed AI deeper into the fabric of our systems, it becomes an urgent task to acknowledge and take corrective steps with respect to the shortcomings of such interactions in order to make the technological ecosystem more human-centered and worthy of trust.
As we are more and more enveloped in this era of AI, the promises of efficiency and handling huge volumes of data become dazzling, to say the least. However, this sparkle should not overshine something more important: human insight and creative thinking. Moving through a world more and more in love with AI, we are in danger of stepping away from that subtlety of understanding and innovation that flows only from the human perspective.
It is at this critical juncture that we have to reiterate our commitment to making humanity the core of our technological development. We should strongly lead in making user experience research and human-centred design ensure each innovation is not only technically excellent but is underscored by empathy, ethical judgement, and a keen sense of culture.
Never has empathetic research been more important.
It is the foundation that will make our technological future not only groundbreaking but inclusive, equitable, and profoundly in sync with what we as instinctive creatures simply ‘know’ to be deep human truths.
The imperative is that we must not let the Autopilot take over.
The opportunity to harness this amazing intelligence is immense, but keeping humanity at the core of each innovation will ensure our new ideas and creations don’t miss the mark and fail to understand the customer.
We are always in such a rush to harness the power of data as the ‘easy answer’… but as Nick Cave so eloquently points out, data in its goal to be rapidly spitting out the ‘what’, totally misses the ‘why’.
Without a deeply resonating ‘why’ or emotionally connecting reason - people tend to engage less. Whether it be a product, brand, image …or song.
For these reasons, we believe the highest performing and most creative solutions will continue to reward those who ‘do the work’