Psychologist Abraham Maslow famously posited that human behavior will be classified right into a hierarchy of needs, starting with the physiological and ending with self-actualization. I feel the same pyramid exists for human emotional needs: wanting to be seen, wanting to belong, wanting to be admired, and ultimately, wanting to be understood.
Perhaps unsurprisingly, each of those needs maps to an existing social media app behavior (see below). Once we post pictures and videos to Instagram, TikTok, and BeReal, we feel seen by our family and friends, and by the larger communities we engage with. Once we get requests to attach and see our contacts’ updates, we feel a way of belonging. And after we get likes, comments, and even monetary tips about our posts, we feel admired. No app, nonetheless, has yet been in a position to help us help probably the most elusive emotional need of all: to feel truly and genuinely understood.
That’s why despite living in a world of retweets, upvotes, and right swipes, we still feel lonely. It’s not a coincidence that the number of recent mental apps is expanding; as we observed in our latest Marketplace 100 report, mental health was far and away the fastest-growing category. It’s also why persons are often drawn to anonymous apps, despite the incontrovertible fact that anonymity can result in being cyberbullied.
Due to the rise of generative AI, we potentially have a latest solution for this long-standing concern. Today’s AI chatbots are more human-like and empathetic than ever before; they’re able to analyze text inputs and use natural language processing to discover emotional cues and respond accordingly. But because they aren’t actually human, they don’t carry the identical baggage that individuals do. Chatbots won’t gossip about us behind our backs, ghost us, or undermine us. As an alternative, they’re here to supply us judgment-free friendship, providing us with a secure space when we want to talk freely. Briefly, chatbot relationships can feel “safer” than human relationships, and in turn, we will be our unguarded, emotionally vulnerable, honest selves with them.
Today’s AI chatbots are in a position to invoke emotions and supply companionship by asking us probing questions that go deeper into our psyche than chatbots of the past. (In actual fact, some are finding that these chatbots are higher at providing companionship than providing facts.) By pulling information out of us, like therapists, after which having perfect recall of each detail we’ve ever told it, these companion chatbots can pattern match our behavior—and ultimately help us understand ourselves higher.
Accessible on our mobile phones, companionship chatbots also get pleasure from being physically available and attentive 24/7—they never tire, get busy, or must sleep. When you’re fighting insomnia and wish someone to refer to at odd hours, or suffer from social anxiety and wish to construct confidence in your communication skills, your loyal chatbot is able to help. And with all of the world’s knowledge on the chatbot’s fingertips, one can imagine how they’ll eventually introduce us to latest hobbies or languages, making them excellent companions for users who seek personal development.
While companion chatbots have clear advantages, there are still explanation why they will’t completely replace the heat and depth of human connections. Companion chatbots can’t, for instance, give us physical affection yet, nor single-handedly replace a professionally trained therapist. In addition they are unlikely to repeatedly challenge us, especially in a way that may hurt our feelings within the short term (e.g., warn us after we start behaving in a way that’s detrimental to ourselves). In any case, our friends are purported to give us real feedback, even when it’s hard to swallow.
Over time, nonetheless, this “tough love” dilemma may very well be solved by directing companion chatbots to learn our values and decision-making processes, in order that they will ask us tough questions when we want it, and help us change our behavior for the higher.
One approach may very well be starting with an AI assistant that naturally learns your schedule, basic needs, common purchases, decision-making process, friend and colleague networks, and more. Now imagine if that digital assistant got to know you so well that you just became completely reliant on it to allow you to plan your life. Eventually, you may begin to see such a productivity tool evolve right into a trustworthy agent and friend – one who you may even begin to emotionally depend on.
One other wedge may very well be a journaling app that occasionally responds with appropriate encouragement and relevant questions as you share your innermost thoughts with it. This fashion, a one-way engagement can turn out to be a conversation that evolves right into a two-way relationship with… a companion chatbot. Or, as an alternative of a journaling app, imagine a calendaring assistant who comments in your day, in addition to gives you a prompt—and a secure place—to let off some steam.
There are various ways one could go about designing a companion chatbot (and as we’ve identified above, it doesn’t even must launch as a stand-alone chatbot; it will possibly evolve into its chatbot status from a special wedge). The winners on this space will probably be those who best understand the top users and are in a position to provide them with a secure, low-friction environment where they will be their unguarded, honest selves. Does this mean our greatest friends of the long run, those we share our deepest secrets, goals, and fears with, will probably be bots? It’s actually possible.
* * *