Chatbots are Social

Chatbots are Social

The chatter about chatbots has died down since the spring. But the concept and software itself are simply in embryo.

Chatbots are software designed to replicate human interactions, a sort of more intuitive Siri with the ability to shift dialogue within a conversation with us based on rule-based or machine learning.

Facebook focused attention on chatbots when it launched its chatbot platform in April. By June 30th it had close to 11,000 chatbots available in Messenger.

Chatbots work within messaging apps, and as this simple introduction to these machine humans points out, "more people now use messenger apps than social networks"

But what's all this got to do with marketing and public relations?

That's still to be figured out. But digital sage Shel Holtz held a webinar (I couldn't attend) in April urging communications professionals to examine them because:

Chatbots will play a big and important role in all corners of PR and corporate communications. Chatbots are a future arriving like a freight train that has lost its brakes.

There are likely more, although I can't think of them, but here are three roles for communicators in the developing chatbot ecosystem, themselves enough reason for PR and marketing skeptics to acknowledge, explore and examine them.

Coding the conversation

No offence to software engineers, but building a human interaction should be the province of people who can lend conversations charm and neighborliness and—since marketers will be all over them—a consistent brand voice. Coding the chatbot conversation will, or should, have the feel of a story, with give-and-take, not a messaged interaction or rote response.

Digital strategy integration 

Chatbot use should be part of a 360 degree digital strategy, not just a standalone customer service tool. And digital strategies should be about humanizing the interplay between an organization and its customers, constituents, advocates or stakeholders. The 'human', not the code, has to lead chatbot development. 

Managing the problems

Microsoft learned the hard way with Tay that we are a long way from a discerning machine intelligence that can avoid human flaws like racism and misogyny. Faced with a crisis in which harm has been caused many executives still have trouble manifesting empathy or making an apology, even when evidence supports doing both. As Tay unfortunately showed, the chances of causing toxic offence with machine or rule-based dialogue are huge. Chatbot owners need to be ready for the inescapable obnoxious and embarrassing conversation mistakes.


Exponential Activism

Exponential Activism

Twitter 101

Twitter 101