I just had a conversation with a Neiman’s call-center representative – it didn’t go well (at least, not from my perspective – and probably not from hers). Neiman’s is a company that seriously needs to pay attention to its call center personnel, because most are far from sufficiently equipped for the job. But, that’s another story for another day.
Today’s Neiman’s experience, however, did put me in mind of a recent article from the WSJ about how bots are expected to help out call center personnel by: 1 – prompting the Call Center Person (CCP) to “cue to empathy” if the bot detects that the customer is frustrated or angry (because of the quality of voice transmissions); 2 – indicate customer satisfaction for the CCP, again, by tone of voice of customer; and 3 – keep track of who is speaking and when – this should be particularly useful for the supervisors and managers monitoring their CCPs – I have been known to say to a particularly vexing CCP, “I hope this call is being recorded, so your manager can review and take appropriate action.” Thus, it is anticipated that with these forms of assistance from the bot, working with CCPs, that the human can actually be made more human. There was a time, not so many years ago, when human CCPs actually knew these things, themselves, and could “cue to empathy” and other such human reactions, appropriately, and as needed. However, with Millennials joining the workforce much of the “human sensitivity” has been lost – It’s what happens when one raises children by technology rather than by human interaction. I think Professor Jordan Peterson of the University of Toronto would probably be one of the first to say that if parents aren’t involved in making human children human (by admonishing them if they’re being rude or insensitive) then it will be up to bots to assist this generation and the next in gaining some expertise in appropriate human interactions.