The chatbot did it. And the dog ate my homework!
Content people everywhere got to savor a morsel of schadenfreude this weekend thanks to a news story in the Guardian about a chatbot gone wild. I saw it, skimmed it, and popped off on LinkedIn, as is my habit.
I shared my incredulity that big companies â companies that, in my experience, are highly risk-averse, and who put their content people through legal and compliance wringers â are now adding generative AI tools to their websites that invent content and publish it instantly.
Turns out thatâs not quite what happened in this particular case ⊠but my incredulity stands, because it is happening.
Having dug into the details, however, Iâve learned that this whole thing is part of an older, longer, and, in 2024, even more embarrassing tradition of enterprise content mismanagement: forking. Weâll get to that in a moment.
First, the story. From the Guardian:
Air Canada ordered to pay customer who was misled by airlineâs chatbot
Canadaâs largest airline has been ordered to pay compensation after its chatbot gave a customer inaccurate information, misleading him into buying a full-price ticket.
Air Canada came under further criticism for later attempting to distance itself from the error by claiming that the bot was âresponsible for its own actionsâ.
So a support chatbot on Air Canadaâs website tells a guy he has 90 days to get a bereavement fair refund, which he understandably believes. That wasnât actually the policy. Air Canada tells him to stuff it when he comes looking for his money. He has the receipts (screenshots, people!), and sues. Air Canada says theyâre not responsible for what a chatbot says. The tribunal thinks thatâs ridiculous, Air Canada loses, we all rejoice.
I and many others assumed it must have been a generative AI chatbot. Larry Swanson pointed out some analysis and basic detective work (looking at a calendar!) by Maaike Groenewege that shows it couldnât have been ChatGPT, but was more likely an** NLU** (natural language) bot. Not my area of expertise, but at a high-level, NLUs attempt to parse conversational speech, and can reply with a pre-programmed facsimile of it. Theyâre the ones on support websites that keep giving you links to articles that donât answer your question and you eventually reply AGENT over and over until you get to chat with a person.
Which reminds me of a meme I made in 2020âŠ
The tribunal wasnât buying what Air Canada was selling. From the decision:
Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives â including a chatbot. It does not explain why it believes that is the case.
I know why they believe itâs the case: Theyâre not user-centered**. **The leaders who offered this defense donât see the chatbot as part of their âofficialâ website. They have a business-centered perspective on their sites and channels: separate projects, separate products, separate initiatives.
But customers donât experience internal products, teams, or departments. Customers experience your website, and your brand, as a whole.
In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission.
âThis is a remarkable submissionâ is legalese for âAre you fucking kidding me?â
While a chatbot has an interactive component, it is still just a part of Air Canadaâs website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.
Love it. I want to buy these guys a drink! đ» I have to repeat my favorite bit:
âŠit is still just a part of Air Canadaâs website.
This should be obvious to Air Canada and every other company, but it often is not. Itâs why I give a talk called _What Even Is a Website? _(hire me!), exploring the vast differences in mental models different stakeholders hold about what feels to everyone like a simple and singular thing: âthe websiteâ.
Again from the decision, emphasis mine:
While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled âBereavement travelâ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website. There is no reason why Mr. Moffatt should know that one section of Air Canadaâs webpage is accurate, and another is not.
Correct. Customers shouldnât have to double-check information found via your chatbot against your website, just as they shouldnât have to double-check information in your TV advertisements, in your social media posts, in your interactive kiosks, in your on-site search results, in your support centers.
This is nothing new to content people.
I learned a good word for what happened from Karen McGrane many moons ago, in a cautionary tale she shared in a workshop about adapting web content for mobile experiences: forking.
Forking is when you take one information object â like an airlineâs bereavement ticket policy â and fork it into two objects, to serve two different platforms. Or three for three. Or three for four, or six for ten!
Forks are often stored in different databases and managed by different teams. You can already imagine, perhaps have already lived, the problems:
- What if something changes but only one team gets notified?
- What if a timely update is required, but one platform can be updated instantly and another requires hands-on work from an engineering team?
- What if subject matter experts arenât even notified when their information gets forked, and therefore donât know to notify anyone when the facts change?
Forking is a short-term fix for a long-term problem. Whatâs really needed is:
- a robust content model +
- a single-sourcing / COPE tech stack +
- the requisite digital governance to maintain a single source of truth and serve it to multiple platforms with accuracy and consistency.
Thatâs a tall order for some companies â and absolutely doable. But, like cold showers and thinking of baseball, you can reduce the urge to fork with content design. A few years ago, I helped a team at a financial services company do just that.
The team managed intranet content in a knowledge base on-site phone agents used to answer questions for sales people in the field. Questions about complex topics like investment vehicles, estates, taxes, and so on. Wrong information could be ruinous for individuals, and a liability for the company.
The team had been doing their best, but much of the extensive web of content had been created organically over time and did not exemplify content design best practices. It was hard and stressful for agents to seek answers inside long, dense articles while someone waited impatiently on the phone.
During my research, we discovered what you might call a âguerrilla forkingâ solution many agents used to solve their problem. When they found an answer they might need again, they printed it out and taped it up in their cubicle. Copy, paste, print.
Clever! But a massive risk for the company. If you printed an answer on Monday, but the information changed on Tuesday, how would you know? You obviously canât push updated content onto an already-printed piece of paper.
And that, in a way, is what was happening at Air Canada. The website said one thing, the chatbot (accessed through the website, no less) said another. One brand, one site, two sources of truth.
Thankfully, the content management team I worked with found their way to Confab (RIP) and eventually to me, and I led them to the good word that is content design. We ran top task analysis to identify the answers and articles agents relied on most, and created a plan to prioritize immediate improvements to them using new principles for readability and usability. These principles were then rolled out through the experience as their workload allowed.
Metrics went up, errors went down, and their content became a more trusted source of answers ⊠so much so that they were eventually able to start experimenting with đ„đ„đ„ chat-based interfaces!
Youâve got to get your house in order first, is what Iâm saying.
I donât know anything about Air Canada or their content operations. Itâs possible this incident was an outlier for them. But many other companies, maybe yours, are trying to run with AI before theyâve learned to walk with content operations.
The risk of tripping and slamming face-first into concrete â or a Canadian small-claims court â is only going to grow for companies laying off content experts, UX researchers, technical writers, and others skilled in making sense of complex information spaces.
I expect to see more stories like this in the coming months and years, not fewer. But smart companies can avoid the worst of it by giving a damn about digital governance, content operations, and content design. And by hiring, or not losing, the people who care about that stuff, too.
In the meantime? Iâll be here with my popcorn. đż