Microsoft AI suggests food bank as a “cannot miss” tourist spot in Canada - 4 minutes read
Enlarge / A photo of the Ottawa Food Bank warehouse.Ottawa Food Bank
Late last week, MSN.com's Microsoft Travel section posted an AI-generated article about the "cannot miss" attractions of Ottawa that includes the Ottawa Food Bank, a real charitable organization that feeds struggling families. In its recommendation text, Microsoft's AI model wrote, "Consider going into it on an empty stomach."
Titled, "Headed to Ottawa? Here's what you shouldn't miss!," (archive here) the article extols the virtues of the Canadian city and recommends attending the Winterlude festival (which only takes place in February), visiting an Ottawa Senators game, and skating in "The World's Largest Naturallyfrozen Ice Rink" (sic).
As the No. 3 destination on the list, Microsoft Travel suggests visiting the Ottawa Food Bank, likely drawn from a summary found online but capped with an unfortunate turn of phrase.
The organization has been collecting, purchasing, producing, and delivering food to needy people and families in the Ottawa area since 1984. We observe how hunger impacts men, women, and children on a daily basis, and how it may be a barrier to achievement. People who come to us have jobs and families to support, as well as expenses to pay. Life is already difficult enough. Consider going into it on an empty stomach.
That last line is an example of the kind of empty platitude (or embarrassing mistaken summary) one can easily find in AI-generated writing, inserted thoughtlessly because the AI model behind the article cannot understand the context of what it is doing.
A screenshot of the "Ottawa Food Bank" blurb in Microsoft Travel's AI-generated article.
Ars Technica
A screenshot of the AI-generated Microsoft Travel Ottawa article.
Ars Technica
The article is credited to "Microsoft Travel," and it is likely the product of a large language model (LLM), a type of AI model trained on a vast scrape of text found on the Internet. Microsoft partner OpenAI made waves with LLMs called GPT-3 in 2020 and GPT-4 in 2023, both of which can imitate human writing styles but have frequently been used for unsuitable tasks, according to critics.
Advertisement
Since the announcement of deep investments and collaborations with ChatGPT-maker OpenAI in January and the emergence of Bing Chat the month after, Microsoft has been experimenting with integrating AI-generated content into its online publications and services, such as adding AI-generated stories to Bing Search and including AI-generated App review summaries on the Microsoft Store. "Microsoft Travel" appears to be another production use of generative AI technology.
First noticed by tech author Paris Marx on Bluesky, the post on the Ottawa Food Bank began to gain traction on social media late Thursday. In response to Marx's post, frequent LLM critic Emily Bender noted, "I can't find anything on that page that marks it overtly as AI-generated. Seems like a major failing on two of their 'Responsible AI' principles."
Bender also pointed toward Microsoft policies; one is "Transparency," and reads, "How might people misunderstand, misuse, or incorrectly estimate the capabilities of the system?" and the other is "Accountability," which states, "How can we create oversight so that humans can be accountable and in control?"
Judging by the Ottawa article's content, it's more than likely that a human was not responsible for writing the article and that no one fully reviewed its content before publication either, which means that Microsoft is publishing AI-generated content on the Internet with little-to-no oversight.
Microsoft was not immediately available for comment by press time.
Update (08/22/2023): According to a statement from Microsoft on the Ottawa Food Bank article acquired by Insider, Microsoft said, "This article has been removed and we have identified that the issue was due to human error. The article was not published by an unsupervised AI. We combine the power of technology with the experience of content editors to surface stories. In this case, the content was generated through a combination of algorithmic techniques with human review, not a large language model or AI system. We are working to ensure this type of content isn't posted in future."
However, the author of the Insider piece, Nathan McAlone, questioned the "human review" part, noting that Microsoft Travel also published a string of other travel articles with incongruous or nonsensical elements (all of which have now been taken down) that suggest largely algorithmic composition at work. (Or perhaps an extremely lazy human copying and pasting random elements into an article with no editorial oversight?) It's entirely possible, like Microsoft claims, that large language models are not behind the Microsoft Travel stories, but experiments in other types of autonomous article composition still seem the most likely explanation for the unusual articles.
Source: Ars Technica
Powered by NewsAPI.org