cunews-ai-chatbots-providing-inaccurate-information-raise-concerns-over-election-misinformation

AI Chatbots Providing Inaccurate Information Raise Concerns over Election Misinformation

Frequent Inaccuracies in AI Chatbot Responses

OpenAI’s ChatGPT, Microsoft’s Bing, and Google’s Bard have gained popularity as AI chatbots. However, their tendency to produce false information has been well-documented. To enhance their reliability, all three companies have equipped these tools with web searching capabilities to cite sources for their provided information. Unfortunately, Bing often gave answers that deviated from the information found in the cited links, according to Salvatore Romano, head of research at AI Forensics.

Bing was specifically chosen for this study because it was one of the first chatbots to include sources, and Microsoft has widely integrated it into various European services, including Bing search, Microsoft Word, and the Windows operating system. Nevertheless, these inaccuracies are not unique to Bing, as preliminary testing on OpenAI’s GPT-4 yielded similar results.

Language Factors and Inaccurate Responses

The researchers found that Bing’s inaccuracy rates were highest when questions were asked in languages other than English, raising concerns about the performance of AI tools developed by U.S.-based companies in foreign markets. For questions in German, factual errors were present in 37% of responses, while the error rate for the same questions in English was 20%. Additionally, Bing declined to answer or provided evasive responses to a higher percentage of queries in French compared to English and German.

Inaccuracies in Bing’s responses included providing incorrect election dates, reporting outdated or mistaken polling numbers, listing withdrawn candidates as leading contenders, and even creating controversies about candidates. For example, when asked about a scandal involving the leader of the populist Free Voters party in Germany, the AI chatbot gave inconsistent responses, some of which were false. Bing also misrepresented the scandal’s impact, inaccurately claiming that the party lost ground in the polls when it actually rose.

Mitigation Efforts and Future Elections

The nonprofits shared some preliminary findings, including examples of inaccuracies, with Microsoft. While Bing provided correct answers for the specific questions mentioned, it continued to supply inaccurate information for other queries, indicating that Microsoft is addressing the problems on a case-by-case basis. Frank Shaw, Microsoft’s head of communications, stated that they are working to resolve these issues and prepare their tools for the upcoming 2024 elections.

Amid concerns about the negative effects of online disinformation, including AI-powered disinformation, the European Commission remains vigilant. Johannes Barke, a spokesperson for the European Commission, emphasized that online platforms’ role in election integrity is a top priority for enforcement under Europe’s new Digital Services Act.

Potential Impact on U.S. Elections

Although the study focused on elections in Germany and Switzerland, anecdotal evidence suggests that Bing also struggled with similar questions about the 2024 U.S. elections in both English and Spanish. False information, inconsistent answers, and factual mix-ups were observed when the chatbot responded to queries about scandals involving President Biden and Donald Trump. However, it is unclear to what extent these inaccurate responses from Bing and other AI chatbots could impact actual election outcomes.

AI language tool expert Amin Ahmad acknowledges that misquoting cited sources can occur to some extent, but a 30% error rate for election-related questions is higher than expected. While Ahmad believes that advancements in AI models will eventually reduce the likelihood of fabrications, the findings from the nonprofits raise valid concerns. Ahmad personally admits that he is unlikely to click on the original story when presented with polling numbers referenced by AI chatbots.


Posted

in

by

Tags: