Google Search results polluted by buggy AI-written code frustrate coders

https://www.theregister.com/2024/05/01/pulumi_ai_pollution_of_search/

Analysis Google has indexed inaccurate infrastructure-as-code samples produced by Pulumi AI – a developer that uses an AI chatbot to generate infrastructure – and the rotten recipes are already appearing at the top of search results.

This mess started with Pulumi's decision to publish the result of its users' prompts on a curated AI Answers page. Google's crawlers indexed the resulting robo-responses – but when users find them, the AI answers are often inaccurate.

"It has happened," wrote developer Arian van Putten in a social media post over the weekend. "The number one Google result was an official Pulumi documentation page that was clearly written by an LLM (it had a disclaimer that it was) and hallucinated an AWS feature that didn't exist. This is the beginning of the end."

As The Register opined in 2022 and reported in January this year, search quality has declined because search engines index low-quality AI-generated content and present it in search results. This remains an ongoing area of concern.

Pulumi AI and its online archive of responses, AI Answers, is a case in point. Google's search crawler indexes the output of Pulumi's AI and presents it to search users alongside links to human-authored content. Software developers have found some of the resulting AI-authored documentation and code inaccurate or even non-functional.

The problem was noted on March 21, 2024 by developer Pete Nykänen in a GitHub Issues post to the Pulumi AI code repository. "Today I was googling various infrastructure related searches and noticed a worrying trend of Pulumi AI answers getting indexed and ranking high on Google results, regardless of the quality of the AI answer itself or if the question involved Pulumi in the first place. This happened with multiple searches and will probably get even worse as the time goes on."

Others have also raised the issue.

A rising tide of muck

Nykänen told The Register in an email that he began noticing Pulumi AI search result issues around the time he posted to GitHub last month.

"As an engineer, I spend a lot of time searching for answers online and it was not difficult to notice the AI answers rising to the top of the search results overnight, even for keywords unrelated to Pulumi itself," he noted. "I filed the issue and hoped that Pulumi would rectify the situation (which they promised to do) but sadly the issue still persists."

"Documentation, especially infrastructure related, is already often incorrect, hard to find, outdated or otherwise missing. While tools like Pulumi AI can provide value to some, filling the internet with unconfirmed, possibly hallucinated, answers is actually pretty malicious. And the longer it goes on, the worse it gets."

Nykänen argued that with AI content already appearing at the top of search results and more companies creating content generation tools, he hopes that those involved in AI consider how their work impacts the integrity of the web.

"I don't think it's too late for Pulumi either and hopefully they will decide to hide their AI generated content from search engine scrapers," he suggested.

Aaron Friel, an AI engineer at Pulumi, acknowledged Nykänen's concerns, responding the following day that the developer has "taken steps to remove more than half (almost two thirds) of AI Answers, and we plan to continue to ensure that these AI answers are complementary to our existing documentation."

Friel noted that Pulumi also plans to make sure its site mentions real APIs and upstream documentation. Testing generated code is also on the to-do list.

Hello? Google?

That was a month ago, and Google hasn't yet gotten the memo. When The Register tried the keywords cited by Nykänen on Monday "aws lightsail xray" – Pulumi AI's answer was the second search result. And when we tried again on Tuesday, it ranked at the top of the page – above the official AWS documentation.

We asked Google what it thought of the situation and a company spokesperson told us it "always aims to surface high quality information, but on some niche topics or unusual queries, there may not be a lot of high quality content available to rank highly in Search."

The search giant also reminded us that it policies mean "Low value content that’s created at scale to manipulate Search rankings is spam, however it is produced", and that recent updates to its tech "reduced low quality, unoriginal content on Search by 45 percent, and aim to tackle unhelpful content that’s designed to rank well in Search."

Microsoft's Bing search engine could be ahead of the game in terms of filtering AI-generated material as it did not have this problem for the same query, though results it produced included a Chat button that launched an AI-generated response if you took the bait and clicked rather than just hitting return to submit the query. Brave Search also omitted the Pulumi AI response. DuckDuckGo, meanwhile, returned the Pulumi AI result as the fourth item on its search results page for the query.

Another GitHub Issue post on Monday, referring to van Putten's complaint, has asked for the removal of Pulumi AI's answer about AWS EBS direct APIs – which Pulumi evidently does not support.

Several AI hallucinations flagged in March have already been dealt with.

In an email to The Register, Pulumi co-founder and CEO Joe Duffy defended his firm's AI effort – but allowed that more drastic intervention might be called for if the issue can't be adequately addressed.

"Pulumi AI has transformed how most of our customers work, enabling them to navigate a sea of hundreds of clouds with the myriad ways you can use all of their services," Duffy explained. "We processed a 50 percent increase in prompts quarter on quarter, which is a testament to how useful our customers are finding it to their daily work."

A startup that promises to do better ...

Duffy claimed that Pulumi has tested and improved its code quality over time and has seen a double-digit improvement in the success rates for code examples quarter over quarter.

"That said, we know these aren't perfect," he conceded. "Because our AI answers are indexable by Google, they show up in search results. I'll be the first to admit, I was surprised at how highly Google is ranking these pages, since in general they have no inbound links – a far cry from how PageRank used to work – and I would have expected it to prefer our older, more mature content."

Asked when Pulumi first realized its AI had issues, Duffy acknowledged Pulumi has been aware its AI isn't perfect since it launched last year, and has invested to improve its quality.

"We have a new typechecker loop that feeds back into the AI and improves our results," he explained. "We've tweaked it to be better at Python, and we've taught it about our cloud SDKs. All of these have had material increases in quality – and it will just keep getting better from here. Although there's been some negative sentiment on social media, far and away the feedback we get directly is that the AI is helpful, especially when just getting started in the cloud – it truly is daunting to even get started navigating hundreds of clouds each with tens of thousands of services."

Duffy revealed that Pulumi has already removed 100,000 AI answers and will take down more in future.

Despite the challenges, Duffy expects AI will improve over time. "We move fast and try innovative new ideas regularly – and sometimes they just don't work out the way we intended," he admitted. "If we can’t get to a good place quickly, we will absolutely consider delisting all of them and building back up more slowly."

Duffy added that Pulumi's AI Answers clearly state that they're the product of AI. "Despite the hallucinations, we regularly hear 'Even if imperfect, we prefer to have something 80 percent correct, [rather] than nothing at all'." ®

{
"by": "petetnt",
"descendants": 11,
"id": 40222106,
"kids": [
40224721,
40222163,
40229514,
40224817,
40224894,
40225161,
40224529,
40225773,
40224591
],
"score": 28,
"time": 1714565350,
"title": "Google Search results polluted by buggy AI-written code frustrate coders",
"type": "story",
"url": "https://www.theregister.com/2024/05/01/pulumi_ai_pollution_of_search/"
}
{
"author": "Thomas Claburn",
"date": "2024-05-01T02:21:50.000Z",
"description": "Pulumi claims it has culled bad infrastructure-as-code samples",
"image": "https://regmedia.co.uk/2021/09/08/google_shutterstock.jpg",
"logo": "https://logo.clearbit.com/theregister.com",
"publisher": "The Register",
"title": "Developers seethe as Google surfaces buggy AI-written code",
"url": "https://www.theregister.com/2024/05/01/pulumi_ai_pollution_of_search/"
}
{
"url": "https://www.theregister.com/2024/05/01/pulumi_ai_pollution_of_search/",
"title": "Developers seethe as Google surfaces buggy AI-written code",
"description": "Analysis Google has indexed inaccurate infrastructure-as-code samples produced by Pulumi AI – a developer that uses an AI chatbot to generate infrastructure – and the rotten recipes are already appearing at...",
"links": [
"https://www.theregister.com/2024/05/01/pulumi_ai_pollution_of_search/",
"https://www.theregister.com/AMP/2024/05/01/pulumi_ai_pollution_of_search/"
],
"image": "https://regmedia.co.uk/2021/09/08/google_shutterstock.jpg",
"content": "<div>\n<p><span>Analysis</span> Google has indexed inaccurate infrastructure-as-code samples produced by Pulumi AI – a developer that uses an AI chatbot to generate infrastructure – and the rotten recipes are already appearing at the top of search results.</p>\n<p>This mess started with Pulumi's decision to publish the result of its users' prompts on a curated AI Answers page. Google's crawlers indexed the resulting robo-responses – but when users find them, the AI answers are often inaccurate.</p>\n<p>\"It has happened,\" wrote developer Arian van Putten in a social media <a target=\"_blank\" href=\"https://x.com/ProgrammerDude/status/1784833971731223033\">post</a> over the weekend. \"The number one Google result was an official Pulumi documentation page that was clearly written by an LLM (it had a disclaimer that it was) and hallucinated an AWS feature that didn't exist. This is the beginning of the end.\"</p>\n<p>As <em>The Register</em> <a target=\"_blank\" href=\"https://www.theregister.com/2022/12/06/internet_ai_gpt_ios/\">opined</a> in 2022 and <a target=\"_blank\" href=\"https://www.theregister.com/2024/01/17/google_search_results_spam/\">reported</a> in January this year, search quality has declined because search engines index low-quality AI-generated content and present it in search results. This remains an <a target=\"_blank\" href=\"https://www.reddit.com/r/artificial/comments/1c7fvam/ai_has_made_google_search_so_bad_people_are/\">ongoing</a> area of concern.</p>\n<p><a target=\"_blank\" href=\"https://pulumi.com/ai\">Pulumi AI</a> and its online archive of responses, <a target=\"_blank\" href=\"https://www.pulumi.com/ai/answers\">AI Answers</a>, is a case in point. Google's search crawler indexes the output of Pulumi's AI and presents it to search users alongside links to human-authored content. Software developers have found some of the resulting AI-authored documentation and code inaccurate or even non-functional.</p>\n<p>The problem was noted on March 21, 2024 by developer Pete Nykänen in a GitHub Issues <a target=\"_blank\" href=\"https://github.com/pulumi/pulumi-ai/issues/79\">post</a> to the Pulumi AI code repository. \"Today I was googling various infrastructure related searches and noticed a worrying trend of Pulumi AI answers getting indexed and ranking high on Google results, regardless of the quality of the AI answer itself or if the question involved Pulumi in the first place. This happened with multiple searches and will probably get even worse as the time goes on.\"</p>\n<p>Others have also <a target=\"_blank\" href=\"https://www.reddit.com/r/pulumi/comments/1avos5m/your_seo_spam_with_your_ai_makes_your_tool/\">raised the issue</a>.</p>\n<h3>A rising tide of muck</h3>\n<p>Nykänen told <em>The Register</em> in an email that he began noticing Pulumi AI search result issues around the time he posted to GitHub last month.</p>\n<p>\"As an engineer, I spend a lot of time searching for answers online and it was not difficult to notice the AI answers rising to the top of the search results overnight, even for keywords unrelated to Pulumi itself,\" he noted. \"I filed the issue and hoped that Pulumi would rectify the situation (which they promised to do) but sadly the issue still persists.\"</p>\n<p>\"Documentation, especially infrastructure related, is already often incorrect, hard to find, outdated or otherwise missing. While tools like Pulumi AI can provide value to some, filling the internet with unconfirmed, possibly hallucinated, answers is actually pretty malicious. And the longer it goes on, the worse it gets.\"</p>\n<p>Nykänen argued that with AI content already appearing at the top of search results and more companies creating content generation tools, he hopes that those involved in AI consider how their work impacts the integrity of the web.</p>\n<p>\"I don't think it's too late for Pulumi either and hopefully they will decide to hide their AI generated content from search engine scrapers,\" he suggested.</p>\n<p>Aaron Friel, an AI engineer at Pulumi, acknowledged Nykänen's concerns, responding the following day that the developer has \"taken steps to remove more than half (almost two thirds) of AI Answers, and we plan to continue to ensure that these AI answers are complementary to our existing documentation.\"</p>\n<p>Friel noted that Pulumi also plans to make sure its site mentions real APIs and upstream documentation. Testing generated code is also on the to-do list.</p>\n<h3>Hello? Google?</h3>\n<p>That was a month ago, and Google hasn't yet gotten the memo. When <em>The Register</em> tried the keywords cited by Nykänen on Monday \"aws lightsail xray\" – <a target=\"_blank\" href=\"https://www.pulumi.com/ai/answers/bLHAi4DutXJvbJyNngGRvS/optimizing-aws-lightsail-and-x-ray-deployment\">Pulumi AI's answer</a> was the second search result. And when we tried again on Tuesday, it ranked at the top of the page – above the official AWS documentation.</p>\n<p>We asked Google what it thought of the situation and a company spokesperson told us it \"always aims to surface high quality information, but on some niche topics or unusual queries, there may not be a lot of high quality content available to rank highly in Search.\"</p>\n<p>The search giant also reminded us that it policies mean \"Low value content that’s created at scale to manipulate Search rankings is spam, however it is produced\", and that recent updates to its tech \"reduced low quality, unoriginal content on Search by 45 percent, and aim to tackle unhelpful content that’s designed to rank well in Search.\"</p>\n<p>Microsoft's Bing search engine could be ahead of the game in terms of filtering AI-generated material as it did not have this problem for the same query, though results it produced included a Chat button that launched an AI-generated response if you took the bait and clicked rather than just hitting return to submit the query. Brave Search also omitted the Pulumi AI response. DuckDuckGo, meanwhile, returned the Pulumi AI result as the fourth item on its search results page for the <a target=\"_blank\" href=\"https://duckduckgo.com/?va=g&amp;t=hk&amp;q=aws+lightsail+xray&amp;ia=web\">query</a>.</p>\n<ul>\n<li><a target=\"_blank\" href=\"https://www.theregister.com/2024/04/30/bruce_perens_post_open_license/\">Open Source world's Bruce Perens emits draft Post-Open Zero Cost License</a></li>\n<li><a target=\"_blank\" href=\"https://www.theregister.com/2024/04/30/european_commission_launches_proceedings_meta_misinformation/\">European Commission starts formal probe of Meta over election misinformation</a></li>\n<li><a target=\"_blank\" href=\"https://www.theregister.com/2024/04/30/kill_killer_robots_now/\">Politicians call for ban on 'killer robots' and the curbing of AI weapons</a></li>\n<li><a target=\"_blank\" href=\"https://www.theregister.com/2024/04/29/openai_hit_by_gdpr_complaint/\">OpenAI slapped with GDPR complaint: How do you correct your work?</a></li>\n</ul>\n<p>Another GitHub Issue <a target=\"_blank\" href=\"https://github.com/pulumi/pulumi-ai/issues/83\">post</a> on Monday, referring to van Putten's complaint, has asked for the removal of Pulumi AI's <a target=\"_blank\" href=\"https://www.pulumi.com/ai/answers/e2Zg4j16E4Wnd6cUhN55ML/unlocking-aws-ebs-snapshot-capabilities-with-typescript\">answer</a> about AWS EBS direct APIs – which Pulumi evidently does not support.</p>\n<p>Several AI hallucinations flagged in March have <a target=\"_blank\" href=\"https://github.com/pulumi/pulumi-ai/issues/68\">already</a> <a target=\"_blank\" href=\"https://github.com/pulumi/pulumi-ai/issues/76\">been</a> <a target=\"_blank\" href=\"https://github.com/pulumi/pulumi-ai/issues/75\">dealt with</a>.</p>\n<p>In an email to <em>The Register</em>, Pulumi co-founder and CEO Joe Duffy defended his firm's AI effort – but allowed that more drastic intervention might be called for if the issue can't be adequately addressed.</p>\n<p>\"Pulumi AI has transformed how most of our customers work, enabling them to navigate a sea of hundreds of clouds with the myriad ways you can use all of their services,\" Duffy explained. \"We processed a 50 percent increase in prompts quarter on quarter, which is a testament to how useful our customers are finding it to their daily work.\"</p>\n<h3>A startup that promises to do better ...</h3>\n<p>Duffy claimed that Pulumi has tested and improved its code quality over time and has seen a double-digit improvement in the success rates for code examples quarter over quarter.</p>\n<p>\"That said, we know these aren't perfect,\" he conceded. \"Because our AI answers are indexable by Google, they show up in search results. I'll be the first to admit, I was surprised at how highly Google is ranking these pages, since in general they have no inbound links – a far cry from how PageRank used to work – and I would have expected it to prefer our older, more mature content.\"</p>\n<p>Asked when Pulumi first realized its AI had issues, Duffy acknowledged Pulumi has been aware its AI isn't perfect since it launched last year, and has invested to improve its quality.</p>\n<p>\"We have a new typechecker loop that feeds back into the AI and improves our results,\" he explained. \"We've tweaked it to be better at Python, and we've taught it about our cloud SDKs. All of these have had material increases in quality – and it will just keep getting better from here. Although there's been some negative sentiment on social media, far and away the feedback we get directly is that the AI is helpful, especially when just getting started in the cloud – it truly is daunting to even get started navigating hundreds of clouds each with tens of thousands of services.\"</p>\n<p>Duffy revealed that Pulumi has already removed 100,000 AI answers and will take down more in future.</p>\n<p>Despite the challenges, Duffy expects AI will improve over time. \"We move fast and try innovative new ideas regularly – and sometimes they just don't work out the way we intended,\" he admitted. \"If we can’t get to a good place quickly, we will absolutely consider delisting all of them and building back up more slowly.\"</p>\n<p>Duffy added that Pulumi's AI Answers clearly state that they're the product of AI. \"Despite the hallucinations, we regularly hear 'Even if imperfect, we prefer to have something 80 percent correct, [rather] than nothing at all'.\" ®</p> \n </div>",
"author": "",
"favicon": "https://www.theregister.com/design_picker/13249a2e80709c7ff2e57dd3d49801cd534f2094/graphics/favicons/favicon.svg",
"source": "theregister.com",
"published": "2024-05-01t07:32:08z",
"ttr": 274,
"type": "article"
}