feedzop-word-mark-logo
searchLogin
Feedzop
homeFor YouIndiaIndia
You
bookmarksYour BookmarkshashtagYour Topics
Trending
trending

Gold prices reach record highs

trending

Harry Kane rescues Bayern draw

trending

Mbappé scores, Madrid beats Alaves

trending

Bangalore cold wave continues

trending

Railways plans for pay commission

trending

India vs South Africa live

trending

Heavy rain alert issued

trending

MHADA lottery draw postponed

trending

Snapchat woos Indian advertisers

Terms of UsePrivacy PolicyAboutJobsPartner With Us

© 2025 Advergame Technologies Pvt. Ltd. ("ATPL"). Gamezop ® & Quizzop ® are registered trademarks of ATPL.

Gamezop is a plug-and-play gaming platform that any app or website can integrate to bring casual gaming for its users. Gamezop also operates Quizzop, a quizzing platform, that digital products can add as a trivia section.

Over 5,000 products from more than 70 countries have integrated Gamezop and Quizzop. These include Amazon, Samsung Internet, Snap, Tata Play, AccuWeather, Paytm, Gulf News, and Branch.

Games and trivia increase user engagement significantly within all kinds of apps and websites, besides opening a new stream of advertising revenue. Gamezop and Quizzop take 30 minutes to integrate and can be used for free: both by the products integrating them and end users

Increase ad revenue and engagement on your app / website with games, quizzes, astrology, and cricket content. Visit: business.gamezop.com

Property Code: 5571

Home / Technology / AI Chatbot Grok Fails Fact-Check on Sydney Attack

AI Chatbot Grok Fails Fact-Check on Sydney Attack

15 Dec

•

Summary

  • Grok AI falsely reported on the Bondi Beach mass shooting event.
  • The chatbot misidentified victims and fabricated event details.
  • This incident highlights AI's struggle with real-time, factual reporting.
AI Chatbot Grok Fails Fact-Check on Sydney Attack

AI chatbot Grok has recently exhibited significant failures in providing accurate information, particularly concerning breaking news events. During the mass shooting at a Hanukkah gathering on Bondi Beach, Grok disseminated false narratives across the social media platform X. The chatbot presented inaccurate descriptions of circulating videos, including claims that they depicted unrelated incidents like a man climbing a tree or a cyclone.

Further inaccuracies involved misidentifying a victim, Ahmed al Ahmed, who was injured during the attack. Grok incorrectly stated the man was Guy Gilboa-Dalal, a former hostage, and conflated details from the Bondi Beach shooting with an incident at Brown University. These errors, some of which remain visible on X, raise concerns about the reliability of AI-generated news content.

This is not the first instance of Grok's problematic output; it has previously generated controversial statements. The chatbot and the platform it operates on are both owned by Elon Musk, and Grok's repeated inaccuracies continue to place it in the headlines for the wrong reasons, questioning its capacity for factual reporting on current events.

Disclaimer: This story has been auto-aggregated and auto-summarised by a computer program. This story has not been edited or created by the Feedzop team.
Grok AI falsely claimed the Bondi Beach shooting video was an old incident and misidentified a victim, stating he was a former hostage.
Yes, Grok AI incorrectly merged details from the Bondi Beach shooting with an incident at Brown University.
Ahmed al Ahmed is the individual who disarmed one of the gunmen during the Bondi Beach mass shooting.

Read more news on

Technologyside-arrowXside-arrowElon Muskside-arrowArtificial Intelligence (AI)side-arrow

You may also like

BigBear.ai Stock: AI Hype vs. Reality

1 day ago • 15 reads

article image

Meta's AI Pivot: From Open Source to Secret Models?

11 Dec • 34 reads

article image

AI Chatbot Grok Saves Man from Ruptured Appendix

10 Dec • 21 reads

article image

AI Revolutionizes Law: Faster, Smarter Legal Battles

9 Dec • 51 reads

article image

NextEra Powers AI's Insatiable Energy Demand

9 Dec • 60 reads

article image