[ad_1]
Well, it’s that time again. Right now, several countries in the world are getting ready to host their elections, and this includes the United States. Since generative AI makes rapid misinformation a very real threat, Google announced that Gemini will not answer questions about global elections.
Misinformation has probably been an issue ever since mankind learned how to speak. However, it’s much more of an issue now with the advent of AI. While the technology is very smart, it is not immune to generating inaccurate results. Right now, many AI chatbots have the capacity to hallucinate and produce wrongful information. That can be devastating if potential voters are given inaccurate information.
Because of this, all eyes are on companies making these AI chatbots. Companies like Google, OpenAI, xAI, Anthropic, Meta, etc. are going to have to prepare for this.
Google will not let Gemini answer questions about global elections
While we are in the dark about what most of these companies are doing, Google gave us a clue as to what it plans on doing. The company announced that if anybody asks for information dealing with any of the numerous elections going on this year, Gemini will respond with a message saying “I’m still learning how to answer this question. In the meantime, try Google search.” So, if you try to ask it any questions about current US presidential candidates, you’re likely to get stonewalled.
This doesn’t only apply to the United States election going on. There are other elections in other countries coming on as well. Since Google’s information and Gemini pretty much span the globe, other countries are just as much at risk of AI hallucinations. So, this seems like a good step for Google.
Right now, the company is struggling with AI. For starters, the Gemini image generator was taken down due to the pictures of historical figures with inaccurate skin tones, ethnicities, and genders. This is only one example of Google’s rushed mentality toward AI technology. In fact, a former Google consultant spoke about this in a video.
[ad_2]
Source link