[ad_1]
Google has began the rollout of restrictions on what type of election-related questions its AI chatbot Gemini will reply as it tries to forestall the unfold of pretend information throughout a year when billions of individuals will vote worldwide.
The know-how big mentioned that customers in India will likely be restricted as to what they’ll ask Gemini, or at the least what varieties of questions it’s going to present responses to.
It is a component of the corporate’s efforts to make sure that misinformation and disinformation is proscribed in a year when in keeping with the Centre for American Progress greater than two billion individuals in 50 international locations will head to the polls.
Some of these elections will likely be contested freely and pretty, whereas others is not going to.
The international locations the place votes are being held this year embody the US, Mexico, Russia and doubtless the UK as effectively.
But by far the most important is India, the place round 900 million individuals are registered to vote in keeping with Chatham House.
“With millions of eligible voters in India heading to the polls for the general election in the coming months, Google is committed to supporting the election process by surfacing high-quality information to voters, safeguarding our platforms from abuse and helping people navigate AI-generated content,” Google mentioned in a weblog put up.
The tech big laid out a sequence of non-AI measures it was taking to attempt to cut back the hurt for which its platforms is likely to be used to unfold.
These embody efforts to offer info straight from the Electoral Commission of India on Google Search and YouTube.
But it’s going to additionally embody restrictions on how Gemini can be utilized.
“Out of an abundance of caution on such an important topic, we have begun to roll out restrictions on the types of election-related queries for which Gemini will return responses,” the Google India staff mentioned.
“We take our responsibility for providing high-quality information for these types of queries seriously, and are continuously working to improve our protections.”
[ad_2]
Source hyperlink