SEATTLE, Wash. - A new study from the University of Washington found that AI is capable of swaying the political opinions of its users.

Participants included both self-identifying Democrats and Republicans who were given one of three modified versions of ChatGPT: a neutral version, a left-leaning version and a right-leaning version.

Across the board, participants were more likely to be swayed toward the AI model they were given—even if their initial opinion was in the opposite direction.

"Democrats and Republicans were both more likely to lean in the direction of the biased chatbot they talked with than those who interacted with the base model," a press release from the UW said. "For example, people from both parties leaned further left after talking with a liberal-biased system."

Th

See Full Page