Personalized algorithms distort learning and foster overconfidence

New research shows that personalized algorithms on platforms like YouTube can hinder learning by limiting exposure to information, even for those with no prior knowledge. Participants in a study explored less material, drew incorrect conclusions, and felt overly confident in their errors. The findings highlight risks for biased understanding in everyday digital interactions.

A study published in the Journal of Experimental Psychology: General reveals how personalized recommendation systems interfere with learning processes. Conducted by Giwon Bahg as part of his doctoral dissertation at The Ohio State University, the research involved 346 online participants who had no background knowledge on the topic. To test the effects, researchers created a fictional task where participants learned to identify types of crystal-like aliens, each defined by six varying features such as shape and color.

In one condition, participants clicked to reveal all features for each alien. In another, a personalization algorithm guided their choices, steering them toward repeatedly examining the same features while allowing skips. Those using the algorithm viewed fewer features overall and in a selective pattern. Later, when tested on new aliens, they sorted them incorrectly but expressed high confidence in their answers.

"They were even more confident when they were actually incorrect about their choices than when they were correct, which is concerning because they had less knowledge," Bahg said. Now a postdoctoral scholar at Pennsylvania State University, Bahg noted that algorithms build biases immediately, leading to distorted views of reality.

Co-author Brandon Turner, a psychology professor at Ohio State, explained: "People miss information when they follow an algorithm, but they think what they do know generalizes to other features and other parts of the environment that they've never experienced."

The researchers illustrated the bias with a movie recommendation scenario: a user new to films from a country might only see action-thrillers, forming inaccurate assumptions about its cinema and culture. Implications extend to children learning online, where algorithms prioritize content consumption over broad exploration. "If you have a young kid genuinely trying to learn about the world, and they're interacting with algorithms online that prioritize getting users to consume more content, what is going to happen?" Turner asked. Co-author Vladimir Sloutsky, also at Ohio State, contributed to the work.

Wannan shafin yana amfani da cookies

Muna amfani da cookies don nazari don inganta shafin mu. Karanta manufar sirri mu don ƙarin bayani.
Ƙi