Scientific Inquirer: Why Searching for Truth Online Might Be Making Us More Biased
Eugina Leung, assistant professor of marketing, was interviewed by Scientific Inquirer about her research finding that search engine users tend to reinforce their own beliefs and biases as a result of the search terms they use.
“Search engines like Google or ChatGPT are engineered to deliver the most relevant results for the specific words you used. So, a search for ‘dangers of caffeine’ will return a list of articles about its negative effects. The algorithm is doing its job perfectly, but the result is a narrow slice of information that matches the bias in your original query.”
To read the article in its entirety, visit scientificinquirer.com:
Interested in advancing your education and/or career? Learn more about Freeman’s wide range of graduate and undergraduate programs. Find the right program for you.
Recommended Reading
- Daniel Mochon: Navigating the Noise
- Matthew Higgins: The Strategy of Innovation
- Pierre Conner: The Future of Energy Is Now
- What Can You Do With a Business Analytics Degree?
- Rob Lalka
- Ukrainian scholar to discuss economic impacts of war
- Join the Freeman School for Homecoming 2012
- Burkenroad Symposium tackles ethics of social media
Other Related Articles
- Forbes: Your Pitch Deck Doesn’t Close the Deal - Your Power in the Room Does
- Research Notes: Alissa Bilfield
- Freeman Futurist Series looks at AI, Robotics and Quantum
- Quartz: Companies that replace workers with AI ‘risk mediocrity,’ expert warns
- BBC News: ChatGPT will soon allow erotica for verified adults, says OpenAI boss
- Business Insider: Why a professor of finance isn't impressed by gold's stunning rally in 2025
- Embracing Business Futurism: A Conversation with Cliff Farrah
- CNET: I Asked AI Chatbots About Problem Gambling. Then They Gave Me Betting Advice