Scientific Inquirer: Why Searching for Truth Online Might Be Making Us More Biased

Eugina Leung, assistant professor of marketing, was interviewed by Scientific Inquirer about her research finding that search engine users tend to reinforce their own beliefs and biases as a result of the search terms they use.
“Search engines like Google or ChatGPT are engineered to deliver the most relevant results for the specific words you used. So, a search for ‘dangers of caffeine’ will return a list of articles about its negative effects. The algorithm is doing its job perfectly, but the result is a narrow slice of information that matches the bias in your original query.”
To read the article in its entirety, visit scientificinquirer.com:
Interested in advancing your education and/or career? Learn more about Freeman’s wide range of graduate and undergraduate programs. Find the right program for you.
Recommended Reading
- Daniel Mochon: Navigating the Noise
- Matthew Higgins: The Strategy of Innovation
- Pierre Conner: The Future of Energy Is Now
- What Can You Do With a Business Analytics Degree?
- Rob Lalka
- Ukrainian scholar to discuss economic impacts of war
- Join the Freeman School for Homecoming 2012
- Burkenroad Symposium tackles ethics of social media
Other Related Articles
- CNET: I Asked AI Chatbots About Problem Gambling. Then They Gave Me Betting Advice
- Freeman Futurist Series explores innovation, leadership and emerging technologies
- Three honored with Freeman teaching awards
- Freeman researcher helps create open-source AI that rivals industry leaders
- Tulane Energy Institute gets major gift from Templeton family, new name for Trading Center
- Research Notes: Daniel Mochon
- Tulane launches technology ethics course bridging science, business and the humanities
- Forbes: AI Eating Tech And Other Jobs? It’s A Matter Of Perspective