Scientific Inquirer: Why Searching for Truth Online Might Be Making Us More Biased

Eugina Leung, assistant professor of marketing, was interviewed by Scientific Inquirer about her reserach that found search engine users tend to end up reinforcing their own beliefs and biases due to the search terms they use.
“Search engines like Google or ChatGPT are engineered to deliver the most relevant results for the specific words you used. So, a search for ‘dangers of caffeine’ will return a list of articles about its negative effects. The algorithm is doing its job perfectly, but the result is a narrow slice of information that matches the bias in your original query.”
To read the article in its entirety, visit scientificinquirer.com:
Interested in advancing your education and/or career? Learn more about Freeman’s wide range of graduate and undergraduate programs. Find the right program for you.
Recommended Reading
- Pierre Conner: The Future of Energy Is Now
- What Can You Do With a Business Analytics Degree?
- Rob Lalka
- Ukrainian scholar to discuss economic impacts of war
- Join the Freeman School for Homecoming 2012
- Burkenroad Symposium tackles ethics of social media
- Burkenroad Symposium to explore ethics and social media
- Students face off in inaugural Tulane Energy Trading Competition
Other Related Articles
- WVUE Fox 8: How the Middle East conflict is affecting oil prices
- WDSU: Gas prices face uncertainty after US strikes on Iran
- CBS Detroit: How is AI impacting entry-level jobs?
- Freeman announces new administrative appointments
- AACSB Insights: Reinventing Teamwork - AI in the Business Classroom
- CNET: The Scientific Reason Why ChatGPT Leads You Down Rabbit Holes
- Payments Dive: GENIUS Act is just the beginning
- The National Desk: CBS investigation finds hundreds of Meta platforms with ‘Nudify’ advertisements