CNET: The Scientific Reason Why ChatGPT Leads You Down Rabbit Holes

Eugina Leung, assistant professor of marketing, was interviewed by CNET for a story about her research that search engines tend to reinforce users' existing beliefs due to the search terms they're using.
"When people look up information, whether it's Google or ChatGPT, they actually use search terms that reflect what they already believe," Eugina Leung, an assistant professor at Tulane University and lead author of the study, told me.
To read the story in its entirety, visit cnet.com:
Interested in advancing your education and/or career? Learn more about Freeman’s wide range of graduate and undergraduate programs. Find the right program for you.
Other Related Articles
- PsyPost: Scientists show how you’re unknowingly sealing yourself in an information bubble
- Scientific Inquirer: Why Searching for Truth Online Might Be Making Us More Biased
- CBS Detroit: How is AI impacting entry-level jobs?
- AACSB Insights: Reinventing Teamwork - AI in the Business Classroom
- The National Desk: CBS investigation finds hundreds of Meta platforms with ‘Nudify’ advertisements
- New Tulane study finds generative AI can boost employee creativity—but only for strategic thinkers
- Tulane study finds smaller companies get kinder online reviews - and empathy is the reason why
- Four honored with Freeman research awards