Artificial Intelligence is becoming very instrumental in our daily lives thanks to its ability to perform and solve tasks that might need human intelligence. However, a new study has shown that most Americans are uncomfortable with physicians using AI to manage their health. Interestingly, the majority acknowledges the potential of AI in minimizing medical mistakes and eliminating some of the problems that providers may have with racial bias.
AI adoption is growing in various sectors
Most people have used technology leveraging AI, like when shopping on amazon. For instance, AI guides the site to recommend products if you have shopped for a particular product. Most Americans like customized services, but that may not be the case regarding healthcare.
Around 66% of the Pew Research Center-sponsored survey participants said they were uncomfortable with a doctor relying on AI to make a diagnosis. In addition, approximately 57% stated that using AI might worsen a relationship with the provider. Interestingly, 38% felt that AI use in diagnosing or recommending treatment might result in better outcomes, 33% stated that it might result in worse outcomes, and 27% stated that there wasn’t a significant difference.
Americans don’t want AI robots doing their surgery
About 6 out of 10 participants in the United States stated they wouldn’t want AI-driven machines to do some of their surgery. Also, 79% of respondents indicated that they would not want artificial intelligence (AI) to play a role in their treatment of mental disorders. Regarding AI and medical information, security is another issue people worry about.
PEW associate director of research Alex Tyson said that awareness regarding AI is still growing, and the dynamic is that people are not aware of the technologies. Tyson explained that when you consider the use of technology at an individual level, like one’s health, the notion that people are getting to know the tech is a dynamic at play.
Americans are more concerned about the rate of AI adoption in medicine and healthcare.