16 Nov Is ChatGPT Listening in On Your Next Informed Consent?
The Association of Clinical Research Professionals
You may be happily using ChatGPT to produce meeting minutes or literature reviews, and your children may be secretly using it to complete essays and term papers, but is the world of medical research ready to have such artificial intelligence (AI) tools producing informed consent forms (ICFs) for clinical trial participants?
Ready or not, various applications of AI are openly and not-so-openly filtering into the mainstream operations of clinical research sponsors and study sites, as in so many other professional settings. When it comes to the importance of the informed consent process and the forms used to explain the purpose, risk/benefit calculation, and scope of a clinical trial to participating patients, speakers slated to dive into the topic at the ACRP 2024 conference urge caution from trial team members who may be hoping for ChatGPT to ease ICF-related burdens on their list of chores.
For one thing, Jeri Burr, MS, RN, PED-BC, CCRC, FACRP, Program Director for the HEAL Pain Effectiveness Research Network and the Data Coordinating Resource Center at the University of Utah, notes that trial team members should be aware that the companies offering different ChatGPT products may, behind the scenes, be using summaries of a site’s pre-existing ICFs and the contents of new ICF drafts under development to “train” their AI models to handle such contents.
“It’s important to go into the ChatGPT settings and turn OFF the history so that the company will not use your online conversations about your ICFs to train its model,” Burr says. “If you are using ‘pretend’ consent documents to develop your next form, this is not an issue, but if we are looking at real consent documents, just know that the contents of your conversations are being used to train the model. Similarly, if you put up a document and ask for a summary, the document is now part of the training data.” Read more…