The Trust Equation Health Marketers Must Solve

America post Staff
8 Min Read

Beyond one-off success stories, Harlan Schwarz, EVP of media at Inizio Evoke, noted that another struggle is finding the right balance between hyper-segmentation and customized messaging that speaks to one person and mass market scalability—especially when the brand is a big blockbuster drug.

“That opens up the question of creativity and being able to find, not the lowest common denominator because you don’t want to set your bar that low, but some midpoint that is going to at least wash over enough people that that micro-targeting then helps to support,” Schwarz said.

Klick’s Kristy Quagliariello
Klick’s Kristy Quagliariello

Stopping the spread of misinformation

The discussion then turned to some of the unique challenges that AI and social media misinformation pose for health marketers.

“We really need to be smart about the brand safety of the products that we are marketing. And also just as consumers, be really diligent in who we’re trusting with the questions and the answers,” said Kristy Quagliariello, VP of programmatic media at Klick. She cited how TikTok influencers are overly hyping peptides instead of GLP-1s.

On the AI front, Roshen Mathew, chief AI and innovation officer with SSCG Media Group, worries that patients using ChatGPT for diagnoses start with false premises.

“They’ll go to ChatGPT and say, ‘My doctor says I have this. But if he’s wrong (that’s the false premise), what could I have?’ They can build an entire script to debate with their physician,” Mathew shared. “They’re coming armed to the teeth with this kind of information, and it’s frankly scary.”

CMI Media Group’s Andrew Miller
CMI Media Group’s Andrew Miller

Goodcuff agreed, pointing out that with the right prompting, you can get AI to tell you anything you want.

“The reason medical school and residency take so long is because you have to learn how to process patterns and associate them with clinical presentations and weed out the noise,” he explained. “Something like AI that’s not specifically clinically trained, it’s just ingesting raw data. It’s probabilistically telling you what you want to hear, and then you’re running off with that.”

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *