When knowing less is worth more

America post Staff
6 Min Read



For decades, NBCUniversal’s “The More You Know” campaign has promoted the idea that knowledge is always a public good. And there’s certainly truth in that. But we’ve all watched as a movie character who starts to know too much soon meets their demise. As we navigate a reality inundated with an ever-growing amount of information, data, and artificial intelligence, I look to the recent unveiling of Banksy’s identity to see that we must reconsider the ‘more you know’ mentality.

Reuters published an investigation revealing Banksy’s identity, arguing the piece was a matter of public interest. I was surprised by the backlash that followed. While plenty of people flocked to the article, others actively avoided it simply because they prefer not to know. Perhaps the same part of us that is awed by a good magic trick and motivated to hypothesize about its secrets also revels in the mystery of Banksy. Indeed, his anonymity was critical to his impact. Obscurity allowed his work to be interpreted detached from his own socioeconomic, political, and personal identities, creating resonance with a broader audience.

Now that audience has the opportunity to know the decades-old secret. Yet, in an attention economy built entirely on information consumption, people were opting out. Deliberately.

THE OPT-OUT MOMENT

In another opting-out moment, Anthropic refused to remove two restrictions from its Pentagon contract: no mass domestic surveillance and no fully autonomous weapons. Ultimately, it lost the $200 million contract to a competitor that met the requirements.

And users noticed. The day after OpenAI announced its Pentagon deal, ChatGPT uninstalls spiked 295%. The day Anthropic refused the deal, Claude’s installs surged 37%, and it rose another 51% the following day. For the first time, U.S. downloads of Claude surpassed ChatGPT, and Claude rose to number one in Apple’s US App Store.

By saying no, Anthropic didn’t just earn goodwill. It earned market share.

Restraint, in this case, became a competitive advantage.

As builders of mobile apps and web products, it is common for our clients, of all sizes, to want to gather as much user information as possible. For 20 years, the dominant product logic has been to collect everything you can, maximize engagement, and optimize for the individual transaction. More data. More reach. More features. AI models are far more knowledgeable and capable than the last and the assumption was that capability equals value.

As AI advances, its vast data collection is astounding, and its ability to put that data to use in human-like ways is already remarkable. Whether that is composing music, creating images and film, or formulating business strategies, do we have the restraint to resist turning to it for everything? How can that restraint be utilized as a true product or brand differentiator?

WHAT DO USERS WANT?

Users are asking different questions now. Not just ‘what does this product or information do for me?’ but ‘what does this do to us?’ And they are deliberately choosing where to invest their attention based on the answers. We’re seeing a shift toward smaller, more intimate digital spaces, and trusted brands as a result.

I’ve noticed another shift within our conversations with clients. The classic line of questioning around ‘what should we build?’ is accompanied by a new one, “What should we never do?’

My recommendation to product teams is to scrutinize every bit of data and establish a clear purpose for everything captured. Furthermore, be transparent and straightforward about that with users. Work out your true answers to the harder questions on everything from user data to AI strategy to brand values. Let those answers guide your business and product decisions.

Here are some questions worth sitting with if you’re building a digital product right now:

  • What does your product choose not to collect, and do your users know that?
  • Can you provide a compelling user experience without this data? A comparable experience with less data?
  • Is there a principled ‘no’ your product could take that would build more trust than any new feature?
  • Whose side is your product on when the interests of your individual user and a different group conflict?

The instinct can still be to collect more, track more, and optimize more. For now, that still works.

But something is shifting in what users trust. And trust, once lost, is the hardest product problem to solve.

If you’re building a digital product right now, this shift is worth paying attention to.

Because I believe the apps we’ll be using in five years won’t be the ones that keep embracing the ‘more’ mentality. We’ll use the products that deliberately choose to know less.

Brad Weber is the CEO and founder of InspiringApps.



Source link

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *