Home
News
Tech Grid
Interviews
Anecdotes
Think Stack
Press Releases
Articles
  • Home
  • /
  • Interviews
  • /
  • Tilman Harmeling: Where Privacy, Data and Trust Finally Converge

Tilman Harmeling: Where Privacy, Data and Trust Finally Converge

  • May 4, 2026
  • Data Governance & Quality
TipNew
Tilman Harmeling: Where Privacy, Data and Trust Finally Converge

Marketing chased scale through data for years. That playbook is now breaking; privacy concerns are rising, and trust is eroding.

Tilman Harmeling from Usercentrics explains how privacy, data, and trust must come together. He unpacks the shift from data volume to high-quality, consented signals, and how privacy-led marketing creates sharper insights, better AI outcomes, and more meaningful engagement, turning privacy into a true growth lever.


You’ve built market models and intelligence frameworks that guide global strategy at Usercentrics. How is the intersection of privacy, data, and marketing fundamentally shifting today?

For a long time, these were separate worlds. Privacy sat with legal, data with engineering, and marketing operated on top without much accountability to either. If you look at it more strategically, privacy is the human element, data is the machine, and marketing is where the two meet to drive behavior. That intersection used to be driven by technology and nudging, but today it really comes down to trust. What’s fundamentally changing now is that consumers, not regulators, are collapsing these silos. Our State of Digital Trust report, based on 10,000 users across Europe and the US, shows this clearly: 46% of users accept cookies less often than three years ago, 42% actively read consent banners before deciding, and 36% have left a website or deleted an app over privacy concerns. Regulation has matured alongside this, but it is the consumer shift that makes it irreversible. Privacy is no longer a downstream compliance task; it is the first interaction a user has with a brand and increasingly the factor that determines whether they stay.

 

Usercentrics champions Privacy-Led Marketing. How does this approach help brands cut through AI-driven complexity and create more trustworthy user experiences?

There’s a common assumption that more data leads to better marketing. In reality, consented data outperforms data collected at scale because the signal quality is fundamentally higher. When someone gives informed, intentional consent, you’re working with a real signal. These are users who trust, who are more likely to buy, and more likely to advocate. Everything else is, to some extent, guesswork. And in an AI-driven world, that guesswork creates a huge amount of noise because models are only as good as the data they’re trained on.

Privacy-led marketing flips that dynamic. Instead of feeding AI massive amounts of inferred, low-quality data, you focus on smaller, high-confidence signals. The result is better models, clearer decisions, and less wasted spend. In practice, it’s the difference between chasing someone for weeks over a product they clicked once by accident and actually responding to real intent. You end up with less data, but far more meaning, and a customer experience that feels deliberate rather than intrusive.

 

It’s often assumed that if something is personalized, it’s valuable. But trust doesn’t automatically follow. What does it take to intentionally design trust into GenAI-powered marketing journeys?

We often hear people say, “My phone is listening to me,” when they see an ad for something they just discussed. In reality, it’s driven by things like geofencing and predictive models, but because it’s invisible, it feels like surveillance. GenAI only amplifies this. Decisions happen instantly, autonomously, and without users understanding how or why. Trust doesn’t erode because personalization is bad; it erodes because it’s opaque.

Designing trust into that means making consent more than a one-time banner. It has to run through the entire journey, especially as AI systems connect more deeply to business data. That requires real transparency, control, and the ability for user choices to take effect immediately. People don’t just want to give preferences; they want to see them respected. In a GenAI world, transparency is what removes the “creepy” factor and replaces it with a sense of control, which ultimately leads to better experiences and stronger trust.

 

We’re seeing a split between “privacy fatigue” and “privacy awakening.” How should marketers interpret this shift, and why is the traditional “accept = consent” model breaking down?

These are often framed as opposing trends, but they’re really two reactions to the same problem. Privacy fatigue comes from consent experiences designed to confuse rather than inform. Dark-patterned “Accept All” buttons don’t show that people don’t care about privacy. They show the industry optimized for capture over clarity. The awakening surfaces when a breach or headline about AI training on personal data triggers concern that was always there, just suppressed by exhaustion.

The “accept equals consent” model is breaking down because a click under pressure doesn’t create a reliable signal. Without real understanding or intent, that data quickly turns into noise. Regulators globally are now codifying that view. The good news is that this is a solvable design problem, and one that directly impacts both trust and performance. The brands that treat it as such will earn consent that actually holds up.

 

With growing scrutiny around children’s online safety, do you think the industry has treated younger users too much like standard consumers?

Without question. A 10-year-old can be profiled, targeted, and retargeted through the same programmatic infrastructure as a 40-year-old. Children can’t evaluate privacy trade-offs. Treating their data interactions as equivalent to adult consent isn’t a regulatory gap. It’s a design failure.

The regulatory momentum reflects how serious this has become, with updated COPPA rules, COPPA 2.0 passing the US Senate unanimously, and the GDPR and AI Act raising the bar across Europe. But the real question isn’t whether companies will comply. It’s whether they take responsibility before they’re forced to. Age-appropriate design shouldn’t be a legal reaction. It should be a product principle.

 

You’ve positioned privacy as a trust accelerator. What separates brands that truly leverage privacy as a competitive advantage from those that treat it as a checkbox?

There’s a simple question I always come back to: who owns privacy in the organization? If the answer is “legal,” it’s a checkbox. If the CMO and the DPO are shaping campaign strategy together, that’s a company that understands something important. Privacy isn’t a constraint on marketing. It’s an input for better marketing.

The brands that turn privacy into a competitive advantage treat consent infrastructure the way they treat their CRM, as a strategic asset they continuously invest in. And they measure the impact: consent rate optimization, the relationship between transparency and lifetime value, campaign ROAS. Once you can show that privacy drives measurable outcomes, the conversation shifts from risk management to growth.

 

As AI, regulation, and user expectations continue to evolve, what will define successful marketing organizations over the next five years, and where will privacy-led strategies play the biggest role?

The marketing organizations that will define the next five years will be the ones that connect the dots, bringing consent, compliance, and data governance together as a unified capability rather than separate workstreams. That’s a significant opportunity, because much of the market hasn’t made that shift yet.

Organizations that build privacy into their AI workflows, treat consent as a system that evolves with technology, and use regulatory change as a catalyst rather than a constraint will develop a structural advantage over time. They’ll move more efficiently, make better use of their data, and earn deeper customer trust, while others are still trying to reconcile fragmented approaches.

Data Privacy
Data Governance
Digital Trust
Customer Trust
Privacy Led Marketing
Trust Economy
Privacy First
AI Trust
  • Share

Tilman Harmeling is a data protection expert with a career focus on the business and technical complexities of privacy. He is primarily involved in data-driven projects related to consent-based marketing, like opt-in analysis and optimization, and the influence of AI on consent and preference management.

Tilman’s goals are to understand the ever-changing privacy landscape and find opportunities for innovation. He is a sought-after speaker on current privacy topics at events like PrivSec Global, OMR, DMEXCO, the BCG MarTech Series, and Leadership Beyond Borders.

More about Tilman: 

Usercentrics is a global market leader in solutions for data privacy and activation of consented data. Our technology solutions enable customers to manage user consent for websites, apps, and CTV. Helping clients achieve privacy compliance, Usercentrics is active in 195 countries on more than 2.3 million websites and apps. We have over 5,400 partners and handle more than 7 billion monthly user consents.

Learn more at usercentrics.com