Inside the Room: What CAIOs, CDOs and CMOs Are Actually Struggling With in AI Adoption

In March, we spent time with a group of CMOs, CDOs and CAIOs at an AI-focused leadership event. The challenge isn't whether to adopt AI, but how to make it work in a way that's structured, connected and actually useful.

Inside the Room:
What CAIOs, CDOs and CMOs Are Actually Struggling With in AI Adoption

In March, we spent time with a group of CAIOs, CDOs and CMOs at an AI-focused leadership event hosted by Inspired Business Media.

What was striking wasn’t a lack of ambition. If anything, most organisations are already well into AI adoption, experimenting with tools, investing in capability, and in some cases moving quickly.

The challenge, as it came through in conversation, is less about whether to adopt AI and more about how to make it work in a way that is structured, connected and ultimately useful.

Across sectors and company sizes, the same themes kept resurfacing.

1. ‘Shadow AI’ is growing faster than it’s being managed

One of the more candid points raised was the rise of what many described as Shadow AI.

In simple terms, this is when teams start using AI tools independently, without formal oversight, coordination, or a shared framework across the organisation. It often happens with good intent. People are trying to move faster, solve problems, or improve how they work day to day.

No one is governing it. No one is connecting it.

In practice, it looks like this. Sales has one set of tools in place, finance is trialling others, operations is experimenting, and marketing is often moving fastest, using AI across content, campaign optimisation and analytics.

What’s missing is coordination.

There is rarely a shared view of which tools are in use, how data is flowing between them, or what is actually delivering value. As a result, adoption appears strong on the surface, but underneath it is fragmented.

That fragmentation shows up in a few consistent ways. Data starts to move in different directions, making it harder to maintain a single, reliable view of the business. Security and compliance exposure increases as new tools are introduced without clear oversight. And organisations often find themselves paying for overlapping capabilities without a clear sense of return.

In many cases, adoption is moving faster than organisations can track or govern it. The issue isn’t that teams are moving too quickly. It’s that AI adoption is outpacing the structure around it.

2. Technical teams are moving ahead. Operational teams are falling behind

Another theme that came through clearly was how uneven progress feels across organisations.

Technical teams, particularly in data and engineering, are building at pace. There is real momentum in models, dashboards and automation.

At the same time, many operational teams are still relying on spreadsheets, email chains and manual reporting. They are waiting for data teams to produce the numbers, often working with information that is already out of date by the time it reaches them.

Several leaders described this as a ‘two-speed organisation.’ One part of the business is becoming increasingly data-driven, while the other continues to operate in much the same way it has for years. In some cases, operational teams are still working from last month’s numbers while technical teams are building ahead.

In marketing, this often shows up in a familiar way. Campaigns move quickly, channels evolve, and activity scales, but understanding performance still lags behind. The teams closest to customers and revenue are not always the ones with the most direct access to timely, reliable insight.

Over time, that gap becomes harder to ignore.

3. AI doesn’t fix bad data. It amplifies it

This was one of the clearest points of alignment in the room. AI is only as effective as the data underneath it.

In practice, many organisations are still working with data spread across multiple systems, with inconsistent definitions, duplication and gaps. When AI is layered on top of that, it doesn’t resolve those issues. It tends to surface them more clearly.

Leaders shared examples where different systems report different versions of the same number, where metrics cannot be reconciled across teams, and where outputs cannot be trusted enough to act on with confidence.

A board report says revenue is one number. Finance reports another. The CRM shows something else entirely.

For CMOs, this is often most visible in attribution and performance reporting, where different platforms and teams produce conflicting views of results. At that point, the challenge is no longer access to data, but confidence in what it is actually saying.

Until the underlying data is connected and governed, AI will struggle to deliver consistent value. 

4. AI strategy often lacks clear ownership

Ownership came up a lot.

In many SME and mid-market organisations, AI doesn’t sit neatly within one function. Technology teams are responsible for capability, operations teams are focused on execution, and finance is focused on cost and return. As a result, AI tends to sit somewhere in between.

Without clear ownership, progress can stall. Pilots are launched but not scaled. Tools are introduced without coordination. Decisions take longer than they should, because responsibility is shared but not clearly defined.

The organisations making more progress tend to approach this differently. They assign clear accountability to someone who can connect technology, operations and commercial priorities, even if that responsibility sits within an existing role.

That clarity makes a noticeable difference.

5. Businesses are over-investing in dashboards and under-investing in access

Most organisations already have dashboards, often more than they need, so the issue is not availability, but adoption.

In practice, dashboards tend to be used by a relatively small group of people who know where to look, what to ask, and how to interpret what they are seeing. Everyone else continues to rely on someone else to extract and explain the data. In many cases, dashboards are built, reviewed by a handful of people, and then rarely used again.

In marketing teams, this often means waiting on analysts to validate performance before decisions can be made, which slows down optimisation and limits responsiveness.

The shift that came through in discussion was not towards building more dashboards, but towards improving access. Giving people the ability to ask a question in plain English and receive a response they can understand and act on, without needing to navigate a BI tool or understand the underlying data structure.

In most organisations, dashboards serve a small group of specialists. The real opportunity is giving the rest of the business direct access to answers.

It’s a subtle shift, but it changes who can actually use data across the organisation.

6. Marketing is moving fastest, but struggling to connect activity to outcomes

For CMOs, AI adoption is already well underway across multiple areas of the function.

Content teams are using it to scale output, paid media is increasingly automated, and CRM teams are building more sophisticated segmentation and lifecycle programmes.

There is no shortage of activity.

What’s proving more difficult is connecting that activity to clear, measurable outcomes.

Several marketing leaders described challenges around inconsistent performance measurement across platforms, increased content volume without a clear view of impact, difficulty linking top-of-funnel activity to revenue, and the introduction of multiple tools across teams and agencies without a shared structure.

The result is a familiar tension. There is more data, more tooling and more output, but not necessarily more clarity.

And ultimately, CMOs are the ones expected to translate that into a coherent, defensible narrative at board level.

Key Takeaways

Across the discussion, a few points came through clearly.

  • Shadow AI is growing faster than organisations can track or govern it.
    Adoption is happening across teams, often without coordination or oversight.

  • Technical teams are moving ahead while operational teams are still working from last month’s numbers.
    Progress is uneven, and the gap between teams is becoming more visible.

  • AI does not fix bad data. It amplifies it.
    Without a connected, governed data foundation, AI exposes inconsistency rather than resolving it.

  • Dashboards serve a small part of your organisation. Answers open up access to everyone.
    Access, not reporting, is becoming the real barrier to adoption.

  • The gap between technical and operational teams widens over time if left unaddressed.
    Without intervention, organisations drift further into a two-speed model.

Structure your AI adoption before it structures itself.
The organisations making progress are the ones putting governance and ownership in place early.

Inside the Room:
What CAIOs, CDOs and CMOs Are Actually Struggling With in AI Adoption

In March, we spent time with a group of CAIOs, CDOs and CMOs at an AI-focused leadership event hosted by Inspired Business Media.

What was striking wasn’t a lack of ambition. If anything, most organisations are already well into AI adoption, experimenting with tools, investing in capability, and in some cases moving quickly.

The challenge, as it came through in conversation, is less about whether to adopt AI and more about how to make it work in a way that is structured, connected and ultimately useful.

Across sectors and company sizes, the same themes kept resurfacing.

1. ‘Shadow AI’ is growing faster than it’s being managed

One of the more candid points raised was the rise of what many described as Shadow AI.

In simple terms, this is when teams start using AI tools independently, without formal oversight, coordination, or a shared framework across the organisation. It often happens with good intent. People are trying to move faster, solve problems, or improve how they work day to day.

No one is governing it. No one is connecting it.

In practice, it looks like this. Sales has one set of tools in place, finance is trialling others, operations is experimenting, and marketing is often moving fastest, using AI across content, campaign optimisation and analytics.

What’s missing is coordination.

There is rarely a shared view of which tools are in use, how data is flowing between them, or what is actually delivering value. As a result, adoption appears strong on the surface, but underneath it is fragmented.

That fragmentation shows up in a few consistent ways. Data starts to move in different directions, making it harder to maintain a single, reliable view of the business. Security and compliance exposure increases as new tools are introduced without clear oversight. And organisations often find themselves paying for overlapping capabilities without a clear sense of return.

In many cases, adoption is moving faster than organisations can track or govern it. The issue isn’t that teams are moving too quickly. It’s that AI adoption is outpacing the structure around it.

2. Technical teams are moving ahead. Operational teams are falling behind

Another theme that came through clearly was how uneven progress feels across organisations.

Technical teams, particularly in data and engineering, are building at pace. There is real momentum in models, dashboards and automation.

At the same time, many operational teams are still relying on spreadsheets, email chains and manual reporting. They are waiting for data teams to produce the numbers, often working with information that is already out of date by the time it reaches them.

Several leaders described this as a ‘two-speed organisation.’ One part of the business is becoming increasingly data-driven, while the other continues to operate in much the same way it has for years. In some cases, operational teams are still working from last month’s numbers while technical teams are building ahead.

In marketing, this often shows up in a familiar way. Campaigns move quickly, channels evolve, and activity scales, but understanding performance still lags behind. The teams closest to customers and revenue are not always the ones with the most direct access to timely, reliable insight.

Over time, that gap becomes harder to ignore.

3. AI doesn’t fix bad data. It amplifies it

This was one of the clearest points of alignment in the room. AI is only as effective as the data underneath it.

In practice, many organisations are still working with data spread across multiple systems, with inconsistent definitions, duplication and gaps. When AI is layered on top of that, it doesn’t resolve those issues. It tends to surface them more clearly.

Leaders shared examples where different systems report different versions of the same number, where metrics cannot be reconciled across teams, and where outputs cannot be trusted enough to act on with confidence.

A board report says revenue is one number. Finance reports another. The CRM shows something else entirely.

For CMOs, this is often most visible in attribution and performance reporting, where different platforms and teams produce conflicting views of results. At that point, the challenge is no longer access to data, but confidence in what it is actually saying.

Until the underlying data is connected and governed, AI will struggle to deliver consistent value. 

4. AI strategy often lacks clear ownership

Ownership came up a lot.

In many SME and mid-market organisations, AI doesn’t sit neatly within one function. Technology teams are responsible for capability, operations teams are focused on execution, and finance is focused on cost and return. As a result, AI tends to sit somewhere in between.

Without clear ownership, progress can stall. Pilots are launched but not scaled. Tools are introduced without coordination. Decisions take longer than they should, because responsibility is shared but not clearly defined.

The organisations making more progress tend to approach this differently. They assign clear accountability to someone who can connect technology, operations and commercial priorities, even if that responsibility sits within an existing role.

That clarity makes a noticeable difference.

5. Businesses are over-investing in dashboards and under-investing in access

Most organisations already have dashboards, often more than they need, so the issue is not availability, but adoption.

In practice, dashboards tend to be used by a relatively small group of people who know where to look, what to ask, and how to interpret what they are seeing. Everyone else continues to rely on someone else to extract and explain the data. In many cases, dashboards are built, reviewed by a handful of people, and then rarely used again.

In marketing teams, this often means waiting on analysts to validate performance before decisions can be made, which slows down optimisation and limits responsiveness.

The shift that came through in discussion was not towards building more dashboards, but towards improving access. Giving people the ability to ask a question in plain English and receive a response they can understand and act on, without needing to navigate a BI tool or understand the underlying data structure.

In most organisations, dashboards serve a small group of specialists. The real opportunity is giving the rest of the business direct access to answers.

It’s a subtle shift, but it changes who can actually use data across the organisation.

6. Marketing is moving fastest, but struggling to connect activity to outcomes

For CMOs, AI adoption is already well underway across multiple areas of the function.

Content teams are using it to scale output, paid media is increasingly automated, and CRM teams are building more sophisticated segmentation and lifecycle programmes.

There is no shortage of activity.

What’s proving more difficult is connecting that activity to clear, measurable outcomes.

Several marketing leaders described challenges around inconsistent performance measurement across platforms, increased content volume without a clear view of impact, difficulty linking top-of-funnel activity to revenue, and the introduction of multiple tools across teams and agencies without a shared structure.

The result is a familiar tension. There is more data, more tooling and more output, but not necessarily more clarity.

And ultimately, CMOs are the ones expected to translate that into a coherent, defensible narrative at board level.

Key Takeaways

Across the discussion, a few points came through clearly.

  • Shadow AI is growing faster than organisations can track or govern it.
    Adoption is happening across teams, often without coordination or oversight.

  • Technical teams are moving ahead while operational teams are still working from last month’s numbers.
    Progress is uneven, and the gap between teams is becoming more visible.

  • AI does not fix bad data. It amplifies it.
    Without a connected, governed data foundation, AI exposes inconsistency rather than resolving it.

  • Dashboards serve a small part of your organisation. Answers open up access to everyone.
    Access, not reporting, is becoming the real barrier to adoption.

  • The gap between technical and operational teams widens over time if left unaddressed.
    Without intervention, organisations drift further into a two-speed model.

Structure your AI adoption before it structures itself.
The organisations making progress are the ones putting governance and ownership in place early.
Configur connects the dots between your systems, teams, and obligations, giving you one place to see the full picture, act faster, and stay audit-ready.