― AI Operating Baseline
People Are Already Using It
In most organizations, AI adoption is already underway. Someone on the finance team is using Copilot to draft reports. Marketing has a handful of people running content through ChatGPT. A few engineers are writing code with Claude. Some of it is sanctioned. A lot of it is not. Either way, it is happening.
The challenge is that individual experimentation is outrunning organizational coordination. One organization suspected a handful of people were experimenting; automated discovery in the first week found employees across four departments had signed up for AI tools independently, while the Copilot licenses the company had already purchased sat underused. That gap between what is sanctioned and what is happening is common. Use case ideas tend to arrive faster than teams can evaluate them. Leadership often has multiple views of where AI should focus and which operating area should go first. Guardrails, governance, and a shared picture of what is already in use usually lag behind the momentum to experiment. Readiness moves at two speeds: individual adoption is fast; organizational coordination takes deliberate effort.
The AI Operating Baseline is designed for exactly this stage: when individual adoption has momentum but the organization has not yet found its starting point.
What the Baseline Produces
Before committing to a larger investment, leadership typically needs four things that are rarely in place without a deliberate effort to build them.
- Current State Snapshot. What AI tools are in use across the organization, who is using them, and for what. The picture includes sanctioned tools and the shadow AI usage that most organizations suspect exists but cannot see clearly.
- Refined Objectives. A shared executive understanding of what success looks like. The Baseline surfaces conflicting priorities through facilitated conversation so leadership can resolve them before resources are committed.
- Governance Design Inputs. Which guardrails and decision rights need to be in place before the organization proceeds, and where enablement work is needed. Not a full governance framework, but the evidence-based gap analysis that tells leadership what to address next.
- Prioritized Opportunity Register. Ranked opportunities with a recommended first use case, each scored for business impact, feasibility, and data readiness. The register gives leadership the basis to commit to a first initiative with evidence behind it.
Together, these four outputs give leadership the shared context to make a first commitment with confidence.
How the Engagement Runs
The Baseline combines automated discovery tools, employee surveys, and targeted executive conversations. Discovery tools identify which AI products are in use across the organization. Where enterprise subscriptions are in place, admin-level usage analytics add visibility into adoption patterns that discovery tools alone cannot reach.
Five to six focused conversations with the right stakeholders surface objectives, governance gaps, and which operating areas are best positioned for a first initiative. The engagement closes with a facilitated prioritization session that ranks opportunities into a recommended starting point.
Two to three weeks. The pace is fast because the tools do significant discovery work that would otherwise take weeks of interviews.
