ChanlChanl
Industry & Strategy

Stop Building Dashboards. Start Shipping Signal.

Dashboards tell VPs what happened last quarter. Signal tells them which account to call today, and why. How CX is exiting the post-dashboard era in 2026.

LDLucas DalamartaEngineering LeadFollow
April 16, 2026
10 min read
A watercolor illustration of a revenue leader turning away from a wall of dashboards to act on a single highlighted customer conversation

A VP of Customer Success I talked to last month opens her weekly dashboard every Monday at 8:30. Ticket volume down 12%. First response time trending down. CSAT holding at 57%. Every chart green.

By 9:00 she's closed the tab and has no idea what to do with her day.

Three of her top-ten accounts quietly cut their license count over the last two weeks. One mentioned a competitor by name on a support call last Tuesday. Another asked about a beta feature and got told "we don't support that yet." None of it is on the dashboard. The dashboard is showing averages. The churn is happening on individual rows inside those averages, and by the time the trend line bends, the customer is gone.

Support dashboards are becoming the new vanity metric. They report what happened across the aggregate. Revenue leaders need to know what to do next, per customer, this week.

Dashboards report. Leaders decide.

Most BI tooling was built for a retrospective audience: finance, ops, board packs. It answers "what happened last quarter across the portfolio." That was fine when customer experience meant survey scores reviewed monthly. It breaks in a world where the actual customer signal is embedded inside thousands of voice and chat conversations that nobody reads.

Analyst houses started pointing at this gap before it had a name. Forrester's 2026 CX predictions warn that many teams are "drifting dangerously close to the event horizon of metric obsession" and that 15% of CX teams will be eliminated by 2027, not because CX stopped mattering, but because those teams became replaceable reporting functions instead of strategic partners tied to growth. A separate CX Today analysis put it bluntly: "CX teams are stuck in the measurement trap. Real-time intelligence is the exit." [^1] [^2]

The quantitative case is uglier than that. Enterprises with more than 500 dashboards typically carry 30% to 40% redundancy across reports, and only about 25% of employees actually use the analytics tools their companies pay for. [^3] [^4] Ampcome calls it the post-dashboard era. Baytech just calls it "a graveyard of unused tabs." [^5] [^6] Whatever you want to call it, the leaders I talk to can't tell me the last time a chart on a CX dashboard caused them to pick up the phone.

The shift is the same one SaaS made a decade ago.

The SaaS rerun: reports became workflows

Early customer success was a reporting function. A CSM would open a health dashboard, stare at a red number, and hopefully remember to email the customer. Gainsight and Totango won because they moved the center of gravity from the dashboard to the playbook. Health drops below a threshold. A task lands in the CSM's queue. An email fires. An exec gets tagged. The report didn't go away, but it stopped being the unit of work.

That shift is measurable. In 2023 alone, Totango customers ran more than 110,000 playbooks through their platform. [^7] Comparisons of Gainsight and Totango today don't argue about who has the prettier chart. They argue about who has the better workflow engine, whose integration with Salesforce is cleaner, whose rules fire faster. [^8] Dashboards survived. But the thing a VP actually judges the platform on is whether the right playbook fired on the right account.

The warehouse world made the same move. Reverse ETL vendors like Hightouch, Census, Fivetran, and dbt have been pounding the same drum for years: the warehouse has to be the source of truth for operations, not just reporting. Otherwise you're asking the sales rep to leave their tool, walk to a dashboard, and come back. They won't. [^9] [^10]

Now CX is up. The conversation is the analog of the warehouse row. The playbook is the analog of the customer success workflow. What's missing is the primitive in between. That primitive is Signal.

What does a Signal actually look like?

A Signal is structured extraction per conversation, with confidence and provenance. It names what the customer said, how sure the extractor is, and which exact transcript turn it came from. That sounds obvious until you realize that almost nothing on a CX dashboard has those properties.

Here's the shape it tends to take in practice:

signal.json·json
{
  "signalId": "sig_01HV9...",
  "customerId": "acct_1204",
  "interactionId": "int_8814",
  "category": "expansion",
  "label": "multi-seat-interest",
  "confidence": 0.92,
  "severity": "high",
  "excerpt": "We'd probably want to roll this out to the APAC team next quarter.",
  "suggestedAction": "notify_csm_for_expansion_play",
  "detectedAt": "2026-04-15T14:22:17Z"
}

Three things are doing the real work in that object. The category lets your workflow engine route on it (expansion goes one place, risk another). The confidence lets you set thresholds so you don't wake up a CSM at 2am for a 0.41 maybe. The excerpt is the provenance: if a human wants to challenge the signal, they click through to the exact moment in the transcript. This is what separates signal from hallucination.

Categories matter, too. The ones worth extracting in CX mostly sort into four buckets:

CategoryWhat you're looking forExample next action
IntentWhat the customer wants to do nextRoute to the right agent or playbook
RiskChurn language, frustration, escalation cuesTrigger save motion, notify CSM
ExpansionSeat counts, new teams, adjacent use casesOpen an expansion play
FeaturesProduct mentions, missing capabilities, competitorsFeed product and positioning

Gong and Chorus proved these categories were extractable at scale years ago. The gap has never really been detection. The gap has been what happens after detection. Does the signal sit inside a conversation intelligence UI for an AE to maybe read later, or does it flow out of that UI into the system that will actually do something with it? [^11] [^12]

Conversation analyst reviewing data

Sentiment Analysis

Last 7 days

Positive 68%
Neutral 24%
Negative 8%
Top Topics
Billing342
Support281
Onboarding197
Upgrade156

The workflow is the product

A signal that doesn't trigger an action is just a prettier piece of telemetry. This is the piece that separates the teams that are going to compound in the agent era from the teams that are going to get outrun.

Think about what "expansion detected" should actually do. A good flow looks like this: the signal lands on the customer record within minutes of the call ending. The CSM gets a notification with the customer name, the exact quote, a confidence score, and a one-click action to open the expansion play. Salesforce gets a new opportunity stub, pre-populated. The prompt for the next AI agent interaction on that account includes a note: "customer expressed expansion interest last Tuesday, reference APAC rollout."

None of that is novel. It's what Gainsight has been doing for health scores and what Gong has been doing for sales deals. The plumbing isn't exotic either: a reverse-ETL layer like Hightouch or Census to sync signals into Salesforce, an orchestrator like Inngest or Temporal to run the play, a Slack webhook to notify the human. The novelty is that the raw material is now the conversation itself, not a column in a product analytics table. And because AI agents are handling more of those conversations, the same system that produces the signal can consume it. The next call with that customer already knows the context. That's the closed loop. It's also why tools like Common Room are actively pitching "automate action on conversational data with AI agents" as a category, not a feature. [^13]

Gartner's agent predictions basically require this architecture. If 40% of enterprise applications are going to have task-specific AI agents by end of 2026, and if agentic AI is projected to autonomously resolve 80% of common service issues by 2029, those agents need to communicate with each other through structured signals, not through dashboards that a human happens to glance at. [^14] [^15] Gartner itself expects one-third of agentic AI implementations to combine multiple agent types by 2027. "Combine" means share state. Shared state means structured extractions. Structured extractions are signals. [^16]

Dashboards vs signals: what's the actual swap?

A dashboard answers "what happened across all accounts last month." A signal answers "what this account needs today." One is optimized for a human to open and read. The other is optimized to trigger the system that will act on it. The swap looks like this on a leadership team's desk:

DashboardsSignals
Aggregate, lagging, "what happened across all accounts last month"Per-customer, per-conversation, "what this account needs today"
Rendered in a UI a human has to openRouted to the system that will act on it
Optimized for readingOptimized for triggering
Metric: charts rendered, logins, report opensMetric: decisions made, plays fired, revenue saved
Owner: analytics teamOwner: the team that owns the action
Churn arrives as a trend line three months after it startedChurn arrives as a specific quote from a specific call on a specific day

Neither column is wrong. Dashboards still matter when the CFO wants a slide. But you'd be surprised how few operational decisions actually get made from them, and how many get made from a Slack message that says "the VP of Operations at CustomerCo just asked if we have an SSO roadmap."

What are the trade-offs of going signal-first?

None of this is free, and the failure modes are real enough that Gartner also predicts more than 40% of agentic AI projects will be canceled by the end of 2027, mostly because teams underestimate what it takes to productionize this stack. [^17]

Three specific trade-offs come up every time.

Extraction quality isn't perfect. Language models mislabel intent. They over-fit to phrases the last fine-tune emphasized. They miss sarcasm, regional idioms, and the conversational context that a human catches instantly. You have to build an evaluation layer alongside the extraction layer: sampled reviews, scorecards, golden-set regression. Skip that and signal quality silently decays while your workflows start firing on noise. This is the work covered in how to turn analytics into action on conversation data: the improvement loop matters as much as the detection itself.

Schema drift is real. The categories and labels that matter for your business in Q1 are not the ones that matter in Q4. A new product launch introduces new risks. A pricing change introduces new expansion triggers. Signals are opinionated data, and opinions need versioning. The teams that do this well treat their signal schema the way product teams treat API contracts: backwards compatible where possible, explicitly versioned where not.

Review loops are non-negotiable. Someone on the team has to own the weekly pass of "here are fifty signals we fired, here's which ones were right, here's which ones triggered the wrong play." Skip it and you drift. Do it and signals get better month over month. This is the same discipline an observability team uses on an application, which is why the teams doing this well tend to borrow the playbook from production AI agent observability rather than treating it as a CX problem.

All of this is tractable. None of it happens by accident, and anyone telling you it's plug-and-play hasn't shipped one in anger.

The measure of the Signal era

The easiest test of whether your CX org has made the shift is the meeting test. Walk into a CS leadership review. What's on screen?

If it's a dashboard of aggregates and the conversation is about why a number moved, you're in the dashboard era. If it's a queue of this week's signals, who owns the response, and what happened to last week's, you're in the Signal era. Most orgs today are still in the first one. The ones that are going to have CX departments in three years are the ones making the switch now.

Chanl's bet is that Signal becomes the primitive every CX platform organizes around, the same way events became the primitive for product analytics a decade ago. We rebuilt our own product framing around this: Signal extracts the structured data from every interaction, Scorecards validate that the extraction is trustworthy, and Monitoring watches the signal stream itself instead of rendering more charts. The old Analytics view didn't go away. It's still there for the board pack. But it's no longer the center of the product, or the center of how we think about the work. More detail on the data side of this shift lives in Your Conversations Are Already CRM Data.

Go back to the VP from the top of this piece. Next Monday at 8:30, she doesn't open a dashboard. She opens a queue. Fourteen expansion signals from last week, three of them high-confidence, each tagged to a specific account and a specific quote. Two churn risks she needs to call personally before lunch. One competitor mention her AE lead should run down. By 9:00 she's made five decisions and kicked off two plays. That's the shift. Dashboards will still be there for the board pack, and that's fine. But the unit of work has moved, and the stacks that still treat charts as output instead of input are the ones getting quietly replaced.

Turn conversations into signal, not slides.

See how Chanl extracts structured signal from every AI agent interaction (intent, risk, expansion, features) and routes it into the system that owns the next action.

Explore Signal
LD

Engineering Lead

Building the platform for AI agents at Chanl — tools, testing, and observability for customer experience.

The Signal Briefing

Un email por semana. Cómo los equipos líderes de CS, ingresos e IA están convirtiendo conversaciones en decisiones. Benchmarks, playbooks y lo que funciona en producción.

500+ líderes de CS e ingresos suscritos

Frequently Asked Questions