Our Approach

One clear framework.
41 providers, 1,220 sites.
Zero proprietary dependencies.

Executive Summary
Toyota Proprietary Digital Analytics
Approach for TDDS

Designed to solve the TDDS measurement problem at scale: 1,220 dealer sites, 41 certified providers, and zero proprietary dependencies, delivering a fully operational, TMNA-owned measurement framework by November 30, 2026.

1,220
Dealer Sites
41
Certified Providers
0
Proprietary Dependencies
Nov 30
Completion Deadline
What's in This Proposal

This proposal covers three sequential phases from discovery through full network deployment, supported by governance, team, and case study materials. Use the left-hand navigation to explore any section in depth.

Our Three-Phase Delivery Plan
1
Phase 1
Integrity & Foundation
June – August 2026
Stakeholder discovery, audit, KPI framework, data layer spec, and Adobe solution design run in parallel: not sequentially. Closes with TMNA Decision Gate in August.
2
Phase 2
Provider Enablement
September – October 2026
While tool providers begin training in September, provider specs are still finalizing. QA starts at first go-live, not at end of project. Categories onboard as their specs are ready.
3
Phase 3
Governance & Handoff
November 2026
Full network coverage, production validation, remediation SLA enforcement. TMNA-owned dashboard live, and complete knowledge transfer, program complete November 30, 2026.
The Merkle Difference
Adobe Scale
1,300+ Adobe credentials globally. Global Platinum Partner. One of two GenStudio Innovation Award winners worldwide.
Automotive Depth
Direct OEM tier-3 experience: 2,400 dealer sites, 7 certified providers, the closest direct analog to TDDS scope anywhere in the industry.
Enforcement, Not Oversight
4-stage QA, DQI scoring, and formal SLA cure periods turn TMNA's contractual authority over 41 providers into operational reality.
TMNA Owns Everything
Every schema, spec, and scoring framework is a TMNA-owned asset on delivery. No licensing. No access agreements. No Merkle dependency.
CDP-Ready Architecture
Web SDK migration path and identity-ready data structures built in from day one. No re-tagging when TMNA activates its CDP.
Concurrent Delivery
Audit, KPI design, and solution architecture run in parallel, compressing the path to provider enablement without sacrificing spec quality.
The Problem TMNA Is Solving
Fragmented Data Collection

Each of 41 providers implements analytics independently. TMNA cannot aggregate or compare dealer data across the network. KPIs are defined differently by every provider.

No TMNA Data Ownership

The current architecture lacks a mechanism for TMNA to retain control over data structures, tagging frameworks, and schemas. No path to an OEM-level CDP without this foundation.

Scale Governance Gap

TMNA holds contractual authority over 41 providers but no structured enforcement mechanism exists; no implementation tracking, no escalation playbook, no graduated remediation.

Zero Standardized Data Layers Today

41 providers, 5 tool categories, 1,220 sites, each with its own interpretation of what data to fire, when, and how. The result is data that cannot be trusted at the network level.

Outcomes We Will Deliver for TMNA
  • Standardized measurement framework across all 1,220 dealer sites by November 2026
  • Unified dealer performance visibility from a single TMNA-owned data source
  • TMNA-owned data architecture; no proprietary dependencies, no vendor lock-in
  • Dealer-level scorecards and network rollups for provider accountability
  • Future-ready CDP foundation with identity-ready event and attribute design
  • Full TMNA team independence from Merkle by November 30, 2026
What TMNA Gets with Merkle
  • 1,300+ Adobe-trained specialists globally: analysts, architects, and strategists with direct automotive experience
  • Global Adobe Platinum Partner with specializations in Analytics, Real-Time CDP, Target, and CJA
  • Direct tier-3 dealer ecosystem experience: 2,400 dealer sites, 7 certified providers, OEM governance models
  • Automotive OEM bench: VW, Ford, Hyundai, Bridgestone, Nissan/Infiniti; not automotive-adjacent, automotive-native
  • Full IP ownership on delivery: every spec, schema, and scoring framework is a TMNA asset from day one
Concurrent Delivery Tracks: Jun through Nov 2026
Jun '26
Kickoff Jun 29
Jul '26
Aug '26
Sep '26
Oct '26
Nov '26
Phase 1: Foundation & Framework (runs concurrently, not sequentially)
Stakeholder Onboarding
Kickoff + Discovery
KPI Framework & Design
Data Collection Audit (41)
Data Layer Spec & Tagging
Adobe Solution Design
Phase 2: Provider Enablement (by category, not all-at-once)
Provider Spec Publishing
Website Providers (7)
Spec + Train + Impl
Digital Retail (10)
Chat (8) + Trade-In (7)
Service Schedulers (9)
QA: Size A/B First
DQI Scoring A/B
Ensighten / Adobe Pipeline
Phase 3: Full Network Coverage & Handoff
Size C/D/E Rollout
Full Coverage
Production Validation All 41
Remediation SLA
TMNA Training & Handoff
Handoff Complete
TMNA Dashboard Live
Dashboard Live
Knowledge Transfer & Docs
Program Complete ◆
Phase 1: Foundation
Phase 2: Enablement
Phase 3: Coverage
QA & Governance
Documentation & Handoff
TMNA Decision Gate
Phase 1 · Integrity & Foundation

Measurement Framework

A canonical KPI definition for every metric, exact trigger, required parameters, and allowed values, so data means the same thing across all 1,220 dealer sites regardless of which of the 41 providers built them.

Four KPI Domains
Website Behavior

Sessions, unique visitors, new vs. returning, SRP-to-VDP progression rate, page depth, bounce rate by entry page and traffic source. The baseline layer every provider must instrument identically.

Digital Retail Engagement

DR tool launch rate (% of VDP visitors), funnel progression by step, payment estimator completion, financing step engagement rate across all 10 certified digital retail tool providers.

Tool Funnel Performance

Trade-in launch/completion and valuation completion rate across 7 providers. Service appointment start/completion across 9 service scheduler providers. Abandonment tracked at each funnel step.

Tier 1 → Tier 3 Journey

Traffic flow into, through, and out of dealer sites, enabling full-funnel optimization and messaging interception across Toyota's digital ecosystem at the OEM level.

Sample KPI Canonical Definitions
Digital Retail Launch Rate
Digital Retail Engagement
TriggerDOMContentLoaded on DR tool iframe render within VDP
Requiredprovider_id · tool_type · vehicle_vin · dealer_code
AA VariableeVar42 (tool_type) + event23 (dr_launch)
Levelsdealer | provider | size_cohort | network
VDP-to-DR Funnel Progression
Digital Retail Engagement
TriggerpostMessage event at each DR tool step transition
Requiredstep_name · step_number · tool_type · dealer_code
AA VariableeVar43 (step_name) + event24 (step_complete)
Levelsdealer | provider | network
Service Appointment Completion Rate
Tool Funnel Performance
TriggerConfirmation page render or postMessage confirmation event
Requiredprovider_id · appointment_type · dealer_code · step
AA VariableeVar51 (appt_type) + event31 (appt_complete)
Levelsdealer | provider | size_cohort | network
SRP-to-VDP Progression Rate
Website Behavior
TriggerPageview where page_type = 'VDP' following page_type = 'SRP'
Requiredpage_type · dealer_code · traffic_source · session_id
AA VariableeVar12 (page_type) + pathing report
Levelsdealer | size_cohort | network | traffic_source
What you get
  • GVM-aligned KPI definitions in TMNA's internal standard format, ready for report suite configuration on day one
  • Four decision-driving domains: not maximum data volume, but the focused signal set that drives dealer and OEM decisions
  • Every KPI reportable at four levels: dealer, provider, size cohort, and network rollup, enabling apples-to-apples comparison for the first time
  • CDP-extensible design: event and attribute structures built for identity resolution without re-tagging

Why this matters: Prior Tier 3 analytics efforts failed because specs defined what to collect but not precisely when, resulting in incomparable data across providers. This framework closes that gap before a single provider begins implementation.

Delivery Plan

Roadmap & Gantt

Discovery, KPI design, audit, and spec work run in parallel. Provider categories onboard as their specs are ready. QA and documentation run continuously across all phases from the moment the first provider goes live.

Concurrent Delivery Tracks: Jun through Nov 2026
Jun '26
Kickoff Jun 29
Jul '26
Aug '26
Sep '26
Oct '26
Nov '26
Phase 1: Foundation & Framework (runs concurrently, not sequentially)
Stakeholder Onboarding
Kickoff + Discovery
KPI Framework & Design
Data Collection Audit (41)
Data Layer Spec & Tagging
Adobe Solution Design
Phase 2: Provider Enablement (by category, not all-at-once)
Provider Spec Publishing
Website Providers (7)
Spec + Train + Impl
Digital Retail (10)
Chat (8) + Trade-In (7)
Service Schedulers (9)
QA: Size A/B First
DQI Scoring A/B
Ensighten / Adobe Pipeline
Phase 3: Full Network Coverage & Handoff
Size C/D/E Rollout
Full Coverage
Production Validation All 41
Remediation SLA
TMNA Training & Handoff
Handoff Complete
TMNA Dashboard Live
Dashboard Live
Knowledge Transfer & Docs
Program Complete ◆
Phase 1: Foundation
Phase 2: Enablement
Phase 3: Coverage
QA & Governance
Documentation & Handoff
TMNA Decision Gate
Why Categories Don't All Start at Once
Website Providers Go First

The 7 website providers are the foundational layer, all 1,220 dealer sites run on them. Their data layer establishes the base schema all embedded tool events write into. They start in September before all tool provider specs are finalized.

Tool Providers Ramp as Ready

Digital retail, chat, trade-in, and service scheduler providers begin training in September but implement on a rolling basis, their iFrame and postMessage architectures require provider-specific prep time that runs alongside website rollout.

QA Starts at First Go-Live, Not at End

As soon as the first website provider goes live, QA begins. Size A and B dealers are validated first, the highest-volume sites where data quality gaps most affect TMNA's enterprise reporting and media attribution decisions.

Documentation Runs the Full Timeline

Knowledge transfer starts in July, not November. Every spec, playbook, and GVM-aligned document is written during the phase it covers so TMNA teams can review, question, and own it before the engagement closes.

What you get from concurrent delivery
  • Faster time to first data: website providers go live in October, not November, because spec and training overlap with audit
  • Risk reduction through staging: A/B dealer QA catches patterns before the full C/D/E rollout in November
  • No bottlenecks from sequential handoffs: audit informs spec while running; spec informs training while finalizing
  • TMNA teams build capability in real time: documentation and training run throughout, not as a bolt-on at the end
Phase 1 · Integrity & Foundation

Data Collection Audit

A provider-by-provider inventory of current-state data collection across all 41 certified providers, establishing the factual baseline that makes the specification achievable, not aspirational.

The 41-Provider Ecosystem
TDDS Certified Provider Categories, All Audited Before Spec Is Written
Website
7
Foundational layer. All 1,220 sites run one of these. All 7 currently push to Adobe Analytics. Implement first.
Digital Retail
10
iFrame-embedded tools. Event passthrough architecture varies by provider, postMessage or direct write assessed per provider.
Chat Tools
8
Tracking capability varies significantly. postMessage bridge vs. direct dataLayer write documented per provider.
Trade-In
7
Widget event inventory: start, step, valuation received, completion, abandonment. High variability in current funnel coverage.
Service Scheduler
9
Appointment funnel critical for TMNA reporting. Highest variability in current capability across the 5 categories.
Total Certified Providers, All Assessed Before a Single Spec Line Is Written
41 Providers · 1,220 Sites
What the Audit Produces
  • Website Providers (7): All 7 currently push to Adobe Analytics. Audit confirms existing data layer maturity, variable mapping, and Ensighten deployment model per provider, establishing the baseline each must standardize to.
  • Chat Tools (8): Tracking capability varies significantly. Audit assesses whether each provider supports postMessage or direct dataLayer write, documenting limitations before spec publication so the spec reflects reality.
  • Digital Retail Tools (10): iFrame architecture review per provider. Documents event passthrough feasibility and required schema by integration type, preventing spec requirements that providers physically cannot implement.
  • Trade-In (7) & Service Scheduler (9): Widget event inventory identifying which funnel steps are currently tracked, which are missing, and which require custom instrumentation, prioritizing highest-gap providers for early engagement.
  • Platform constraints: CMS, TMS, and iFrame limitations documented per provider so the spec accounts for real-world constraints before providers receive it. No aspirational requirements that fail in production.
What you get
  • A provider-by-provider current-state inventory across all 41 certified providers, the only audit of its kind in the TDDS program
  • Platform limitation documentation that prevents spec failures during implementation
  • A data quality gap map showing where inconsistencies originate today, provider-wide vs. dealer-cohort-specific
  • Prioritized remediation targets: highest-volume, highest-impact gaps addressed first

TMNA confirmed: Inconsistency in tracking tools exists due to differences in tool platforms across the 41 certified providers. The audit creates the factual baseline that makes the spec achievable rather than aspirational.

Phase 1 · Integrity & Foundation

Solution Design

A standardized, TMNA-owned data layer architecture applicable across all dealer websites and every embedded TDDS-certified third-party tool, the single schema all 41 providers implement consistently.

Data Layer Attribute Groups
Page
page_typeREQUIRED
enum: home | SRP | VDP | service | contact | DR, fires on every pageview, drives segmentation
page_name
Descriptive title combining model and dealer, human-readable AA dimension
site_section
Top-level content category: inventory | service | about | specials
Dealer
dealer_codeREQUIRED
TMNA 5-digit code, required on all hits. Enables network rollup and DQI attribution
region
TMNA regional designation, supports regional benchmarking in AA workspace
size_cohortREQUIRED
A–E stratification, drives DQI benchmarking and QA prioritization logic
Vehicle
vinREQUIRED
17-character VIN on VDP and DR tool events, core vehicle identity attribute
model / model_year
Toyota model name and year, powers model-level performance reporting
condition
enum: new | cpo | used, required for accurate inventory funnel segmentation
Provider / User
provider_idREQUIRED
TDDS-assigned slug, required on all hits from certified tool providers
provider_categoryREQUIRED
enum: website | dr | chat | tradein | scheduler, maps events to provider category
session_id / traffic_source
Anonymous session token + referring domain; no PII. Enables Tier 1→3 journey attribution
Design Principles Aligned to RFP Requirements
  • Simplicity over volume: Every field in the schema earns its place against a specific KPI requirement. No speculative attributes that add complexity without adding decision value.
  • Vendor-agnostic by design: The dataLayer object is plain JavaScript. It does not depend on Ensighten, Adobe, or any TMS to exist, any provider can write to it regardless of their platform.
  • TMNA ownership enforced in the schema: The dealer_code and provider_id fields are required on every hit, ensuring TMNA can always attribute data to a specific dealer and provider without relying on vendor-side logic.
  • CDP-extensible: The user.session_id field is architected to accept a hashed identity value when TMNA activates its CDP; no re-instrumentation required.
  • iFrame event passthrough: Embedded tools communicate back to the parent page via postMessage, the schema defines the required payload structure for each tool category so providers know exactly what to fire.
What you get
  • A single canonical dataLayer schema applicable across all 5 provider categories, one spec, 41 implementations
  • Full TMNA ownership of the schema on delivery; no vendor licensing, no proprietary structure
  • Ensighten-native mapping: existing tag management infrastructure, no new tooling required
  • iFrame passthrough architecture documented per tool category so providers know exactly how to fire events back to the parent dataLayer
Phase 1 · Integrity & Foundation

Implementation Recommendations

Provider-specific implementation guidance derived from the TMNA-approved spec clear enough that any certified provider can implement consistently without bespoke Merkle involvement for every site.

What the recommendations include
  • Per-category implementation guides: Separate guidance for website providers, chat tools, digital retail tools, trade-in tools, and service schedulers accounting for each category's architectural constraints.
  • Acceptance criteria: Explicit pass/fail definitions for every required event, so providers know exactly what QA validation will test.
  • Platform-specific notes: Where a provider's platform creates a constraint, the guide documents the approved workaround or alternative implementation path.
  • TMNA approval gates: No provider receives implementation guidance until the underlying spec has received explicit TMNA sign-off.
Spec Publishing

Formal spec publication to all 41 providers as a single coordinated release every provider starts from the same TMNA-approved document.

TMNA-Owned IP

All deliverables are TMNA-owned upon delivery. No ongoing Merkle access or license required to operate or extend them.

What you get
  • Five category-specific implementation guides covering all 41 certified providers
  • Explicit acceptance criteria for every required event no ambiguity at QA time
  • A TMNA-approved spec serving as the permanent certification compliance standard going forward
  • Documentation structured for certification update enabling TDDS to embed requirements into provider governance materials permanently
Phase 2 · Provider Enablement

Data Literacy & Consulting

Structured working sessions with all 41 certified providers translating the TMNA-approved spec into shared understanding before any implementation begins.

Working session structure
  • Provider-specific sessions: Each certified provider receives a dedicated spec review covering requirements, technical questions, and platform-specific constraints not a generic group call.
  • Environment inventory: CMS, TMS, and iFrame architecture confirmed per provider so platform limitations are documented before implementation begins.
  • Acceptance criteria walkthrough: Providers understand exactly what QA will test eliminating the ambiguity that produces re-work and implementation delays.
  • Direct engagement model: Merkle operates as an extension of TMNA engaging providers directly for day-to-day execution, with TMNA providing governance and escalation support.
Existing Forums Leveraged

TDDS maintains recurring provider meetings, dealer communications, and field staff reviews. Merkle works within these structures not around them to minimize disruption.

No Provider Contracts

Merkle coordinates implementation as an agent of TMNA. TDDS holds all contractual authority; Merkle handles execution coordination.

What you get
  • 41 providers briefed and aligned on the TMNA-approved spec before implementation begins
  • Platform constraints documented per provider no surprises during QA
  • A single TMNA escalation path when provider issues require contractual intervention
  • Merkle as an extension of TMNA not an additional layer between TDDS and its providers
Phase 2 · Provider Enablement

Provider Training

Hands-on training built around the TDDS spec giving each provider's technical team the context and confidence to implement correctly the first time.

Training approach
  • Role-specific content: Training structured for the technical implementers at each provider specific guidance on the TMNA data layer and acceptance criteria they need to pass.
  • Real TDDS use cases: Training uses actual TMNA examples not sample data so providers understand the business context behind each event they're implementing.
  • Category cohorts: Providers grouped by category for training shared constraints, shared guidance, efficient use of provider time.
  • Office-hour support: Structured Q&A availability during implementation windows so providers can resolve technical questions without blocking progress.
TMNA Team Training

Parallel training track for TMNA TDDS, Analytics, and IT teams building internal capability to QA, monitor, and maintain the framework independently post-engagement.

Playbooks Delivered

Written implementation playbooks for each provider category persistent reference materials that survive staff turnover and support future certification renewal cycles.

What you get
  • Per-category training for all 41 provider technical teams
  • TMNA internal capability built in parallel your team owns QA monitoring on day one post-handoff
  • Written playbooks that outlast the engagement and support future certification renewals
  • Reduced re-work providers who understand the spec implement it correctly the first time
Phase 2 · Provider Enablement

Implementation Guidance

Active technical support during the implementation window resolving blockers, enforcing spec consistency, and tracking progress across all five provider categories simultaneously.

7
Website Providers
18
Tool Providers
16
Chat + Trade-In
Per-category implementation support
  • Website Providers (7): Phased first foundational layer before embedded tools. All 7 currently push to Adobe Analytics; implementation standardizes trigger conditions, naming conventions, and required attributes.
  • Digital Retail (10): iFrame event passthrough with standardized action schema. Payment estimator and full DR flows covered.
  • Chat Tools (8): Post-message bridge vs. direct dataLayer write assessed per provider. Tracking capability varies; guidance reflects each provider's actual architecture.
  • Trade-In (7): Widget events: start, step, valuation received, completion. Abandonment tracking required at each funnel step.
  • Service Scheduler (9): Appointment funnel: start, step, confirmation, abandonment. Nine providers with high variability in current capability.
What you get
  • Active implementation tracking across all 41 providers with gap identification and escalation
  • Consistent spec adherence no provider-specific variations that compromise standardization
  • Progress visibility for TMNA at every stage not a black box between spec delivery and QA
  • Escalation support Merkle documents and escalates; TMNA enforces through TDDS certification authority
Phase 2 · Provider Enablement

QA & Certification Readiness

A four-stage QA process ensuring every provider implementation is accurate, complete, and TMNA-spec-compliant before the framework goes live, zero tolerance for drift.

Four-Stage QA Process
1
Pre-Implementation
  • Spec review session with each provider
  • Environment inventory (CMS, TMS, iFrame)
  • Platform limitations documented
  • Acceptance criteria confirmed before code is written
2
Implementation Review
  • Tag audit via browser devtools + proxy
  • Event firing validation against spec
  • Data layer attribute completeness check
  • Adobe Analytics variable mapping confirmed
3
Production Validation
  • Representative live site audits per provider
  • DQI score assigned (5 dimensions)
  • Dashboard data reconciliation
  • Remediation SLA clock starts at failure
4
Ongoing Monitoring
  • Monthly DQI review per provider category
  • Alert framework for event volume drop-off
  • Certification renewal QA gate
  • TMNA team trained to run independently
Enforcement Principle: TMNA holds contractual authority over all 41 providers. Merkle defines the escalation process, missed requirements trigger a formal cure period. Unresolved gaps escalate to TDDS program management for certification review. Merkle documents; TMNA enforces.
What you get
  • A four-stage QA framework that catches implementation gaps before they reach production data
  • DataTrue-integrated monitoring configured in TMNA's existing platform; not a new tool
  • Formal acceptance criteria per provider category, pass/fail is never ambiguous
  • TMNA owns QA monitoring independently from day one of post-engagement operations
Phase 3 · Governance & Handoff

Data Quality Index

A size-stratified scoring framework giving TMNA comparable, defensible data quality benchmarks across all 41 providers, and a factual basis for holding each one accountable at certification renewal.

Dealer Size Stratification, Why It Matters
Size Cohort Monthly Sessions (Est.) Approx. Dealer Count QA Priority Relative Volume
Size A: Enterprise 50,000+ ~80 First
Size B: Large 20,000–50,000 ~190 First
Size C: Mid-Large 8,000–20,000 ~280 Second
Size D: Mid-Small 3,000–8,000 ~420 Third
Size E: Small <3,000 ~250 Third
Session volumes are illustrative. Actual cohort thresholds defined collaboratively with TMNA during Phase 1 based on TDDS program data.
DQI Scorecard, Illustrative Provider View
Provider Size A Size B Size C Size D Size E
Provider 19288756550
Provider 28085827870
Provider 38878655540
Provider 47280838174
Provider 58582776858
Provider 66072788076
Provider 77874706248
≥ 80 High
65–79 Medium
< 65 Needs Remediation
Scores illustrative. Actual values generated from Ensighten/Adobe Analytics QA audits post-implementation.
Why Size-Stratified Scoring Is the Right Approach
  • Eliminates composition bias: A provider serving primarily Size A dealers will appear stronger on raw KPIs than one serving Size E dealers, even if implementation quality is identical. Raw scores reward portfolio composition, not execution quality.
  • Exposes implementation gaps: Size-stratified views reveal whether data quality inconsistencies are provider-wide or concentrated in specific dealer cohorts, a critical distinction for targeted remediation.
  • Sets fair benchmarks: TMNA holds each provider accountable to the performance norms of the segment they actually serve; not a network-wide average that disadvantages providers focused on smaller dealers.
  • Prioritizes QA resources: Size A and B dealers receive QA attention first, the 270 highest-volume sites where data quality gaps most affect TMNA's enterprise reporting and media attribution decisions.
What you get
  • A DQI scorecard across all 41 providers segmented by dealer size cohort (A through E)
  • Comparable provider performance data: an apples-to-apples view for the first time
  • A defensible accountability baseline for TDDS certification and renewal conversations
  • TMNA-owned scoring logic configured in DataTrue for ongoing independent monitoring
Phase 3 · Governance & Handoff

Remediation SLA's

Formal cure periods with defined resolution deadlines ensuring that every gap identified in QA review has an owner, a timeline, and a consequence for non-resolution.

Remediation process
  • Gap identification: DQI scoring and four-stage QA reviews surface implementation gaps with specificity exact event, exact provider, exact failure mode.
  • SLA clock starts at failure: Every failed QA check triggers a formal cure period. The provider receives a documented gap statement and the specific acceptance criteria they failed.
  • Escalation path: Gaps unresolved within the cure period are escalated to TMNA/TDDS program leadership with Merkle's documented evidence for contractual enforcement.
  • TMNA leads enforcement: Merkle tracks, documents, and escalates. TMNA holds contractual authority and leads enforcement conversations with providers directly.
What you get
  • A formal remediation process SLA-governed cure periods, not informal follow-up emails
  • Documented evidence ready for TMNA enforcement conversations with non-compliant providers
  • Clear RACI: Merkle documents and escalates; TMNA enforces through certification authority
  • A compliance track record that informs certification renewal decisions for every provider
Phase 3 · Governance & Handoff

Provider Certification Renewal Gates

Compliance with the TMNA-approved tagging specification embedded as a mandatory condition of TDDS certification making data quality a permanent, self-enforcing program requirement.

How certification gates work
  • Spec compliance as certification condition: TDDS will formalize the tagging spec in an update to provider governance materials making implementation a permanent certification requirement, not a one-time project ask.
  • Annual renewal QA gate: Each provider's DQI score and QA audit results are reviewed at certification renewal providers who have drifted face a formal remediation requirement before renewal is granted.
  • New provider onboarding standard: Any new TDDS-certified provider must demonstrate compliance as a condition of initial certification the framework becomes self-perpetuating.
  • Decertification backstop: Persistent non-compliance can result in decertification the program's ultimate enforcement lever, held exclusively by TMNA/TDDS.
What you get
  • A permanently embedded compliance standard not a project that expires in November 2026
  • Governance documentation structured for direct insertion into TDDS provider certification materials
  • Annual renewal QA gates that sustain data quality without ongoing Merkle involvement
  • New provider onboarding criteria that extend the framework to future TDDS additions automatically
Phase 3 · Governance & Handoff

Structured Handoff to TMNA Ops

A planned, documented transition that leaves TMNA's internal team fully capable of operating, monitoring, and evolving the analytics framework independently with no ongoing Merkle dependency.

Full Documentation Suite

Tagging specifications, data layer definitions, KPI framework, QA procedures, and acceptance criteria all in GVM-aligned format for independent internal use.

TMNA Team Training

Hands-on capability transfer for TMNA TDDS, Analytics, and IT covering QA monitoring in DataTrue, spec evolution, and provider onboarding procedures.

DataTrue Configuration

TMNA's existing QA platform configured with alert framework for event volume drop-off and monthly DQI review automation monitoring runs independently from day one.

TMNA-Owned Codebase

All specifications, schemas, taxonomies, acceptance criteria, and guides are TMNA-owned upon delivery no vendor licensing, no access agreements required.

What you get
  • Complete documentation your team can read, maintain, and extend without Merkle involvement
  • A trained internal team capable of running QA monitoring, certifying providers, and evolving the spec
  • DataTrue configured and live TMNA owns monitoring from handoff day forward
  • Full IP ownership every deliverable is TMNA's asset, not a licensed Merkle product

By November 30, 2026: All 41 providers have passed acceptance criteria. Validated data flows into Adobe Analytics and TMNA's visualization platform. QA monitoring is active and producing clean baseline reporting. All documentation has been delivered. The framework runs without Merkle.

Continuous · All Phases

Ongoing Governance & Enhancements

A persistent governance layer running beneath all three phases and continuing post-handoff that keeps the measurement framework accurate, current, and aligned with TMNA's evolving program needs.

Bi-Wk
QA Review Cadence
DQI
Scoring Framework
Nov 30
Independence Date
Governance operating model
  • Bi-weekly QA cadence: Regular DQI scoring reviews across all provider categories a continuous quality signal, not a single end-of-project audit.
  • Alert framework: Event volume drop-off monitoring in DataTrue. When a provider's tag implementation degrades, TMNA sees it in real time.
  • Enhancements roadmap: Structured process for incorporating new TDDS-certified providers, spec evolution, and KPI additions.
  • Certification renewal integration: Annual provider certification cycle includes QA gate review governance embedded into TDDS's existing operating rhythm.
What you get
  • A continuous quality signal throughout the engagement not a one-time QA event
  • An enhancements roadmap TMNA can execute independently post-November 2026
  • DataTrue-configured monitoring that runs without Merkle after handoff
  • Governance embedded in TDDS's existing certification rhythm not a separate program to manage
Delivery Model

How the Machine Runs

An adaptive transformation structure built on agile delivery drawing on Merkle's deep bench of specialists to meet program needs at every phase while maintaining strategic alignment and accountability.

Transformation Pillars
Vision
A common vision for success and the outcomes transformation should drive TMNA owns a standardized, durable, vendor-agnostic Tier 3 analytics framework operable independently after November 2026.
Technology
The solutions that enable seamless experiences while maximizing existing investments Adobe Analytics, Ensighten, DataTrue, and the TBDP data lake, all within TMNA's existing architecture.
Business Case
The impact and measurable return comparable provider performance data, defensible accountability benchmarks, and a QA-validated data foundation enabling confident program decisions.
Operating Model
The organizational structure, processes, and capabilities to execute and sustain transformation clear RACI, bi-weekly governance cadence, and a structured knowledge transfer to TMNA operations.
Adaptive Team Structure
Core Team Leadership

Account Leader, Transformation Leader, Program Leader, and Product Leader provide strategic oversight, client relationship management, and workstream coordination across all three phases.

Core Workstream Leads

Analytics Lead, Technology Strategy Lead, and Implementation Lead provide oversight, executive communication, and strategic alignment while coordinating across the provider ecosystem.

Flex Team SMEs

Project Management, Business Strategy, Analytics, Change Management, Business Analysis, Platform Architecture, and Creative resources drawn from Merkle's deep bench based on workstream priorities and program phase.

Agile Delivery

Delivery follows a parallel-workstream agile model: discovery, stakeholder alignment, and prioritization run concurrently across multiple tracks, with team members and capacity allocated to active workstreams at each phase.

Adoption is Built With Your Teams
Team-Based Training
  • Role-specific playbooks for data-driven decisions not generic training
  • Communications plan to build awareness and drive adoption
  • Training with real TMNA use cases for contextual learning not sample data
Center of Excellence
  • Champions across TMNA teams who have lived current challenges and have a stake in overall success
  • Merkle supports standards and best practices for the COE to operate
  • First line of defense for process or training questions from their teams
Feedback Loops
  • Ongoing engagement to collect feedback, surface issues, and document what is/isn't working
  • Response cycles to address technical fixes or refine features
  • Phased rollout with pilot teams validating workflows before broader deployment
What you get
  • Strategic oversight and vision alignment unified transformation vision aligned with business objectives and executive buy-in
  • Right-sized resource flexibility specialized expertise pulled based on workstream priorities, optimizing budget while maintaining momentum
  • Accelerated time-to-value agile delivery enables rapid response to changing requirements and faster implementation of high-impact initiatives
  • Transparent governance, no administrative burden bi-weekly reviews and a single escalation path keep every stakeholder aligned
Our People

Meet the Team

An integrated team of analytics architects, delivery leaders, and automotive specialists certified in Adobe Analytics and CJA, with direct experience managing multi-tier dealer ecosystems at scale.

Core Team
SB
Scott Burnam
Director, Client Services
8+ years of digital analytics experience with a client-services-led approach built on partnership, accountability, and calibrated delivery. Serves as senior point of ownership for complex engagements, coordinating across analytics, engineering, privacy, and procurement teams. Has led multi-year programs for Cox Automotive, Verizon, the US GSA, and Broadcom engagements requiring executive alignment, regulatory awareness, and steady leadership through complexity.
Client ServicesProgram OwnershipCox Automotive
AG
Adriana Grajales
VP, Delivery Excellence
10+ years at Merkle | Cardinal Path, progressing from Senior Digital Project Manager to VP of Delivery Excellence. Deep foundation in digital program management and delivery leadership, with a strong track record of scaling delivery practices. Earlier experience managing Nissan USA and Infiniti USA accounts at Critical Mass adds direct automotive industry context to the Toyota team.
Delivery ExcellenceNissan / InfinitiProgram Scaling
NV
Nicholas von Hahn
Director, Analysis & Insights
Seasoned digital analytics leader with a proven track record delivering complex, enterprise-scale analytics programs. Deep expertise in web measurement, data strategy, and reporting. Certified Adobe Analytics Architect Master. Developed specialized expertise in the automotive sector, leading analytics engagements for global OEMs including Volkswagen, Ford, Hyundai, and Bridgestone modernizing measurement frameworks and improving data quality at scale.
Adobe Architect MasterVW · Ford · HyundaiVancouver, BC
MW
Mike Watts
Head of Analytics & Testing
Team Lead for Merkle's Digital Experience Analytics group. Sets the vision and direction for data-driven decision-making, ensuring brands implement solutions that enhance personalized customer experiences and maximize business impact. Certified Expert in Adobe Analytics and CJA. Champions a culture of curiosity and continuous improvement, equipping analysts with the tools, methodologies, and industry best practices to push the boundaries of digital measurement.
Adobe Analytics ExpertCJA CertifiedDX Analytics Lead
AK
Oshu Kapil
Analytics Manager
Experienced data enthusiast and problem solver with strong technical fluency across web analytics implementation and reporting tools. Expertise spans Adobe Launch, Adobe Experience Platform (AEP), Adobe Analytics, Customer Journey Analytics (CJA), Google Tag Manager (GTM), and GA4. Excels at transforming complex data into actionable insights. Certified Expert in Adobe Analytics and CJA.
Adobe Analytics ExpertAEP · Web SDKGTM · GA4
AP
Akshit Panda
Analytics Sr. Manager
Digital Analytics specialist with extensive experience designing and implementing enterprise-scale measurement solutions across web and mobile platforms. Deep expertise in Adobe Analytics, Adobe Experience Platform, Customer Journey Analytics, Web SDK, and tag management frameworks. Specializes in translating complex business requirements into scalable, privacy-compliant analytics architectures. Certified Master in Adobe Analytics, Expert CJA.
Adobe MasterWeb SDKPrivacy-Compliant Arch
GD
Gabriel Davis
Analytics Director
Provides impactful recommendations grounded in site data and data-backed insights. Believes good data visualization and storytelling are critical to client success. Clients have included Microsoft, Costco, Samsung, L'Oréal, Verizon, Subaru, Intel, and Shiseido. Recent focus in Adobe Analytics and CJA building client-friendly, detailed dashboards that answer critical business questions and regularly leading client training sessions for these tools.
Adobe Analytics ExpertSubaru · VerizonDashboard Design
SY
Shennie Yang
Senior Project Manager
Seasoned project manager with 8+ years of experience leading digital analytics initiatives for enterprise clients. Known for a structured and strategic approach to managing programs with multiple workstreams and stakeholders. Consistently applies best practices while delivering against project KPIs, timelines, and budgets. Industry experience includes Bridgestone, Amgen, AbbVie, Intel, Suncor, Feeding America, Universal Music Group, and Walmart eCommerce.
PMPBridgestoneVancouver, Canada
QA Specialist
MN
Mykim Nguyen
Lead QA Analyst
8+ years in QA across sports media, entertainment, mobile applications, and ecommerce analytics. Built Vail Resorts' first analytics testing structure and joined their A/B testing team. Most recently at Charter Communications as QA Tester and Application Release Specialist. Adobe Analytics QA specialist with deep expertise in event validation, data layer testing, and DataTrue-based monitoring frameworks. QA Certified.
QA CertifiedAdobe Analytics QADataTrue
NP
Naveen Puppala
Senior Analyst
Digital analytics and optimization professional with 6+ years driving data-led decision making and measurable business impact. Specializes in marketing analytics, measurement strategy, customer journey analysis, and experience optimization. Strong skill set in translating complex behavioral data into actionable insights. Industry experience: Bridgestone, Royal Bank of Canada, Bell Canada, Walmart eCommerce. Google Analytics Certified; Adobe Analytics Expert Business Practitioner; Snowflake Snow-Pro Core.
Adobe Analytics ExpertSnowflakeBridgestone
Team credentials at a glance
  • 160+ Adobe Analytics & CJA credentialed employees across Merkle globally
  • 1,300+ Adobe credentials and certifications 2,700+ Adobe trained staff globally
  • Global Platinum Adobe Partner specializations in AA, AEM, Commerce, Real-Time CDP, Target, CJA
  • Direct automotive experience: Volkswagen, Ford, Hyundai, Bridgestone, Subaru, Nissan, Infiniti, Cox Automotive
Proof of Capability

Case Studies

Analogous engagements demonstrating Merkle's ability to standardize analytics across complex multi-site ecosystems, govern multi-vendor implementations, and deliver TMNA-scale programs.

Global Automobile Retailer Brand & Dealer Sites
Adobe Analytics Across Tier 1, 2 & 3 Dealer Network
$2M+
Incremental revenue from CTA A/B tests
25%
Increase in lead conversion rate
18%
Reduction in JIRA data quality bugs
Client had transitioned from Autometrix to Adobe Analytics and needed to optimize implementations across all brand and brand support websites (Tiers 1 & 2) and 2,400 dealer (Tier 3) websites. Needed a partner that could create best practices rolled out across tiers and regions in a unified way that also applied governance across brands and tiers.
  • Clean data for CTAs resulted in A/B tests with over $2M incremental revenue in a single year
  • JIRA bugs reduced by 18% data integrity greatly improved and analytics capabilities opened up
  • 25% increase in lead conversion rate; reporting expanded to include dealer reporting and advanced analyses unavailable previously
Global Mobility Company
Advancing Analytics Through a Modern, Scalable Implementation
93%
Reduction in analytics technical debt
100%
Analytics & CDP powered by event-driven data layer
Address growing technical debt across the analytics ecosystem, establish consistent and reliable data across platforms, and strengthen the organization's ability to deliver personalized, data-driven experiences at scale.
Implemented Adobe Web SDK and established a centralized, event-driven data layer to standardize how user interactions and business events are captured across digital experiences. Analytics tags, marketing pixels, and real-time CDP all redeployed to consume data directly from this unified data layer enabling immediate data ingestion, profile enrichment, and activation across channels.
Global Rental Car Company Three Brands
Adobe Analytics & Target Total Solution Management
$150M
Incremental revenue from Adobe Target program annually
18
FTE specialist team
Three distinct brands, each generating billions of customer journeys annually. Years of accumulated technical debt across Adobe Launch and AEP implementations led to data quality issues, inaccurate attribution, and an inability to leverage insights for strategic decisions. Fragmented approach prevented unified reporting and created conflicting data stakeholders couldn't trust.
  • Harmonized analytics across three brands into a trusted Adobe ecosystem used by Finance as the official source of truth for media attribution chargebacks
  • Improved data quality enabling confident strategic decision-making
  • Enhanced personalization through clean, reliable customer data over $150M per year in incremental revenue
  • Established scalable processes for ongoing maintenance and continuous improvement
National B2B Telecommunications
Adobe Analytics Data Quality & Insights at Scale
$4M
Media efficiency from quality traffic score
$15M
Additional annual subscription revenue from A/B/n test
Client sought to enhance the customer experience and attract more subscribers through on-site demand generation. Previous tagging efforts were inadequate; current measures could only identify non-bounced site visits as quality traffic. Without visibility into subsequent actions, it was difficult to optimize targets.
  • Introduced a new 'quality traffic score' that filters customer IDs based on likelihood to purchase uncovering $4M in media efficiency opportunities
  • A/B/n testing on the homepage generated $15M in additional annual subscription revenue
Why these cases matter for TDDS
  • Dealer Tier 3 at scale: Merkle has governed analytics across 2,400 dealer sites with 7 development providers directly analogous to TDDS's 1,220 sites and 41 certified providers
  • Web SDK migration expertise: Proven track record designing and executing event-driven data layer architectures directly applicable to TMNA's planned Web SDK migration
  • Multi-brand harmonization: Experience unifying analytics across fragmented multi-vendor ecosystems into a single trusted data source mirrors the TDDS standardization challenge
  • Data quality at enterprise scale: Remediation-first approaches with dedicated QA staff have produced measurable improvements in data integrity and business outcomes
Client References

References

Two client contacts available for TMNA to speak with directly both from engagements involving multi-site analytics ecosystems, standardized tagging architectures, and complex vendor governance models.

Enterprise Mobility
Contact
Christiaan Breur
Title
Lead Digital Architect
Email
christiaan.breur1@em.com
Phone
(765) 413-8554
Engagement scope: Adobe Analytics implementation across Enterprise Mobility's multi-brand digital ecosystem, including analytics standardization and tag governance across a complex platform environment.
Bridgestone
Contact
Available upon request
Relationship
Active client multiple team members (Nicholas von Hahn, Naveen Puppala, Shennie Yang) have direct Bridgestone engagement experience
Engagement scope: Digital analytics implementation and measurement strategy for Bridgestone's automotive-adjacent digital properties, including Web SDK and Adobe Analytics work relevant to the TDDS scope.
Analogous Work Examples
  • Multi-site / multi-entity ecosystem: Global automobile retailer engagement covered Tiers 1, 2, and 3 across 2,400 dealer sites with 7 certified development providers the closest direct analog to the TDDS scope.
  • Standardized tagging architecture: Global Mobility Company engagement established a centralized, event-driven data layer standardizing data collection across all digital experiences directly mirroring the TMNA data layer spec objective.
  • OEM with franchise model: Automotive experience across Volkswagen, Ford, Hyundai (Nicholas von Hahn) and Nissan/Infiniti (Adriana Grajales at Critical Mass) demonstrates familiarity with OEM-to-dealer governance structures and multi-tier analytics accountability.
  • Multi-brand governance: Global Rental Car Company engagement unified analytics across three brands into a single trusted Adobe ecosystem analogous to the TDDS challenge of unifying 41 certified provider implementations into one TMNA-owned data architecture.
What you get
  • Two named references available for direct contact by TMNA prior to or following proposal evaluation
  • Analogous case documentation across multi-site tagging, multi-vendor governance, and OEM franchise models
  • Direct automotive team experience references who can speak to Merkle's approach in dealer and automotive contexts specifically

Additional references available: Merkle can provide additional client references upon request, including from healthcare, financial services, telecommunications, and retail engagements involving large-scale analytics standardization and vendor governance programs.

Merkle Delivery Framework
Additional Proposal Features