Approach for TDDS
Designed to solve the TDDS measurement problem at scale: 1,220 dealer sites, 41 certified providers, and zero proprietary dependencies, delivering a fully operational, TMNA-owned measurement framework by November 30, 2026.
This proposal covers three sequential phases from discovery through full network deployment, supported by governance, team, and case study materials. Use the left-hand navigation to explore any section in depth.
Each of 41 providers implements analytics independently. TMNA cannot aggregate or compare dealer data across the network. KPIs are defined differently by every provider.
The current architecture lacks a mechanism for TMNA to retain control over data structures, tagging frameworks, and schemas. No path to an OEM-level CDP without this foundation.
TMNA holds contractual authority over 41 providers but no structured enforcement mechanism exists; no implementation tracking, no escalation playbook, no graduated remediation.
41 providers, 5 tool categories, 1,220 sites, each with its own interpretation of what data to fire, when, and how. The result is data that cannot be trusted at the network level.
- Standardized measurement framework across all 1,220 dealer sites by November 2026
- Unified dealer performance visibility from a single TMNA-owned data source
- TMNA-owned data architecture; no proprietary dependencies, no vendor lock-in
- Dealer-level scorecards and network rollups for provider accountability
- Future-ready CDP foundation with identity-ready event and attribute design
- Full TMNA team independence from Merkle by November 30, 2026
- 1,300+ Adobe-trained specialists globally: analysts, architects, and strategists with direct automotive experience
- Global Adobe Platinum Partner with specializations in Analytics, Real-Time CDP, Target, and CJA
- Direct tier-3 dealer ecosystem experience: 2,400 dealer sites, 7 certified providers, OEM governance models
- Automotive OEM bench: VW, Ford, Hyundai, Bridgestone, Nissan/Infiniti; not automotive-adjacent, automotive-native
- Full IP ownership on delivery: every spec, schema, and scoring framework is a TMNA asset from day one
Kickoff Jun 29
Measurement Framework
A canonical KPI definition for every metric, exact trigger, required parameters, and allowed values, so data means the same thing across all 1,220 dealer sites regardless of which of the 41 providers built them.
Sessions, unique visitors, new vs. returning, SRP-to-VDP progression rate, page depth, bounce rate by entry page and traffic source. The baseline layer every provider must instrument identically.
DR tool launch rate (% of VDP visitors), funnel progression by step, payment estimator completion, financing step engagement rate across all 10 certified digital retail tool providers.
Trade-in launch/completion and valuation completion rate across 7 providers. Service appointment start/completion across 9 service scheduler providers. Abandonment tracked at each funnel step.
Traffic flow into, through, and out of dealer sites, enabling full-funnel optimization and messaging interception across Toyota's digital ecosystem at the OEM level.
- GVM-aligned KPI definitions in TMNA's internal standard format, ready for report suite configuration on day one
- Four decision-driving domains: not maximum data volume, but the focused signal set that drives dealer and OEM decisions
- Every KPI reportable at four levels: dealer, provider, size cohort, and network rollup, enabling apples-to-apples comparison for the first time
- CDP-extensible design: event and attribute structures built for identity resolution without re-tagging
Why this matters: Prior Tier 3 analytics efforts failed because specs defined what to collect but not precisely when, resulting in incomparable data across providers. This framework closes that gap before a single provider begins implementation.
Roadmap & Gantt
Discovery, KPI design, audit, and spec work run in parallel. Provider categories onboard as their specs are ready. QA and documentation run continuously across all phases from the moment the first provider goes live.
Kickoff Jun 29
The 7 website providers are the foundational layer, all 1,220 dealer sites run on them. Their data layer establishes the base schema all embedded tool events write into. They start in September before all tool provider specs are finalized.
Digital retail, chat, trade-in, and service scheduler providers begin training in September but implement on a rolling basis, their iFrame and postMessage architectures require provider-specific prep time that runs alongside website rollout.
As soon as the first website provider goes live, QA begins. Size A and B dealers are validated first, the highest-volume sites where data quality gaps most affect TMNA's enterprise reporting and media attribution decisions.
Knowledge transfer starts in July, not November. Every spec, playbook, and GVM-aligned document is written during the phase it covers so TMNA teams can review, question, and own it before the engagement closes.
- Faster time to first data: website providers go live in October, not November, because spec and training overlap with audit
- Risk reduction through staging: A/B dealer QA catches patterns before the full C/D/E rollout in November
- No bottlenecks from sequential handoffs: audit informs spec while running; spec informs training while finalizing
- TMNA teams build capability in real time: documentation and training run throughout, not as a bolt-on at the end
Data Collection Audit
A provider-by-provider inventory of current-state data collection across all 41 certified providers, establishing the factual baseline that makes the specification achievable, not aspirational.
- Website Providers (7): All 7 currently push to Adobe Analytics. Audit confirms existing data layer maturity, variable mapping, and Ensighten deployment model per provider, establishing the baseline each must standardize to.
- Chat Tools (8): Tracking capability varies significantly. Audit assesses whether each provider supports postMessage or direct dataLayer write, documenting limitations before spec publication so the spec reflects reality.
- Digital Retail Tools (10): iFrame architecture review per provider. Documents event passthrough feasibility and required schema by integration type, preventing spec requirements that providers physically cannot implement.
- Trade-In (7) & Service Scheduler (9): Widget event inventory identifying which funnel steps are currently tracked, which are missing, and which require custom instrumentation, prioritizing highest-gap providers for early engagement.
- Platform constraints: CMS, TMS, and iFrame limitations documented per provider so the spec accounts for real-world constraints before providers receive it. No aspirational requirements that fail in production.
- A provider-by-provider current-state inventory across all 41 certified providers, the only audit of its kind in the TDDS program
- Platform limitation documentation that prevents spec failures during implementation
- A data quality gap map showing where inconsistencies originate today, provider-wide vs. dealer-cohort-specific
- Prioritized remediation targets: highest-volume, highest-impact gaps addressed first
TMNA confirmed: Inconsistency in tracking tools exists due to differences in tool platforms across the 41 certified providers. The audit creates the factual baseline that makes the spec achievable rather than aspirational.
Solution Design
A standardized, TMNA-owned data layer architecture applicable across all dealer websites and every embedded TDDS-certified third-party tool, the single schema all 41 providers implement consistently.
- Simplicity over volume: Every field in the schema earns its place against a specific KPI requirement. No speculative attributes that add complexity without adding decision value.
- Vendor-agnostic by design: The dataLayer object is plain JavaScript. It does not depend on Ensighten, Adobe, or any TMS to exist, any provider can write to it regardless of their platform.
- TMNA ownership enforced in the schema: The
dealer_codeandprovider_idfields are required on every hit, ensuring TMNA can always attribute data to a specific dealer and provider without relying on vendor-side logic. - CDP-extensible: The
user.session_idfield is architected to accept a hashed identity value when TMNA activates its CDP; no re-instrumentation required. - iFrame event passthrough: Embedded tools communicate back to the parent page via postMessage, the schema defines the required payload structure for each tool category so providers know exactly what to fire.
- A single canonical dataLayer schema applicable across all 5 provider categories, one spec, 41 implementations
- Full TMNA ownership of the schema on delivery; no vendor licensing, no proprietary structure
- Ensighten-native mapping: existing tag management infrastructure, no new tooling required
- iFrame passthrough architecture documented per tool category so providers know exactly how to fire events back to the parent dataLayer
Implementation Recommendations
Provider-specific implementation guidance derived from the TMNA-approved spec clear enough that any certified provider can implement consistently without bespoke Merkle involvement for every site.
- Per-category implementation guides: Separate guidance for website providers, chat tools, digital retail tools, trade-in tools, and service schedulers accounting for each category's architectural constraints.
- Acceptance criteria: Explicit pass/fail definitions for every required event, so providers know exactly what QA validation will test.
- Platform-specific notes: Where a provider's platform creates a constraint, the guide documents the approved workaround or alternative implementation path.
- TMNA approval gates: No provider receives implementation guidance until the underlying spec has received explicit TMNA sign-off.
Formal spec publication to all 41 providers as a single coordinated release every provider starts from the same TMNA-approved document.
All deliverables are TMNA-owned upon delivery. No ongoing Merkle access or license required to operate or extend them.
- Five category-specific implementation guides covering all 41 certified providers
- Explicit acceptance criteria for every required event no ambiguity at QA time
- A TMNA-approved spec serving as the permanent certification compliance standard going forward
- Documentation structured for certification update enabling TDDS to embed requirements into provider governance materials permanently
Data Literacy & Consulting
Structured working sessions with all 41 certified providers translating the TMNA-approved spec into shared understanding before any implementation begins.
- Provider-specific sessions: Each certified provider receives a dedicated spec review covering requirements, technical questions, and platform-specific constraints not a generic group call.
- Environment inventory: CMS, TMS, and iFrame architecture confirmed per provider so platform limitations are documented before implementation begins.
- Acceptance criteria walkthrough: Providers understand exactly what QA will test eliminating the ambiguity that produces re-work and implementation delays.
- Direct engagement model: Merkle operates as an extension of TMNA engaging providers directly for day-to-day execution, with TMNA providing governance and escalation support.
TDDS maintains recurring provider meetings, dealer communications, and field staff reviews. Merkle works within these structures not around them to minimize disruption.
Merkle coordinates implementation as an agent of TMNA. TDDS holds all contractual authority; Merkle handles execution coordination.
- 41 providers briefed and aligned on the TMNA-approved spec before implementation begins
- Platform constraints documented per provider no surprises during QA
- A single TMNA escalation path when provider issues require contractual intervention
- Merkle as an extension of TMNA not an additional layer between TDDS and its providers
Provider Training
Hands-on training built around the TDDS spec giving each provider's technical team the context and confidence to implement correctly the first time.
- Role-specific content: Training structured for the technical implementers at each provider specific guidance on the TMNA data layer and acceptance criteria they need to pass.
- Real TDDS use cases: Training uses actual TMNA examples not sample data so providers understand the business context behind each event they're implementing.
- Category cohorts: Providers grouped by category for training shared constraints, shared guidance, efficient use of provider time.
- Office-hour support: Structured Q&A availability during implementation windows so providers can resolve technical questions without blocking progress.
Parallel training track for TMNA TDDS, Analytics, and IT teams building internal capability to QA, monitor, and maintain the framework independently post-engagement.
Written implementation playbooks for each provider category persistent reference materials that survive staff turnover and support future certification renewal cycles.
- Per-category training for all 41 provider technical teams
- TMNA internal capability built in parallel your team owns QA monitoring on day one post-handoff
- Written playbooks that outlast the engagement and support future certification renewals
- Reduced re-work providers who understand the spec implement it correctly the first time
Implementation Guidance
Active technical support during the implementation window resolving blockers, enforcing spec consistency, and tracking progress across all five provider categories simultaneously.
- Website Providers (7): Phased first foundational layer before embedded tools. All 7 currently push to Adobe Analytics; implementation standardizes trigger conditions, naming conventions, and required attributes.
- Digital Retail (10): iFrame event passthrough with standardized action schema. Payment estimator and full DR flows covered.
- Chat Tools (8): Post-message bridge vs. direct dataLayer write assessed per provider. Tracking capability varies; guidance reflects each provider's actual architecture.
- Trade-In (7): Widget events: start, step, valuation received, completion. Abandonment tracking required at each funnel step.
- Service Scheduler (9): Appointment funnel: start, step, confirmation, abandonment. Nine providers with high variability in current capability.
- Active implementation tracking across all 41 providers with gap identification and escalation
- Consistent spec adherence no provider-specific variations that compromise standardization
- Progress visibility for TMNA at every stage not a black box between spec delivery and QA
- Escalation support Merkle documents and escalates; TMNA enforces through TDDS certification authority
QA & Certification Readiness
A four-stage QA process ensuring every provider implementation is accurate, complete, and TMNA-spec-compliant before the framework goes live, zero tolerance for drift.
- Spec review session with each provider
- Environment inventory (CMS, TMS, iFrame)
- Platform limitations documented
- Acceptance criteria confirmed before code is written
- Tag audit via browser devtools + proxy
- Event firing validation against spec
- Data layer attribute completeness check
- Adobe Analytics variable mapping confirmed
- Representative live site audits per provider
- DQI score assigned (5 dimensions)
- Dashboard data reconciliation
- Remediation SLA clock starts at failure
- Monthly DQI review per provider category
- Alert framework for event volume drop-off
- Certification renewal QA gate
- TMNA team trained to run independently
- A four-stage QA framework that catches implementation gaps before they reach production data
- DataTrue-integrated monitoring configured in TMNA's existing platform; not a new tool
- Formal acceptance criteria per provider category, pass/fail is never ambiguous
- TMNA owns QA monitoring independently from day one of post-engagement operations
Data Quality Index
A size-stratified scoring framework giving TMNA comparable, defensible data quality benchmarks across all 41 providers, and a factual basis for holding each one accountable at certification renewal.
| Size Cohort | Monthly Sessions (Est.) | Approx. Dealer Count | QA Priority | Relative Volume |
|---|---|---|---|---|
| Size A: Enterprise | 50,000+ | ~80 | First | |
| Size B: Large | 20,000–50,000 | ~190 | First | |
| Size C: Mid-Large | 8,000–20,000 | ~280 | Second | |
| Size D: Mid-Small | 3,000–8,000 | ~420 | Third | |
| Size E: Small | <3,000 | ~250 | Third |
| Provider | Size A | Size B | Size C | Size D | Size E |
|---|---|---|---|---|---|
| Provider 1 | 92 | 88 | 75 | 65 | 50 |
| Provider 2 | 80 | 85 | 82 | 78 | 70 |
| Provider 3 | 88 | 78 | 65 | 55 | 40 |
| Provider 4 | 72 | 80 | 83 | 81 | 74 |
| Provider 5 | 85 | 82 | 77 | 68 | 58 |
| Provider 6 | 60 | 72 | 78 | 80 | 76 |
| Provider 7 | 78 | 74 | 70 | 62 | 48 |
- Eliminates composition bias: A provider serving primarily Size A dealers will appear stronger on raw KPIs than one serving Size E dealers, even if implementation quality is identical. Raw scores reward portfolio composition, not execution quality.
- Exposes implementation gaps: Size-stratified views reveal whether data quality inconsistencies are provider-wide or concentrated in specific dealer cohorts, a critical distinction for targeted remediation.
- Sets fair benchmarks: TMNA holds each provider accountable to the performance norms of the segment they actually serve; not a network-wide average that disadvantages providers focused on smaller dealers.
- Prioritizes QA resources: Size A and B dealers receive QA attention first, the 270 highest-volume sites where data quality gaps most affect TMNA's enterprise reporting and media attribution decisions.
- A DQI scorecard across all 41 providers segmented by dealer size cohort (A through E)
- Comparable provider performance data: an apples-to-apples view for the first time
- A defensible accountability baseline for TDDS certification and renewal conversations
- TMNA-owned scoring logic configured in DataTrue for ongoing independent monitoring
Remediation SLA's
Formal cure periods with defined resolution deadlines ensuring that every gap identified in QA review has an owner, a timeline, and a consequence for non-resolution.
- Gap identification: DQI scoring and four-stage QA reviews surface implementation gaps with specificity exact event, exact provider, exact failure mode.
- SLA clock starts at failure: Every failed QA check triggers a formal cure period. The provider receives a documented gap statement and the specific acceptance criteria they failed.
- Escalation path: Gaps unresolved within the cure period are escalated to TMNA/TDDS program leadership with Merkle's documented evidence for contractual enforcement.
- TMNA leads enforcement: Merkle tracks, documents, and escalates. TMNA holds contractual authority and leads enforcement conversations with providers directly.
- A formal remediation process SLA-governed cure periods, not informal follow-up emails
- Documented evidence ready for TMNA enforcement conversations with non-compliant providers
- Clear RACI: Merkle documents and escalates; TMNA enforces through certification authority
- A compliance track record that informs certification renewal decisions for every provider
Provider Certification Renewal Gates
Compliance with the TMNA-approved tagging specification embedded as a mandatory condition of TDDS certification making data quality a permanent, self-enforcing program requirement.
- Spec compliance as certification condition: TDDS will formalize the tagging spec in an update to provider governance materials making implementation a permanent certification requirement, not a one-time project ask.
- Annual renewal QA gate: Each provider's DQI score and QA audit results are reviewed at certification renewal providers who have drifted face a formal remediation requirement before renewal is granted.
- New provider onboarding standard: Any new TDDS-certified provider must demonstrate compliance as a condition of initial certification the framework becomes self-perpetuating.
- Decertification backstop: Persistent non-compliance can result in decertification the program's ultimate enforcement lever, held exclusively by TMNA/TDDS.
- A permanently embedded compliance standard not a project that expires in November 2026
- Governance documentation structured for direct insertion into TDDS provider certification materials
- Annual renewal QA gates that sustain data quality without ongoing Merkle involvement
- New provider onboarding criteria that extend the framework to future TDDS additions automatically
Structured Handoff to TMNA Ops
A planned, documented transition that leaves TMNA's internal team fully capable of operating, monitoring, and evolving the analytics framework independently with no ongoing Merkle dependency.
Tagging specifications, data layer definitions, KPI framework, QA procedures, and acceptance criteria all in GVM-aligned format for independent internal use.
Hands-on capability transfer for TMNA TDDS, Analytics, and IT covering QA monitoring in DataTrue, spec evolution, and provider onboarding procedures.
TMNA's existing QA platform configured with alert framework for event volume drop-off and monthly DQI review automation monitoring runs independently from day one.
All specifications, schemas, taxonomies, acceptance criteria, and guides are TMNA-owned upon delivery no vendor licensing, no access agreements required.
- Complete documentation your team can read, maintain, and extend without Merkle involvement
- A trained internal team capable of running QA monitoring, certifying providers, and evolving the spec
- DataTrue configured and live TMNA owns monitoring from handoff day forward
- Full IP ownership every deliverable is TMNA's asset, not a licensed Merkle product
By November 30, 2026: All 41 providers have passed acceptance criteria. Validated data flows into Adobe Analytics and TMNA's visualization platform. QA monitoring is active and producing clean baseline reporting. All documentation has been delivered. The framework runs without Merkle.
Ongoing Governance & Enhancements
A persistent governance layer running beneath all three phases and continuing post-handoff that keeps the measurement framework accurate, current, and aligned with TMNA's evolving program needs.
- Bi-weekly QA cadence: Regular DQI scoring reviews across all provider categories a continuous quality signal, not a single end-of-project audit.
- Alert framework: Event volume drop-off monitoring in DataTrue. When a provider's tag implementation degrades, TMNA sees it in real time.
- Enhancements roadmap: Structured process for incorporating new TDDS-certified providers, spec evolution, and KPI additions.
- Certification renewal integration: Annual provider certification cycle includes QA gate review governance embedded into TDDS's existing operating rhythm.
- A continuous quality signal throughout the engagement not a one-time QA event
- An enhancements roadmap TMNA can execute independently post-November 2026
- DataTrue-configured monitoring that runs without Merkle after handoff
- Governance embedded in TDDS's existing certification rhythm not a separate program to manage
How the Machine Runs
An adaptive transformation structure built on agile delivery drawing on Merkle's deep bench of specialists to meet program needs at every phase while maintaining strategic alignment and accountability.
Account Leader, Transformation Leader, Program Leader, and Product Leader provide strategic oversight, client relationship management, and workstream coordination across all three phases.
Analytics Lead, Technology Strategy Lead, and Implementation Lead provide oversight, executive communication, and strategic alignment while coordinating across the provider ecosystem.
Project Management, Business Strategy, Analytics, Change Management, Business Analysis, Platform Architecture, and Creative resources drawn from Merkle's deep bench based on workstream priorities and program phase.
Delivery follows a parallel-workstream agile model: discovery, stakeholder alignment, and prioritization run concurrently across multiple tracks, with team members and capacity allocated to active workstreams at each phase.
- Role-specific playbooks for data-driven decisions not generic training
- Communications plan to build awareness and drive adoption
- Training with real TMNA use cases for contextual learning not sample data
- Champions across TMNA teams who have lived current challenges and have a stake in overall success
- Merkle supports standards and best practices for the COE to operate
- First line of defense for process or training questions from their teams
- Ongoing engagement to collect feedback, surface issues, and document what is/isn't working
- Response cycles to address technical fixes or refine features
- Phased rollout with pilot teams validating workflows before broader deployment
- Strategic oversight and vision alignment unified transformation vision aligned with business objectives and executive buy-in
- Right-sized resource flexibility specialized expertise pulled based on workstream priorities, optimizing budget while maintaining momentum
- Accelerated time-to-value agile delivery enables rapid response to changing requirements and faster implementation of high-impact initiatives
- Transparent governance, no administrative burden bi-weekly reviews and a single escalation path keep every stakeholder aligned
Meet the Team
An integrated team of analytics architects, delivery leaders, and automotive specialists certified in Adobe Analytics and CJA, with direct experience managing multi-tier dealer ecosystems at scale.
- 160+ Adobe Analytics & CJA credentialed employees across Merkle globally
- 1,300+ Adobe credentials and certifications 2,700+ Adobe trained staff globally
- Global Platinum Adobe Partner specializations in AA, AEM, Commerce, Real-Time CDP, Target, CJA
- Direct automotive experience: Volkswagen, Ford, Hyundai, Bridgestone, Subaru, Nissan, Infiniti, Cox Automotive
Case Studies
Analogous engagements demonstrating Merkle's ability to standardize analytics across complex multi-site ecosystems, govern multi-vendor implementations, and deliver TMNA-scale programs.
- Clean data for CTAs resulted in A/B tests with over $2M incremental revenue in a single year
- JIRA bugs reduced by 18% data integrity greatly improved and analytics capabilities opened up
- 25% increase in lead conversion rate; reporting expanded to include dealer reporting and advanced analyses unavailable previously
- Harmonized analytics across three brands into a trusted Adobe ecosystem used by Finance as the official source of truth for media attribution chargebacks
- Improved data quality enabling confident strategic decision-making
- Enhanced personalization through clean, reliable customer data over $150M per year in incremental revenue
- Established scalable processes for ongoing maintenance and continuous improvement
- Introduced a new 'quality traffic score' that filters customer IDs based on likelihood to purchase uncovering $4M in media efficiency opportunities
- A/B/n testing on the homepage generated $15M in additional annual subscription revenue
- Dealer Tier 3 at scale: Merkle has governed analytics across 2,400 dealer sites with 7 development providers directly analogous to TDDS's 1,220 sites and 41 certified providers
- Web SDK migration expertise: Proven track record designing and executing event-driven data layer architectures directly applicable to TMNA's planned Web SDK migration
- Multi-brand harmonization: Experience unifying analytics across fragmented multi-vendor ecosystems into a single trusted data source mirrors the TDDS standardization challenge
- Data quality at enterprise scale: Remediation-first approaches with dedicated QA staff have produced measurable improvements in data integrity and business outcomes
References
Two client contacts available for TMNA to speak with directly both from engagements involving multi-site analytics ecosystems, standardized tagging architectures, and complex vendor governance models.
- Multi-site / multi-entity ecosystem: Global automobile retailer engagement covered Tiers 1, 2, and 3 across 2,400 dealer sites with 7 certified development providers the closest direct analog to the TDDS scope.
- Standardized tagging architecture: Global Mobility Company engagement established a centralized, event-driven data layer standardizing data collection across all digital experiences directly mirroring the TMNA data layer spec objective.
- OEM with franchise model: Automotive experience across Volkswagen, Ford, Hyundai (Nicholas von Hahn) and Nissan/Infiniti (Adriana Grajales at Critical Mass) demonstrates familiarity with OEM-to-dealer governance structures and multi-tier analytics accountability.
- Multi-brand governance: Global Rental Car Company engagement unified analytics across three brands into a single trusted Adobe ecosystem analogous to the TDDS challenge of unifying 41 certified provider implementations into one TMNA-owned data architecture.
- Two named references available for direct contact by TMNA prior to or following proposal evaluation
- Analogous case documentation across multi-site tagging, multi-vendor governance, and OEM franchise models
- Direct automotive team experience references who can speak to Merkle's approach in dealer and automotive contexts specifically
Additional references available: Merkle can provide additional client references upon request, including from healthcare, financial services, telecommunications, and retail engagements involving large-scale analytics standardization and vendor governance programs.