Description of Methodology

Last update: 11/21/2025

Introduction

Overview of CheckedUp and its Proof of Play Report Service

CheckedUp is a healthcare-focused, physician-designed point-of-care (POC) digital media platform that delivers targeted educational content and advertisements through strategically placed screens in healthcare provider (HCP) offices. CheckedUp’s network is designed to engage patients in waiting rooms, providing health education and pharmaceutical advertisements that are relevant to their conditions and treatment options.

To ensure accountability and provide value to advertisers, CheckedUp generates Proof of Play Reports, which confirm that contracted advertisements are displayed according to the terms of the media agreement. These reports provide detailed insights on ad performance, verifying that the ad plays align with expectations set by our clients. 

Media agreements are based on locations and reaching a specified number of locations each month of a campaign. A location is reached if there are ad plays within that location within the given month. 

Purpose and Scope of this DOM

This Description of Methodology (DOM) outlines CheckedUp’s procedures for measuring, verifying and reporting advertising content delivery across its POC network. It establishes standardized methods aligned with industry practices to ensure accurate, transparent and consistent campaign reporting.

The DOM covers all stages of the advertising delivery process, from ad insertion to proof of play reporting, including:

  • Processes for monitoring and reporting ad plays across HCP practice locations.
  • Methods for data capture and reporting.
  • Generation of performance reports providing clients with advertising campaign data.

CheckedUp is committed to maintaining transparency in its operational processes. The practices outlined in this document represent CheckedUp’s current methodologies for ad delivery and reporting. As the POC advertising industry evolves and as our operations change, CheckedUp may update these practices to align with emerging standards, technological advancements and operational adjustments.

This DOM serves as the authoritative source for the definitions, metrics, and data collection methods used in CheckedUp’s Proof of Play reporting. Clients are advised to reference this document to understand report content, limitations, and applicable methodology.

CheckedUp is currently undergoing MRC audit review. As of this version, the ad play reporting service is not yet accredited by the Media Rating Council (MRC). This DOM describes the methodology and controls currently under MRC review. All client reports reference this DOM for transparency and consistency with industry standards.

Measurement Overview

Description of What the Proof of Play Report Measures

The Proof of Play Report validates the delivery and performance of digital advertising campaigns displayed across CheckedUp’s POC network of TV screens. The report confirms the number of HCP locations where campaign ads were played, the total number of ad plays at each location and provides detailed counts of healthcare professionals practicing at each location. Metrics provided prove that the media buy is fulfilled according to the campaign’s goals and that advertisers have clear visibility into their campaign’s performance.

Definition of in-scope product lines (under MRC review)

  • Explorer Waiting Room TV:
    A digital, screen-based media product consisting of CheckedUp-managed television displays installed in point-of-care (POC) healthcare locations nationwide, primarily positioned within patient waiting rooms.

    While the product is designed for waiting room environments, a small percentage of devices may be installed in alternate on-site locations due to site-specific layout constraints. These installations represent approximately 0.7% of reported TV Video Ad Plays (as of October 2025), and remain in scope for measurement and reporting within the same Explorer Waiting Room TV product line.

Definition of in-scope metrics (under MRC review)

  • Explorer Waiting Room TVs:
    The count of CheckedUp Explorer TV screens installed at each listed location, and active in the given reporting period.
  • TV Video Ad Plays:
    The count of individual instances where the selected ad was displayed on CheckedUp’s Explorer Waiting Room TV screens across all locations, counting only ads that ran until completion (which are the only ads logged in the system).

Definition of additional reported fields (not in scope for MRC review)

  • Contracted Locations: The number of healthcare provider locations contracted for ad delivery as defined in the media agreement.
  • Delivered Locations: The number of healthcare provider sites where the ad was successfully played, compared to the number of contracted locations, for assessment of campaign delivery.
  • Practice Name: Name of the practice that the location falls under.
  • Location Name: The name of the location.
  • MD Count: Number of Medical Doctors (MD) at each reported location.
  • OD Count: Number of Doctors of Optometry (OD) at each reported location.
  • NP Count: Number of Nurse Practitioners at each reported location.
  • PA Count: Number of Physician Assistants at each reported location.
  • Address: Street, City, State, and Zip Code.
  • Month: The specified month used for report filtering.
  • Rich Media Ad Plays: Total number of Rich Media Ad Plays identified within the time period specified in report filtering.
  • Total Ad Count per Location: The number of times an ad was played at each location during the reporting period, indicating localized performance. The sum of TV Video Ad Plays and Rich Media Ad Plays.

Business Partners for Measurement

CheckedUp engages third-party service providers for various operational and measurement functions, including CMS development, IT infrastructure management, and external data validation.

Measurement Methodology

Data Collection Process

CheckedUp’s data collection process captures ad play events directly from its network of devices installed at healthcare provider (HCP) locations. Each time an advertisement is played, the system logs an event, recording information such as the timestamp, device identifier and ad content. This data is transmitted securely to CheckedUp’s event database using encrypted communication protocols.

    • Transmission Technology: Ad play data is transmitted via secure API connections to CheckedUp’s servers, which handle real-time event logging. The system logs ad play completion events once an ad has fully played on the device.
    • Offline Handling: If a device goes offline, ad play events are stored locally on the device for up to 60 days. When the device reconnects to the network, the stored events are transmitted to the server, ensuring no data is lost during periods of downtime.
    • Time Zone Management: Time zones are managed at the device level through a dedicated configuration setting on each TV. The CheckedUp CMS automatically assigns a time zone to each device based on the physical address of the location to which it is assigned, and this time-zone designation is synchronized to the device during routine check-ins. While the CMS-assigned time zone serves as the system default, CheckedUp may manually adjust a device’s time-zone setting in rare cases where an override is required to correct an erroneous or incomplete configuration.

      All time-based processing—including timestamp interpretation and office-hours eligibility—uses the device’s assigned local time zone to ensure accurate evaluation of ad plays relative to each healthcare location’s operating hours.

Ad Play Logging

Ad plays are recorded only after ads have fully completed, ensuring that only completed plays are logged through our post-play event methodology. The event data is stored in CheckedUp’s event tracker and synchronized with the server to ensure accurate reporting.

Auto-Play and Continuous Play Behavior

CheckedUp devices use a looping playlist model. Ads are not user-initiated; they are delivered via the playlist under autonomous device control, based on ad and campaign settings configured within the CheckedUp CMS. Because playback is automated, all measured plays reflect actual completion events captured at the device level, and no simulated or system-triggered filler events are counted. Play counts represent only completed ad playback events and are not influenced by playlist rotation frequency. 

Data Validation and Processing

Once ad play data is collected, it undergoes several validation steps to ensure the integrity and accuracy of the data:

  • Validation Checks: The system runs automated checks on ad play logs to ensure that all necessary fields (location, timestamp, and ad identifier) are present and accurate. Discrepancies such as missing or duplicate events are flagged for review.
  • Duplicate Event Detection: During data ingestion, each inbound ad play event is assigned a composite database key generated from three required fields: device_id, timestamp, and ad_id. When an incoming event matches an existing record on all three fields, the system classifies it as a duplicate. Incoming events flagged as duplicates are automatically rejected and not written to the events database, ensuring that only unique, valid completion events are stored and used for reporting.
  • Error Detection and Resolution: CheckedUp uses Sentry, an error-monitoring tool, to capture and analyze system errors. This tool sends real-time alerts when errors occur, allowing the technical team to intervene promptly. Errors such as missing logs or incorrect data transmission are addressed by generating field cases to recover inactive devices.

Data Editing and Filtering

CheckedUp maintains detailed documentation of data pipeline procedures for data filtration. This documentation is made available to clients upon request.

Time-Based Eligibility Feature

Definition of Office Hours
CheckedUp maintains a record of location-specific office hours for each location in our network. These data are collected by our internal client relations team prior to each new location being installed; verified by our internal operations team after installation; and reverified by client relations quarterly thereafter.

If office hours are missing or incomplete for a location for any reason, CheckedUp applies default hours of 8AM to 6PM.

Time-Based Eligibility Filtration
CheckedUp only reports ad plays that occur during standard office hours at each HCP location. Using the timestamp captured during ad play logging, each play is evaluated against a location-specific schedule to confirm it occurred during defined operational hours. Plays falling outside of these windows are excluded from reporting, ensuring that only ad plays with a legitimate opportunity to be seen are included.

The time-based eligibility filtration process is an automated, periodic task that is executed once per day against the ad events that occurred on the prior day.

Internal Traffic Filtration

CheckedUp only reports ad plays that occur on production devices installed in point-of-care healthcare offices. Ad plays generated from internal test or demonstration devices are excluded from all in-scope metrics. Test devices are not removed during data ingestion; instead, these devices are manually designated as “Demo Devices” within the CheckedUp CMS, and this designation is applied during the reporting process to filter out all associated ad play events. Because this filtration occurs at the reporting stage, data from demo devices may exist within the event database but is systematically excluded from the calculation of in-scope metrics defined in Section 2.3.

Metric Calculation

    • TV Video Ad Plays: The count of individual instances where the selected ad was displayed on CheckedUp’s TV screens across all locations, counting only ads that ran until completion (which are the only ads logged in the system).
    • Delivered Locations: Identified by the unique location IDs associated with each ad play event. This ensures that ad plays align with the number of contracted locations.
    • Total Ad Play Count per Location: Determined by calculating the total number of ad plays per location during the reporting period, which provides insights into localized campaign performance.
    • HCP Count: Number of healthcare professionals at a given location collected from first party and third party data.
    • Explorer Waiting Room TV Count: For each unique Location included in the report, this metric reflects the number of unique Device IDs that are both installed at that location (per installation status in the CheckedUp CMS) and active during the reporting month (as indicated by the device’s last-activity timestamp in the CheckedUp CMS). This metric represents operational devices within the network and does not necessarily correspond to the number of devices that recorded plays of the selected ad, as a device may be active without delivering that specific campaign asset.

Error Handling and Data Integrity

CheckedUp maintains several layers of error handling to ensure data integrity:

  • Automated Alerts: The system generates alerts when devices go offline, allowing the CheckedUp team to take immediate action. In the event of prolonged outages, manual reviews are conducted to assess the impact on campaign delivery.
  • Manual Data Review: Campaign analytics teams review ad play logs regularly to ensure accuracy, running reports after campaign launches and on a weekly basis to identify any discrepancies in the data.

Reporting Methodology

Report Generation Process

CheckedUp’s Proof of Play Reports are generated through a structured process designed to ensure accuracy and transparency. Each TV ad play event is logged, validated and then processed into a report that summarizes key metrics for clients. Reports are generated from data logged directly by CheckedUp devices and processed through the CMS and internal reporting database. Clients do not access data directly or manipulate reports within a dashboard system.

  • Data Processing: Once ad play events are collected, the data undergoes several validation steps to ensure accuracy. This includes verifying timestamps, location IDs, device health and ad identifiers.
  • Scheduled Reporting: Ad play data is captured in real-time, but reports are typically generated at the end of a monthly reporting period.

Report Formats and Delivery

  • Standard Monthly Reports: CheckedUp delivers standard Proof of Play reports on a monthly basis. These reports summarize the key metrics, including total ad plays, delivered locations, ad frequency and device counts. Each report is tailored to the client’s campaign, highlighting the performance across the contracted HCP locations.
  • Report Delivery: Reports are delivered electronically in PDF format and include links to access location-based data in Excel format.

Quality Assurance and Validation

Before reports are delivered to clients, the data undergoes a series of quality control checks:

  • Data Integrity Checks: All reported data is cross validated with the original event logs to ensure that no data is missing, duplicated or incorrectly recorded.
  • Anomaly Detection: Any discrepancies, such as locations with unexpected ad play frequencies or inactive devices, are flagged and investigated prior to report finalization. The support team addresses device-related issues to ensure accurate reporting.
  • Manual Review: In addition to automated checks, a manual review is conducted by the campaign analytics team to verify that all data aligns with the campaign goals and media agreements.

Reporting Transparency

CheckedUp is committed to providing transparent and reliable data to its clients. Each report clearly outlines the metrics used, the methodology behind data collection and any limitations encountered during the reporting period. If any discrepancies are found, CheckedUp works closely with clients to resolve them in a timely manner.

Limitations and Restrictions

Known Limitations of the Methodology

  1. Data Transmission Delays: There is a lag time between the database collecting the events and the reporting database when the ad plays are pulled for client delivery.
  2. Offline Device Periods: A device may have played an ad while offline, in which case the associated ad play events are stored locally and are not transmitted to the events database until the device reconnects to the network. These late-arriving events may be received and logged up to 60 days after the original playback. Although CheckedUp does not routinely reissue prior-month counts solely for this reason, late-arriving events may cause ad play totals for earlier months to differ slightly when viewed in subsequent Proof of Play reports, since each monthly report includes a cumulative summary of previous reporting periods.
  3. Incomplete Event Logging: If a necessary field (location, timestamp, and ad identifier) is missing the event log will not accurately reflect the ad play.
  4. Time Zone Settings Errors: Device-level time-zone inaccuracies may cause devices to record ad play events with an incorrect local timestamp. When this occurs, time-based eligibility rules (such as office-hours filtering) may be applied incorrectly, resulting in some plays being either excluded from reports when they should be included, or conversely, included when they should be excluded. Ad play events already logged to the database retain the timestamp captured at the time of logging and are not retroactively corrected after a device’s time-zone setting is updated.
  5. Office Hours Exclusion Timing: As described in Section 4.5.1 (Time Based Eligibility Feature), CheckedUp excludes ad plays that occur outside of recorded office hours. However, ad plays from devices that were offline at the time of playback may be received after the daily office-hours exclusion process has already run. In such cases, these late-arriving events may not be automatically excluded, even if they occurred outside of operational hours.
  6. Manual Demo Device Identification: Internal traffic filtration relies on the manual application of the “Demo Device” flag in the CMS by the operations team. This introduces a risk of false positives (removing legitimate traffic) or false negatives (including internal traffic) if manual errors occur. CheckedUp conducts periodic internal reviews of device classifications to mitigate this risk.
      1.  

Updates and Changes

Methodology Update Process

CheckedUp will reevaluate accuracy of data measurement on a regular cadence and update the process where any errors or improvements can be found.

Methodological changes are tracked in CheckedUp’s Journal of Changes (JoC). Known errors in published figures are documented in a separate Reporting Error Log maintained for a minimum of two years.

Change Notifications

Clients will be notified of any methodology changes that result in a difference of ±5% or more to the in-scope metrics (Section 2.3), upon the issuance or reissuance of the affected report.

Restatements and Error Correction

If a reporting error is discovered after a report is issued, CheckedUp will assess whether the error materially affects campaign reporting. If so, the client will be notified and a corrected report will be issued. An internal Report Error Log tracks all such events, including cause, resolution, and disclosure decisions.