Description: This article describes the process Innovid follows to produce an effective attribution solution for individual data partners, to manage expectations of attribution timescales.

This document informs data partners of the Innovid data evaluation process and provides information for each step.


Data evaluation process

Four steps are required before attribution is ready. The tables in this section show a list of possible data and information required—the exact requirements will depend on each data evaluation project.

Typically, the evaluation process takes 8-12 weeks from receipt of data, but this can sometimes take longer if there are additional complexities.

Data Evaluation Process.png


Step 1: Data and test campaign definition

The following tables outline what is required to kick off the data evaluation process.

a. Data details

Data type Description Who
Viewership (if applicable) Expected format/columns of viewership data for all devices in the panel and how they are linked to devices and outcomes data. Data partner
Ad impressions (if applicable) Expected format/columns of ad impressions data for linear or digital streaming impressions and how they should be linked to holdout/control groups and outcomes data. Data partner
Holdout groups (if applicable) Expected format/columns of holdout groups for ad impressions campaign data. Data partner
Control groups (if applicable)

Expected format/columns of holdout groups for ad impressions campaign data or partner-specific best practice for creating control groups.

If none is provided, Innovid will use its default methodology to create control groups.

Data partner
Sessions and outcomes (if applicable) Expected format/columns of outcomes data, if not using Innovid web tag or 3rd party app data, and how they should be linked to viewership or impressions data. Data partner
Upscaling approach (if applicable)

Partner-specific best practice for upscaling from panel to population (either geographic or demographic or both)

If none is provided, Innovid will use its default methodology. A universe file may be required.

Data partner
Filtering rules (if applicable) Partner-specific rules for filtering out data as needed (i.e., duplicates or outliers). This is expected to be done at the partner level; however, rules can also be implemented by Innovid as needed. Data partner + Innovid
Data dictionary Description of all provided data files, fields, and formats. Data partner

Data update cadence

Expected file updates and how they should be treated (i.e., cumulatively or if recent files replace historic ones). Data partner + Innovid

 

b. Data matching

This is critical to ensuring that all provided data can be linked appropriately to drive final attribution results.

Data type Description Who
Matching key(s) Based on the linking rules provided in the previous section, InnovidXP will match appropriate datasets. Innovid
Crosswalk Used for linking identifiers across datasets and keeping track of active/changing IDs over time. Innovid
Cadence The frequency of crosswalk updates depends on the data file cadence in the previous section and the appropriateness of deliverables/outcomes. Innovid

 

c. Test campaign details

Depending on the proposed evaluation and outcomes, InnovidXP may want to evaluate the data across several customers and/or campaigns.

Campaign detail Description Who
Customer name Provide the name of the advertiser for the test campaign. Data partner and Innovid
Campaign name and size Provide the campaign name and duration. Data partner and Innovid
Campaign schedule Confirm whether this is a retrospective campaign or whether it is yet to happen. Data partner and Innovid
Campaign type Confirm if the campaign is OTT/linear or both. Data partner 
Expected window between impression and outcome Confirm the estimated gap between the impression being served and an outcome happening. Data partner
Campaign area Confirm whether the campaign is national (across the U.S.) or regional only. Data partner and Innovid

Step 2: Data loading and matching

The following table outlines what happens during the second step of the process:

Action Description Who
a. Receiving Decide how to send the data, for example, to an S3 bucket. Data partner and Innovid
b. Loading Upload millions of rows of data to the system. Innovid
c. Matching Match the outcome to the ad exposure via crosswalk implementation. Innovid

Step 3: Data science evaluation

Innovid’s skilled data scientists analyze and evaluate the data to understand whether the data:

  • Has any significant gaps
  • Contains outliers
  • Requires additional cleaning or transformations
  • Works with our models
  • Identify options for alternative models, if required

Step 4: System and model tuning

We fine-tune the model to work with the uploaded data, to include:

  • Any upscaling (provided by the data partner in Phase 1)
  • Any extra configuration, for example, extended response windows or special metrics such as Uplift

Attribution results

Innovid can present the results ‘offline’ using an Excel spreadsheet. Sometimes, it is possible for Innovid to create a test platform and present the attribution results ‘online’, depending on the data.


Related content

FAQs: Using the Product

 

 

Was this article helpful?
0 out of 0 found this helpful