Datamart replication

The datamart replication feature allows you to replicate all the data ingested by mediarithmics in an external solution of your choice. For now, we integrate with:

  • Google Cloud Platform - Pub/Sub

  • Microsoft Azure - Event Hubs (Alpha)

This module is not included in the default plan. Contact your Account manager to activate it.

How it works

We replicate the update and delete operations from Pionus for the eight following objects:

We replicate all of these operations. No filtering is available.

Replication status

Your replication can be in one of the following status:

  • ACTIVE: All data processed by Pionus will be replicated to your external solution.

  • PAUSED: No data processed by Pionus will be replicated to your external solution.

  • ERROR: The system is no longer able to replicate messages. In this case, check your external solution (expired instance, invalid credentials, etc). If you can't find anything wrong, please contact your Account manager.

Initial synchronization

You can run an initial synchronization so that already existing data within the mediarithmics platform can be replicated.

All your active replications will receive a set of UPDATE operations that represent all existing elements of your datamart for exampleUserPoint, and UserActivity.

Please note that if you run an initial synchronization you might receive a large volume of messages. Processing them can be expensive, depending on your cloud provider.

Output messages

We convert Pionus operations in a standardized output format: operation = {ts, doc_type, doc_id, op, value}

Field

Type

Comment

ts

Timestamp (Long)

The mutation date

doc_type

String

The object type :UserActivity, UserProfile, UserSegment, UserAgent, UserAccount, UserEmail, UserPoint or UserPointParent

doc_id

String

The object unique id. The format varies depending on doc_type

op

String

The operation type UPDATE or DELETE

value

JSON Object

The object value. The format varies depending on doc_type.

Object formats based on doc_type

doc_type

doc_id

value

UserPoint

{{user_point_id}}

Empty (you already have the user_point_id in the doc_id)

UserAgent

{{user_point_id}}:{{vector_id}}

Browser info and device info

UserDevicePoint

{{user_point_id}}:{{user_device_point_id}}

Browser info and device info

UserDeviceTechnicalId

{{user_point_id}}:{{user_device_point_id}}:{{user_device_technical_id}}

Empty (you already have the user_device_technical_id in the doc_id)

UserActivity

{{user_point_id}}:{{user_activity_id}}

Detailed activity

UserSegment

{{user_point_id}}:{{segment_id}}

Creation timestamp and last modify timestamp (date)

UserProfile

{{user_point_id}}:{{compartment_id}}:{{user_account_id}}

Detailed profile

UserAccount

{{user_point_id}}:{{compartment_id}}:{{user_account_id}}

Empty (you already have the user_account_id in the doc_id)

UserEmail

{{user_point_id}}:{{email_hash}}

User's email hash

UserPointParent

{{user_point_id}}

It is the ID of the UserPoint which is merged on the oldest one (the kept one).

Message <current_user_point_id> merged with <the_kept_user_point_id>

Examples

A new activity will trigger a replicated UserActivity operation. You will receive a similar message in your external solution as shown in this example.

{
   "ts": 1572947762,
   "doc_type": "UserActivity",
   "doc_id": "XXXXXXX-XXXX-XXX-XXXXXXXX:XXXXXX-XXXXX-XXXX-XXXX-XXXXXXXXXX",
   "op":" UPDATE",
   "value":{
       "$type":"SITE_VISIT",
       "$source":"XXXX",
       "etc": "etc"
   }
}

A new user agent will trigger a replicated UserAgent operation like the one bellow

{
   "ts":1676627112685,
   "doc_type":"UserAgent",
   "doc_id":"4700c85f-17e3-4304-aa7f-dc140173b08d:vec:32453299893",
   "op":"UPDATE",
   "value":{
      "$os_family":"LINUX",
      "$brand":null,
      "$os_version":null,
      "$form_factor":"PERSONAL_COMPUTER",
      "$carrier":null,
      "$model":null,
      "$creation_ts":0,
      "$browser_family":"FIREFOX"
   }
}

A new user device point will trigger a replication UserDevicePoint operation like the one bellow

{
   "ts":1676627112685,
   "doc_type":"UserDevicePoint",
   "doc_id":"4700c85f-17e3-4304-aa7f-dc140173b08d:udp:-32453299893",
   "op":"UPDATE",
   "value":{
      "$os_family":"LINUX",
      "$brand":null,
      "$os_version":null,
      "$form_factor":"PERSONAL_COMPUTER",
      "$carrier":null,
      "$model":null,
      "$creation_ts":0,
      "$browser_family":"FIREFOX"
   }
}

A new user device technical id will trigger a replicated UserDeviceTechnicalId operation like the ones bellow

// exemple with a MumId 
{
   "ts":1676627112685,
   "doc_type":"UserDeviceTechnicalId",
   "doc_id":"4700c85f-17e3-4304-aa7f-dc140173b08d:udp:-32453299893:mum:7231822539",
   "op":"UPDATE",
   "value":{}
}

// exemple with an installationId
{
   "ts":1676627112685,
   "doc_type":"UserDeviceTechnicalId",
   "doc_id":"4700c85f-17e3-4304-aa7f-dc140173b08d:udp:-32453299893:ins:1001:aZmFhOTVlM2ItMGRhOC00NDZlLWFhODMtNjZlZGI0YjNiNTk2",
   "op":"UPDATE",
   "value":{}
}

Upgrade of datamarts to user_point_system_version v202205

For datamarts with user_point_system_version anterior to v202205, device identifiers are stored as User Agents, and replicated as UserAgent operations (doc_id exemple: 4700c85f-17e3-4304-aa7f-dc140173b08d:vec:32453299893).

However for datamarts leveraging the user_point_system_version v202205, device identifiers are stored as User Device Points and User Device Technical Ids, and replicated through UserDevicePoint and UserDeviceTechnicalId operations.

In the case of a datamart that is upgraded to theuser_point_system_version v202205:

  • New device identifiers are directly stored and replicated using the device point formats,

  • Existing device identifiers that were previously stored in the UserAgent format are progressively migrated.

This migration is seemless within the datamart, however it is reflected on your datamart replication. For each migrated device identifier, you will receive:

  • A DELETE operation with the doc_type User Agent

  • Two UPDATE operations with doc_type UserDevicePoint and doc_type UserDeviceTechnicalId

For instance, a migration of a user agent with a doc_id 4700c85f-17e3-4304-aa7f-dc140173b08d:vec:7231822539 will produce

  • 1 DELETE operation with doc_type UserAgent the same doc_id

  • 2 UPDATE operations:

    • 1 with doc_type UserDevicePoint and the following doc_id: 4700c85f-17e3-4304-aa7f-dc140173b08d:udp:-32453299893

    • 1 with doc_type UserDeviceTechnicalId and the following doc_id 4700c85f-17e3-4304-aa7f-dc140173b08d:udp:-32453299893:mum:7231822539

After migration, no more UserAgent operations will be produced

Setting up replications

Prerequisites

You need to have an instance of the external solution where you want to replicate your mediarithmics data.

Depending on the external solution, you will need to fulfill some requirements.

Google Pub/Sub

You will need:

Click on Create Service Account :

Give your service account a name, select the right account access (Pub/Sub Publisher, Pub/Sub Editor) and save.

Once your Service Account is created, you can generate your key :

credentials.json file example:

{
  "type": "service_account",
  "project_id": "xxx-xxx-xx",
  "private_key_id": "xxxxxxx",
  "private_key": "-----BEGIN PRIVATE KEY-----\n xxxxxxx \n-----END PRIVATE KEY-----\n",
  "client_email": "xxx@project_id.iam.gserviceaccount.com",
  "client_id": "xxxxxx",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/projetc_id.iam.gserviceaccount.com"
}

NOTE: Here is the Google Pub/Sub Pricing documentation: https://cloud.google.com/pubsub/pricing

Microsoft Azure Event Hubs (Alpha)

You will need:

credentials.txt file example:

Endpoint=sb://<FQDN>/;SharedAccessKeyName=<KeyName>;SharedAccessKey=<KeyValue>

NOTE: Here is the Microsoft Azure Event Hubs Pricing documentation: https://azure.microsoft.com/en-us/pricing/details/event-hubs/

Listing your replications

You can access replications in the datamart settings in your navigator application.

  1. Select the organisation on which there is the datamart you want to replicate.

  2. Click on Settings.

  3. Click on the Datamarts tab and then click the Datamart menu entry.

  4. Select the datamart you want to replicate.

  5. In the Replications subtab, you will see a table dedicated to your Datamart Replications.

Creating & starting a replication

To create a new replication:

  1. Go to the Replications subtab.

  2. Click New Replication.

  3. Select a Replication type matching the external solution of your choice.

  4. Complete configuration information (see Prerequisites as help to configure advanced fields).

  5. Click Select a File to upload your credentials file and click on Update.

  6. Click Save Replication to create your new replication.

  7. You will see your new replication in the Replications subtab.

example for Google Pub/Sub :

When a Replication is created, its status is automatically set to Paused. To start your replication, you will have to activate it. If the system can't replicate your datamart on activation, you will see an error.

When a replication can't be activated, it is usually due to an error on credentials, so you might want to verify your replication configuration and your credentials file first.

Activating / pausing a replication

You can change the replication status using the status button.

If the system is no longer able to replicate the messages, the replication status will be set to ERROR. In this case, check your external solution (expired instance, invalid credentials, etc). If you can't find anything wrong, please contact your Account manager.

In case of an error with your external solution, you will need to recreate your replication to rebind it to a new working external solution with good credentials and the right specific information.

Executing an initial synchronization

A dashboard listing every Initial Synchronization that was done on your datamart is available in the same Replications subtab.

For now, you will have to ask your Account manager to run an initial synchronization. Later, you will be able to run an initial synchronization yourself by clicking New Execution. You must have at least one active replication and you can't run an initial synchronization more than once a week.

We replicate all operations. No filtering is possible.

Please note that you might receive a large volume of messages while running an initial synchronization. Processing them can be expensive, depending on your cloud provider.

While an initial synchronization is running, you can't change the status of your replications. The initial synchronization will only replicate the data for active replications.

The active replications are still running during initial synchronizations. Messages from the initial synchronization and live messages (tag, import, etc.) from the active replications are mixed.

Last updated