Developer
User guidesDeveloper websiteHelp centerLog in
  • Welcome!
  • Organisations structure
    • Datamart
    • Users and roles
  • User points
    • User identifiers
      • Networks IDs
        • Device-based Network IDs
          • Custom Device ID integration
          • ID5
          • First ID
        • User-based Network IDs
          • Custom User ID integration
          • UTIQ martechpass
      • Accounts
      • Emails
      • Device identifiers
    • User activities and events
    • Compartments
    • User profiles
    • User segments
    • Hyper point & Quarantine
  • Data model
    • Defining your schema
    • Computed fields
      • Concepts
      • Quickstart
      • Examples
  • Data ingestion
    • Real time user tracking
      • Website tracking
      • Mobile apps tracking
      • Ads exposure tracking
      • AMP tracking
      • Conversions tracking
      • Email views and clicks
      • Tracking API
      • Event rules
      • Activity analyzers
    • Bulk processing
      • Imports
        • User activities import
        • User profiles import
        • User choices import
        • Segments import
      • Deletions
        • User identifiers deletion
        • Device points deletion
        • User points deletion
      • User identifiers association
      • Integration batch
    • Activities analytics
    • Data warehouse
      • Preliminary setup
        • BigQuery
      • Create data warehouse
  • Querying your data
    • OTQL queries
    • OTQL examples
    • GraphQL queries
    • UserPoint API
    • User activities
    • Activities analytics queries
      • API Quickstart
      • Dimensions and metrics
      • Use cases
    • Funnel API
  • Alerting
    • Alert configurations
  • Data visualisation
    • Quickstart
    • Dashboards
    • Sections and cards
    • Charts
    • Datasets and data sources
      • Using a data file data source
    • Transformations
    • Filters
    • Cookbook
    • Reference
  • Advanced usages
    • Audience segmentation
      • Audience features
      • Segment builders
      • Audience segment metrics
      • Audience segment feed
        • Building new feeds
        • Monitoring a feed
        • Curated Audiences (SDA)
      • Edge segments
      • Cohort-based Lookalike
    • Contextual targeting
      • Setup
      • Activation
        • Google Ad Manager
        • Xandr (through prebid.js)
      • API documentation
    • Exporting your data
      • Query Exports
      • Datamart replication
    • Data privacy compliance
      • User choices
      • Cleaning rules
      • Exercise of user rights
      • Cookies
    • Campaigns
    • Automations
      • Email routers
      • Email renderers
      • Opt-in provider
      • Custom action plugins
      • Usage limits for automations
    • Plugins
      • Concepts
      • Creation & Deployment
      • Coding your plugin
      • Manage existing plugins
      • Layouts
      • Presets
      • Monitoring
      • Throttling
      • Batching (for external feeds)
    • Platform monitoring
      • Resources usage
        • Dimensions and metrics
      • Collection volumes
        • Dimensions and metrics
      • Events ingestion monitoring
        • Dimensions and metrics
    • Data Clean Room
      • Bunker
      • Clean room
  • Resources
    • Tutorial: Data Ingestion
      • Your first events
        • Add the mediarithmics tag
          • Getting the tag
          • Adding the tag
        • Send events using the tag
          • Adding event properties
          • Finding the UserEvent type in your schema
          • Matching your schema
          • Standard events
      • Your first bulk imports
        • API basics
          • Authentication
          • Your first API call
        • Send documents using the API
          • Requirements
          • Sending documents
    • Using our API
      • Authentication
    • Tools & libraries
      • mics CLI
      • JS Tag
      • Plugin SDK
    • Data cubes
      • Creating a report
      • Reference
Powered by GitBook
On this page
  • How-to
  • User identifiers association command
  • Example

Was this helpful?

Export as PDF
  1. Data ingestion
  2. Bulk processing

User identifiers association

PreviousUser points deletionNextIntegration batch

Last updated 3 years ago

Was this helpful?

This document import allows you to merge user points by associating their . Each line in the document represents a different user identifiers association

This is only supported for datamarts using a user point system version of v201812 or later.

How-to

  1. Use the endpoints to create a with theUSER_IDENTIFIERS_ASSOCIATION_DECLARATIONSdocument type and APPLICATION_X_NDJSON mime type. Only ndjson data is supported for user activities.

  2. Create anwith your commands formatted in ndjson .

User identifiers association command

Each line will create/merge a user point that has all the specified identifiers

field

type

description

identifiers

UserIdentifierResource[]

An array of User Identifier Resource of any type

User identifier resource can be of three shapes. Either email or user agent or user account id. They correspond with the different types of .

Email

field

type

description

type

"USER_EMAIL"

The type of the identifier.

hash

String

A hash of the email. The hashing function should be unique per datamart.

email

String (optional)

the email address

User Agent

field

type

description

type

"USER_AGENT"

The type of the identifier.

user_agent_id

String

The user agent ID

User Account

field

type

description

type

"USER_ACCOUNT"

The type of the identifier.

user_account_id

String

The User Account ID

compartment_id

String (optional)

The Compartment ID. If you don't input the compartment id it will fall back on the default one

Example

# Create the document import
curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	"document_type": "USER_IDENTIFIERS_ASSOCIATION_DECLARATIONS",
	"mime_type": "APPLICATION_X_NDJSON",
	"encoding": "utf-8",
	"name": "<YOUR_DOCUMENT_IMPORT_NAME>"
}'
# Create the execution
curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/1162/document_imports/<DOCUMENT_IMPORT_ID>/executions \
  -H 'Authorization: <API_TOKEN>' \
  -H 'Content-Type: application/x-ndjson' \
  -d '
    { 
      "identifiers":[
        { "type": "USER_EMAIL", "hash":"<EMAIL_HASH>" }, 
        { "type": "USER_AGENT", "user_agent_id": "<USER_AGENT_ID>" }, 
        { "type": "USER_ACCOUNT", "user_account_id": "<USER_ACCOUNT_ID>",  "compartment_id": "<COMPARTMENT_ID>" }
      ]
    }
  '

You can, of course, add different identifier types at the same time. Please note that the uploaded data is in ndjson and not json. That means the different additions are not separated by commas, but by a line separator \n

bulk import
user identifiers
user identifiers
document import
execution