Developer
User guidesDeveloper websiteHelp centerLog in
  • Welcome!
  • Organisations structure
    • Datamart
    • Users and roles
  • User points
    • User identifiers
      • Networks IDs
        • Device-based Network IDs
          • Custom Device ID integration
          • ID5
          • First ID
        • User-based Network IDs
          • Custom User ID integration
          • UTIQ martechpass
      • Accounts
      • Emails
      • Device identifiers
    • User activities and events
    • Compartments
    • User profiles
    • User segments
    • Hyper point & Quarantine
  • Data model
    • Defining your schema
    • Computed fields
      • Concepts
      • Quickstart
      • Examples
  • Data ingestion
    • Real time user tracking
      • Website tracking
      • Mobile apps tracking
      • Ads exposure tracking
      • AMP tracking
      • Conversions tracking
      • Email views and clicks
      • Tracking API
      • Event rules
      • Activity analyzers
    • Bulk processing
      • Imports
        • User activities import
        • User profiles import
        • User choices import
        • Segments import
      • Deletions
        • User identifiers deletion
        • Device points deletion
        • User points deletion
      • User identifiers association
      • Integration batch
    • Activities analytics
    • Data warehouse
      • Preliminary setup
        • BigQuery
      • Create data warehouse
  • Querying your data
    • OTQL queries
    • OTQL examples
    • GraphQL queries
    • UserPoint API
    • User activities
    • Activities analytics queries
      • API Quickstart
      • Dimensions and metrics
      • Use cases
    • Funnel API
  • Alerting
    • Alert configurations
  • Data visualisation
    • Quickstart
    • Dashboards
    • Sections and cards
    • Charts
    • Datasets and data sources
      • Using a data file data source
    • Transformations
    • Filters
    • Cookbook
    • Reference
  • Advanced usages
    • Audience segmentation
      • Audience features
      • Segment builders
      • Audience segment metrics
      • Audience segment feed
        • Building new feeds
        • Monitoring a feed
        • Curated Audiences (SDA)
      • Edge segments
      • Cohort-based Lookalike
    • Contextual targeting
      • Setup
      • Activation
        • Google Ad Manager
        • Xandr (through prebid.js)
      • API documentation
    • Exporting your data
      • Query Exports
      • Datamart replication
    • Data privacy compliance
      • User choices
      • Cleaning rules
      • Exercise of user rights
      • Cookies
    • Campaigns
    • Automations
      • Email routers
      • Email renderers
      • Opt-in provider
      • Custom action plugins
      • Usage limits for automations
    • Plugins
      • Concepts
      • Creation & Deployment
      • Coding your plugin
      • Manage existing plugins
      • Layouts
      • Presets
      • Monitoring
      • Throttling
      • Batching (for external feeds)
    • Platform monitoring
      • Resources usage
        • Dimensions and metrics
      • Collection volumes
        • Dimensions and metrics
      • Events ingestion monitoring
        • Dimensions and metrics
    • Data Clean Room
      • Bunker
      • Clean room
  • Resources
    • Tutorial: Data Ingestion
      • Your first events
        • Add the mediarithmics tag
          • Getting the tag
          • Adding the tag
        • Send events using the tag
          • Adding event properties
          • Finding the UserEvent type in your schema
          • Matching your schema
          • Standard events
      • Your first bulk imports
        • API basics
          • Authentication
          • Your first API call
        • Send documents using the API
          • Requirements
          • Sending documents
    • Using our API
      • Authentication
    • Tools & libraries
      • mics CLI
      • JS Tag
      • Plugin SDK
    • Data cubes
      • Creating a report
      • Reference
Powered by GitBook
On this page
  • How-to
  • User profile import command
  • Example

Was this helpful?

Export as PDF
  1. Data ingestion
  2. Bulk processing
  3. Imports

User profiles import

PreviousUser activities importNextUser choices import

Last updated 1 year ago

Was this helpful?

Use this feature to UPSERT or DELETE in your datamart.

How-to

Use the endpoints to create a with theUSER_PROFILEdocument type and APPLICATION_X_NDJSON mime type. Only ndjson data is supported for user profiles.

Then, create anwith your user profile import commands formatted in ndjson .

User profile import command

Each line in the uploaded file can have the following properties:

field

type

description

operation

Enum

Either UPSERT or DELETE

compartment_id

String (Optional)

user_account_id

String (Optional)

email_hash

String (Optional)

user_agent_id

String (Optional)

force_replace

Boolean (Optional)

Mandatory when the operation is UPSERT. If true, then the User Profile will be completely replaced by the object passed in the user_profile field. If false, the object passed in the user_profile field will be merged with the existing User Profile of the User Point.

merge_objects

Boolean (Optional)

Only considered if force_replace is false.

Manage the comportement between two objects with a same property.

If false (default value), the new object overrides the existing one.

user_profile

JSON Object (Optional)

Mandatory when operation == UPSERT.

Example

# Create the document import
curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	"document_type": "USER_PROFILE",
	"mime_type": "APPLICATION_X_NDJSON",
	"encoding": "utf-8",
	"name": "<YOUR_DOCUMENT_IMPORT_NAME>"
}'
# Create the execution
curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions \
  -H 'Authorization: <API_TOKEN>' \
  -H 'Content-Type: application/x-ndjson' \
  -d '{ 
        "operation": "UPSERT",
        "compartment_id": "<COMPARTMENT_ID>", 
        "user_account_id": "<USER_ACCOUNT_ID>",
        "force_replace": false,
        "user_profile": {
              this": "is",
              "a":"test"
        }
      }'

You can, of course, upload multiple user profiles at once. Note the uploaded data is in ndjson and not json. That means the different profiles are not separated by commas, but by a line separator \n

When importing profiles with identifiers, only one identifier is allowed per line. For example, you shouldn't specify the user agent ID if the Email Hash is already used in a line.

More details on merge_objects behavior :

# Stored profile:
{
  "my_property_1": "value1",
  "my_property_2": "value1",
  "my_array_property": ["value1"]
  "my_array_object_property": [
    {
      "my_sub_array_object_property_1": "value1",
      "my_sub_array_object_property_2": "value1"
    }
  ],
  "my_object_property": {
    "my_sub_object_property_1": "value1",
    "my_sub_object_property_2": "value1"
  }    
}


# New profile in request payload
curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions \
  -H 'Authorization: <API_TOKEN>' \
  -H 'Content-Type: application/x-ndjson' \
  -d '{ 
        "operation": "UPSERT",
        "compartment_id": "<COMPARTMENT_ID>", 
        "user_account_id": "<USER_ACCOUNT_ID>",
        "force_replace": false,
        "merge_objects": true,
        "user_profile": {
          "my_property_2": "value2",
          "my_property_3": "value3",
          "my_array_property": ["value2"]
          "my_array_object_property": [
            {
              "my_sub_array_object_property_2": "value2"
              "my_sub_array_object_property_3": "value3"
            }
          ],
          "my_object_property": {
            "my_sub_object_property_2": "value2"
            "my_sub_object_property_3": "value3"
          }    
        }
      }'

# New saved profile:
{
  "my_property_1": "value1",
  "my_property_2": "value2", # override scalar property
  "my_property_3": "value3",
  "my_array_property": ["value1","value2"] # merge arrays
  "my_array_object_property": [ # merge arrays
    {
      "my_sub_array_object_property_1": "value1"
      "my_sub_array_object_property_2": "value1"
    },
    {
      "my_sub_array_object_property_2": "value2"
      "my_sub_array_object_property_3": "value3"
    }
  ],
  "my_object_property": { # merge objects
    "my_sub_object_property_1": "value1"
    "my_sub_object_property_2": "value2" # override scalar property within object
    "my_sub_object_property_3": "value3"
  }    
}

The Compartment ID, acting as a user in correlation with user_account_id

The User Account ID, acting as an in correlation with compartment_id

The Email Hash, acting as an

The User Agent ID, acting as an

If true the new object is merged in deep to the existing one (see ).

JSON Object representing the User Profile. Please refer to the for more information.

user profile object
example
user profiles
bulk import
identifier
identifier
identifier
identifier
document import
execution