Developer
User guidesDeveloper websiteHelp centerLog in
  • Welcome!
  • Organisations structure
    • Datamart
    • Users and roles
  • User points
    • User identifiers
      • Networks IDs
        • Device-based Network IDs
          • Custom Device ID integration
          • ID5
          • First ID
        • User-based Network IDs
          • Custom User ID integration
          • UTIQ martechpass
      • Accounts
      • Emails
      • Device identifiers
    • User activities and events
    • Compartments
    • User profiles
    • User segments
    • Hyper point & Quarantine
  • Data model
    • Defining your schema
    • Computed fields
      • Concepts
      • Setup
      • Development
      • Examples
  • Data ingestion
    • Real time user tracking
      • Website tracking
      • Mobile apps tracking
      • Ads exposure tracking
      • AMP tracking
      • Conversions tracking
      • Email views and clicks
      • Tracking API
      • Event rules
      • Activity analyzers
    • Bulk processing
      • Imports
        • User activities import
        • User profiles import
        • User choices import
        • Segments import
      • Deletions
        • User identifiers deletion
        • Device points deletion
        • User points deletion
      • User identifiers association
      • Integration batch
    • Activities analytics
    • Data warehouse
      • Preliminary setup
        • BigQuery
      • Create data warehouse
  • Querying your data
    • OTQL queries
    • OTQL examples
    • GraphQL queries
    • UserPoint API
    • User activities
    • Activities analytics queries
      • API Quickstart
      • Dimensions and metrics
      • Use cases
    • Funnel API
  • Alerting
    • Alert configurations
  • Data visualisation
    • Quickstart
    • Dashboards
    • Sections and cards
    • Charts
    • Datasets and data sources
      • Using a data file data source
    • Transformations
    • Filters
    • Cookbook
    • Reference
  • Advanced usages
    • Audience segmentation
      • Audience features
      • Segment builders
      • Audience segment metrics
      • Audience segment feed
        • Building new feeds
        • Monitoring a feed
        • Curated Audiences (SDA)
      • Edge segments
      • Cohort-based Lookalike
    • Contextual targeting
      • Setup
      • Activation
        • Google Ad Manager
        • Xandr (through prebid.js)
      • API documentation
    • Exporting your data
      • Query Exports
      • Datamart replication
    • Data privacy compliance
      • User choices
      • Cleaning rules
      • Exercise of user rights
      • Cookies
    • Campaigns
    • Automations
      • Email routers
      • Email renderers
      • Opt-in provider
      • Custom action plugins
      • Usage limits for automations
    • Plugins
      • Concepts
      • Creation & Deployment
      • Coding your plugin
      • Manage existing plugins
      • Layouts
      • Presets
      • Monitoring
      • Throttling
      • Batching (for external feeds)
    • Platform monitoring
      • Resources usage
        • Dimensions and metrics
      • Collection volumes
        • Dimensions and metrics
      • Events ingestion monitoring
        • Dimensions and metrics
    • Data Clean Room
      • Bunker
      • Clean room
  • Resources
    • Tutorial: Data Ingestion
      • Your first events
        • Add the mediarithmics tag
          • Getting the tag
          • Adding the tag
        • Send events using the tag
          • Adding event properties
          • Finding the UserEvent type in your schema
          • Matching your schema
          • Standard events
      • Your first bulk imports
        • API basics
          • Authentication
          • Your first API call
        • Send documents using the API
          • Requirements
          • Sending documents
    • Using our API
      • Authentication
    • Tools & libraries
      • mics CLI
      • JS Tag
      • Plugin SDK
    • Data cubes
      • Creating a report
      • Reference
Powered by GitBook
On this page
  • Create a document import
  • Create a document import execution
  • Checking your document import execution status

Was this helpful?

Export as PDF
  1. Resources
  2. Tutorial: Data Ingestion
  3. Your first bulk imports
  4. Send documents using the API

Sending documents

You'll first need to create a document import, then you'll be able to launch executions.

Create a document import

The first step if you want to use bulk imports is to create a document import. See this as creating a configuration for your imports of the same type.

Creating a document import is done with a simple request to POST /v1/datamarts//documents_imports. Let's create a document import for user activities that we will call "My user activity document import". You will need to replace <DATAMART_ID> with your datamart id (which can be found in the UI in Settings > Datamart > Datamarts) and <YOUR_API_TOKEN> with your authentication token.

curl -X POST \
  https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	"document_type": "USER_ACTIVITY",
	"mime_type": "APPLICATION_X_NDJSON",
	"encoding": "utf-8",
	"name": "My user activity document import"
}'

Let's unpack this:

  • for document_type we have chosen USER_ACTIVITY in order to send user activities. Other valid values would be USER_SEGMENT, USER_PROFILE, USER_IDENTIFIERS_ASSOCIATION_DECLARATIONS

  • mime_type should match the type of format you will use for your data. Valid values are APPLICATION_X_NDJSON (if you will send data in a NDJSON format) or TEXT_CSV (if you format your data as comma-separated values). In the case of USER_ACTIVITY, only NDJSON is valid

  • encoding is the encoding of the data that will be imported

  • name is the name you want to give this document import

See the API documentation on this endpoint or our guide on document imports for more information on the other types of document.If everything went well, the response should look something like this:

{
    "status": "ok",
    "data": {
        "id": "<DOCUMENT_IMPORT_ID>",
        "datafarm_key": "DF_EU_YYYY_MM",
        "datamart_id": "<DATAMART_ID>",
        "document_type": "USER_ACTIVITY",
        "mime_type": "APPLICATION_X_NDJSON",
        "encoding": "utf-8",
        "name": "My user activity document import",
        "priority": "MEDIUM"
    }
}

Let's take note of provided the <DOCUMENT_IMPORT_ID>.

Create a document import execution

Once we have created our document import, we can start creating executions (i.e. actually sending data!).

Let's send some store visits. First, we will prepare our JSON file:

{
  "$email_hash": {
    "$email": "some.email@dummy.com"
  },
  "$type": "TOUCH",
  "$session_status": "NO_SESSION",
  "$ts": 1605262037783,
  "$events": [{
    "$event_name": "store-visit",
    "$ts": 1605262037783,
    "$properties": {}
  }],
  "$location": {
    "$country": "france",
    "$city": "paris",
    "$zip_code": "75001"
  }
}

Please check our guide on user activity imports for a complete explanation of all the properties in our payload.

curl --location --request POST 'https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions' \
--header 'Content-Type: application/x-ndjson' \
--header 'Authorization: <YOUR_API_TOKEN>' \
--data-raw '{"$email_hash": {"$email": "some.email@dummy.com"},"$type": "TOUCH","$session_status": "NO_SESSION","$ts": 1605262037783,"$events": [{"$event_name": "store-visit","$ts": 1605262037783,"$properties": {}}],"$location": {"$country": "france","$city": "paris","$zip_code": "75001"}}'

You will need to replace <DATAMART_ID> with your datamart id, <DOCUMENT_IMPORT_ID> with the document import id you got in the previous request and <YOUR_API_TOKEN> with your authentication token.

Please note that your Content-Type header must match the mime_type you set when creating the document import earlier.

The response should look like this:

{
    "status": "ok",
    "data": {
        "parameters": null,
        "result": null,
        "error": null,
        "id": "<DOCUMENT_IMPORT_EXECUTION_ID>",
        "status": "PENDING",
        "creation_date": 1605271495713,
        "start_date": null,
        "duration": null,
        "organisation_id": "<ORGANISATION_ID>",
        "user_id": null,
        "cancel_status": null,
        "debug": null,
        "is_retryable": false,
        "permalink_uri": "xxxxxx",
        "num_tasks": null,
        "completed_tasks": null,
        "erroneous_tasks": null,
        "retry_count": 0,
        "job_type": "DOCUMENT_IMPORT",
        "import_mode": "MANUAL_FILE",
        "import_type": null
    }
}

Take note of the <DOCUMENTATION_IMPORT_EXECUTION_ID> here, you will need it if you want to check the status of your execution.

Checking your document import execution status

You can check the status of your execution with this simple request:

curl --location --request GET 'https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions/<DOCUMENT_IMPORT_EXECUTION_ID>' \
--header 'Authorization: <YOUR_API_TOKEN>'

The response should look like this:

{
    "status": "ok",
    "data": {
        "parameters": {
            "datamart_id": 1502,
            "document_import_id": 20517,
            "mime_type": "APPLICATION_X_NDJSON",
            "document_type": "USER_ACTIVITY",
            "input_file_name": "xxxxxx",
            "file_uri": "xxxxxx",
            "number_of_lines": 1,
            "segment_id": null
        },
        "result": {
            "total_success": 1,
            "total_failure": 0,
            "input_file_name": "xxxxxx",
            "input_file_uri": "xxxxxx",
            "error_file_uri": "xxxxxx",
            "possible_issue_on_identifiers": false,
            "top_identifiers": {}
        },
        "error": null,
        "id": "<DOCUMENT_IMPORT_EXECUTION_ID>",
        "status": "PENDING",
        "creation_date": 1605627687764,
        "start_date": 1605627714053,
        "duration": 1065,
        "organisation_id": "<ORGANISATION_ID>",
        "user_id": null,
        "cancel_status": null,
        "debug": null,
        "is_retryable": false,
        "permalink_uri": "xxxxxxx",
        "num_tasks": 1,
        "completed_tasks": 1,
        "erroneous_tasks": 0,
        "retry_count": 0,
        "job_type": "DOCUMENT_IMPORT",
        "import_mode": "MANUAL_FILE",
        "import_type": null,
        "end_date": 1605627715118
    }
}

Notice the PENDING status: after a while, the execution will be processed and if you check again, status will be changed to RUNNING then to SUCCEEDED.

PreviousRequirementsNextUsing our API

Last updated 4 years ago

Was this helpful?

Now we will convert our JSON file to NDJSON and send in the body of the following request. If you want to learn more about the NDJSON format, check out .

this site