Sending documents
You'll first need to create a document import, then you'll be able to launch executions.

Create a document import

The first step if you want to use bulk imports is to create a document import. See this as creating a configuration for your imports of the same type.
Creating a document import is done with a simple request to POST /v1/datamarts//documents_imports. Let's create a document import for user activities that we will call "My user activity document import". You will need to replace <DATAMART_ID> with your datamart id (which can be found in the UI in Settings > Datamart > Datamarts) and <YOUR_API_TOKEN> with your authentication token.
1
curl -X POST \
2
https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports \
3
-H 'Authorization: <YOUR_API_TOKEN>' \
4
-H 'Content-Type: application/json' \
5
-d '{
6
"document_type": "USER_ACTIVITY",
7
"mime_type": "APPLICATION_X_NDJSON",
8
"encoding": "utf-8",
9
"name": "My user activity document import"
10
}'
Copied!
Let's unpack this:
  • for document_type we have chosen USER_ACTIVITY in order to send user activities. Other valid values would be USER_SEGMENT, USER_PROFILE, USER_IDENTIFIERS_ASSOCIATION_DECLARATIONS
  • mime_type should match the type of format you will use for your data. Valid values are APPLICATION_X_NDJSON (if you will send data in a NDJSON format) or TEXT_CSV (if you format your data as comma-separated values). In the case of USER_ACTIVITY, only NDJSON is valid
  • encoding is the encoding of the data that will be imported
  • name is the name you want to give this document import
See the API documentation on this endpoint or our guide on document imports for more information on the other types of document.If everything went well, the response should look something like this:
1
{
2
"status": "ok",
3
"data": {
4
"id": "<DOCUMENT_IMPORT_ID>",
5
"datafarm_key": "DF_EU_YYYY_MM",
6
"datamart_id": "<DATAMART_ID>",
7
"document_type": "USER_ACTIVITY",
8
"mime_type": "APPLICATION_X_NDJSON",
9
"encoding": "utf-8",
10
"name": "My user activity document import",
11
"priority": "MEDIUM"
12
}
13
}
Copied!
Let's take note of provided the <DOCUMENT_IMPORT_ID>.

Create a document import execution

Once we have created our document import, we can start creating executions (i.e. actually sending data!).
Let's send some store visits. First, we will prepare our JSON file:
1
{
2
"$email_hash": {
3
"$email": "[email protected]"
4
},
5
"$type": "TOUCH",
6
"$session_status": "NO_SESSION",
7
"$ts": 1605262037783,
8
"$events": [{
9
"$event_name": "store-visit",
10
"$ts": 1605262037783,
11
"$properties": {}
12
}],
13
"$location": {
14
"$country": "france",
15
"$city": "paris",
16
"$zip_code": "75001"
17
}
18
}
Copied!
Please check our guide on user activity imports for a complete explanation of all the properties in our payload.
Now we will convert our JSON file to NDJSON and send in the body of the following request. If you want to learn more about the NDJSON format, check out this site.
1
curl --location --request POST 'https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions' \
2
--header 'Content-Type: application/x-ndjson' \
3
--header 'Authorization: <YOUR_API_TOKEN>' \
4
--data-raw '{"$email_hash": {"$email": "[email protected]"},"$type": "TOUCH","$session_status": "NO_SESSION","$ts": 1605262037783,"$events": [{"$event_name": "store-visit","$ts": 1605262037783,"$properties": {}}],"$location": {"$country": "france","$city": "paris","$zip_code": "75001"}}'
Copied!
You will need to replace <DATAMART_ID> with your datamart id, <DOCUMENT_IMPORT_ID> with the document import id you got in the previous request and <YOUR_API_TOKEN> with your authentication token.
Please note that your Content-Type header must match the mime_type you set when creating the document import earlier.
The response should look like this:
1
{
2
"status": "ok",
3
"data": {
4
"parameters": null,
5
"result": null,
6
"error": null,
7
"id": "<DOCUMENT_IMPORT_EXECUTION_ID>",
8
"status": "PENDING",
9
"creation_date": 1605271495713,
10
"start_date": null,
11
"duration": null,
12
"organisation_id": "<ORGANISATION_ID>",
13
"user_id": null,
14
"cancel_status": null,
15
"debug": null,
16
"is_retryable": false,
17
"permalink_uri": "xxxxxx",
18
"num_tasks": null,
19
"completed_tasks": null,
20
"erroneous_tasks": null,
21
"retry_count": 0,
22
"job_type": "DOCUMENT_IMPORT",
23
"import_mode": "MANUAL_FILE",
24
"import_type": null
25
}
26
}
Copied!
Take note of the <DOCUMENTATION_IMPORT_EXECUTION_ID> here, you will need it if you want to check the status of your execution.

Checking your document import execution status

You can check the status of your execution with this simple request:
1
curl --location --request GET 'https://api.mediarithmics.com/v1/datamarts/<DATAMART_ID>/document_imports/<DOCUMENT_IMPORT_ID>/executions/<DOCUMENT_IMPORT_EXECUTION_ID>' \
2
--header 'Authorization: <YOUR_API_TOKEN>'
Copied!
The response should look like this:
1
{
2
"status": "ok",
3
"data": {
4
"parameters": {
5
"datamart_id": 1502,
6
"document_import_id": 20517,
7
"mime_type": "APPLICATION_X_NDJSON",
8
"document_type": "USER_ACTIVITY",
9
"input_file_name": "xxxxxx",
10
"file_uri": "xxxxxx",
11
"number_of_lines": 1,
12
"segment_id": null
13
},
14
"result": {
15
"total_success": 1,
16
"total_failure": 0,
17
"input_file_name": "xxxxxx",
18
"input_file_uri": "xxxxxx",
19
"error_file_uri": "xxxxxx",
20
"possible_issue_on_identifiers": false,
21
"top_identifiers": {}
22
},
23
"error": null,
24
"id": "<DOCUMENT_IMPORT_EXECUTION_ID>",
25
"status": "PENDING",
26
"creation_date": 1605627687764,
27
"start_date": 1605627714053,
28
"duration": 1065,
29
"organisation_id": "<ORGANISATION_ID>",
30
"user_id": null,
31
"cancel_status": null,
32
"debug": null,
33
"is_retryable": false,
34
"permalink_uri": "xxxxxxx",
35
"num_tasks": 1,
36
"completed_tasks": 1,
37
"erroneous_tasks": 0,
38
"retry_count": 0,
39
"job_type": "DOCUMENT_IMPORT",
40
"import_mode": "MANUAL_FILE",
41
"import_type": null,
42
"end_date": 1605627715118
43
}
44
}
Copied!
Notice the PENDING status: after a while, the execution will be processed and if you check again, status will be changed to RUNNING then to SUCCEEDED.