Developer
User guidesDeveloper websiteHelp centerLog in
  • Welcome!
  • Organisations structure
    • Datamart
    • Users and roles
  • User points
    • User identifiers
      • Networks IDs
        • Device-based Network IDs
          • Custom Device ID integration
          • ID5
          • First ID
        • User-based Network IDs
          • Custom User ID integration
          • UTIQ martechpass
      • Accounts
      • Emails
      • Device identifiers
    • User activities and events
    • Compartments
    • User profiles
    • User segments
    • Hyper point & Quarantine
  • Data model
    • Defining your schema
    • Computed fields
      • Concepts
      • Setup
      • Development
      • Examples
  • Data ingestion
    • Real time user tracking
      • Website tracking
      • Mobile apps tracking
      • Ads exposure tracking
      • AMP tracking
      • Conversions tracking
      • Email views and clicks
      • Tracking API
      • Event rules
      • Activity analyzers
    • Bulk processing
      • Imports
        • User activities import
        • User profiles import
        • User choices import
        • Segments import
      • Deletions
        • User identifiers deletion
        • Device points deletion
        • User points deletion
      • User identifiers association
      • Integration batch
    • Activities analytics
    • Data warehouse
      • Preliminary setup
        • BigQuery
      • Create data warehouse
  • Querying your data
    • OTQL queries
    • OTQL examples
    • GraphQL queries
    • UserPoint API
    • User activities
    • Activities analytics queries
      • API Quickstart
      • Dimensions and metrics
      • Use cases
    • Funnel API
  • Alerting
    • Alert configurations
  • Data visualisation
    • Quickstart
    • Dashboards
    • Sections and cards
    • Charts
    • Datasets and data sources
      • Using a data file data source
    • Transformations
    • Filters
    • Cookbook
    • Reference
  • Advanced usages
    • Audience segmentation
      • Audience features
      • Segment builders
      • Audience segment metrics
      • Audience segment feed
        • Building new feeds
        • Monitoring a feed
        • Curated Audiences (SDA)
      • Edge segments
      • Cohort-based Lookalike
    • Contextual targeting
      • Setup
      • Activation
        • Google Ad Manager
        • Xandr (through prebid.js)
      • API documentation
    • Exporting your data
      • Query Exports
      • Datamart replication
    • Data privacy compliance
      • User choices
      • Cleaning rules
      • Exercise of user rights
      • Cookies
    • Campaigns
    • Automations
      • Email routers
      • Email renderers
      • Opt-in provider
      • Custom action plugins
      • Usage limits for automations
    • Plugins
      • Concepts
      • Creation & Deployment
      • Coding your plugin
      • Manage existing plugins
      • Layouts
      • Presets
      • Monitoring
      • Throttling
      • Batching (for external feeds)
    • Platform monitoring
      • Resources usage
        • Dimensions and metrics
      • Collection volumes
        • Dimensions and metrics
      • Events ingestion monitoring
        • Dimensions and metrics
    • Data Clean Room
      • Bunker
      • Clean room
  • Resources
    • Tutorial: Data Ingestion
      • Your first events
        • Add the mediarithmics tag
          • Getting the tag
          • Adding the tag
        • Send events using the tag
          • Adding event properties
          • Finding the UserEvent type in your schema
          • Matching your schema
          • Standard events
      • Your first bulk imports
        • API basics
          • Authentication
          • Your first API call
        • Send documents using the API
          • Requirements
          • Sending documents
    • Using our API
      • Authentication
    • Tools & libraries
      • mics CLI
      • JS Tag
      • Plugin SDK
    • Data cubes
      • Creating a report
      • Reference
Powered by GitBook
On this page
  • How-to
  • Plugin creation
  • Example
  • Plugin version creation
  • Example
  • Plugin instance creation
  • Example
  • Integration batch execution creation

Was this helpful?

Export as PDF
  1. Data ingestion
  2. Bulk processing

Integration batch

PreviousUser identifiers associationNextActivities analytics

Last updated 3 years ago

Was this helpful?

The integration batch is a plugin type used for customers' integrations. It can be periodic or non-periodic plugin. You can choose to pause a recurring plugin so that all the coming executions are canceled.

You can check to understand the hierarchy between a plugin, the versions, the instances and the executions.

Imagine you want to create a script that imports data every night for a customer :

  • You declare a new integration batch plugin called import-data-for-customer

  • You declare a first 1.0.0 version for this plugin with the code of the script and the declaration of the script parameters

  • Your script is now available for usage

To execute the script, you can :

  • Create an integration batch instance that will use the code from the 1.0.0 version with specific input parameters

  • Either program the instance to automatically create executions at a specified cron, or manually create a new execution to start now or later.

How-to

Plugin creation

Use the endpoints to create a new plugin with the plugin type as INTEGRATION_BATCH. Everything else remains the same.

Example

# Create the plugin definition
curl -X POST \
  https://api.mediarithmics.com/v1/plugins \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	"organisation_id": "my_organisation_id",
        "plugin_type": "INTEGRATION_BATCH",
        "group_id": "com.my-client.integration_batch",
        "artifact_id": "integration-batch-my-client"
}'

Plugin version creation

Example

# Create the plugin version
curl -X POST \
  https://api.mediarithmics.com/v1/plugins/<PLUGIN_ID>/versions \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	  "version_id":"1.0.0",
          "plugin_properties":[
        	  	{
        	     "technical_name": "one_more_property",
        	     "value": {
        	       "value": ""
        	     },
        	     "property_type": "STRING",
        	     "origin": "PLUGIN",
        	     "writable": true,
        	     "deletable": true
        	   },
                {
                    "technical_name": "provider",
                    "value": {
                        "value": "mediarithmics"
                    },
                    "property_type": "STRING",
                    "origin": "PLUGIN_STATIC",
                    "writable": false,
                    "deletable": false
                },
                {
                    "technical_name": "name",
                    "value": {
                        "value": "My Plugin Name"
                    },
                    "property_type": "STRING",
                    "origin": "PLUGIN_STATIC",
                    "writable": false,
                    "deletable": false
                }
        ]
}'

Plugin instance creation

For the integration batch plugin, the instance is called integration_batch_instances.

There are five properties that are used for this plugin type: cron , cron_status, ram_size, cpu_size and disk_size.

The cron and the cron_status are not mandatory as you can create non-periodic jobs. If used, you should use them together.

The ram_size , cpu_size and disk_size are mandatory and the default values are set to LOW.

Example

# Create the plugin instance
curl -X POST \
  https://api.mediarithmics.com/v1/integration_batch_instances \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
	"group_id": "com.my-client.integration_batch",
        "artifact_id": "integration-batch-my-client",
        "version_id": "my-version-id",
        "organisation_id": "my-plugin-id",
        "name": "The name of my instance",
        "archived": false, 
        "cron" :"* * * 7 *",
        "cron_status": "ACTIVE | PAUSED",
        "ram_size": "LOW | MEDIUM | LARGE | EXTRA_LARGE",
        "disk_size": "LOW | MEDIUM | LARGE | EXTRA_LARGE",
        "cpu_size": "LOW | MEDIUM | LARGE | EXTRA_LARGE"
}'

You can perform the operations POST / PUT / GET and DELETE on the instances.

Integration batch execution creation

Executions can be created either automatically by the scheduler using the cron defined in the instance or manually using the API or the interface.

When creating an execution you have to set the execution_type and expected_start_date properties.

// Create an execution
curl -X POST \
  https://api.mediarithmics.com/v1/integration_batch_instances/<INSTANCE_ID>/executions \
  -H 'Authorization: <YOUR_API_TOKEN>' \
  -H 'Content-Type: application/json' \
  -d '{
           "parameters": {
		execution_type: "MANUAL | CRON", 
		expected_start_date: 1562595783663
            },
           "organisation_id": "1185",
           "user_id": "1007",
           "error": null,
           "status": "PENDING",
           "external_model_id: "42",
           "external_model_name": "PUBLIC_INTEGRATION_BATCH",
           "start_date": 1562595789171,
           "job_type": "BATCH_INTEGRATION",
  }

You can perform the operations POST / PUT / GET and DELETE on the executions.

The execution_type can be either MANUAL when created using the interface or CRON when created by the instance using the cron value set in the instance.

The expected_start_date is set by the timestamp chosen in the interface or by the cron set in the instance.

Use the endpoints to create a new plugin version with all the properties. The call and format are the same than usual.

plugin version creation
plugin creation
plugins key concepts