Developer
User guidesDeveloper websiteHelp centerLog in
  • Welcome!
  • Organisations structure
    • Datamart
    • Users and roles
  • User points
    • User identifiers
      • Networks IDs
        • Device-based Network IDs
          • Custom Device ID integration
          • ID5
          • First ID
        • User-based Network IDs
          • Custom User ID integration
          • UTIQ martechpass
      • Accounts
      • Emails
      • Device identifiers
    • User activities and events
    • Compartments
    • User profiles
    • User segments
    • Hyper point & Quarantine
  • Data model
    • Defining your schema
    • Computed fields
      • Concepts
      • Quickstart
      • Examples
  • Data ingestion
    • Real time user tracking
      • Website tracking
      • Mobile apps tracking
      • Ads exposure tracking
      • AMP tracking
      • Conversions tracking
      • Email views and clicks
      • Tracking API
      • Event rules
      • Activity analyzers
    • Bulk processing
      • Imports
        • User activities import
        • User profiles import
        • User choices import
        • Segments import
      • Deletions
        • User identifiers deletion
        • Device points deletion
        • User points deletion
      • User identifiers association
      • Integration batch
    • Activities analytics
    • Data warehouse
      • Preliminary setup
        • BigQuery
      • Create data warehouse
  • Querying your data
    • OTQL queries
    • OTQL examples
    • GraphQL queries
    • UserPoint API
    • User activities
    • Activities analytics queries
      • API Quickstart
      • Dimensions and metrics
      • Use cases
    • Funnel API
  • Alerting
    • Alert configurations
  • Data visualisation
    • Quickstart
    • Dashboards
    • Sections and cards
    • Charts
    • Datasets and data sources
      • Using a data file data source
    • Transformations
    • Filters
    • Cookbook
    • Reference
  • Advanced usages
    • Audience segmentation
      • Audience features
      • Segment builders
      • Audience segment metrics
      • Audience segment feed
        • Building new feeds
        • Monitoring a feed
        • Curated Audiences (SDA)
      • Edge segments
      • Cohort-based Lookalike
    • Contextual targeting
      • Setup
      • Activation
        • Google Ad Manager
        • Xandr (through prebid.js)
      • API documentation
    • Exporting your data
      • Query Exports
      • Datamart replication
    • Data privacy compliance
      • User choices
      • Cleaning rules
      • Exercise of user rights
      • Cookies
    • Campaigns
    • Automations
      • Email routers
      • Email renderers
      • Opt-in provider
      • Custom action plugins
      • Usage limits for automations
    • Plugins
      • Concepts
      • Creation & Deployment
      • Coding your plugin
      • Manage existing plugins
      • Layouts
      • Presets
      • Monitoring
      • Throttling
      • Batching (for external feeds)
    • Platform monitoring
      • Resources usage
        • Dimensions and metrics
      • Collection volumes
        • Dimensions and metrics
      • Events ingestion monitoring
        • Dimensions and metrics
    • Data Clean Room
      • Bunker
      • Clean room
  • Resources
    • Tutorial: Data Ingestion
      • Your first events
        • Add the mediarithmics tag
          • Getting the tag
          • Adding the tag
        • Send events using the tag
          • Adding event properties
          • Finding the UserEvent type in your schema
          • Matching your schema
          • Standard events
      • Your first bulk imports
        • API basics
          • Authentication
          • Your first API call
        • Send documents using the API
          • Requirements
          • Sending documents
    • Using our API
      • Authentication
    • Tools & libraries
      • mics CLI
      • JS Tag
      • Plugin SDK
    • Data cubes
      • Creating a report
      • Reference
Powered by GitBook
On this page
  • Logs
  • Accessing your logs
  • Advanced filtering
  • Log levels

Was this helpful?

Export as PDF
  1. Advanced usages
  2. Plugins

Monitoring

PreviousPresetsNextThrottling

Last updated 3 months ago

Was this helpful?

Logs

In your plugins, everything that is written through the Logger class will be available as logs. Logs are written in a stream dedicated to the plugin's organisation.

In order to ease the log search process, you may want to systematically add some context informations to logs, like feed ID, site ID, app ID, ... That way you'll be able to filter on a specific instance of your plugin to see logs.

Accessing your logs

If you don't already have an access, ask your Account manager for an access to your organisation's plugin logs.

Click on your organisation's stream. The stream name is built in the format "{Organisation ID} {Organisation name}"

You should already have a stream for your client's organisation. If not, ask your Account manager to create one. Please provide them with the organisation ID, the datamart ID and the emails and names of people that should have access to this stream.

Once on the stream, you can see all incoming logs. You can select a time span for logs visualization, search for particular properties and start automatic refreshes in order to see logs as they arrive.

Advanced filtering

You can directly use the search if you know the syntax. Or you can use the left Fields panel to help you generate queries.

You can select the files you want to see with each line of logs, and use the arrows to show additional actions for each field.

Click on Quick values to show the different available values for a field. For example on the plugin artifact ID field to filter on a particular plugin for your organisation.

Simply click the search icon on the line of the value you wish to filter on. The query will automatically be added to the search query. Start a refresh and you now only see the logs for one particular plugin !

Log levels

You can use logs with different levels:

  • Error

  • Log

  • Info

  • Debug

You should aim to not generate too much logs. If it happened, mediarithmics would stop your stream from logging in order to protect all the other streams.

You can use Debug severity for log intensive activities, like tracing data and functions. Debug logs are not saved by default, and you can ask your Account manager to temporarily enable them. That allows you to prepare your code for tracing, without generating useless logs all the time.

A good practice to have clear logs is to generate stats every X seconds. For example, instead of showing a message at every item, you could show the number of items processed every X seconds and their status.

You can check plugin's logs on .

You can get familiar with the search query language on .

https://plugin-log.mediarithmics.com/streams
Graylog's official documentation
Plugin SDK
A sample use of the Logger class
Command bar to search for the logs you need
Quick values for plugin artifact ID