Kaneda provides builtin backends to store metrics and events in a persistent storage. If you want to use your custom backend you need to subclass BaseBackend and implement your custom report method which is the responsible to store the metrics data.


Elasticsearch is a search based NoSQL database that works very well with metrics data. It provides powerful tools to analyze data and build real-time dashboards easily with Kibana.


Before using Elasticesearch as backend you need to install Elasticsearch Python client:

pip install elasticsearch
class kaneda.backends.ElasticsearchBackend(index_name, app_name, client=None, connection_url=None, host=None, port=None, user=None, password=None, timeout=0.3)[source]

Elasticsearch backend.

  • index_name – name of the Elasticsearch index used to store metrics data. Default name format will be index_name-YYYY.MM.DD.
  • app_name – name of the app/project where metrics are used.
  • client – client instance of Elasticsearch class.
  • connection_url – Elasticsearch connection url (https://user:secret@localhost:9200). It can be used passing a single connection_url (a string) or passing multiple connection_urls (a list).
  • host – server host. It can be used passing a single host (a string) or passing multiple hosts (a list).
  • port – server port.
  • user – HTTP auth username.
  • password – HTTP auth password.
  • timeout – Elasticsearch connection timeout (seconds).


MongoDB is a document oriented NoSQL database. Is a great tool to store metrics as it provides a powerful aggregation framework to perform data analysis.


Before using MongoDB as backend you need to install MongoDB Python client:

pip install pymongo
class kaneda.backends.MongoBackend(db_name, collection_name, client=None, connection_url=None, host=None, port=None, timeout=300)[source]

MongoDB backend.

  • db_name – name of the MongoDB database.
  • collection_name – name of the MongoDB collection used to store metric data.
  • client – client instance of MongoClient class.
  • connection_url – Mongo connection url (mongodb://localhost:27017/).
  • host – server host.
  • port – server port.
  • timeout – MongoDB connection timeout (milliseconds).


RethinkDB is an open source scalable, distributed NoSQL database built for realtime applications.


Before using RethinkDB as backend you need to install RethinkDB Python client:

pip install rethinkdb
class kaneda.backends.RethinkBackend(db, table_name=None, connection=None, host=None, port=None, user=None, password=None, timeout=0.3)[source]

RethinkDB backend.

  • db – name of the RethinkDB database.
  • table_name – name of the RethinkDB table. If this is not provided, it will be used the name of the metric.
  • host – server host.
  • port – server port.
  • user – auth username.
  • password – auth password.
  • timeout – RethinkDB connection timeout (seconds).


You can use a logger instance of the logging library from the Python standard lib. Useful for debugging.

class kaneda.backends.LoggerBackend(logger=None, filename='')[source]

Logger backend.

  • logger – logging instance.
  • filename – name of the file where logger will store the metrics.


InfluxDB is an open source time series database with no external dependencies. It’s useful for recording metrics, events, and performing analytics.


Before using InfluxDB as backend you need to install InfluxDB Python client:

pip install influxdb


InfluxDB can store other type of data besides time series. However it has some restrictions:

  • Metrics tags field can’t be a list only a dict:

    # bad
    metrics.timing('user.profile_load_time', 230, tags=['login', 'edit_profile'])
    # good
    metrics.timing('user.profile_load_time', 230, tags={'from': 'login', 'to': 'edit_profile'})
  • Custom metric value field can’t be a list nor a nested dict:

    # bad
    metrics.custom('', metric='query_time', value={'times': [120, 230]})
    metrics.custom('', metric='query_time', value={'times': {'start': 120}, {'end': 230}})
    # good
    metrics.custom('', metric='query_time', value={'start_time': 120, 'end_time': 230})
class kaneda.backends.InfluxBackend(database, client=None, connection_url=None, host=None, port=None, username=None, password=None, timeout=0.3)[source]

InfluxDB backend.

  • database – name of the InfluxDB database.
  • client – client instance of InfluxDBClient class.
  • connection_url – InfluxDB connection url (influxdb://username:password@localhost:8086/databasename).
  • host – server host.
  • port – server port.
  • username – auth username.
  • password – auth password.
  • timeout – InfluxDB connection timeout (seconds).