Mobile Performance Index #4 – Using APIs to Collect App Performance Metrics

Using Bitbar Monitoring API to gather mobile performance metrics

One of the upsides by adopting synthetic monitoring is that you can gain lots of valuable metrics that depict the overall performance of your mobile apps. When you and your team agree on what app performance metrics to measure, it’s time to gather these data and push them to your own DevOps dashboards so every responsible individual/team has access to. But before you can actually get to your dashboard, you need to know how to collect these metrics and select the tool with which your dashboard is presentable.

In our Mobile Performance Benchmark project, we use the Bitbar Monitoring service to gather request timing information for top US retail mobile apps. As you have already known, we have been monitoring the performance of the search functionality in these apps. And one of the specific requests to benchmark these apps is to perform a search for headphones in the mobile apps. We think gathering metrics of that request’s duration over time would give information on the stability, load and user experience of each app.

In Bitbar Monitoring, one can reuse their existing automated functional tests (e.g. Appium scripts) to monitor the critical user flows and key functionalities. (Sign up to try it out).  At the same time, it provides a feature that captures full network stats of each run performed. As the search for headphones was done in the app by an Appium script, the Monitoring service would capture that HTTP request sent to the API of the store and measure its response time. The goal was to collect that request’s response time from runs and draw a graph of the response time in the monitoring dashboard.

When it comes to the tool for displaying this graph, we have been using ELK stack (Elasticsearch, Logstash, Kibana) that is a popular open source tool stack for log monitoring and dashboard creation. The idea was to push all the network traffic capture data through Logstash to Elasticsearch and then use Kibana to query Elasticsearch for the correct information. Kibana can display graphs of data over time, so a custom dashboard displaying the headphones search time could be created.

As you have already known that Bitbar provides powerful and comprehensive APIs for mobile dev & testing, it goes the same for gathering monitoring metrics. Bitbar Monitoring APIs provides three ways to get the collected traffic capture information.

The first and the most manual method is to get the whole traffic capture in HAR format. Each monitoring run in the Bitbar Monitoring service contains a link to a HAR file containing all the requests and responses made by the application during the run. Then the file can be read for example using a Python script and each request-response pair sent to Logstash to be saved into Elasticsearch.

To configure Logstash to accept HAR entries, a filter must be configured. We used the following logstash filter configuration to allow HAR entries to be uploaded:

filter {
  # if this is being send to /har, then assume that it is a "entry" element from a har recording
  if ( [index] == 'har' ){
    json {
      source => "message"
    # use the timestamp from the request start
    date { match => [ "startedDateTime", "ISO8601" ] }

To get the HAR file using the Monitoring API, one can use for example bash:

# This script needs your API token in variable TOKEN,
# the check ID in CHECK and number of runs to get HAR
# file for in ITEMS
for URL in $(wget -O- --header "Authorization: Bearer ${TOKEN}" "${CHECK}?items=${ITEMS}" | jq -r .results[].assets.har); do
  RUN_ID=$(echo $URL | sed 's/.*\/runs\/[0-9]*\/\([a-z0-9\-]*\)\/.*/\1/')
  wget ${URL} -O ${HAR}
  if [ -s "${HAR}" ]; then
    echo "Got har!"
    # You could send it to Logstash here, for example using Python

Then the HAR file can be uploaded through Logstash to Elasticsearch. We used Python for simplicity:

import simplejson as json
import requests

# URL to Logstash should be stored in variable “logstash_host”
# Path to HAR file should be stored in variable “harfile”
log = har['log']

for entry in log['entries']:
  # Delete response content since it is not required to be stored
  del entry['response']['content']
  r = requests.put(logstash_host+"/har",data=json.dumps(entry))

After this, you should be able to see each request and their responses in Elasticsearch using Kibana. Each request entry in the HAR file also contains timing information for the request. This timing information can be used for our search time dashboard.

Running all these scripts seems like a lot of work. Luckily, Bitbar Monitoring API can do the work for you.

The second way of getting traffic capture information using the API is requesting the HAR entries directly:


Making this HTTP request to the API using a specific run’s ID, the API will return the HAR file’s entries already parsed for you:

  "harstats": [
      "url": "",
      "startedDateTime": "2017-08-25T12:34:08.621Z",
      "time": 553,
      "responseCode": 200,
      "responseSize": 1159,
      "timings": {
        "blocked": -1,
        "connect": 0,
        "dns": -1,
        "receive": 0,
        "send": 13,
        "ssl": 0,
        "wait": 539
  "more": true

Then you could use a similar Python script as before to push these entries through Logstash to Elasticsearch.

If this is still not easy enough and you know exactly what URL’s request you want to follow, the Monitoring API can also directly provide you with historical data across multiple runs. Make a request to the following endpoint to get the data you need:


This request will return each request from the HAR files of recent runs for which the URL matches the given URL regular expression in the query.

Hope you find it helpful and we will cover more on how to make a dashboard in Kibana using HAR entries in the coming blog. Stay tuned. 

Manifesto: Everything about Mobile DevOps

Learn how to properly adopt DevOps approach for your mobile team and get the most out of it.


By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.