What’s Trending with Mobile Test Automation Frameworks

Device runs at Bitbar Public Cloud

For several years, we’ve been providing the most diverse and (by far) the largest mobile device farm for our users. Bitbar Public Cloud (Testdroid Cloud) has been incrementally growing year by year and more device runs have been executed quarter by quarter. Furthermore, every now and then it’s great to deep dive with all this data, gather and analyze the most interesting aspects of how people use their mobile test automation environments and frameworks.

Here are some facts and numbers from our Public Cloud during the last year, especially when it comes to used test automation frameworks with it.

As you know, we did a major brand change from Testdroid to Bitbar last year. This had an impact also on Testdroid Cloud which was named as Bitbar Public Cloud. Despite of that, the data presented and illustrated in this blog is gathered between 1.1 – 31.12.2016 from our Public Cloud.

Test Automation and Unlimited Concurrency are The Enablers

First of all, getting a huge number of test runs on device farm is not possible without test automation. Test automation is the enabler that makes concurrent use of mobile devices possible and enables users to run their app and tests simultaneously on any number of devices. This is also one of our sweet spots as we haven’t ever limited the number of devices that users can use simultaneously for their tests.

It’s also great to see that native app and game developers use the same device roster – yes, slightly differently and test scripts/used frameworks are different – but the same foundation serves both user groups extremely well.

Especially during the past two years, test automation has come into play and helped mobile game developers to quickly and easily see how their games work across all possible device variants and get that vital debugging information on which things should be fixed (before releasing). This has produced tons of great data of how game can be optimized for end-user devices.

Overview of Test Runs in Bitbar Public Cloud

During the 2016, we provided the most diverse (devices all around the world, with with OS combinations etc.) and the largest mobile device farm for Android and iOS app, game and web developers. Among this group of users, there are Fortune 500 companies, a dozen of top 20 mobile game developer companies, as well as lots of SMBs and indie developers.

Their mobile test runs hammered thousands of our devices every day, produced enormous amount of data, results, screenshots, recorded videos out of devices runs, and performance statistics on how their mobile apps/games work on these different handsets.

The total number of test runs exceeded 184 million unique device runs, majority of those done on Android devices, but with significant growing trend with iOS devices (quarter by quarter). This yields that ‘fragmentation’ is far away to be solved on both major mobile platforms – and testing must take its place before top companies want to push their apps and games for consumers.

Device runs at Bitbar Public Cloud

From the data, close to 84% of test runs were done on Android devices (and less than 16% on iOS devices) which provided just a bit higher failure rate (17.8%) than on iOS devices (15.5%). This sort of finding states that there are lots of issues also on iOS devices, with an increasing trend. Mainly, this is because of new iOS versions and some new form factors with iOS devices (plus new APIs, new notification system etc).

What’s really awesome to see is that concurrency has been going up quarter by quarter. The average number of used mobile devices per test run is now over 10 devices per a test run. However, there is still lots of outliers and variation here as many of users still use one-device-at-time and some test automation experts use tens (or sometimes even hundreds) of devices simultaneously for their test runs.

The average test time also varies a lot. For example, quick and fast Espresso tests typically complete quickly and some extensive, thorough Appium test runs may take 10x time. Naturally, it’s all about the scope of testing and typically more logic is included in those Appium tests (those Appium runs cover functional aspects better than unit-like tests with Espresso).

The Most Popular Test Automation Frameworks

Appium has been the most popular framework for functional testing of mobile apps, games and, to certain extent, mobile web as well (we didn’t include Selenium in these numbers as majority of all those selenium tweaks are more or less Appium-based). There are lots of good reasons why people have been using Appium (cross-platform, support for literally any programming language, great coverage with API etc) and frankly it has worked extremely well until recent changes with iOS UI Automation.

On Android, Espresso and UIAutomator has been very popular as well. And there also good reasons why people use/love these frameworks. Espresso provides very fast test execution and UIAutomator provides a light-weight API that is easy to adopt and use for your native apps. Both of these frameworks, however, are somewhat limited to native apps only. Again, majority of game developers either use Appium or some internal grown/developed framework.

Another (promising) new framework that has come into play is the Robot Framework and it quickly took over the Robotium, which used to be one of the mainstream frameworks in the early days of Android test automation.

On iOS Appium and Calabash, both being cross-platform frameworks, have been popular choices for iOS test automation. As many game developers have been using Appium it’s also obvious from the data that usage has been on both major platforms, Android and iOS.

Another highly used framework on iOS has been UI Automation (until the last quarter of 2016). Since that there has been (and will be) a clear replacement movement to some other framework and I’ll cover this with a bit more details in the next chapter.

On both platforms, there are also lots of other frameworks that didn’t get listed here. Some of those are internal, proprietary or just niche frameworks that their users happen to use. As stated, there is no wrong choice for framework if that does the job efficiently and provides exact results on how apps, games and web stuff work on real mobile devices.

New Trends with Mobile Test Automation Frameworks

Okay, let’s speculate a bit.

Currently we don’t see any major changes taking place in Android test automation. Appium, Calabash and Espresso will be definitely in high use and probably even grow stronger as Google tends to favour Espresso and there are lots of Appium fans out there as well. Calabash hasn’t provided much new to the ecosystem lately but one of its strength is that “if you speak English, you can write a mobile test”.

The most significant changes are going on in iOS side of things. As UI Automation got deprecated by Apple, XCTest and XCUITest got an excellent start to be the next big frameworks in iOS test automation. However, the change by Apple hit Appium, as it was relying UI Automation as its foundation.

Trends with mobile test automation frameworks

Now, the big question is that what happens to Appium with UI Automation deprecation. If things start to work well with Appium, there is a good chance that many of UI Automation users go to Appium (instead of XCTest/XCUITest, that we see on going now). It’s always worth to consider which framework (XCTest vs Appium) could provide more stable environment for test automation runs for next years.

If you are looking some guidance on what’s going on with Appium (and how to use the latest version of it efficiently), take a look at this awesome post by my colleague.

Allright. Let’s speculate a bit more. Let us know in the comment section below which frameworks will fall, which will flourish, and why!


An Essential Guide to XCTest Framework for iOS App Testing

Get all essentials about XCTest framework and learn how to get started with it for cloud testing

Download

  • Mattje

    Robot Framework doesn’t directly interact with a mobile environment. Rather, it uses libraries to interface with the other frameworks. e.g. Appium and Calabash.
    We use RF because it’s easy to write tests, has nice reporting capabilities and can test non-mobile stuff as well.

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close