Enhanced Test Reporting and Reports on Your Mobile Test Cases

Bitbar, the mobile devops company. Logo, large

Dear Testdroiders,

When it comes to test automation frameworks for mobile development there are bunch of great options for modern app developers out there. Many of these mobile-centric test automation frameworks are actively developed and have an active community around those. We at Bitbar believe in these communities and encourage everyone to contribute and use these great open source frameworks, as testing of your mobile apps, games and web-related stuff is really vital for your success. Naturally, all these frameworks have their own traits, pros and cons, and one of the top feature/benefit you should look from these frameworks is how well it exposes the issues, performance, problems and all that DNA to fix app for perfect user experience.

team_insights_big

Top Mobile Testing Frameworks – Android and iOS

We rely heavily on open source technology here at Bitbar and we encourage all our users to do the same. One of the most frequently asked question is what test automation frameworks work best – but unfortunately there is no straight answer. It really goes down to how is the app built, what underlying layers, frameworks and complementary software stacks you’ve used in your app, does it have graphic content (e.g. pure OpenGL ES implementation) and is it input-driven etc. All these characteristics determine which framework might work best for you. However, the core of all testing, implementation, processes and practices should be always considered in the context of mobile automation – and how well does the framework produce you useful information about potential problems in your app.

By nature, Testdroid Cloud service is testing, test automation and test framework agnostic to its users. The entire service is built so that the most common frameworks are available for everyone in the public cloud but the service supports literally any frameworks, setups and integration with Testdroid Private Cloud and Testdroid Enterprise. Because of huge number of possible (and publicly available) test frameworks including various versions, the question comes down to support and what frameworks should be enabled as standard for all public cloud users. And more importantly, how to harmonize all results, reports, logs, screenshots, test steps between these different frameworks.

frameworks-ios-android

Looking at the top mobile testing frameworks the most popular ones and the most used ones are Appium, Calaba.sh, KIF (or XCTest), UI Automation, Robotium and Jasmine. All of these frameworks are supported on our platform. As of today, Appium and Calabash are the frameworks that we see our customers using the most and naturally those are the ones that we update most often.

Test Reporting – The DNA to App Behavior and Performance

Updating frameworks is a straight-forward task and we keep up with the latest versions of available frameworks. Every now and then we make some bigger update where we integrate the framework even tighter to our solution. Lately we did so updating our support for Calabash Cucumber runs – and introduced a major revamp with its test result reporting.

cucumber-logo

The most important thing with the test reporting and test case reports is that users get actionable and rich information about the performance of an app, test cases, and all possible details from the device run. We started this revamp from the Project View to provide you yet better real-time review capabilities what is going on with the device runs. So, from now on, you’ll get to see the following project overview for all of your test runs:

Screen Shot 2016-02-09 at 1.06.10 PM

Then, we wanted to improve the readability of the test run results. Some colors and highlighting the most meaningful things truly makes a difference! Now every Calabash Cucumber run report has all the steps highlighted with each framework action timed and nicely parsed. Especially when trying to figure out what has gone wrong this is really precious – and saves you a lot of time. All screenshots can be also reviewed in the right context – meaning that when you highlight a test step you’ll get the right screenshots shown in the Screenshots view.

Screen Shot 2016-02-09 at 1.07.46 PM

All Calabash screenshots are also now mapped to each test step. While going through test steps the user is now able to select and check each step’s screenshots one by one. Thanks to this mapping the reverse is also possible. So if there’s a screenshot that looks a bit off using a single click it is possible to find the test step where this happens.

Another very important improvement is the detailed log of a test run. We’ve now separated the logs on their own and for example Calabash / Cucumber logs can be seen as their own file under the Log section. Furthermore, the real-time data and log inspection has been significantly improved and everything works fast and you’ll have detailed log inputs of execution of any given moment:

Screen Shot 2016-02-09 at 14.06.37

Screen Shot 2016-02-09 at 14.06.59

As a last note, a few things that are coming out even still this week! A customer requested to have a better screenshots comparison view in addition to the one that was already there. This week’s release contains a new view allowing comparing screenshots step by step between selected devices. Here is the sneak peak to this brand new comparison view:

Screen Shot 2016-02-09 at 1.31.13 PM

All these changes will be in Testdroid Cloud available for all users, so please do provide feedback about these enhancements. You can always reach me by emailing niko dot cankar at bitbar dot com – and I’ll be happy to discuss with you.

Happy Testing Testdroiders!

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close