The mobile market continues to evolve and grow. This is evidenced by the large number of handset manufacturers that continue to churn out devices, the choice of different mobile operating systems, even the type of design that ranges from tablets to minis to standard sizes.
Given the rise in mobile use, companies and brands are turning more and more towards developing mobile applications and launching them into the market for consumers. However, the sheer number of possible mobile configurations can present a challenge for developers of these applications. Ensuring that an app works well on a large number of devices is crucial to avoid bugs and frustrated users.
This blog looks differences in testing your mobile application or game on emulators and real devices – in the context of test automation. As this is one of the most asked questions regarding mobile app testing many things should be included in the comparison. We’ll take a look at what things are important for end-users, how those things are seen by developers, and what eventually strikes a balance between testing on emulator and real devices.
To get a comprehensive comparison of using emulators and real devices in your app testing we separate different aspects of app testing into four categories: User Experience, Hardware, Software and Infrastructure. Whereas the user experience focuses on overall usability and testing the entity, hardware and software categories are separated into different pieces, and infrastructure takes surrounding things into consideration.
CATEGORY #1 – User Experience and Usability
Testing usability and what user experience your app delivers to its end-users simply cannot be done on a desktop with a mouse and keyboard. Therefore, we picked a few things you should think about when selecting a platform for testing:
User Interactions – Frankly, not all types of user interactions – or stimuli – can be tested on an emulator. Using a mouse or keyboard to click on a simulator is different than using a finger on the screen of a mobile device. Real devices provide a real user environment. Certain interactions like pinching, zooming and scrolling are considerably different on the touchscreen.
Real occurring events – Interrupts, battery consumption, how charger effects to overall performance and usage. These sorts of issues can be simulated emulators but that isn’t real. Seriously, the only option for testing real occurring events during execution of your mobile app can be only done on an actual device.
Overall performance – It is important to test the app for its performance. Using an emulated or simulated environment for performance testing is not a realistic and results are meaningless. Performance lags are easy to expose using actual devices. In addition, the current Android emulator – which is based on QEMU – isn’t really fast or agile on any scale. The frustration of developers and testers using slow emulator can be avoided with help of real devices – physical ones or ones resided on the cloud. For comparison, uploading your APK from Android SDK to Bitbar Testing using run-in-cloud plugin gives you results in a minute. You barely see emulator getting up and running at that time. And that is yet emulator.
Consistency in results – You would think when running your app on an emulator that results would be always the same. Actually, due the non-real environment results are not always guaranteed to be the right as many things are taken in account, starting from your desktop setup, internet connection, file system, the capability to illustrate graphics and so on. Only real devices provide a platform to test real user experiences.
CATEGORY #2 – Spectrum of device configurations (Hardware)
Hardware is naturally the biggest difference in testing between emulators and real devices. However, many things make also hardware perform differently and here are some of the most common ones:
Chipset – It’s actually amazing how differently different silicon (CPU, GPU etc.) perform. Imagine yourself executing something that is targeted for high-end devices on low-end hardware. That simply doesn’t work. The user experience is the most obvious hit as performance goes down and targeted apps/games running on low-end silicon with a low clock-frequency rate is delivering a bad experience to users. In case of many apps/games, it is actually severe as activities are synced with the clock and UI refresh can’t keep up with it. Users see this sort of behavior as badly implemented graphics, blinking of the screen, and general slowness.
Display – It is not always about the resolution that is causing a headache for developers, but more and more problems are related the quality of a display: it is also about the density, colors and overall quality of the display used in the device. For example, many games suffer because of bad quality displays in mobile devices. For instance, the developer wants to have a blue button on a certain part of UI but it gets shown as a different shade of blue – and even in some cases totally different color. This is a very common problem and not only related to display but also drivers. The graphics driver can break consistent-looking UI as we’ve experienced on Android world in the past. In addition, for some apps the graphical content has been missing because of color brightness and low density of the display.
Memory – Developers have to eye-witness this problem quite often. In many cases their app or game doesn’t run on certain Android devices due to too big memory consumption. Typically the most popular apps and games – as rated with 4 and 5 stars in app markets – do not have this problem as memory handling is implemented right. But, too many of today’s apps and games are still originally developed for high-end devices and low-end devices can’t run those. However, what is clear is that this type of problem can be easily tackled when app/game is properly tested across the array of Android devices.
Sensors – Misbehaving sensors – and we’re not talking about badly calibrated or not-calibrated sensors – are causing various issues on games that require inputs from device handling. For GPS, the known issues are difficulties of navigation in indoors and some locations that satellites cannot be reached. One typical issue also is related to media streaming – for example, a video that is meant to be shown in landscape mode worked well in landscape but you would have to rotate the device 180 degrees to see it right. Frankly, there is no way to test orientation properly with an emulator and things related to accelerometers, geo-location, and push notification cannot be emulated or may provide inaccurate results.
CATEGORY #3 – Platform + Customizations (Software)
Your app is software but there are tons of other software involved in the mobile device too. And this ‘other’ software makes your software perform differently on an array of devices. Let’s look at some of the most common things experienced by app developers:
Platform/OS version – In fact, the operating system version combined with rest of the software – typically brought in by OEM – have an impact on developers and how their app/game performs on any device: The well-known root cause for Android problems is the OS platform version. Something that was meaningful for developers and end-users in the prior version may not play that important role in the new version. For example, the version of Android executed on QEMU is not the same that you’ll find on any of OEMs Android devices – no matter what the version number or API level tells you. Typically OEMs put in customizations for middleware, drivers, and other infrastructure software. How could any developer access these on emulator?
OEM and Carrier customizations – What is fragmentation for developers is a differentiation for OEMs. OEMs have been keenly built their own user interface layers, skins and other middleware on top of the vanilla Android, and this is a significant source of fragmentation for developers. Many OEMs also bring their legacy – or just Android tailored software – to their devices and it is also breaking the compatibility of developers building their stuff to work identically on different brand models. Drivers also cause major problems, and many of those are related to graphics. As mentioned, a developer may see a whole different color scheme on different Android device and nothing close for what it was meant to be.
Dependencies to other software/app – Sometimes apps or games depend on access to some other application. For example, many apps/games include social media and in worst implementations, it is taken for granted that some of the social media applications are preinstalled in devices before a user can actually use those shortcut SoMe buttons. No need to mention but this causes already a situation that not only affects to design but also the implementation of the app.
CATEGORY #4 – Infrastructure (Network)
Network/WiFi – Devices experience network issues and a slow network is not possible to be tested on emulation. In terms of network configuration, emulators run on the PC, connect to the LAN and access the internet via your corporate firewall. Using real handsets, the network is connected to the radio interface and from there to the internet.
Every developer is yet more carefully considering those pros and cons between emulators and real devices to ensure the right decision in their path to create their mobile app testing strategy. Typically emulators are used in initial stages of the app development and later on real devices are brought into the game. However, using the right platform to foster your next big thing should be as authentic as possible – from day one. From our experience that has been one of the most important cornerstones of creating those successful apps – with hundreds of millions of downloads.
Learn major cost drivers in mobile testing and how to leverage test automation to improve the ROI.Download