Skip to content

Just a month ago we started a new blog series about best practices in mobile game testing. This blog series has been very popular (and again, thank you, everyone, for your feedback!) so we wanted to give you a special view to one of the most important aspects in mobile game testing – the combined performance with battery life and user experience.

Today’s guest blogger is Sharif Sakr from GameBench, a company that builds an uncheatable benchmark for the mobile gaming industry.

Guest blog

Android game developers naturally want to make their games work well on as many devices as possible.

That sounds obvious, doesn’t it? After all, it’s the reason why a service like Bitbar Testing is so important.

But this simple, indisputable notion contains a big assumption: that popular devices already exist out there in the world, so it’s a coder’s responsibility to make games that will run on them.

This assumption is certainly true and commercially sound. But it’s not very precise, because it doesn’t lay out which devices are “popular” enough to merit support and optimization. It’s also inherently conservative, suggesting that developers be afraid of fragmentation rather than nudging at its boundaries. What’s more, it puts pretty much all of the onus on the developer when a game doesn’t run as well as it should.

Testing Mobile Game Performance

If a Nexus 5 owner complains that Asphalt 7 is a battery hog, then they’d technically be right. Our tool, GameBench, shows that Asphalt 7 burns nearly one percent of battery for every minute of game time on the Nexus 5, meaning that it’d deplete the battery from 100 percent to empty in around two hours.

mobile game testing performance and battery

Now, one percent per minute is a scarily fast drain rate, but it wouldn’t be fair to put the blame on Asphalt 7.

The truth is that the Nexus 5 has proved to be an excellent general-purpose phone with a great price tag, but it has very poor stamina when it comes to gaming. In fact, the phone’s gaming longevity is much worse than you’d expect from looking at its spec sheet, or from reading journalists’ reviews that mentions its “average” battery life.

By contrast, GameBench data shows that Asphalt 7 runs just fine on the Galaxy S4 or even on the mid-range TCL (Alcatel) Idol X+. On these two phones, the game drains the battery at as little as half the rate observed on the Nexus 5, while still maintaining relatively steady frame rates above 30fps.

mobile game testing performance and battery

Testing Battery Life – Existing Benchmarks Can’t Solve This

Is the unhappy pairing of Asphalt 7 and the Nexus 5 anybody’s fault? No, not really, and I’ve probably focused on it too much already — GameBench’s ever-expanding database contains plenty of similar examples involving other games and other phones.

But it begs the question: If developers are trying to make their games run well on mainstream devices, can hardware makers reciprocate by ensuring that their phones can run mainstream games?

Or could they at least indicate the sort of experience that a game-playing consumer might expect from their hardware?

That’s how things have evolved in PC gaming, where customers, developers, product reviewers and hardware makers all know (thanks to an agreed framework of evaluation, largely based on game frame rate charts) what will be demanded from a low-, mid- or top-end graphics card.

In the mobile space today, the only indicators of gaming ability are the branding of a phone’s processor, its core count and clock speeds, and how well it performs in synthetic benchmarks. None of which is much help in this situation. The Nexus 5 contains a Snapdragon 800 chip that excels in all these areas — so who could have guessed that it had a problem with battery life?

Teaming up – Bitbar Testing and Gamebench

We’re just getting started, but eventually, we’ll be able to offer you types of data that have never been brought together in one place before (spanning things like frame rates and frame rate variability, battery drain rates and temperature, and CPU/GPU usage, all tied to a timeline of screenshots).

This data can be used to evaluate your unreleased software, but equally, it can be crowdsourced to publicly rate the pairings of hundreds of real-world games with hundreds of real-world phones and tablets, so it becomes clearer which sorts of pairings work well and which don’t.


We know from talking to many of you, and from reading reviews of the GameBench app at the Play Store, that developers are keen to have access to this sort of broad, real-world data.

What’s reassuring is that device makers, chipmakers and retailers all want it too. They’ve been among the first to see the potential of GameBench’s approach and to listen to what it says — even to the point of letting it influence their marketing and their plans for future products.

It’s a big ambition, but it’s also simply a matter of shared responsibility and aligned interests: If a casual game always runs well on a mid-range phone, and a graphically immersive game always runs well on a cutting edge device, then it stands to reason that more games and more devices will be sold to more people who will appreciate them.

Sharif Sakr – Director of Business Development, GameBench


Sharif is responsible for media, PR and brand licensing strategies at GameBench, as well for growing the company’s relationships with users and enthusiasts. He comes from a journalistic background, having worked as a technology reporter for BBC2′s Working Lunch and BBC World TV News, and as Senior European Editor for Engadget. He still writes for Forbes in his capacity as an analyst and consultant, and makes regular appearances on Bloomberg, Sky, CNBC and other channels.

Sharif graduated from Oxford University with a degree in Human Sciences, but has since chosen to specialize in mobile devices and processors rather than bipeds. He’s the author of some of the web’s most influential reviews of phones, tablets and other gadgets.

Ville-Veikko Helppi

Mobile Testing Product Expert