Higher ratings are the ‘lifeblood’ of the smartphone app world but what if they are inflated? From a report:
Rating an iPhone app takes just a second, maybe two. “Enjoying Skype?” a prompt will ask, and you click on a 1-5 star rating. Millions of people respond to these requests, giving little thought to their fleeting whim. Behind the scenes, though, an entire industry has spent countless hours and lines of code to craft this moment. The prompt, seemingly random, can be orchestrated to hit your glowing screen only at times when you are most likely to leave a five star review. Gaming apps will solicit a rating just after you reach a high score. Banking apps will ask when they know it’s payday. Gambling apps will prompt users after they are dealt the perfect Blackjack hand. A sporting app will give the nudge only when a user’s team is winning.
Apple has for a decade clamped down on “ratings farms” and “download bots” that companies use to fraudulently garner five-star scores and manipulate App Store rankings. And it has had some success. But these are blunt instruments trying to cheat the system in clear violation of Apple’s rules. The more sophisticated techniques stay within the rules but draw on behavioural psychology to understand your mood, emotions and behaviour – they are not hacking the system; they are hacking your brain. “The algorithms that are used are very hush-hush,” says Saoud Khalifah, chief executive of Fakespot, a service that analyses the authenticity of reviews on the web. “They can target you when you are euphoric, when you have a lot of dopamine. They can use machine learning to determine [when] a user will be more inclined to leave positive reviews.”