Abstract: The Apple App store uses a curated model in which all applications must be approved by Apple before they are published. In has been argued that this provides better security than Android where applications are published immediately and only removed if they are found to be problematic. We study the cases of Smurf’s Village and similar applications and argue that the protection offered by this form of security is limited. We conclude that the image of Apple as a benevolent dictator acting as a neutral arbitrator to promote the best user experience is dangerously naive. Apple’s economic interests will often be different than their users’ security interests.
In the Apple App store for iOS devices such as the iPhone and iPad, all applications must be approved by Apple before they are published. That is in contrast to Android in which submitted applications are immediately published and only removed if they are determined to be problematic. The Apple approval process has been criticized as being arbitrary and confusing. Many accuse Apple of exercising excessive control. The App Store Review Guidelines are 8 pages. The introduction contains statements like the following:
- We view Apps different than books or songs, which we do not curate. If you want to criticize a religion, write a book. If you want to describe sex, write a book or a song, or create a medical app.
- We have over 350,000 apps in the App Store. We don’t need any more Fart apps.
- [If] you’re trying to get your first practice App into the store to impress your friends, please brace yourself for rejection. We have lots of serious developers who don’t want their quality Apps to be surrounded by amateur hour.
- If your app is rejected, we have a Review Board that you can appeal to. If you run to the press and trash us, it never helps.
- If it sounds like we’re control freaks, well, maybe it’s because we’re so committed to our users and making sure they have a quality experience with our products.
The rest of the Guidelines contains specific rules about what is and is not allowed. It is generally acknowledged that all submissions are not reviewed equally. For example, Apple includes a ban on offensive or mean-spirited commentary but specifically exempts professional political satirists and humorists. Additionally, well known companies and organizations are thought to have an easier time getting their apps approved than individual developers.
Though many have criticized those restrictions as excessive, it has been argument that they will provide protection against malware. Indeed, Jonathan Zittrain, among others, has worried that the iPhone will be adopted because it provides increased security even though this security comes the cost of the freedom and flexibility that has led to significant innovation in the computer industry. But just how effectively will the Apple review process protect users? Software auditing is time consuming and expensive. There are a variety of technical measures that would allow malicious developers to hide unapproved functionality from Apple’s reviewers. I may discuss these techniques in another, geekier, post.
In this post, I use the existence of Smurf’s Village and similar apps to argue that a curated app store is not a security panacea. These type of apps, many of which are targeted at children, are free to download but have very expensive in app purchases for items. For example, Smurfs’ Village, which is is based on the Smurfs tv show, sold a wheel barrow of Smurf Berries for $99. Many children amassed huge bills playing these applications on their parents phones. One 8 year old girl amassed a $1400 phone bill playing Smurf’s Village. (See http://www.digitaltrends.com/mobile/publisher-greed-little-girl-amasses-1400-iphone-bill-playing-smurfs-village/ ) There have been allegations of similar spending on other games such as a 5 year old girl spending $99.99 in Fishies, a 9 year old boy spending $670 on virtual weapons for X-Mas Resort and F.A.S.T, and a 9 year-old girl spending $200 on Zombie Toxins and other items in Zombie Cafe. (http://newsandinsight.thomsonreuters.com/New_York/News/2011/05_-_May/Class_action_firms_go_after__free__Apple_apps/ ) Whether these type of applications should count as malware or badware, is a question of semantics that we will not attempt to address. However, it is worth noting that the financial harm suffered by the parents of these children is likely to be greater than those whose PCs or Android phones are infected with traditional malware.
Why does Apple allow these applications in their app store? An app like Smurfs’ Village might be justifiable based laissez-faire arguments in an open market but that argument makes no sense in the context of Apple’s curated market. It is impossible to know Apple’s exact motivation but economic incentives may provide the answer. Apple requires that all game items be purchased using their In APP Purchase API payment system and takes a 30% cut of any transaction. Thus Apple has an incentive for users to spend as much money as possible. Indeed Apple’s guidelines contained no restriction on excessive pricing of in game items even though they contained restrictions on just about everything else. (Apple’s 30% commission is much greater than the amount taken by credit card companies and consequently they have a much greater incentive to allow expensive purchases.)
Apple has since taken some steps to mitigate the problem. They now require a password to be entered before every purchase (previously the password would be cached for 15 minutes) and allow in-app purchases to be disabled in device settings. Whether or not these mechanisms are sufficient to protect parents from these specific type of applications, in app purchases are a rich target for exploitation. In the future, there will likely be new and different attempts to exploit it that are both creative and disturbing. In the media, there is the popular stereo type that undesirable software originates from super hackers who are either anti-social western males or Russian gangs. The danger of this myth is that it causes us to be overly trusting of corporate software. The extent to which Apple uses developer reputation in the approval process may make them more likely to approve unsavory software if it has the veneer of corporate legitimacy.
It is clear that Apple’s review process does not prevent users from being harmed by unsavory applications. Indeed the image of Apple as a benevolent dictator acting as a neutral arbitrator to promote the best user experience is dangerously naive. Apple’s economic interests will often be different than their users’ security interests.