Apple has issued new rules for iOS developers in relation around the new HealthKit and HomeKit platforms and third-party app extensions, including third-party keyboards, that will be standout features of iOS 8. Apple will likely highlight these at its press event next week, where the iPhone 6 is expected to be announced, because they provide a major competitive advantage: They force developers to see the user as their ultimate customer, rather than a means to collect data that can later be sold for profit.
The central theme of the new rules is that developers cannot use information obtained about users, their health, homes or keystrokes for advertising – either on their own or by selling or mining the data and then providing it to advertisers or data warehousing firms.
The HealthKit rules
New rules governing HealthKit were included in a revision of the license agreement for the most recent iOS 8 developer preview and were reported last week. Those restrictions prohibit “health information collected through the HealthKit API to advertising platforms, data brokers or information resellers,” according to a report originally made by the Financial Times and also prohibited collected data from being used “for any purpose other than providing health and/or fitness services.”
Additional information about HealthKit restrictions and restrictions on HomeKit and extensions was revealed in an updated version of Apple’s App Store Review Guidelines, a public document that specifies the offenses that will get an app rejected or removed from the iOS App Store.
The HealthKit additions contain several interesting requirements and prohibitions.
Those restrictions are excellent for ensuring user health and medical information security. They also make it very clear that the end user is Apple’s ultimate customer, and that developers must treat the end user as their ultimate customer, rather than using the user as a tool to aggregate data for advertising or commercial data mining. As health and fitness platforms and devices proliferate, ensuring that a device maker, platform, or developer is focused solely on the user and his or her needs is a key component to establishing trust.
This isn’t just about establishing trust with the user, but also with the doctors, healthcare administrators, hospitals, insurers, and other healthcare entities that will be key to HealthKit (and any other similar platform) being a truly successful healthcare platform and solution. Apple very clearly wants HealthKit to be more than a fitness app. The company is working with top hospitals and clinics as well as at least two of the largest health IT and electronic health record vendors in North America. It is looking to change the way individuals and their healthcare teams interact and manage the full range of health-related activities. That requires that any app that can access the pool of HealthKit data on a user’s device be completely above board and trustworthy.
In the process, Apple is saying “no thank you” to any app developer or accessory maker that isn’t willing to put users and their privacy first. That means many apps may be rejected, but the quality and caliber of those that are approved will be top-notch.
It is interesting that Apple is allowing data mining for medical research, which likely includes research into specific diseases or conditions as well as public health and epidemiology, for which this kind of broad data set could be incredibly significant. Research show that up to 90% of Americans are OK with sharing such data about themselves for research, but significant numbers want to ensure it is done anonymously and want some control and transparency about how it’s used, issues that made some question Jawbone’s use of user sleep data to study the recent California earthquake.
Given that Apple is requiring explicit user consent to share data, it clearly understands that this is a need even when done in the name of research and that the user should always understand and agree to share their information.
Nothing to do with nude photos
The rules also block apps from storing health data in iCloud. It’s easy to point to the recent scandal of nude starlet photos and the role iCloud played in it as the reason for this inclusion, which wasn’t reported last week, and many reports have seized on that. But anyone familiar with handling healthcare data and with privacy regulations surrounding that data is going to see a much different story.
If healthcare providers are going to exchange information with HealthKit-enabled apps, the security provisions will have to be pretty serious to ensure providers don’t violate laws governing protected health information.
Apple needs to have a technical and legal barrier so that its data centres don’t see any of that data. If that happens at all, even unintentionally and without a data breach, Apple could violate privacy laws simply by handling that data without the required legal agreements with individual healthcare providers. The company certainly doesn’t want to be on the hook for a data-breach investigation or the fines that come with it.
Blocking access to iCloud was a no-brainer from the moment that HealthKit was proposed. It’s also worth noting that Apple has built an impressive team of medical and healthcare regulation experts and done due diligence in working with both the healthcare technology companies and government agencies in terms of how HealthKit would function with respect to regulation.
In fact, the third interesting and important point is that Apple requires developers abide by applicable law in each place or market where the app will be available. Apps that provide true medical functions including “diagnoses, treatment advice, or control hardware designed to diagnose or treat medical conditions” must be able to show they’ve won regulatory approval. That language likely came out of Apple’s interactions with the US Food and Drug Administration (FDA) about how the agency determines whether or not to regulate an app as a medical device.
The HomeKit rules
The HomeKit rules are much shorter, partly because they wouldn’t be subject to as much stringent regulation, but they follow the same theme and are in some ways even more strict.
Apps accessing HomeKit cannot collect data to be used for advertising or data mining, much as is the case with HealthKit. Additionally HomeKit data cannot be collected or used for any purpose other than home automation or to improve either the user experience of an app (and presumably a related device) or “hardware/software performance in providing home automation functionality.”
Apple also specifies that any app accessing HomeKit must be focused on home automation, saying that apps “must have a primary purpose of providing home automation services.” That’s a pretty clear cut way of saying that you don’t get access the HomeKit’s data or its capabilities if you’re not using them in a very specific way. Again, the message is that Apple has no problem telling developers not focused on the user’s needs to take a hike.
Although there is no major regulatory of industry mandate for Apple to wall off HomeKit from advertising or data mining, the company is clearly saying that it understands that users will be putting a lot of trust in the HomeKit solutions they adopt and in Apple itself. Apple is willing to tolerate losing some developers or smart devices in order to ensure its user’s privacy and physical safety. This is going to be an increasingly big concern with the Internet of Things and Apple is setting the bar extremely high in terms what it believes is the right approach to these types of issues.
The move also reinforces Apple’s business model. Although its content and app storefronts and related services are huge businesses in and of themselves, the core of Apple’s business is selling premium devices. Its other businesses and initiatives are about creating a premium experience for those devices.
In other words, its end customer is the person buying the iPhone and Apple is always going to focus on their experience, which allows it to dictate privacy in ways that companies with other business models, such as Google, may not be willing or able to do.
Extensions and keyboards
Extensions will likely end up having a larger impact on the iOS 8 experience and on many users than either HomeKit or HealthKit. They represent the first time that Apple has really let iOS apps interact with each other and provide system-level services. It should come as no surprise that Apple is once again setting very clear standards for privacy and security.
The main requirements for extensions, which include keyboards, sharing options and widgets for the iOS Notification Center, is that they cannot include advertising or marketing and they must provide some form of useful functionality.
The review guidelines for third-party keyboards, however, are more explicit. They cannot collect information for any purposes other than enhancing keyboard functionality. The first type of information a keyboard could capture for other purposes is probably the scariest – actual keystrokes, which could encompass anything a user types, like contacts, messages, events, and even passwords – but other information could be collected as well that would be more useful for data mining, like which apps the keyboard is used with, the times of day it’s used, or even a user’s location while using it.
Apple is pretty much saying that a keyboard must be just a keyboard and that it must work well.
Open but not too open
Overall, HealthKit, HomeKit and extensions in iOS 8 are a sign that Apple is beginning to open things up to developers and third parties in a way that’s out of character with the old Apple. I noted in June that this year’s developer conference was the first time that we really saw the type of company that Apple has become under Tim Cook’s leadership. I still believe that.
These privacy restrictions, however, show that Apple isn’t going to let iOS and its App Store become a free-for-all. It is going to retain the control and authority that it needs in order to deliver the best experience for its users. That could be a very good advantage. In a year where Facebook admitted that it experimented with the mental state of hundreds of thousands of its users, it’s refreshing to see a company that remains focused on what its users want and need, and doesn’t rely on selling information about them.