All Around the Mulberry Bush
The objective is to provide Apple users with information that will make them healthier and more fit, but Apple, even before the platform is available, has made the upgrade to AI agents and an integration with Apple Intelligence, to make that information more ‘real-time’, personal, and meaningful. The agents are the scavengers that will poll your Apple devices for the health information they collect and bring it to Apple Intelligence for monitoring and evaluation. It is thought that Apple will not only offer you evaluations of your nutritional and sleep habits but could even offer camera-based assessments of your workouts and access to educational videos, put together by internal and external health experts.
While the range of detail is thought to delve into physical therapy, mental health, and even cardiology, the initial focus is thought to be nutritional, with monitoring and alerts leading to personalized health advice based on your data, although there has been talk of AI-based mental health counseling and chronic disease predictive analysis. As one might expect, Apple’s focus seems to be on the ‘user experience’, the part of the Apple persona that allows them to charge a premium for their products, but Apple is certainly not the first to go in this direction in this new age of AI. Google’s (GOOG) Fit is a similar collector of personal health data through Android’s Health Connect. This platform allows permitted 3rd party apps to supply and collect data that feed the Google Fit app, but is more a collector, aggregator, and visualizer than an advice tool, although Google is currently working to integrate that data into its other health related services, with a tie-in to reference ‘reputable sources’ on YouTube.
Amazon (AMZN) also has a health program, but its focus is more oriented toward B2B with the Amazon Pharmacy supplying information on medications and interactions and the Amazon Clinic and One Medical able to set up virtual video or text sessions with clinicians (some on staff) that can evaluate conditions, make diagnoses, and prescribe medication for relatively common illnesses. There are also companies like Noom (pvt) or MyFitnessPal (pvt) that are more specific to food and calorie management but given the enthusiasm for Ai that seems rampant across the health sector, we expect almost every health related application to leverage AI to stay competitive.
There are a few caveats here, particularly HIPAA regulations which regulate any health information that is maintained or transferred. Entities involved must encrypt health data, limit access, perform risk assessment, maintain audit trails, breach notifications, and take ‘reasonable steps’ to prevent access to or disclosure of patient information. HIPAA is difficult enough to understand and maintain, but adding AI to the mix opens everything up to new legal questions, many of which have yet to reach the courts and as liability becomes a potential issue when health-related advice is being given, we expect many new court cases that will not only focus on the potential liability of poor or incorrect data, but will include questions of algorithmic bias, inadequate software testing, and the fact that Ai systems are essentially ‘black boxes’ that make it impossible to derive where or how an AI arrived at a particular diagnosis or conclusion.
Smart lawyers will not only include site owners but also those who wrote the algos that run them, looking for biases that could cause hallucinations, errors in judgement, or flawed diagnoses based on poor human vetting. When Ai developers are called into court to defend issues like what data was included in an AI’s training or what process was used to draw a conclusion, high level math will not be how they are judged by a jury, so while Apple jumps into the fray to provide a positive health experience through Project Mulberry and Apple Intelligence, its not like Wikipedia, where you take things with a grain of salt. Healthcare decisions affect people’s lives, as some can be significantly influenced by the information given by Ai healthcare. There are good and bad doctors, and sometimes doctors make mistakes, which is why malpractice insurance exists, but will there be malpractice insurance for an application that gives incorrect advice or misdiagnoses an ailment or mental condition?
[1] IP Address
Device Type & Model
Operating System
Device Identifiers (trackers like AAID, IMEI
Screen Resolution
Installed apps (some)
Browser type & version
Cookies (optional)
Browsing History (Optional)
Location Data (Optional)
Referring websites
App usage
Contacts & Calendar (Optional)
Photos & Videos (Optional)
In-app purchases
Search queries (Optional)
Social Media Activity
Shopping activity
Form submissions
Wi-Fi network name
Data usage
Bluetooth data
Sensor tracking
Accelerometer & Gyroscope data
Ambient Lighting data
‘like’ data
DNS lookups
…To name a few.