
I tried to ship a wellbeing journal and discovered the App Store still isn’t sure whether the human body is health, lifestyle, or sin.
Building for iOS sometimes feels like archaeology: brush away enough guidelines and you hit something older and stranger. A system that can classify violence with forensic precision still can't decide if the human body is health, lifestyle, or sin.
One day I tried to ship a private intimacy tracker–nothing scandalous, just a journal for wellbeing–and App Store Connect assigned it the 16+ rating it uses for gambling apps and "unrestricted web access". The rating itself is fine: the target audience is well past that age anyway. What baffles me is the logic.
Silk–the app I’m talking about, almost reluctantly–is a wellbeing journal in the most boring sense possible. You choose a few words about your day, moods, closeness, symptoms, or whatever else matters to you and your partner(s). It lives entirely on-device, syncs with nothing and phones no one. The whole point is that nothing interesting happens to your data after you close the app.
And yet, from the App Store’s point of view, you can build a game with guns and cartoon violence and happily ship it to kids, while tracking your own body needs a 16+ “mature themes” label.
Welcome to the grey zone.
If you were around for the early App Store, you’ll remember its optimism: accelerometer-driven beer glasses, wobbling jelly icons, flashlight apps that set brightness to 100% because no one had ever considered the idea before. The ecosystem assumed “content” meant pictures, sound, or the occasional cow-milking simulator–not a user quietly describing part of their life to themselves.
The App Store still carries the outline of that first life. Its vocabulary came from iTunes, which came from film ratings, built for a world where "content" meant something you could point a camera at. When the App Store arrived in 2008, it reused that system because it was available–and because no one expected apps to do much beyond wobbling or making noise.
Those assumptions didn’t last. By 2009 the Store had hosted an infamous $999 app that did nothing but display a red gem, a game where you shook a crying baby until it died, and enough fart apps that one reportedly earned five figures daily. The review process was learning in public.
Against that backdrop, Apple introduced age ratings in mid-2009 with iOS 3. The strictest category, 17+, wasn't really created for gore or gambling–it was a pressure valve for novelty apps where shaking your phone made cartoon clothes fall off. Anything that might show “objectionable content”, from bikini galleries to embedded browsers, went into the same bucket.
By 2010, Apple reversed course. After Steve Jobs declared "folks who want porn can buy an Android phone," thousands of sexy-but-not-explicit apps vanished overnight in what became known as the Great App Purge. The platform moved from reactive cleanup to something more systematic.
The Age Ratings matrix Apple operates now in iOS 26 is far more precise. It defines violence with forensic granularity, subdivides gambling, categorises medical risk. It does all the things a global marketplace must do once everyone realises software can cause harm.
But the matrix still retains its original silhouette: everything is defined by "content," not context. App Review's logic is keyed to artifacts inside the bundle–screenshots, metadata, stored assets–not to what the software actually does. That works beautifully for games and media. It falls apart when the "content" is whatever the user decides to write that day.
Silk has no images, no user-generated photos, no feed, no external links. The matrix rated it 16+ anyway–the same tier as gambling apps and unrestricted browsers. The rating isn't describing what Silk does. It's describing the absence of a category that should exist.
When HealthKit launched in 2014, Apple consciously avoided anything resembling "behavioural interpretation." Heart rate or steps were fine, relationships were not. A decade later, the API surface has expanded in every direction–sleep depth, sound exposure, handwashing, environmental allergens, even a "Sexual Activity" field added quietly in iOS 9. But relational wellbeing remains conspicuously absent.
HealthKit tracks heart rate, inhaler usage, mindfulness minutes, and the more delicate end of gastrointestinal bookkeeping. Nowhere does it model intimacy, affection, or closeness–the things couples might actually want to track privately. If the platform doesn't have words for what you're building, the classification system can't label it correctly. The vocabulary doesn't exist.
Apple is not avoiding the topic here, they’re being literal. And when a system is literal in a domain that is inherently contextual, things start to get interesting.
The first place that breaks is search.
Apple's search infrastructure is fast and strict. Search for "budget app" and you get budget apps. Search for "meditation" and you get meditation apps plus a few over-confident habit trackers. Search for the phrases people actually use when they want what Silk does–"relationship journal", "couples diary", "private moments"–and you get wedding planners, travel blogs, generic note-taking apps, and the occasional CBT worksheet. The algorithm can't read between lines it doesn't know exist.
On the developer side, metadata stops being about discoverability and becomes a small diplomatic exercise. A few terms trigger moderation, a few trigger follow-up questions, and the keyword field turns into a minefield where every word is inspected for what it might imply rather than what it means. Too specific reads like medicine, too gentle reads like romance, and anything metaphorical gets outright rejected.
This isn't new though. In 2009, Ninjawords–a perfectly useful English dictionary–was delayed and forced into 17+ because it could return definitions for swear words. Phil Schiller personally explained that since parental controls promised to filter profanity, any app displaying unfiltered words needed age-gating. Never mind that Safari could look up far worse. The rule was simple: uncurated content equals adult content, context be damned.
There's also the "massage" rule, mostly folklorfe but widely believed: any app with that word in its metadata triggers extended review, whether it’s physiotherapy or post-marathon recovery. The system was burned once by apps using "massage" as euphemism and never forgot. Most of the odd heuristics you encounter today are scars from 2009.
Ambiguity in meaning becomes ambiguity in engineering. Without shared vocabulary, misalignment cascades: classification shapes search, search shapes metadata, metadata triggers review flags. The policies update faster than the taxonomy beneath them evolves.
Once you see that, the problem becomes solvable–not culturally, but technically.
At this point I did the only sensible thing: treated App Store Connect like a black box and started running experiments.
First test: keywords using only soft language–"relationship journal", "partner log", "connection tracker". Search rankings tanked. Silk dropped to page 8 for "relationship journal," outdone by printable worksheets for couples therapy. Good news: the algorithm was confident I wasn't selling anything objectionable. Bad news: it was equally confident I wasn't selling anything at all.
Replacing those with direct terms–"intimacy tracker", "sexual wellness", "couples health"–brought visibility back to page 2–3. It also triggered longer App Review cycles and required increasingly elaborate "Review Notes" explaining why the app shouldn't be rated 18+. Same binary, screenshots, and code, but different words in a metadata field neither the users nor I can even see in the UI.
Screenshots followed the same logic. Completely sterile set–empty fields, no microcopy, generic UI–sailed through but made Silk look like Notes with a different background. A more honest set showing what the app actually does triggered the 18+ question again. The framing changed the classification. The classification changed nothing about what the software does, but everything about where it appears and who finds it.
None of this is surprising if you assume the system is a classifier trained on categories from 2009. From the outside it feels arbitrary. From the inside it's doing exactly what it was built to do: match patterns it understands and escalate the ones it doesn’t. It just doesn't have a pattern for "private health journal that mentions bodies", even though there are lots of private health journals in the App Store these days. You can almost hear it thinking: This smells like health but reads like dating and contains the word 'intimacy.' Escalate!
Silk’s architecture was shaped by this lag in the same way Fermento’s safety checks were shaped by gaps in food-safety guidance, or Residency’s "compiler warnings" for travel emerged from inconsistent definitions of “presence” by different countries. It’s not a case study in “growth”; it’s just another example of what happens when you have to reverse-engineer the missing assumptions. When a domain refuses to state its rules, you provide the scaffolding yourself.
Most of the engineering time went into figuring out what not to build–not from fear of rejection, but from understanding how classifiers behave when they encounter undefined cases. Treat it like a compiler that hasn't learned your edge-case syntax yet and stay inside the subset of language it already understands. The discipline felt familiar–the same kind you develop when building in domains where the platform's rules aren't fully specified and you have to infer the boundaries from failed experiments.
The more carefully you specify what the app does, the less the platform has to guess on your behalf. In 2010, you could ship "Mood Scanner" apps that claimed to read emotional states from fingerprints. They still exist–the App Store didn't purge them–but try submitting one in a category App Review associates with actual health data and you'll trigger very different questions. The scrutiny isn't random; it’s contextual. It depends on how your metadata accidentally pattern-matches against old problems.
The closer Silk came to shipping, the more I understood the App Store's behaviour as conservatism–not ideological, but technical. The kind that keeps a global marketplace from accidentally approving malware. Some of this conservatism is regional: China's App Store has additional filters for "relationship content," South Korea requires separate disclosures for wellbeing data. Apple unifies this under one policy umbrella, which produces a system that's cautiously consistent across borders but not particularly imaginative about edge cases.
The post-Epic world made Apple more explicit about where liability lives. Ambiguity became expensive, underspecification became expensive, classifiers that behave "roughly right" became expensive. The safest rule became simple: if the system can't clearly state what something is, err on caution until the taxonomy expands.
The cost is that new categories appear slowly. Sleep apps lived on the periphery for years. Meditation apps bounced between "Health" and "Lifestyle" depending on screenshot aesthetics. Third-party cycle trackers existed for nearly a decade before Apple added native reproductive health tracking in 2015 and a dedicated Cycle Tracking experience in 2019. Digital wellbeing apps faced suspicion until Screen Time shipped in iOS 12. Each category began as an edge case, proved itself through user adoption, and eventually got formalized–usually announced in a single sentence at WWDC as if it had always existed.
Silk is at the beginning of that cycle. Eventually Apple will introduce a more nuanced descriptor, or HealthKit will model relational wellbeing, or the age matrix will gain more precision. The entire ecosystem will re-index overnight and everyone will move on.
It turns out the best way to handle a category that doesn’t exist is to build as if it does, then wait for the taxonomy to catch up. Until then, the grey zone is honestly not a bad neighbourhood. The users already know what the app is for. The platform will figure it out eventually.
Working in the quiet gaps of the platform? I build iOS software for the problems people don’t talk about. work@drobinin.com
Very strange article. The author is very upset that an "intimacy tracker" might receive an 18+ rating on the app store. I mean yes, younger folks do it, but the vast majority of potential customers are 18+. Why is this an problem?
Hi, OP here. Not sure what else to add beyond the first paragraph of the article:
> The rating itself is fine: the target audience is well past that age anyway. What baffles me is the logic.
I don't mind the 18+ label, even though it's up to the users what they use the app for, whether it's tracking sex, a partner's health, or personal wellbeing.
But I do find the history of age ratings and categories in the App Store and the limits they have to be quite hilarious, and figured I might as well write them down.
Hi OP! I’ll share what I mentioned below in hopes of a response from you directly, because I’m genuinely curious to hear what you think:
Seems like people should be of whatever age we consider mature before they start capturing intimate data about themselves on random platforms. If we don’t think you’re able to understand the risks of pursuing your reproductive impulses, do we think you can measure the risks of sharing data about those impulses on a platform you don’t control?
Local data or not, if I were the steward of a marketplace I’d use that position to create this kind of teaching moment for pre-developed consumers. If young people had been warned since the mid 2000s of how much of their intimacy they were handing over to Meta, ByteDance, etc. before they started, the world would certainly be better off.
Hey! I don’t disagree that people of any age should think twice before putting personal data (intimate or not) into any platform.
My point wasn’t about lowering the age rating. The issue is that Apple doesn’t have a real category for this kind of wellbeing at all. The age gate itself is sensible, but what’s funny is why it exists. It’s not "because we carefully considered how to protect teens’ data", it’s "because in 2009 the Store was drowning in farting apps, and we’ve been patching around that ever since."
So, what is your own company's approach to restricting underage use?
because your blog post is, ahem rather less than persuasive
Urm, did you read a different article then the one linked?
Because there's isn't really an argument innit - at least none that I took notice of. Isn't it just exploring the reasons why it is like it is today? They even made it abundantly clear in the beginning (and in the comments here) that the rating is fine for the app
And for what conceivable reason would this need to have sure underage people aren't using it?
A period tracker has relevance in the context of a sexual relationship, but there is really nothing about it that needs to be censored from underage people. It is not explicit content. It's a specialized journal, that's it
I bet that the ratings are dictated not by usability but by liability.
Yes, people younger than 18 engage in sex, but this has different legal consequences than for people past 18, and Apple has no interest to wade through that legal quagmire.
[flagged]
> Hi, OP here. Not sure what else to add beyond the first paragraph of the article:
I would imagine that the confusion arose because they read past that sentence. You wrote that you don’t mind that the app you specifically made for adults to use got the rating that it did and then sort of talk about how you don’t find the rating system to be rational.
I couldn’t tell if the subject of this article is “I think my intimacy tracking app shouldn’t have an adult rating because a user could use it for general wellbeing” or “I don’t like Fortnite”
That’s fair feedback, thank you. The point I was trying to make wasn’t "my app deserves a lower rating", it was "I built something for adults and realised there isn’t actually a correct category for it at all."
Once I noticed that gap, I went digging into the history to understand why the App Store age ratings and categories are the way they are, hence this archeological detour of a post.
[flagged]
[flagged]
Maybe I don't read good but I'm a bit confused by the article - it feels like it's avoiding describing what the Silk app actually is for. Is it a sex tracker? or for relationship emotions? or for both plus anything else you want? Or something else?
Yeah, the author really tries everything to avoid calling it a sex tracker but the app store listing right now literally says "Sex, Partner Cycles, and Health"
I'm not sure what else they expected? 16+ seems appropriate to me. I can buy the argument for a need of additional categories, though.
I can see their point if it genuinely is just a tracker/journal. It’s effectively just words. Should books that describe sex also be age restricted?
As they raised, games with gacha mechanics and violence receive a lower rating.
I feel the complaint is less about the app receiving that rating and more the flawed logic around how they are categorized given it’s effectively a health and wellness tracker.
Books that describe sex are somewhat age restricted. Something like Looking for Alaska is rated 16+ by Common Sense Media and not in elementary or middle school libraries.
“Partner Cycles” particularly describes a domestic surveillance situation that caters to an audience I’m not looking to spend time with
It is an explicit sex tracker that does not restrict usage by children
You can imagine why Apple takes a dim view
Are you saying it is "explicit" in that it is explicitly stating itself as such or that it contains explicit content? Because the former is honest and the latter seems to be untrue. Also you are replying to every comment here calling the author of the article a "disgusting pervert" and accusing them of a lot of things and I'm not sure it's adding anything to the conversation. It's a harmless journalling app
Sure it's harmless, but I think the app has enough suggestive content to be "explicit." There is a section where you can track your toy performance with options including ropes and vibrators.
It is a tracker of sexual activity.
It has no other purpose.
It ain't livejournal, my man
I don't know what this comment means. I know this. Are you ok?
I think you’re missing the point of the post. It’s not about that specific app. It’s how Apple approaches such topics without context and with extreme prudishness.
This may be a bad global political climate for the "Big tech is bad because they restrict my sex app to users 18 and older" take.
I didn't take that message away from the blog at all. They seem perfectly content to be an 18+ app but are musing on the fact that, functionality wise, it doesn't actually have any sexual content. Just the vague suggestion that you might choose to log that information in there.
An IRL analogy is probably stores that are happy to let children shop in the vicinity of sexually suggestive items such as condoms, lube, and intimate apparel—you can get these at grocery stores even.
In all honesty, I think the front of a condom box which says "for contraception plus STI protection" is less sexually suggestive than an app that has a "Spice Library" where the user can tag activities with tags such as "Anal," "Oral," or "Intercourse."
How about a different angle then since you seem to be of the opinion that words related to sex can't be in the vicinity of children. Book stores. There aren't age ratings for books and a 8 year old is not only free to wander around by themselves near all the smut—from missionary to rape to drugged group sex with hockey boy dragon princes—but they would be allowed to buy them too. The only 'protection' from this happening is the fact that kids aren't actually interested in this kind of stuff.
So it gets to the heart of what these age ratings are really for and what kind of things do we actually not want kids around. And it seems like the line is drawn at sexually explicit imagery. And so app stores in a way are kind of unique in that they apply age restrictions much more liberally than the wider world around them.
Seems like people should be of whatever age we consider mature before they start capturing intimate data about themselves on random platforms. If we don’t think you’re able to understand the risks of pursuing your reproductive impulses, do we think you can measure the risks of sharing data about those impulses on a platform you don’t control?
Local data or not, if I were the steward of a marketplace I’d use that position to create this kind of teaching moment for pre-developed consumers. If young people had been warned since the mid 2000s of how much of their intimacy they were handing over to Meta, ByteDance, etc. before they started, the world would certainly be better off.
> Seems like people should be of whatever age we consider mature before they start capturing intimate data about themselves on random platforms
How about we just don't do that capture at all?
Agreed. But giving adults free will is a principle of the market, so if attempting to prevent the most vulnerable consumers is the best we can get from a compromise, I’m for it
It is, quite literally, a diary for your sexual activity
Condoms, lubes, and intimate apparel are not threats to children. If you found children buying condoms, lubes, or intimate apparel at unusually high rates, that would be very alarming. But, normally, they are purchasing these products for very lazy adults.
There is 0 use case for a person under the age of majority to use this application. It is only usable by a person being abused.
Precisely 0 adults have charged children with the mission of f... blogging sexual activity.
> If you found children buying condoms, lubes, or intimate apparel at unusually high rates, that would be very alarming. But, normally, they are purchasing these products for very lazy adults.
“Children buying condoms or lube at unusually high rates” is a funny phrase because I don’t think I’ve ever seen a child buy either of those things. What is the threshold between usual and unusual for these purchases in hard terms?
You've never seen a kid buy condoms because mom and dad are lazy?
I aspire to your lifestyle ;)
kids buy condoms for the same reasons kids used to buy cigarettes or beer. It would be super weird to see the same neighborhood kid buy condoms every week, but it's not particularly alarming on a one-time basis.
Maybe I am just very old, and I should be alarmed by a one-time purchase. I am relating the basic guidelines from my own youth.
It is plainly a bad idea by any standard to provide an application for teenagers to keep diaries of their sexual abuse.
> You've never seen a kid buy condoms because mom and dad are lazy?
Not that I can think of, and I’ve worked retail. I’m assuming that ‘lazy parents’ have been buying condoms and lube online for a long time now given the selection and price points available. Like you can get a hundred pack for ~$20 which is way cheaper than what you’d get at a gas station.
Gonna file this away under the "I am very old" category
Revised opinion: Maybe it should be EXTREMELY ALARMING to see a neighborhood kid make a one-time purchase of condoms or lube
edit: i have worked retail but it was 30 years ago
Wait until you see how many condoms you can buy off Amazon with a $20 Visa card. It even ships in an unmarked cardboard box, the West will fall any minute now.
Should pen and paper be restricted to 18+?
Pen and paper are not restricted by Apple's "App Store."
the epitome of the slippery slope fallacy.
Obviously not, pen and paper is the correct platform for a young person to document this type of data.
Not sarcasm, kids should truly just write this shit down instead of using a weird app that’s not accountable to them in any way,
Your daughter's teacher could write her a handwritten note to stay after class. Won't anybody think of the children?
/s in case.