...

wongarsu

30239

Karma

2015-02-15

Created

Recent Activity

  • Yeah, I think the idea of the law is fine. If you imagine "Operating System" to mean "things like Windows and iOS, or Desktop install of Fedora", "Application Store" to mean "Microsoft Store or AppStore or the like" and "Application" to mean "Word and Doom and stuff like that" then it's fine. Especially if you keep in mind that there isn't any actual verification of the age, it's simply set by whoever sets up the account

    Most of the issues only arise because in the bill "operating system", "covered application store" and "application"/"developer" have very loose definitions that match lots of things where the law doesn't make sense.

  • The size of the age bracket also puts practical limitations on it. There is only one mandated bracket for everyone who's at least 18, preventing that attack on anyone who starts using your software after their 18th birthday. And if a 13 year old signs up it takes three years for you to observe the switch to the >=16,<18 bucket

  • The California law only stipulates that there's an "accessible interface at account setup" to set the birthday or age at account setup, and an interface to query the age bracket. Plus the crap for "application stores"

    I don't think it's a very well thought-out law. But realistically this will end up as setting some env variable for your docker containers to assure them that you are 99 years old. And yes, maybe transmitting a header to docker hub that you are 99 years old. Probably configured via an env variable for the docker cli to use. It's stupid, but nothing a couple env variables wouldn't comply with

    The real issue is when the law inevitably gets expanded to get some real teeth, and all the easy workarounds stop being legal

  • AI facial recognition is smarter than what they are capable of. That's not the issue. It is much faster than a human, and state-of-the-art models make fewer errors than a human (though the types of errors are not the same).

    The issue is that facial recognition is just not very reliable. Not for humans and not for machines. If you look at millions of people, some of them just look incredibly similar. Yet police apparently thought that was all the evidence they will ever need. A case so watertight there's no point in even talking to the suspect

  • Yeah, I could see a world where it swings exactly the opposite way for software. Writing software for yourself is becoming cheap, but gathering requirements, getting alignment between stakeholders or marketing your software isn't getting much cheaper. Maybe everyone will end up with their own in-house solution? Or maybe we end up with configurable SAP-like behemoths, but instead of an army of expensive consultants configuring the software for your use case you have AI agents taking that part

    I'm sure whatever path this takes will seems obvious in hindsight

HackerNews