in 100 years ipv4 will be recognized as one of the great discoveries like calculus. ipv6 is a misnomer really. It's a separate, and lesser protocol. Much like other second systems, it was too ambitious and not pragmatic enough.
Rather than looking down on IPv4 , we should admire how incredible it's design was. Its elegance and intuitiveness, resourcefulness have all led to it outlasting every prediction of it's demise.
edit: it's satire. but likely not too far off from the reality in 6 months.
> Our process is deliberately, provably, almost tediously legal. One set of AI agents analyzes only public documentation: README files, API specifications, type definitions.
since nearly all open source dependencies couple the implementation with type definitions, I'm curious how this could pass the legal bar of the clean room.
Even if they claim to strip the implementation during their clean room process -- their own staff & services have access to the implementation during the stripping process.
“Big data” doesn’t have a 5gb memory cap.
I’m guessing so many devs started out on 32gb MacBooks that the NEO seems underpowered. but it wasn’t too long ago that 8gb, 1500mb/sec IO & so many cores was an elite machine.
I did a lot of dev work on a glorified eePC Chromebook when my laptop was damaged. You don’t need a lot of ram to run a terminal.
I’m hoping NEO resets the baseline testing environment so developers get back to shipping software that doesn’t monopolize resources. “Plays nice with others” should be part of the software developer’s creed.