Designer of bespoke analytical database engines for applications with unusual performance and scale requirements. Knows a lot about geospatial and sensor data. Went to school for chemical engineering.
email: andrew at jarbox dot org
This post is a poor exposition of Crocker’s Rules.
Crocker’s Rules were a reaction to the avoidance of direct discussion of topics where some people treat the mere act of discussion in any capacity as offensive. Sacred cows and taboos for which there are social consequences even when asking honest questions. Crocker’s Rules, practically speaking, were a declaration that no good faith discussion was intrinsically offensive ipso facto for the person making the declaration. All taboos were open to good faith arguments and attempts at rigorous intellectual inquiry.
This article is focused too much on communication style and not enough on the subject of communication. The latter was the crux of it. Crocker’s Rules were about being able to rigorously discuss topics that society has deemed to be beyond discussion without taking offense at the fact it is being discussed.
I was present when Crocker’s Rules were “invented”. I see a couple other handles here that may have been as well.
> There aren't many datasets exceeding that outside fundamental physics.
Just about every physical world telemetry or sensing data source of any note will generate petabytes of analytical data model in hours to days. On the high end, there are single categories of data source that aggregate to more like an exabyte per day of high-value data.
It is a completely different standard of scale than web data. In many industrial domains the average small-to-medium sized company I come across retains tens of petabytes of data and it has been this way for many years. The prohibitive cost is the only thing keeping them for scaling even more.
The major issue is that the large-scale analytics infrastructure developed for web data are hopelessly inadequate.
The idiomatic way to safely do pointer tagging in C++ works through uintptr_t.
If you don't care about portability or using every theoretically available bit then it is trivial. A maximalist implementation must be architecture aware and isn't entirely knowable at compile-time. This makes standardization more complicated since the lowest common denominator is unnecessarily limited.
In C++ this really should be implemented through a tagged pointer wrapper class that abstracts the architectural assumptions and limitations.
Washington already has payroll taxes and revenue(!) taxes, which have been increasing. It was causing employers to leave. The State had to limit ambitions on several of these taxes because it was causing businesses to move to other States.
For highly paid tech employees, the total tax burden across employer and employee in Washington has become one of the highest in the US. Even if the employee doesn’t see all of that, the employer definitely does. Washington already has de facto income taxes by proxy and this is on top of that.
This project is an enhanced reader for Ycombinator Hacker News: https://news.ycombinator.com/.
The interface also allow to comment, post and interact with the original HN platform. Credentials are stored locally and are never sent to any server, you can check the source code here: https://github.com/GabrielePicco/hacker-news-rich.
For suggestions and features requests you can write me here: gabrielepicco.github.io