groks vr, boardgames, logseq, python, rust, llm, linux, gitlab, tango argentino, meditation, nordic larp, mudra space awareness
likes playing board games offline and online (BGA, TTS, Board Games VR, Tabletop Playground, AllOnBoard or Vassal)
groks books and dance
interactivity - bsky.app/profile/monsuta - fsiefken at gmail - meet.hn/city/52.0907006,5.1215634/Utrecht
The README.md is 9k of dense text, but does explain it: faster, more efficient, more accurate & more sensible.
Rust port feature: The implementation "passes 93.8% of Mozilla's test suite (122/130 tests)" with full document preprocessing support.
Test interpretation/sensibility: The 8 failing tests "represent editorial judgment differences rather than implementation errors." It notes four cases involving "more sensible choices in our implementation such as avoiding bylines extracted from related article sidebars and preferring author names over timestamps."
This means that the results are 93.8% identical, and the remaining differences are arguably an improvement. Further improvement, extraction accuracy: Document preprocessing "improves extraction accuracy by 2.3 percentage points compared to parsing raw HTML."
Performance:
* Built in Rust for performance and memory safety
* The port uses "Zero-cost abstractions enable optimizations without runtime overhead."
* It uses "Minimal allocations during parsing through efficient string handling and DOM traversal."
* The library "processes typical news articles in milliseconds on modern hardware."
It's not explicitly written but I think it's a reasonable assumption that its "millisecond" processing time is significantly faster than the original JavaScript implementation based on these 4 points. Perhaps it's also better memory wise.I would add a comparison benchmark (memory and processing time), perhaps with barcharts to make it more clear with the 8 examples of the differing editorial judgement for people who scan read.
For me it's like the pebble in smart glasses land, simple and elegant. Less is more, just calendar, tasks, notes and AI. The rest I can do on my laptop or phone (with or without other display glasses). I do wish there's a way to use the LLM on my android phone with it and if possible write my own app for it. So I am not dependent on the internet and have my HUD/G2 as a lightweight custom made AI assistent.
This project is an enhanced reader for Ycombinator Hacker News: https://news.ycombinator.com/.
The interface also allow to comment, post and interact with the original HN platform. Credentials are stored locally and are never sent to any server, you can check the source code here: https://github.com/GabrielePicco/hacker-news-rich.
For suggestions and features requests you can write me here: gabrielepicco.github.io