you can contact me at marliechiller~proton.me (replace ~ with @)
Personally, I dont think so. I can understand a mathmatical axiom and reason with it. In a sequence of numbers I will be able to tell you N + 1, regardless of where N appears in the sequence. An LLM does not "know" this in the way a human does. It just applies whatever is the most likely thing that the training data suggests.
I find the use of the word intelligence to be a bit of a misnomer. Is something intelligent if all its doing is pattern matching? Is the evolution that led to owl butterflies appearing like an owl intelligent? Im not sure.
As an aside, its amusing we simulatenously have this article on the front page as well as [Generative AI coding tools and agents do not work for me](https://news.ycombinator.com/item?id=44294633) also on the front page. LLMs are really dividing the community at the moment and its exhausting to keep up with what I (as a dev) should be doing to stay sharp
Not sure I fully agree - sometimes maybe, but I think in the majority of cases it's used when people feel they dont need to explain exactly what they mean because it should already be obvious to most people.
Example. "Always look when you cross the road" is a snippet common sense, with lack of heeding to that sense resulting in you potentially being hit by a car. Even a 4 year old wouldnt need the latter explanation, but most people could articulate that if they needed to. Its just a way of speeding up communication
This project is an enhanced reader for Ycombinator Hacker News: https://news.ycombinator.com/.
The interface also allow to comment, post and interact with the original HN platform. Credentials are stored locally and are never sent to any server, you can check the source code here: https://github.com/GabrielePicco/hacker-news-rich.
For suggestions and features requests you can write me here: gabrielepicco.github.io