...

paulyy_y

126

Karma

2020-09-16

Created

Recent Activity

  • Burying the lede here - your solution for avoiding using vector search is either offloading to 1) user, expecting them to remember the right terms or 2) using LLM to craft the search query? And having it iterate multiple times? Holy mother of inefficiency, this agentic focus is making us all brain dead.

    Vector DB's and embeddings are dead simple to figure out, implement, and maintain. Especially for a local RAG, which is the primary context here. If I want to find my latest tabular notes on some obscure game dealing with medical concepts, I should be able to just literally type that. It shouldn't require me remembering the medical terms, or having some local (or god forbid, remote) LLM iterate through a dozen combos.

    FWIW I also think this is a matter of how well one structures their personal KB. If you follow strict metadata/structure and have coherent/logical writing, you'll have better chance of getting results with text matching. For someone optimizing for vector space search, and minimizing the need for upfront logical structuring, it will not work out well.

  • Humanity just doesn't ever learn. Europe will end up having draconian oversight and censorship that will be abused beyond belief by fascists. When some central entity--subject to the whims of political temperature--controls what you can access, there can be no trust in the durability and integrity of information. Same as in the US really, except there being executed to support the nascent regime without any liberal auspices.

  • Allow/deny list, hope that helps.

HackerNews