...

jki275

387

Karma

2017-07-14

Created

Recent Activity

  • I run local models on my M1 Max. there are a number of them that are quite useful.

  • Preach.

    Golang is the best language there is for most workflows that aren't bare metal embedded or have real time requirements, and this is coming from a 20 year+ C++ dev.

  • Commented: "Ghidra by NSA"

    The NSA doesn't do serious work?

  • Local models won't generally have as much context window, and the quantization process does make them "dumber" for lack of a better word.

    If you try to get them to compose text, you'll end up seeing a lot less variety than you would with a chatgpt for instance. That said, ask them to analyze a csv file that you don't want to give to chatgpt, or ask them to write code and they're generally competent at it. the high end codex-gpt-5.2 type models are smarter, may find better solutions, may track down bugs more quickly -- but the local models are getting better all the time.

  • I have an M1 Max w/ 64gb that cost me much less than that -- you don't have to buy the latest model brand new.

HackerNews