Ken Shirriff ken.shirriff@gmail.com https://righto.com @righto.com on Bluesky @oldbytes.space/@kenshirriff on Mastodon
The article asks about the etymology of X for crystal. I looked into that a while ago. The abbreviation "xtal" has been used for "crystal" since the 1800s in medicine, geology, and chemistry, and then electronics copied the usage. This comes from the earlier use of X for the "christ" sound, as in "xmas", which goes back to the 16th century. As the article suggests, the Greek chi (Χ) is the root.
Taste has a much longer pg history: "Taste for Makers" in 2002, "How Art Can Be Good" in 2006, and "Is There Such a Thing as Good Taste" in 2021.
(If we're venting about words, I'll bring up "opinionated", which has somehow become a positive .)
Links: https://paulgraham.com/taste.html https://www.paulgraham.com/goodart.html https://paulgraham.com/goodtaste.html
Did someone ask about Intel processor history? :-) The Intel 8080 (1974) didn't use microcode, but there were many later processors that didn't use microcode either. For instance, the 8085 (1976). Intel's microcontrollers, such as the 8051 (1980), didn't use microcode either. The RISC i860 (1989) didn't use microcode (I assume). The completely unrelated i960 (1988) didn't use microcode in the base version, but the floating-point version used microcode for the math, and the bonkers MX version used microcode to implement objects, capabilities, and garbage collection. The RISC StrongARM (1997) presumably didn't use microcode.
As far as x86, the 8086 (1978) through the Pentium (1993) used microcode. The Pentium Pro (1995) introduced an out-of-order, speculative architecture with micro-ops instead of microcode. Micro-ops are kind of like microcode, but different. With microcode, the CPU executes an instruction by sequentially running a microcode routine, made up of strange micro-instructions. With micro-ops, an instruction is broken up into "RISC-like" micro-ops, which are tossed into the out-of-order engine, which runs the micro-ops in whatever order it wants, sorting things out at the end so you get the right answer. Thus, micro-ops provide a whole new layer of abstraction, since you don't know what the processor is doing.
My personal view is that if you're running C code on a non-superscalar processor, the abstractions are fairly transparent; the CPU is doing what you tell it to. But once you get to C++ or a processor with speculative execution, one loses sight of what's really going on under the abstractions.
AMD's Am9511 floating-point chip (1977) acted like an I/O device, so you could use it with any processor. You could put it in the address space, write commands to it, and read back results. (Or you could use DMA with it for more performance.) Intel licensed it as the Intel 8231, targeting it at the 8080 and 8085 processors.
Datasheet: https://www.hartetechnologies.com/manuals/AMD/AMD%209511%20F...
This project is an enhanced reader for Ycombinator Hacker News: https://news.ycombinator.com/.
The interface also allow to comment, post and interact with the original HN platform. Credentials are stored locally and are never sent to any server, you can check the source code here: https://github.com/GabrielePicco/hacker-news-rich.
For suggestions and features requests you can write me here: gabrielepicco.github.io