...

el_pollo_diablo

94

Karma

2023-11-30

Created

Recent Activity

  • Capturing invariants in the type system is a two-edged sword.

    At one end of the spectrum, the weakest type systems limit the ability of an IDE to do basic maintenance tasks (e.g. refactoring).

    At the other end of the spectrum, dependent type and especially sigma types capture arbitrary properties that can be expressed in the logic. But then constructing values in such types requires providing proofs of these properties, and the code and proofs are inextricably mixed in an unmaintainable mess. This does not scale well: you cannot easily add a new proof on top of existing self-sufficient code without temporarily breaking it.

    Like other engineering domains, proof engineering has tradeoffs that require expertise to navigate.

  • Years ago the research team behind OCaml released Chamelle, a version of the language localized in French, as an April fool's joke:

    https://gallium.inria.fr/blog/ocaml-5/

  • I would put the emphasis on a different word:

    > This article is made and published by Anna Hartz, which may have used AI in the preparation

    Which, not who. They're not even sure the author is human!

  • Head and tail make sense for persistent lists in functional languages with value semantics, yes.

    The intrusive, mutable, doubly-linked loops with reference semantics under discussion are quite different. Although all entries behave identically in the structure itself, one of them is _used_ differently, as a standalone anchor point, while the others are embedded in the list's "elements".

  • > Are you saying the only real way to program is with generic data structures?

    Certainly not. As I said, the experienced programmer knows when (not) to use them. Some programs are better off without them, such as... most of the low-level software I write (security-critical operating systems).

    > Data tends to follow a particular path and there is probably 1 red black tree and it’s a core feature of the application. If that’s true then it’s not a big deal to write that core feature of your application.

    "It's not a big deal" are famous last words. Maybe it is not a big deal, but unless you have hard constraints or measurements that confirm that you would not be served well enough by an existing solution, it may very well be the reason why your competitors make progress while you are busy reinventing, debugging, maintaining, and obtaining certification for a wheel whose impact you overestimated.

    > It avoids premature optimization and code bloat because you tend to use complex data structures when they have a need.

    Avoiding code bloat? Agreed, a custom solution cannot be worse than a generic one. Avoiding premature optimization? Quite the contrary: going straight for the custom solution without first confirming, with actual measurements, that the generic solution is the unacceptable bottleneck, is textbook premature optimization.

    I am sorry but I do not understand what you are getting at.

HackerNews