It's often said that programming languages should have "a small set of orthogonal primitives" but I wonder if there's any empirical basis for this?
A highly relevant thread (thx @missingfaktor): twitter.com/slava_pestov/s…
https://twitter.com/dubroy/status/1500490670150144005 ∙ Archived on 2025-03-28.