The CS/IT world seems to have an unusual absence of understanding of its own history, compared to…
The CS/IT world seems to have an unusual absence of understanding of its own history, compared to other engineering fields.
Sure, everybody with a degree is vaguely aware of Turing, and maybe they’re aware of Babbage, Lovelace, and a couple other figures. But, I rarely meet even professionals who are aware of the lineage of the ideas they work with.
Web developers somehow limit their understanding of the history of hypertext to the idea that Tim Berners-Lee did it, and are unaware of Ted Nelson, Hypercard, NLS, and Memex. Front-end developers repeat the idea that Apple invented the GUI, or repeat the idea that Xerox did, but generally don’t distinguish between WIMP interfaces & other forms of GUIs, and aren’t aware of the landscape of UX going back to Raskin, Licklider, and Englebart, nor are they aware of the way that the Macintosh interface was influenced by the Lisa, the Amiga, GEM, and other contemporary competitors.
Software engineering culture, at the low end, consists mostly of uncontextualized lore, often repeated and accepted without much consideration. We repeat the maxim that premature optimization is the root of all evil without recognizing the extremely limited context in which the author of that epigram would have agreed with it, nor the fact that the same guy would have wanted every prospective computer programmer to first obtain a doctorate in mathematics.
The proliferation of fads comes from a broader feeling among the HN crowd that the history of this domain is not worth understanding more than very shallowly, because all valuable ideas lie in the future. As a result, otherwise intelligent people repeatedly reinvent the wheel, not realizing that their great idea was invented, written about, and eventually rejected by someone much smarter than they are in the late 1950s.
In the sixty years that this field has had a commercial presence, we’ve picked much of the low-hanging fruit. Future progress will be harder, and if we don’t cultivate a culture of understanding history we won’t be able to apply the wisdom of the past to ideas in the present. When breakthroughs in computing were primary academic, this was less of a problem: to get a paper published, you have to cite your references and give a sense of where the new idea fits in the existing domain. In a commercial environment, on the other hand, misrepresenting old things as new and cultivating an ignorance of other related work is encouraged from a marketing perspective — and when your work is VC-funded, marketing is the only thing that matters.