How Nvidia won the AI war
A sysadmin's retrospective on three decades of GPU computing — from texture-memory hacks to CUDA monoculture, from demoscene dreams to enterprise myopia. (Disclosure: I used Claude Sonnet to help w...

Source: DEV Community
A sysadmin's retrospective on three decades of GPU computing — from texture-memory hacks to CUDA monoculture, from demoscene dreams to enterprise myopia. (Disclosure: I used Claude Sonnet to help with the writing and structuring) My backstory, I learned to code before most people owned a computer It was the early 1980s. I was five. My parents put a computer in the house. Back then we were only 3 kids at elementary school with something that could pass as a computer. I don't know if they understood what they were starting. I certainly didn't. I just knew that if you talked to the machine in the right language, it would do things. That was enough. By the time I was a teenager / young adult, "the right language" had expanded considerably. I was a hacker. Pick your hat color, I wore most of them. Black, white, grey, whatever the situation called for. The demoscene pulled me in hard, because of course it did. If you've ever looked at a 64KB executable push geometry and music and impossible