Years ago, in a misbegotten career move, I briefly became an IT guy. It was almost accidental. The publication I worked for had moved away from its traditional (meaning it lasted about 10 years) computer/production system. Back then, we worked on dumb terminals, and any formatting was accomplished by typing arcane commands like “cf43,14,30p.” Thus formatted, the type, as it were, went to a printer that spat out film, which was then pasted onto boards and trucked to the printer. I know it sounds really convoluted and difficult, but I harbor a certain fondness for that setup, especially when I think of what followed.

The typesetting machine and the minicomputer that powered it and the dumb terminals were falling apart. So the journalists were thrown into the wonderful world of desktop publishing, which seemed really cool. Except it wasn't. Suffice it to say that our first attempt was plagued by buggy software, exploding layouts (photos inexplicably went black if they traveled from an editor's Windows PC to a layout artist's Mac and back again), and a worn-out staff that hated the technology that they had to use.

With a mutiny in the cards, our publisher asked me to fix it. That involved a whole new set of software, new computers (Macs, even in those dark days before Steve Jobs' return, still worked better for publishing) and new ways of laying out a computer network. We swapped out mysterious printer and shared drive names for labels like “manuscripts” and “future issues,” words that normal people could understand. Peace settled in (and I had a new job function).