Why you’ll never make really big money as an AI dev

Artificial Intelligence? How the future was back in the ’80s Among the stupider things I said in the 1980s was a comment about Artificial Intelligence, including neural nets – or perceptrons as we called them back then – saying we needed “maybe a processor that worked at a hundred megahertz and literally gigabytes of storage”. I also believed that following our success using Fuzzy Logic to optimize Cement Kilns (from which my college made serious cash), Fuzzy was the future. I was wrong and am now envious of the power you have to play with. I now go to more conferences than any rational person, like Intel’s recent Nervana show, and part of me feels like I’m revising my mid-1980s degree again. Neural networks can classify pictures of goats, despite the occasional confusing of women’s feet and various species of crab. Hardware vendors like Cray, Intel and nVidia lurrve neural nets, since massively parallel stupidity masquerading as Artificial Intelligence soaks up whatever power you throw at it. Backward chaining is appearing as a “new” technique and a big risk factor for plotting a career in AI, in that because we’re mostly mining existing, but buried techniques, the stability of demand…


Link to Full Article: Why you’ll never make really big money as an AI dev

Pin It on Pinterest

Share This