What Made AI Mainstream, Part 3: Big Models (Video)
Learn more about AI at the next SpeechTEK conference.
Read the complete transcript of this clip:
Paco Nathan: The third part here is about big models. Neural networks are interesting, they go back into the ’40s with McCulloch and Pitts at MIT. But, again, it took a long time before we really had enough computing power for neural networks to take off. So in 2006, Geoff Hinton published a paper about having layers and layers of neural networks stacked on top of each other using a thing called an autoencoder in between every layer, and they could do magic with it. They just didn't have the right data for that to really show it off. It took them seven years to find the right data set. But, in 2013, they published a paper that took ImageNet data, a bunch of images, and ran them through deep neural network structures. One of the grad students was Alex, he has a Russian name and I apologize, I can't pronounce it. But, he put some code on GitHub called AlexNet, and it was basically a way of taking ImageNet, running it through a deep neural network.
Interestingly, he also added the option that you could use GPUs, graphical processing units. You could use GPUs to make the thing run much, much faster. He made that all up in source on GitHub, and from 2013, that just took off. I have a friend here in DC who had the first DARPA contract for defense related use of deep learning. It was a two-person company, and they did the challenge at DARPA for F-16 fighters for incoming target, whatever, identification. They were able to apply deep learning on that data set with all these huge defense contractors competing, and they were able to beat out the closest competition by 25x. We've seen this down the line, that deep learning has some really amazing consequences. So that all started in 2012, it'll take about 12 years to be commodified. That gets us into the mid-2020s, so we're seeing interesting things with the big models right now.
Paco Nathan of O'Reilly Media's R & D Group discusses the merits of the emerging Text Rank initiative in leveraging data in NLP in this clip from SpeechTEK 2018.
Paco Nathan of O'Reilly Media's R & D Group discusses the role of big compute in the commodification of AI in this clip from SpeechTEK 2018.
Paco Nathan of O'Reilly Media's R & D Group discusses the role of big data in the commoditization of AI in this clip from SpeechTEK 2018.
O'Reilly Media's Paco Nathan discusses the explosive growth in CPUs and GPUs that have opened new vistas for natural-language processing applications and rendered old approaches obsolete in this clip from his SpeechTEK 2018 keynote.