NEW: RESOURCE PAGE

Searching for tutorials and software about Deep Learning and Neural Nets? Be sure to look at my Resource Page!
Looking for Octave? Go to my Easy Octave on Mac page!

Thursday, February 26, 2015

Deep means complex architecture

I've been working my way through the Li Deng and Dong Yu survey monograph. My takeaway impression so far is that the Deep in the Deep Learning paradigm stands for hierarchical or complex modular architectures.  

And what are the modules and the learning techniques at the bottom of the hierarchy? Well it seems the base modules are close cousins of the non-deep ie. shallow nets of the 80's. The original unsupervised and supervised learning algorithms like backpropagation are still also very much in use. 

As an analogy, the relationship is a little bit like in programming tools, where the languages C++ or Objective C are built on top of the implementation technology of the previous language, the eponymous  C. 

One of the best tutorial introductions to the original material is doubtless still the pair of  PDP books, the heavyweight manifesto from Rummelhart & McLellandwhich put connectionism on the map as a full equal of symbolic AI. 



These books are still in print, ebook versions are floating around the web, and there is even a free up to date software package with a manual that explains all the algorithms!

Anyway, apart from searching for literature, I've grabbed some Restricted Boltzmann Machine code and run it. Here are today's screenshots, original data on the left, reconstructed on the right. My PC is finally doing more than web browsing ... I'm getting used to installing and running Python libraries and code.




No comments:

Post a Comment

Hey, let me know what you think of my blog, and what material I should add!