在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):gwtaylor/convnet_matlab开源软件地址(OpenSource Url):https://github.com/gwtaylor/convnet_matlab开源编程语言(OpenSource Language):MATLAB 42.8%开源软件介绍(OpenSource Introduction):convnet_matlabSimple 2-d convolutional net demo for Matlab. Author: Graham Taylor Originally written April 12, 2010 DataWe provide 32x32 downsampled data for the Small NORB dataset. http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x01235x9x18x6x2x32x32-testing-dat-matlab-bicubic.mat http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x01235x9x18x6x2x96x96-testing-cat-matlab.mat http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x01235x9x18x6x2x96x96-testing-info-matlab.mat http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x46789x9x18x6x2x32x32-training-dat-matlab-bicubic.mat http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x46789x9x18x6x2x96x96-training-cat-matlab.mat http://dl.dropbox.com/u/13294233/smallnorb/smallnorb-5x46789x9x18x6x2x96x96-training-info-matlab.mat Original dataset: http://www.cs.nyu.edu/~ylclab/data/norb-v1.0-small/ Note that we only train on image 1 of the stereo pair. SetupYou will need to change the path defined at the top of smallnorb_makebatches.m to reflect the actual location of the NORB data. e.g. datasetpath = '~/Dropbox/Public/smallnorb'; Architecture and MethodThe convolutional net in this example has the following architecture: Data; connected to Convolutional Layer 1; connected to Subsampling Layer 1; connected to Convolutional Layer 2; connected to Subsampling Layer 2 The output of Subsampling L2 is vectorized and fully-connected to the output layer which is a multinomial (i.e. 1 of "K") Here, the outputs correspond to object categories (K=5). The data is connected to each map of Convolutional L1. We build a random connectivity map to determine which maps of Subsampling L1 are connected to which maps of Convolutional L2. We use Carl Rasmussen's "minimize" conjugate gradient code to train the network. Therefore, we define a function which returns:
The benefit of this is that we can use Carl's "checkgrad" function to check the gradients of the cross-entropy error with respect to the parameters of the convnet using the method of finite differences. Note that for the first 6 epochs, only the topmost (fully-connected) weights are updated while the other parameters are held constant. Running the demoThere are three entry points:
The bulk of the code is in:
And the optimized counterparts :
|
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论