在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):ShichenLiu/CondenseNet开源软件地址(OpenSource Url):https://github.com/ShichenLiu/CondenseNet开源编程语言(OpenSource Language):Python 97.2%开源软件介绍(OpenSource Introduction):CondenseNetsThis repository contains the code (in PyTorch) for "CondenseNet: An Efficient DenseNet using Learned Group Convolutions" paper by Gao Huang*, Shichen Liu*, Laurens van der Maaten and Kilian Weinberger (* Authors contributed equally). CitationIf you find our project useful in your research, please consider citing:
ContentsIntroductionCondenseNet is a novel, computationally efficient convolutional network architecture. It combines dense connectivity between layers with a mechanism to remove unused connections. The dense connectivity facilitates feature re-use in the network, whereas learned group convolutions remove connections between layers for which this feature re-use is superfluous. At test time, our model can be implemented using standard grouped convolutions —- allowing for efficient computation in practice. Our experiments demonstrate that CondenseNets are much more efficient than other compact convolutional networks such as MobileNets and ShuffleNets. Figure 1: Learned Group Convolution with G=C=3. Figure 2: CondenseNets with Fully Dense Connectivity and Increasing Growth Rate. UsageDependenciesTrainAs an example, use the following command to train a CondenseNet on ImageNet
As another example, use the following command to train a CondenseNet on CIFAR-10
EvaluationWe take the ImageNet model trained above as an example. To evaluate the trained model, use
or use
Note that these models are still the large models. To convert the model to group-convolution version as described in the paper, use the
Finally, to directly load from a converted model (that is, a CondenseNet), use a converted model file in combination with the
Other OptionsWe also include DenseNet implementation in this repository. ResultsResults on ImageNet
Results on CIFAR
(* trained 600 epochs) Inference time on ARM platform
Contact[email protected] We are working on the implementation on other frameworks. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论