Skip to content

hoangt/SkimCaffe

Repository files navigation

Caffe

Build Status License

Caffe is a deep learning framework made with expression, speed, and modularity in mind. It is developed by the Berkeley Vision and Learning Center (BVLC) and community contributors.

Check out the project site for all the details like

and step-by-step examples.

Join the chat at https://gitter.im/BVLC/caffe

Please join the caffe-users group or gitter chat to ask questions and talk about methods and models. Framework development discussions and thorough bug reports are collected on Issues.

Happy brewing!

License and Citation

Caffe is released under the BSD 2-Clause license. The BVLC reference models are released for unrestricted use.

Please cite Caffe in your publications if it helps your research:

@article{jia2014caffe,
  Author = {Jia, Yangqing and Shelhamer, Evan and Donahue, Jeff and Karayev, Sergey and Long, Jonathan and Girshick, Ross and Guadarrama, Sergio and Darrell, Trevor},
  Journal = {arXiv preprint arXiv:1408.5093},
  Title = {Caffe: Convolutional Architecture for Fast Feature Embedding},
  Year = {2014}
}

SkimCaffe Specific Description

We assume you have a recent Intel compiler and MKL installed. Tested environments: (Intel compiler version 15.0.3.187 and boost 1.59.0) We also assume you have a recent x86 CPU with AVX2 or AVX512 support. Direct sparse convolution and sparse fully-connected layers is only tested for AlexNet. More details on direct sparse convolution is described at: https://arxiv.org/abs/1608.01409

  1. Set up Intel compiler environment (compilervars.sh or compilervars.csh)

  2. Compile SpMP:

cd experiments/sparsity/SpMP
make
  1. Build Caffe as usual

  2. Test:

bzip2 -d models/bvlc_reference_caffenet/fc_0.1_ft_caffenet_0.57368_5e-05.caffemodel.bz2
build/tools/caffe.bin test -model models/bvlc_reference_caffenet/test_direct_sconv.prototxt -weights models/bvlc_reference_caffenet/fc_0.1_ft_caffenet_0.57368_5e-05.caffemodel -iterations 3

About

Caffe for Sparse Convolutional Neural Network

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 164