Skip to content

Commit 923a847

Browse files
committed
Merge pull request BVLC#260 from kloudkl/fix_doc_typos
Fix doc typos
2 parents 85bafd5 + 6ef936e commit 923a847

File tree

3 files changed

+15
-7
lines changed

3 files changed

+15
-7
lines changed

docs/feature_extraction.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -20,11 +20,11 @@ We'll make a temporary folder to store things into.
2020
Generate a list of the files to process.
2121
We're going to use the images that ship with caffe.
2222

23-
find `pwd`/examples/images -type f -exec echo {} \; > examples/_temp/file_list.txt
23+
find `pwd`/examples/images -type f -exec echo {} \; > examples/_temp/temp.txt
2424

2525
The `ImagesLayer` we'll use expects labels after each filenames, so let's add a 0 to the end of each line
2626

27-
sed "s/$/ 0/" examples/_temp/file_list.txt > examples/_temp/file_list.txt
27+
sed "s/$/ 0/" examples/_temp/temp.txt > examples/_temp/file_list.txt
2828

2929
Define the Feature Extraction Network Architecture
3030
--------------------------------------------------
@@ -48,7 +48,7 @@ Extract Features
4848

4949
Now everything necessary is in place.
5050

51-
build/tools/extract_features.bin models/caffe_reference_imagenet_model examples/_temp/imagenet_val.prototxt fc7 examples/_temp/features 10
51+
build/tools/extract_features.bin examples/imagenet/caffe_reference_imagenet_model examples/_temp/imagenet_val.prototxt fc7 examples/_temp/features 10
5252

5353
The name of feature blob that you extract is `fc7`, which represents the highest level feature of the reference model.
5454
We can use any other layer, as well, such as `conv5` or `pool3`.
@@ -57,6 +57,10 @@ The last parameter above is the number of data mini-batches.
5757

5858
The features are stored to LevelDB `examples/_temp/features`, ready for access by some other code.
5959

60+
If you meet with the error "Check failed: status.ok() Failed to open leveldb examples/_temp/features", it is because the directory examples/_temp/features has been created the last time you run the command. Remove it and run again.
61+
62+
rm -rf examples/_temp/features/
63+
6064
If you'd like to use the Python wrapper for extracting features, check out the [layer visualization notebook](http://nbviewer.ipython.org/github/BVLC/caffe/blob/master/examples/filter_visualization.ipynb).
6165

6266
Clean Up

docs/imagenet_training.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ Yangqing's Recipe on Brewing ImageNet
99
"All your braincells are belong to us."
1010
- Caffeine
1111

12-
We are going to describe a reference implementation for the approach first proposed by Krizhevsky, Sutskever, and Hinton in their [NIPS 2012 paper](http://books.nips.cc/papers/files/nips25/NIPS2012_0534.pdf). Since training the whole model takes some time and energy, we provide a model, trained in the same way as we describe here, to help fight global warming. If you would like to simply use the pretrained model, check out the [Pretrained ImageNet](imagenet_pretrained.html) page. *Note that the pretrained model is for academic research / non-commercial use only*.
12+
We are going to describe a reference implementation for the approach first proposed by Krizhevsky, Sutskever, and Hinton in their [NIPS 2012 paper](http://books.nips.cc/papers/files/nips25/NIPS2012_0534.pdf). Since training the whole model takes some time and energy, we provide a model, trained in the same way as we describe here, to help fight global warming. If you would like to simply use the pretrained model, check out the [Pretrained ImageNet](getting_pretrained_models.html) page. *Note that the pretrained model is for academic research / non-commercial use only*.
1313

1414
To clarify, by ImageNet we actually mean the ILSVRC12 challenge, but you can easily train on the whole of ImageNet as well, just with more disk space, and a little longer training time.
1515

@@ -99,4 +99,4 @@ Parting Words
9999

100100
Hope you liked this recipe! Many researchers have gone further since the ILSVRC 2012 challenge, changing the network architecture and/or finetuning the various parameters in the network. The recent ILSVRC 2013 challenge suggests that there are quite some room for improvement. **Caffe allows one to explore different network choices more easily, by simply writing different prototxt files** - isn't that exciting?
101101

102-
And since now you have a trained network, check out how to use it: [Running Pretrained ImageNet](imagenet_pretrained.html). This time we will use Python, but if you have wrappers for other languages, please kindly send a pull request!
102+
And since now you have a trained network, check out how to use it: [Running Pretrained ImageNet](getting_pretrained_models.html). This time we will use Python, but if you have wrappers for other languages, please kindly send a pull request!

docs/installation.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,11 @@ You will also need other packages, most of which can be installed via apt-get us
4545

4646
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libboost-all-dev libhdf5-serial-dev
4747

48-
The only exception being the google logging library, which does not exist in the Ubuntu 12.04 repository. To install it, do:
48+
On CentOS or RHEL, you can install via yum using:
49+
50+
sudo yum install protobuf-devel leveldb-devel snappy-devel opencv-devel boost-devel hdf5-devel
51+
52+
The only exception being the google logging library, which does not exist in the Ubuntu 12.04 or CentOS/RHEL repository. To install it, do:
4953

5054
wget https://google-glog.googlecode.com/files/glog-0.3.3.tar.gz
5155
tar zxvf glog-0.3.3.tar.gz
@@ -62,7 +66,7 @@ After setting all the prerequisites, you should modify the `Makefile.config` fil
6266

6367
## Compilation
6468

65-
After installing the prerequisites, simply do `make all` to compile Caffe. If you would like to compile the Python and Matlab wrappers, do
69+
After installing the prerequisites, simply do `make all -j10` in which 10 is the number of parallel compilation threads to compile Caffe. If you would like to compile the Python and Matlab wrappers, do
6670

6771
make pycaffe
6872
make matcaffe

0 commit comments

Comments
 (0)