@ -1,162 +0,0 @@
|
||||
# # This will run on Travis' 'new' container-based infrastructure
|
||||
|
||||
# # Blacklist
|
||||
# branches:
|
||||
# only:
|
||||
# - master
|
||||
|
||||
# # Environment variables + OS + other parameters
|
||||
# global:
|
||||
# - GH_REPO_NAME: openpose
|
||||
# - DOXYFILE: $CI_BUILD_DIR/.doc_autogeneration.doxygen
|
||||
# # Set this in Environment Variables on travis-ci.org
|
||||
# # - GH_REPO_REF: github.com/<user_name>/openpose.git
|
||||
# matrix:
|
||||
# # Use a build matrix to test many builds in parallel
|
||||
# # envvar defaults:
|
||||
# # WITH_CMAKE: true
|
||||
# # WITH_PYTHON: false
|
||||
# # WITH_CUDA: true
|
||||
# # WITH_CUDNN: true
|
||||
# # WITH_OPEN_CL: false
|
||||
# # WITH_MKL: false
|
||||
# include:
|
||||
# # Ubuntu 16.04
|
||||
# # Ubuntu 16.04 - Default - CMake - CUDA
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-default-cmake-cuda8"
|
||||
# sudo: required
|
||||
# # Ubuntu 16.04 - Python - CMake - CUDA
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-python-cmake-cuda8" WITH_PYTHON=true
|
||||
# sudo: required
|
||||
# # Generate and deploy documentation
|
||||
# after_success:
|
||||
# - cd $CI_BUILD_DIR
|
||||
# - chmod +x scripts/generate_gh_pages.sh
|
||||
# - ./scripts/generate_gh_pages.sh
|
||||
# # Ubuntu 16.04 - Python - CMake - CPU
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-python-cmake-cpu" WITH_PYTHON=true WITH_CUDA=false
|
||||
# sudo: required
|
||||
# # Ubuntu 16.04 - Python - CMake - OpenCL
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-python-cmake-opencl" WITH_PYTHON=true WITH_CUDA=false WITH_OPEN_CL=true
|
||||
# sudo: required
|
||||
# # Ubuntu 16.04 - Python - CMake - CPU - Debug
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-python-cmake-cpu-debug" WITH_PYTHON=true WITH_CUDA=false WITH_DEBUG=true
|
||||
# sudo: required
|
||||
# # Ubuntu 16.04 - Python - CMake - CPU - Unity
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-python-cmake-cpu-unity" WITH_PYTHON=true WITH_UNITY=true WITH_CUDA=false
|
||||
# sudo: required
|
||||
|
||||
# # Mac OSX
|
||||
# # Mac OSX - Python - CMake - CPU
|
||||
# - os: osx
|
||||
# osx_image: xcode9.4 # xcode10.1 does not work with Python # Versions: https://docs.travis-ci.com/user/languages/objective-c#supported-xcode-versions
|
||||
# env: NAME="OSX-python-cmake-cpu" WITH_CUDA=false WITH_PYTHON=true
|
||||
# sudo: required
|
||||
# # Mac OSX - Python - CMake - OpenCL
|
||||
# - os: osx
|
||||
# osx_image: xcode10.1 # Versions: https://docs.travis-ci.com/user/languages/objective-c#supported-xcode-versions
|
||||
# env: NAME="OSX-default-cmake-opencl" WITH_CUDA=false WITH_OPEN_CL=true
|
||||
# sudo: required
|
||||
# # Mac OSX - Python - CMake - CPU - Debug
|
||||
# - os: osx
|
||||
# osx_image: xcode9.4 # xcode10.1 does not work with Python # Versions: https://docs.travis-ci.com/user/languages/objective-c#supported-xcode-versions
|
||||
# env: NAME="OSX-python-cmake-cpu-debug" WITH_CUDA=false WITH_PYTHON=true WITH_DEBUG=true
|
||||
# sudo: required
|
||||
# # Mac OSX - Python - CMake - CPU - Unity
|
||||
# - os: osx
|
||||
# osx_image: xcode9.4 # xcode10.1 does not work with Python # Versions: https://docs.travis-ci.com/user/languages/objective-c#supported-xcode-versions
|
||||
# env: NAME="OSX-python-cmake-cpu-unity" WITH_CUDA=false WITH_PYTHON=true WITH_UNITY=true
|
||||
# sudo: required
|
||||
# # Mac OSX - Default - CMake - CPU
|
||||
# - os: osx
|
||||
# osx_image: xcode10.1 # Versions: https://docs.travis-ci.com/user/languages/objective-c#supported-xcode-versions
|
||||
# env: NAME="OSX-default-cmake-cpu" WITH_CUDA=false
|
||||
# sudo: required
|
||||
|
||||
# # # TO-DO: To be implemented
|
||||
# # # Windows
|
||||
# # # Windows - Default - CMake - CUDA
|
||||
# # - os: windows
|
||||
# # env: NAME="W10-default-cmake-cuda8"
|
||||
|
||||
# # Ubuntu (others)
|
||||
# # Ubuntu 16.04 - Default - CMake - CPU
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-default-cmake-cpu" WITH_CUDA=false
|
||||
# sudo: required
|
||||
# # Ubuntu 16.04 - Default - Make - CUDA
|
||||
# - os: linux
|
||||
# dist: xenial
|
||||
# env: NAME="U16-default-make-cuda8" WITH_CMAKE=false
|
||||
# sudo: required
|
||||
# # # TO-DO: To be implemented
|
||||
# # # Ubuntu 16.04 - Default - CMake - CPU MKL
|
||||
# # - os: linux
|
||||
# # dist: xenial
|
||||
# # env: NAME="U16-default-cmake-cpu-mkl" WITH_CUDA=false WITH_MKL=true
|
||||
# # sudo: required
|
||||
# # # Ubuntu 16.04 - Python - CMake - OpenCL
|
||||
# # - os: linux
|
||||
# # dist: xenial
|
||||
# # env: NAME="U16-python-cmake-opencl" WITH_PYTHON=true WITH_CUDA=false WITH_OPEN_CL=true
|
||||
# # sudo: required
|
||||
# # # Unnecessary/redundant ones
|
||||
# # # Ubuntu 16.04 - Default - CMake - CUDA - no cuDNN
|
||||
# # - os: linux
|
||||
# # dist: xenial
|
||||
# # env: NAME="U16-default-cmake-cuda8-nocudnn" WITH_CUDNN=false
|
||||
# # sudo: required
|
||||
# # Ubuntu 14.04 - Default - CMake - CPU
|
||||
# - os: linux
|
||||
# dist: trusty
|
||||
# env: NAME="U14-default-cmake-cpu" WITH_CUDA=false
|
||||
# sudo: required
|
||||
# # Ubuntu 14.04 - Default - Make - CUDA
|
||||
# - os: linux
|
||||
# dist: trusty
|
||||
# env: NAME="U14-default-make-cuda8" WITH_CMAKE=false
|
||||
# sudo: required
|
||||
# # # Unnecessary/redundant ones
|
||||
# # # Ubuntu 14.04 - Default - CMake - CUDA
|
||||
# # - os: linux
|
||||
# # dist: trusty
|
||||
# # env: NAME="U14-default-cmake-cuda8"
|
||||
# # sudo: required
|
||||
|
||||
# # Install apt dependencies
|
||||
# addons:
|
||||
# apt:
|
||||
# packages:
|
||||
# - doxygen
|
||||
# - doxygen-doc
|
||||
# - doxygen-latex
|
||||
# - doxygen-gui
|
||||
# - graphviz
|
||||
|
||||
# # Install Caffe and OP dependencies
|
||||
# install:
|
||||
# - if [[ "$CI_OS_NAME" == "linux" ]]; then sudo bash scripts/CI/install_deps_ubuntu.sh ; fi
|
||||
# - if [[ "$CI_OS_NAME" == "osx" ]]; then bash scripts/CI/install_deps_osx.sh ; fi
|
||||
# - if [[ "$CI_OS_NAME" == "windows" ]]; then exit 99 ; fi
|
||||
|
||||
# # Running CMake
|
||||
# before_script:
|
||||
# - bash scripts/CI/configure.sh
|
||||
|
||||
# # Build your code e.g., by calling make
|
||||
# script:
|
||||
# - bash scripts/CI/run_make.sh
|
||||
# - bash scripts/CI/run_tests.sh
|
@ -1,13 +0,0 @@
|
||||
In order to recover Travis:
|
||||
1. Uncomment all the lines of code in ".travis.yml" and move it back to the main folder (e.g., at the same level than the global README.md and CMakeLists.txt files).
|
||||
2. Move `generate_gh_pages.sh` back to `scripts/generate_gh_pages.sh`.
|
||||
3. Re-add the table of badges in the README.md:
|
||||
|
||||
| |`Default Config` |`CUDA (+Python)` |`CPU (+Python)` |`OpenCL (+Python)`| `Debug` | `Unity` |
|
||||
| :---: | :---: | :---: | :---: | :---: | :---: | :---: |
|
||||
| **`Linux`** | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/1)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/2)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/3)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/4)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/5)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/6)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) |
|
||||
| **`MacOS`** | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/7)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/7)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/8)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/9)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/10)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) | [![Status](https://travis-matrix-badges.herokuapp.com/repos/CMU-Perceptual-Computing-Lab/openpose/branches/master/11)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose) |
|
||||
| **`Windows`** | [![Status](https://ci.appveyor.com/api/projects/status/5leescxxdwen77kg/branch/master?svg=true)](https://ci.appveyor.com/project/gineshidalgo99/openpose/branch/master) | | | | |
|
||||
<!--
|
||||
Note: Currently using [travis-matrix-badges](https://github.com/bjfish/travis-matrix-badges) vs. traditional [![Build Status](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose.svg?branch=master)](https://travis-ci.org/CMU-Perceptual-Computing-Lab/openpose)
|
||||
-->
|
@ -1,104 +0,0 @@
|
||||
#!/bin/sh
|
||||
################################################################################
|
||||
# Title : generate_gh_pages.sh
|
||||
# Date created : 2016/02/22
|
||||
# Notes :
|
||||
__AUTHOR__="openpose"
|
||||
# Preconditions:
|
||||
# - Packages doxygen doxygen-doc doxygen-latex doxygen-gui graphviz
|
||||
# must be installed.
|
||||
# - Doxygen configuration file must have the destination directory empty and
|
||||
# source code directory with a $(CI_BUILD_DIR) prefix.
|
||||
# - An gh-pages branch should already exist. See below for mor info on hoe to
|
||||
# create a gh-pages branch.
|
||||
#
|
||||
# Required global variables:
|
||||
# - CI_BUILD_NUMBER : The number of the current build.
|
||||
# - CI_COMMIT : The commit that the current build is testing.
|
||||
# - DOXYFILE : The Doxygen configuration file.
|
||||
# - GH_REPO_NAME : The name of the repository.
|
||||
# - GH_REPO_REF : The GitHub reference to the repository.
|
||||
# - GH_REPO_TOKEN : Secure token to the github repository.
|
||||
#
|
||||
# For information on how to encrypt variables for Travis CI please go to
|
||||
# https://docs.travis-ci.com/user/environment-variables/#Encrypted-Variables
|
||||
# or https://gist.github.com/vidavidorra/7ed6166a46c537d3cbd2
|
||||
# For information on how to create a clean gh-pages branch from the master
|
||||
# branch, please go to https://gist.github.com/vidavidorra/846a2fc7dd51f4fe56a0
|
||||
#
|
||||
# This script will generate Doxygen documentation and push the documentation to
|
||||
# the gh-pages branch of a repository specified by GH_REPO_REF.
|
||||
# Before this script is used there should already be a gh-pages branch in the
|
||||
# repository.
|
||||
#
|
||||
################################################################################
|
||||
|
||||
################################################################################
|
||||
##### Setup this script and get the current gh-pages branch. #####
|
||||
echo 'Setting up the script...'
|
||||
# Exit with nonzero exit code if anything fails
|
||||
set -e
|
||||
|
||||
# Create a clean working directory for this script.
|
||||
mkdir code_docs
|
||||
cd code_docs
|
||||
|
||||
# Get the current gh-pages branch
|
||||
git clone -b gh-pages https://git@$GH_REPO_REF
|
||||
cd $GH_REPO_NAME
|
||||
|
||||
##### Configure git.
|
||||
# Set the push default to simple i.e. push only the current branch.
|
||||
git config --global push.default simple
|
||||
# Pretend to be an user called Travis CI.
|
||||
git config user.name "Travis CI"
|
||||
git config user.email "travis@travis-ci.org"
|
||||
|
||||
# Remove everything currently in the gh-pages branch.
|
||||
# GitHub is smart enough to know which files have changed and which files have
|
||||
# stayed the same and will only update the changed files. So the gh-pages branch
|
||||
# can be safely cleaned, and it is sure that everything pushed later is the new
|
||||
# documentation.
|
||||
rm -rf *
|
||||
|
||||
# Need to create a .nojekyll file to allow filenames starting with an underscore
|
||||
# to be seen on the gh-pages site. Therefore creating an empty .nojekyll file.
|
||||
# Presumably this is only needed when the SHORT_NAMES option in Doxygen is set
|
||||
# to NO, which it is by default. So creating the file just in case.
|
||||
echo "" > .nojekyll
|
||||
|
||||
################################################################################
|
||||
##### Generate the Doxygen code documentation and log the output. #####
|
||||
echo 'Generating Doxygen code documentation...'
|
||||
# Redirect both stderr and stdout to the log file AND the console.
|
||||
echo "OUTPUT_DIRECTORY = " >> $DOXYFILE
|
||||
doxygen $DOXYFILE 2>&1 | tee doxygen.log
|
||||
|
||||
################################################################################
|
||||
##### Upload the documentation to the gh-pages branch of the repository. #####
|
||||
# Only upload if Doxygen successfully created the documentation.
|
||||
# Check this by verifying that the html directory and the file html/index.html
|
||||
# both exist. This is a good indication that Doxygen did it's work.
|
||||
if [ -d "html" ] && [ -f "html/index.html" ]; then
|
||||
|
||||
echo 'Uploading documentation to the gh-pages branch...'
|
||||
# Add everything in this directory (the Doxygen code documentation) to the
|
||||
# gh-pages branch.
|
||||
# GitHub is smart enough to know which files have changed and which files have
|
||||
# stayed the same and will only update the changed files.
|
||||
git add --all
|
||||
|
||||
# Commit the added files with a title and description containing the Travis CI
|
||||
# build number and the GitHub commit reference that issued this build.
|
||||
git commit -m "Deploy code docs to GitHub Pages Travis build: ${CI_BUILD_NUMBER}" -m "Commit: ${CI_COMMIT}"
|
||||
|
||||
# Force push to the remote gh-pages branch.
|
||||
# The ouput is redirected to /dev/null to hide any sensitive credential data
|
||||
# that might otherwise be exposed.
|
||||
git push --force "https://${GH_REPO_TOKEN}@${GH_REPO_REF}" > /dev/null 2>&1
|
||||
else
|
||||
echo '' >&2
|
||||
echo 'Warning: No documentation (html) files have been found!' >&2
|
||||
echo 'Warning: Not going to push the documentation to GitHub!' >&2
|
||||
exit 1
|
||||
fi
|
Before Width: | Height: | Size: 8.6 KiB |
Before Width: | Height: | Size: 287 KiB |
@ -1,85 +0,0 @@
|
||||
### Posting rules
|
||||
1. **No duplicated posts, only 1 new post opened a day, and up to 2 opened a week**. Otherwise, extrict user bans will occur.
|
||||
- Check the [FAQ](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/doc/05_faq.md) section, other GitHub issues, and general documentation before posting. E.g., **low-speed, out-of-memory, output format, 0-people detected, installation issues, ...**).
|
||||
- Keep posting all your issues in the same post.
|
||||
- No bans if you are unsure whether the post is duplicated!
|
||||
2. **Fill all** the **Your System Configuration section** if you are facing an error or unexpected behavior. Some posts (e.g., feature requests) might not require it.
|
||||
3. **No questions about training or 3rd party libraries**:
|
||||
- OpenPose only implements testing. For training, check [OpenPose train](https://github.com/CMU-Perceptual-Computing-Lab/openpose_train).
|
||||
- Caffe errors/issues: Check [Caffe](http://caffe.berkeleyvision.org) documentation.
|
||||
- CUDA/cuDNN check failed errors: They are usually fixed by re-installing CUDA, then re-installing the proper cuDNN version, then rebooting, and then re-installing OpenPose. Otherwise, check Nvidia/CUDA/cuDNN forums.
|
||||
- OpenCV errors: Install the default/pre-compiled OpenCV or check for OpenCV online help.
|
||||
4. Set a **proper issue title**: Add the OS (Ubuntu, Windows) and be specific (e.g., do not call it: `Error`).
|
||||
5. Only English comments.
|
||||
6. Remove these posting rules from your post but follow them!
|
||||
Posts which do not follow these rules will be **ignored/deleted** and those **users banned** with no further clarification.
|
||||
|
||||
|
||||
|
||||
### Issue Summary
|
||||
|
||||
|
||||
|
||||
### Executed Command (if any)
|
||||
Note: add `--logging_level 0 --disable_multi_thread` to get higher debug information.
|
||||
|
||||
|
||||
|
||||
### OpenPose Output (if any)
|
||||
|
||||
|
||||
|
||||
### Errors (if any)
|
||||
|
||||
|
||||
|
||||
### Type of Issue
|
||||
Select the topic(s) on your post, delete the rest:
|
||||
- Compilation/installation error
|
||||
- Execution error
|
||||
- Help wanted
|
||||
- Question
|
||||
- Enhancement / offering possible extensions / pull request / etc
|
||||
- Other (type your own type)
|
||||
|
||||
|
||||
|
||||
### Your System Configuration
|
||||
1. **Whole console output** (if errors appeared), paste the error to [PasteBin](https://pastebin.com/) and then paste the link here: LINK
|
||||
|
||||
2. **OpenPose version**: Latest GitHub code? Or specific commit (e.g., d52878f)? Or specific version from `Release` section (e.g., 1.2.0)?
|
||||
|
||||
3. **General configuration**:
|
||||
- **Installation mode**: CMake, sh script, manual Makefile installation, ... (Ubuntu); CMake, ... (Windows); ...?
|
||||
- **Operating system** (`lsb_release -a` in Ubuntu):
|
||||
- **Operating system version** (e.g., Ubuntu 16, Windows 10, ...):
|
||||
- **Release or Debug mode**? (by default: release):
|
||||
- Compiler (`gcc --version` in Ubuntu or VS version in Windows): 5.4.0, ... (Ubuntu); VS2015 Enterprise Update 3, VS2017 community, ... (Windows); ...?
|
||||
|
||||
4. **Non-default settings**:
|
||||
- **3-D Reconstruction module added**? (by default: no):
|
||||
- Any other custom CMake configuration with respect to the default version? (by default: no):
|
||||
|
||||
5. **3rd-party software**:
|
||||
- **Caffe version**: Default from OpenPose, custom version, ...?
|
||||
- **CMake version** (`cmake --version` in Ubuntu):
|
||||
- **OpenCV version**: pre-compiled `apt-get install libopencv-dev` (only Ubuntu); OpenPose default (only Windows); compiled from source? If so, 2.4.9, 2.4.12, 3.1, 3.2?; ...?
|
||||
|
||||
6. If **GPU mode** issue:
|
||||
- **CUDA version** (`cat /usr/local/cuda/version.txt` in most cases):
|
||||
- **cuDNN version**:
|
||||
- **GPU model** (`nvidia-smi` in Ubuntu):
|
||||
|
||||
7. If **CPU-only mode** issue:
|
||||
- **CPU brand & model**:
|
||||
- Total **RAM memory** available:
|
||||
|
||||
8. If **Python** API:
|
||||
- **Python version**: 2.7, 3.7, ...?
|
||||
- **Numpy version** (`python -c "import numpy; print numpy.version.version"` in Ubuntu):
|
||||
|
||||
9. If **Windows** system:
|
||||
- Portable demo or compiled library?
|
||||
|
||||
10. If **speed performance** issue:
|
||||
- Report OpenPose timing speed based on the [profiling documentation](https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/doc/06_maximizing_openpose_speed.md#profiling-speed).
|
Before Width: | Height: | Size: 2.3 MiB |
Before Width: | Height: | Size: 4.2 MiB |
Before Width: | Height: | Size: 59 KiB |
Before Width: | Height: | Size: 14 KiB |
Before Width: | Height: | Size: 39 KiB |
Before Width: | Height: | Size: 11 KiB |
Before Width: | Height: | Size: 14 KiB |
Before Width: | Height: | Size: 130 KiB |
Before Width: | Height: | Size: 36 KiB |
Before Width: | Height: | Size: 110 KiB |
Before Width: | Height: | Size: 34 KiB |
Before Width: | Height: | Size: 181 KiB |
Before Width: | Height: | Size: 11 KiB |
Before Width: | Height: | Size: 70 KiB |
Before Width: | Height: | Size: 180 KiB |
Before Width: | Height: | Size: 31 KiB |
Before Width: | Height: | Size: 1.8 MiB |
Before Width: | Height: | Size: 9.2 MiB |
Before Width: | Height: | Size: 8.0 MiB |
Before Width: | Height: | Size: 10 MiB |
Before Width: | Height: | Size: 1.9 MiB |
Before Width: | Height: | Size: 22 KiB |
Before Width: | Height: | Size: 32 KiB |
Before Width: | Height: | Size: 940 KiB |
@ -1,17 +0,0 @@
|
||||
# Number of days of inactivity before an issue becomes stale
|
||||
daysUntilStale: 60
|
||||
# Number of days of inactivity before a stale issue is closed
|
||||
daysUntilClose: 7
|
||||
# Issues with these labels will never be considered stale
|
||||
exemptLabels:
|
||||
- bug/typo
|
||||
- enhancement
|
||||
# Label to use when marking an issue as stale
|
||||
staleLabel: stale/old
|
||||
# Comment to post when marking an issue as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
This issue has been automatically marked as stale because it has not had
|
||||
recent activity. It will be closed if no further activity occurs. Thank you
|
||||
for your contributions.
|
||||
# Comment to post when closing a stale issue. Set to `false` to disable
|
||||
closeComment: false
|
@ -1,292 +0,0 @@
|
||||
name: CI
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
env:
|
||||
GH_REPO_NAME: ${{ github.event.repository.name }}
|
||||
DOXYFILE: ${{ github.workspace }}/.doc_autogeneration.doxygen
|
||||
GH_REPO_REF: ${{ github.repositoryUrl }}
|
||||
|
||||
jobs:
|
||||
build:
|
||||
name: ${{ matrix.env.NAME }}
|
||||
runs-on: ${{ matrix.os }}
|
||||
env: ${{ matrix.env }}
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
# Ubuntu
|
||||
# Ubuntu 20.04 - Default - CMake - CUDA
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-default-cmake-cuda
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 18.04 - Default - CMake - CUDA
|
||||
- os: ubuntu-18.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U18-default-cmake-cuda
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 20.04 - Python - CMake - CUDA
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-python-cmake-cuda
|
||||
WITH_CUDNN: false
|
||||
WITH_PYTHON: true
|
||||
CI_OS_NAME: linux
|
||||
CI_BUILD_NUMBER: ${{ github.run_number }}
|
||||
CI_COMMIT: ${{ github.sha }}
|
||||
PYTHON3_VERSION: python3.8
|
||||
DOCS: true
|
||||
# Ubuntu 18.04 - Python - CMake - CUDA
|
||||
- os: ubuntu-18.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U18-python-cmake-cuda
|
||||
WITH_CUDNN: false
|
||||
WITH_PYTHON: true
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 20.04 - Python - CMake - CPU
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-python-cmake-cpu
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
PYTHON3_VERSION: python3.8
|
||||
# Ubuntu 18.04 - Python - CMake - CPU
|
||||
- os: ubuntu-18.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U18-python-cmake-cpu
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 16.04 - Python - CMake - CPU
|
||||
- os: ubuntu-16.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U16-python-cmake-cpu
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
# TO-DO (not passing)
|
||||
# # Ubuntu 20.04 - Python - CMake - OpenCL
|
||||
# - os: ubuntu-20.04
|
||||
# os_name: linux
|
||||
# env:
|
||||
# NAME: U20-python-cmake-opencl
|
||||
# WITH_PYTHON: true
|
||||
# WITH_CUDA: false
|
||||
# WITH_CUDNN: false
|
||||
# WITH_OPEN_CL: true
|
||||
# CI_OS_NAME: linux
|
||||
# PYTHON3_VERSION: python3.8
|
||||
# Ubuntu 18.04 - Python - CMake - OpenCL
|
||||
- os: ubuntu-18.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U18-python-cmake-opencl
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_OPEN_CL: true
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 16.04 - Python - CMake - OpenCL
|
||||
- os: ubuntu-16.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U16-python-cmake-opencl
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_OPEN_CL: true
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 20.04 - Python - CMake - CPU - Debug
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-python-cmake-cpu-debug
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_DEBUG: true
|
||||
CI_OS_NAME: linux
|
||||
PYTHON3_VERSION: python3.8
|
||||
# Ubuntu 18.04 - Python - CMake - CPU - Debug
|
||||
- os: ubuntu-18.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U18-python-cmake-cpu-debug
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_DEBUG: true
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 16.04 - Python - CMake - CPU - Debug
|
||||
- os: ubuntu-16.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U16-python-cmake-cpu-debug
|
||||
WITH_PYTHON: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_DEBUG: true
|
||||
CI_OS_NAME: linux
|
||||
# Ubuntu 20.04 - Python - CMake - CPU - Unity
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-python-cmake-cpu-unity
|
||||
WITH_PYTHON: true
|
||||
WITH_UNITY: true
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
PYTHON3_VERSION: python3.8
|
||||
# Ubuntu 20.04 - Default - CMake - CPU
|
||||
- os: ubuntu-20.04
|
||||
os_name: linux
|
||||
env:
|
||||
NAME: U20-default-cmake-cpu
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: linux
|
||||
|
||||
# Mac OSX
|
||||
# Mac OSX - Python - CMake - CPU
|
||||
- os: macos-10.15
|
||||
os_name: osx
|
||||
env:
|
||||
NAME: OSX-python-cmake-cpu
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_PYTHON: true
|
||||
CI_OS_NAME: osx
|
||||
# Mac OSX - Python - CMake - OpenCL
|
||||
- os: macos-10.15
|
||||
os_name: osx
|
||||
env:
|
||||
NAME: OSX-default-cmake-opencl
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_OPEN_CL: true
|
||||
CI_OS_NAME: osx
|
||||
# Mac OSX - Python - CMake - CPU - Debug
|
||||
- os: macos-10.15
|
||||
os_name: osx
|
||||
env:
|
||||
NAME: OSX-python-cmake-cpu-debug
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_PYTHON: true
|
||||
WITH_DEBUG: true
|
||||
CI_OS_NAME: osx
|
||||
# Mac OSX - Python - CMake - CPU - Unity
|
||||
- os: macos-10.15
|
||||
os_name: osx
|
||||
env:
|
||||
NAME: OSX-python-cmake-cpu-unity
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
WITH_PYTHON: true
|
||||
WITH_UNITY: true
|
||||
CI_OS_NAME: osx
|
||||
# Mac OSX - Default - CMake - CPU
|
||||
- os: macos-10.15
|
||||
os_name: osx
|
||||
env:
|
||||
NAME: OSX-default-cmake-cpu
|
||||
WITH_CUDA: false
|
||||
WITH_CUDNN: false
|
||||
CI_OS_NAME: osx
|
||||
|
||||
# TO-DO (not passing)
|
||||
# Note: CUDA jobs fail in U16 because of an issue with GCC 5.5 (https://github.com/NVIDIA/apex/issues/529). GH Actions doesn't support
|
||||
# GCC 5.4 and CUDA does not support newer than GCC 5.X. Thus, cuDNN (sh file only for Ubuntu 16) is no longer tested.
|
||||
# # Ubuntu 16.04
|
||||
# # Ubuntu 16.04 - Default - CMake - CUDA
|
||||
# - os: ubuntu-16.04
|
||||
# os_name: linux
|
||||
# env:
|
||||
# NAME: U16-default-cmake-cuda8
|
||||
# CI_OS_NAME: linux
|
||||
|
||||
# # Deprecated (not working for the above issue between CUDA and the Ubuntu16 from GitHub Actions)
|
||||
# # Ubuntu 16.04 - Default - Make - CUDA
|
||||
# - os: ubuntu-16.04
|
||||
# os_name: linux
|
||||
# env:
|
||||
# NAME: U16-default-make-cuda
|
||||
# WITH_CUDNN: false
|
||||
# WITH_CMAKE: false
|
||||
# CI_OS_NAME: linux
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: recursive
|
||||
- uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: 2.x
|
||||
if: ${{ matrix.env.WITH_PYTHON }}
|
||||
- name: Install (Linux)
|
||||
run: scripts/CI/install_deps_ubuntu.sh
|
||||
if: ${{ matrix.os_name == 'linux' }}
|
||||
- name: Install (Mac OS)
|
||||
run: scripts/CI/install_deps_osx.sh
|
||||
if: ${{ matrix.os_name == 'osx' }}
|
||||
|
||||
- name: Configure
|
||||
run: scripts/CI/configure.sh
|
||||
- name: Make
|
||||
run: scripts/CI/run_make.sh
|
||||
- name: Tests
|
||||
run: scripts/CI/run_tests.sh
|
||||
|
||||
- name: Docs APT packages
|
||||
run: |
|
||||
# The Doxygen apt-get version for Ubuntu 20 is 1.8.17, which has some bugs fixed in 1.9.1
|
||||
# run: sudo apt-get -yq install doxygen doxygen-doc doxygen-latex doxygen-gui graphviz
|
||||
git clone https://github.com/doxygen/doxygen.git && cd doxygen && git checkout Release_1_9_1
|
||||
mkdir build && cd build
|
||||
cmake -G "Unix Makefiles" ..
|
||||
make -j`nproc`
|
||||
sudo make install
|
||||
if: ${{ matrix.DOCS }}
|
||||
- name: Generate docs
|
||||
run: |
|
||||
cd ${{ github.workspace }}
|
||||
echo 'Generating Doxygen code documentation...'
|
||||
doxygen $DOXYFILE 2>&1 | tee doxygen.log
|
||||
echo 'Creating .nojekyll and copying log...'
|
||||
echo "" > doxygen/html/.nojekyll
|
||||
cp doxygen.log doxygen/html/doxygen.log
|
||||
# Required so Doxygen links/finds the license file
|
||||
cp LICENSE doxygen/html/LICENSE
|
||||
# Required in order to link .github/media/ images with doc without modifying doc links. Remove if using `publish_dir: doxygen/html/`
|
||||
mkdir -p doxygen_final/web/html/ && mv doxygen/html doxygen_final/web/html/doc
|
||||
mkdir -p doxygen_final/web/.github/media/ && cp -rf .github/media/ doxygen_final/web/.github/
|
||||
mkdir -p doxygen_final/web/html/.github/media/ && cp -rf .github/media/ doxygen_final/web/html/.github/
|
||||
mkdir -p doxygen_final/web/html/doc/.github/media/ && cp -rf .github/media/ doxygen_final/web/html/doc/.github/
|
||||
if: ${{ matrix.DOCS }}
|
||||
- name: Deploy Docs
|
||||
uses: peaceiris/actions-gh-pages@v3
|
||||
with:
|
||||
github_token: ${{ secrets.GITHUB_TOKEN }}
|
||||
publish_dir: doxygen_final/
|
||||
# publish_dir: doxygen/html/ # Original one, it would turn [...].github.io/openpose/web/html/doc/ into [...].github.io/openpose/, but images would not work
|
||||
destination_dir: .
|
||||
enable_jekyll: false
|
||||
force_orphan: true
|
||||
if: ${{ matrix.DOCS && github.event_name == 'push' && github.ref == 'refs/heads/master' }}
|
@ -1,139 +0,0 @@
|
||||
######################### Binary Files #########################
|
||||
# Compiled Object files
|
||||
*.slo
|
||||
*.lo
|
||||
*.o
|
||||
*.cuo
|
||||
|
||||
# Compiled Dynamic libraries
|
||||
*.so*
|
||||
*.dylib
|
||||
|
||||
# Compiled Static libraries
|
||||
*.lai
|
||||
*.la
|
||||
*.a
|
||||
|
||||
# Compiled protocol buffers
|
||||
*.pb.h
|
||||
*.pb.cc
|
||||
*_pb2.py
|
||||
|
||||
# Compiled python
|
||||
*.pyc
|
||||
|
||||
# Compiled MATLAB
|
||||
*.mex*
|
||||
|
||||
# IPython notebook checkpoints
|
||||
.ipynb_checkpoints
|
||||
|
||||
# LevelDB files
|
||||
*.sst
|
||||
*.ldb
|
||||
LOCK
|
||||
CURRENT
|
||||
MANIFEST-*
|
||||
|
||||
######################### IDEs Files #########################
|
||||
# Editor temporaries
|
||||
*.swp
|
||||
*~
|
||||
|
||||
# Sublime Text settings
|
||||
*.sublime-workspace
|
||||
*.sublime-project
|
||||
|
||||
# Eclipse Project settings
|
||||
*.*project
|
||||
.settings
|
||||
|
||||
# QtCreator files
|
||||
*.user
|
||||
|
||||
# PyCharm files
|
||||
.idea
|
||||
|
||||
# OSX dir files
|
||||
.DS_Store
|
||||
._.DS_Store
|
||||
|
||||
# Vim files
|
||||
cscope.*
|
||||
tags
|
||||
.clang_complete
|
||||
.vim
|
||||
|
||||
######################### Windows (Visual Studio) Files #########################
|
||||
*.db
|
||||
*.deps
|
||||
*.obj
|
||||
*.opendb
|
||||
*.pdb
|
||||
|
||||
######################### Windows (Visual Studio) Folders and Compressed files #########################
|
||||
*.vs/
|
||||
*x64/
|
||||
# Allowing this file (removed *.user for Qt)
|
||||
!*.vcxproj.user
|
||||
|
||||
######################### Caffe Files / Folders #########################
|
||||
# User's build configuration
|
||||
Makefile
|
||||
Makefile.config
|
||||
|
||||
# Data and models are either
|
||||
# 1. reference, and not casually committed
|
||||
# 2. custom, and live on their own unless they're deliberated contributed
|
||||
distribute/
|
||||
*.caffemodel
|
||||
*.caffemodel.h5
|
||||
*.solverstate
|
||||
*.solverstate.h5
|
||||
*.binaryproto
|
||||
*leveldb
|
||||
*lmdb
|
||||
|
||||
# Camera parameters
|
||||
*.xml
|
||||
|
||||
######################### 3rd-party Folders and Zip Files #########################
|
||||
3rdparty/caffe/.git
|
||||
3rdparty/caffe/.github
|
||||
3rdparty/asio/
|
||||
3rdparty/eigen/
|
||||
3rdparty/windows/caffe/
|
||||
3rdparty/windows/caffe_cpu/
|
||||
3rdparty/windows/caffe_opencl/
|
||||
3rdparty/windows/caffe3rdparty/
|
||||
3rdparty/windows/opencv/
|
||||
3rdparty/windows/freeglut/
|
||||
3rdparty/windows/spinnaker/
|
||||
3rdparty/*zip
|
||||
3rdparty/windows/*zip
|
||||
|
||||
######################### Compilation (build, distribute & bins) Files / Folders #########################
|
||||
*.bin
|
||||
*.testbin
|
||||
build
|
||||
build*
|
||||
.build*
|
||||
*cmake_build
|
||||
distribute/*
|
||||
python/caffe/proto/
|
||||
|
||||
######################### Generated Documentation #########################
|
||||
_site
|
||||
docs/_site
|
||||
docs/dev
|
||||
docs/gathered
|
||||
doxygen
|
||||
html/
|
||||
|
||||
######################### Video Files #########################
|
||||
# Testing videos
|
||||
*.mp4
|
||||
*.mov
|
||||
|
||||
######################### Validation Scripts & Testing #########################
|
||||
output*/
|
@ -1,6 +0,0 @@
|
||||
[submodule "3rdparty/caffe"]
|
||||
path = 3rdparty/caffe
|
||||
url = https://github.com/CMU-Perceptual-Computing-Lab/caffe.git
|
||||
[submodule "3rdparty/pybind11"]
|
||||
path = 3rdparty/pybind11
|
||||
url = https://github.com/pybind/pybind11.git
|
@ -1,38 +0,0 @@
|
||||
Asio:
|
||||
- Version 1.12.1.
|
||||
- Link: https://think-async.com/Asio/Download
|
||||
|
||||
Eigen:
|
||||
- Version 3.3.8
|
||||
- Link: http://eigen.tuxfamily.org/index.php
|
||||
|
||||
Unix:
|
||||
- Caffe:
|
||||
- Version 1.0.0, extracted from GitHub from its master branch.
|
||||
- Link: https://github.com/BVLC/caffe
|
||||
- GitHub SHA: 864520713a4c5ffae7382ced5d34e4cadc608473
|
||||
|
||||
- Spinnaker:
|
||||
- Tested on Spinnaker 1.13.0.31 SDK - Linux Ubuntu 16.04 (64-bit) - 05/25/2018 - 41.580MB.
|
||||
- Prerequisites: https://www.ptgrey.com/tan/10685
|
||||
|
||||
Windows:
|
||||
- Caffe & Caffe dependencies (caffe3rdparty):
|
||||
- Version 1.0.0, extracted from GitHub from its master branch.
|
||||
- Link: https://github.com/BVLC/caffe/tree/windows
|
||||
|
||||
- FreeGLUT (only for the 3-D reconstruction demo):
|
||||
- Version 3.0.0-2 (MSVC), extracted from their oficial website.
|
||||
- Link: https://www.transmissionzero.co.uk/software/freeglut-devel/
|
||||
|
||||
- OpenCV:
|
||||
- Version 4.5.0, extracted from their oficial website: section `Releases`, subsection `OpenCV - 4.5.0`, `Windows` version.
|
||||
- Link: https://opencv.org/releases/
|
||||
|
||||
- Unzip:
|
||||
- Version 5.52.
|
||||
- Link: http://stahlworks.com/dev/?tool=zipunzip
|
||||
|
||||
- Wget:
|
||||
- Version wget-1.20.3-win64.
|
||||
- Link: https://eternallybored.org/misc/wget/
|
@ -1,99 +0,0 @@
|
||||
## General
|
||||
|
||||
# Compiled Object files
|
||||
*.slo
|
||||
*.lo
|
||||
*.o
|
||||
*.cuo
|
||||
|
||||
# Compiled Dynamic libraries
|
||||
*.so
|
||||
*.dylib
|
||||
|
||||
# Compiled Static libraries
|
||||
*.lai
|
||||
*.la
|
||||
*.a
|
||||
|
||||
# Compiled protocol buffers
|
||||
*.pb.h
|
||||
*.pb.cc
|
||||
*_pb2.py
|
||||
|
||||
# Compiled python
|
||||
*.pyc
|
||||
|
||||
# Compiled MATLAB
|
||||
*.mex*
|
||||
|
||||
# IPython notebook checkpoints
|
||||
.ipynb_checkpoints
|
||||
|
||||
# Editor temporaries
|
||||
*.swp
|
||||
*~
|
||||
|
||||
# Sublime Text settings
|
||||
*.sublime-workspace
|
||||
*.sublime-project
|
||||
|
||||
# Eclipse Project settings
|
||||
*.*project
|
||||
.settings
|
||||
|
||||
# QtCreator files
|
||||
*.user
|
||||
|
||||
# PyCharm files
|
||||
.idea
|
||||
|
||||
# Visual Studio Code files
|
||||
.vscode
|
||||
|
||||
# OSX dir files
|
||||
.DS_Store
|
||||
|
||||
## Caffe
|
||||
|
||||
# User's build configuration
|
||||
Makefile.config
|
||||
|
||||
# Data and models are either
|
||||
# 1. reference, and not casually committed
|
||||
# 2. custom, and live on their own unless they're deliberated contributed
|
||||
data/*
|
||||
models/*
|
||||
*.caffemodel
|
||||
*.caffemodel.h5
|
||||
*.solverstate
|
||||
*.solverstate.h5
|
||||
*.binaryproto
|
||||
*leveldb
|
||||
*lmdb
|
||||
|
||||
# build, distribute, and bins (+ python proto bindings)
|
||||
build
|
||||
.build_debug/*
|
||||
.build_release/*
|
||||
distribute/*
|
||||
*.testbin
|
||||
*.bin
|
||||
python/caffe/proto/
|
||||
cmake_build
|
||||
.cmake_build
|
||||
|
||||
# Generated documentation
|
||||
docs/_site
|
||||
docs/_includes
|
||||
docs/gathered
|
||||
_site
|
||||
doxygen
|
||||
docs/dev
|
||||
|
||||
# LevelDB files
|
||||
*.sst
|
||||
*.ldb
|
||||
LOCK
|
||||
LOG*
|
||||
CURRENT
|
||||
MANIFEST-*
|
@ -1,67 +0,0 @@
|
||||
dist: trusty
|
||||
sudo: required
|
||||
|
||||
language: cpp
|
||||
compiler: gcc
|
||||
|
||||
env:
|
||||
global:
|
||||
- NUM_THREADS=4
|
||||
matrix:
|
||||
# Use a build matrix to test many builds in parallel
|
||||
# envvar defaults:
|
||||
# WITH_CMAKE: false
|
||||
# WITH_PYTHON3: false
|
||||
# WITH_IO: true
|
||||
# WITH_CUDA: false
|
||||
# WITH_CUDNN: false
|
||||
- BUILD_NAME="default-make"
|
||||
# - BUILD_NAME="python3-make" WITH_PYTHON3=true
|
||||
- BUILD_NAME="no-io-make" WITH_IO=false
|
||||
- BUILD_NAME="cuda-make" WITH_CUDA=true
|
||||
- BUILD_NAME="cudnn-make" WITH_CUDA=true WITH_CUDNN=true
|
||||
|
||||
- BUILD_NAME="default-cmake" WITH_CMAKE=true
|
||||
- BUILD_NAME="python3-cmake" WITH_CMAKE=true WITH_PYTHON3=true
|
||||
- BUILD_NAME="no-io-cmake" WITH_CMAKE=true WITH_IO=false
|
||||
- BUILD_NAME="cuda-cmake" WITH_CMAKE=true WITH_CUDA=true
|
||||
- BUILD_NAME="cudnn-cmake" WITH_CMAKE=true WITH_CUDA=true WITH_CUDNN=true
|
||||
|
||||
cache:
|
||||
apt: true
|
||||
directories:
|
||||
- ~/protobuf3
|
||||
|
||||
before_install:
|
||||
- source ./scripts/travis/defaults.sh
|
||||
|
||||
install:
|
||||
- sudo -E ./scripts/travis/install-deps.sh
|
||||
- ./scripts/travis/setup-venv.sh ~/venv
|
||||
- source ~/venv/bin/activate
|
||||
- ./scripts/travis/install-python-deps.sh
|
||||
|
||||
before_script:
|
||||
- ./scripts/travis/configure.sh
|
||||
|
||||
script:
|
||||
- ./scripts/travis/build.sh
|
||||
- ./scripts/travis/test.sh
|
||||
|
||||
notifications:
|
||||
# Emails are sent to the committer's git-configured email address by default,
|
||||
# but only if they have access to the repository. To enable Travis on your
|
||||
# public fork of Caffe, just go to travis-ci.org and flip the switch on for
|
||||
# your Caffe fork. To configure your git email address, use:
|
||||
# git config --global user.email me@example.com
|
||||
email:
|
||||
on_success: always
|
||||
on_failure: always
|
||||
|
||||
# IRC notifications disabled by default.
|
||||
# Uncomment next 5 lines to send notifications to chat.freenode.net#caffe
|
||||
# irc:
|
||||
# channels:
|
||||
# - "chat.freenode.net#caffe"
|
||||
# template:
|
||||
# - "%{repository}/%{branch} (%{commit} - %{author}): %{message}"
|
@ -1,127 +0,0 @@
|
||||
cmake_minimum_required(VERSION 2.8.7)
|
||||
if(POLICY CMP0046)
|
||||
cmake_policy(SET CMP0046 NEW)
|
||||
endif()
|
||||
if(POLICY CMP0054)
|
||||
cmake_policy(SET CMP0054 NEW)
|
||||
endif()
|
||||
|
||||
# ---[ Caffe project
|
||||
project(Caffe C CXX)
|
||||
|
||||
# ---[ Caffe version
|
||||
set(CAFFE_TARGET_VERSION "1.0.0" CACHE STRING "Caffe logical version")
|
||||
set(CAFFE_TARGET_SOVERSION "1.0.0" CACHE STRING "Caffe soname version")
|
||||
add_definitions(-DCAFFE_VERSION=${CAFFE_TARGET_VERSION})
|
||||
|
||||
# ---[ Using cmake scripts and modules
|
||||
list(APPEND CMAKE_MODULE_PATH ${PROJECT_SOURCE_DIR}/cmake/Modules)
|
||||
|
||||
include(ExternalProject)
|
||||
include(GNUInstallDirs)
|
||||
|
||||
include(cmake/Utils.cmake)
|
||||
include(cmake/Targets.cmake)
|
||||
include(cmake/Misc.cmake)
|
||||
include(cmake/Summary.cmake)
|
||||
include(cmake/ConfigGen.cmake)
|
||||
|
||||
# ---[ Options
|
||||
caffe_option(CPU_ONLY "Build Caffe without CUDA support" OFF) # TODO: rename to USE_CUDA
|
||||
caffe_option(USE_CUDNN "Build Caffe with cuDNN library support" ON IF NOT CPU_ONLY)
|
||||
caffe_option(USE_NCCL "Build Caffe with NCCL library support" OFF)
|
||||
caffe_option(BUILD_SHARED_LIBS "Build shared libraries" ON)
|
||||
caffe_option(BUILD_python "Build Python wrapper" ON)
|
||||
set(python_version "2" CACHE STRING "Specify which Python version to use")
|
||||
caffe_option(BUILD_matlab "Build Matlab wrapper" OFF IF UNIX OR APPLE)
|
||||
caffe_option(BUILD_docs "Build documentation" ON IF UNIX OR APPLE)
|
||||
caffe_option(BUILD_python_layer "Build the Caffe Python layer" ON)
|
||||
caffe_option(USE_OPENCV "Build with OpenCV support" ON)
|
||||
caffe_option(USE_LEVELDB "Build with levelDB" ON)
|
||||
caffe_option(USE_LMDB "Build with lmdb" ON)
|
||||
caffe_option(ALLOW_LMDB_NOLOCK "Allow MDB_NOLOCK when reading LMDB files (only if necessary)" OFF)
|
||||
caffe_option(USE_OPENMP "Link with OpenMP (when your BLAS wants OpenMP and you get linker errors)" OFF)
|
||||
|
||||
# This code is taken from https://github.com/sh1r0/caffe-android-lib
|
||||
caffe_option(USE_HDF5 "Build with hdf5" ON)
|
||||
|
||||
# ---[ Dependencies
|
||||
include(cmake/Dependencies.cmake)
|
||||
|
||||
# ---[ Flags
|
||||
if(UNIX OR APPLE)
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -fPIC -Wall -std=c++11")
|
||||
endif()
|
||||
|
||||
caffe_set_caffe_link()
|
||||
|
||||
if(USE_libstdcpp)
|
||||
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -stdlib=libstdc++")
|
||||
message("-- Warning: forcing libstdc++ (controlled by USE_libstdcpp option in cmake)")
|
||||
endif()
|
||||
|
||||
# ---[ Warnings
|
||||
caffe_warnings_disable(CMAKE_CXX_FLAGS -Wno-sign-compare -Wno-uninitialized)
|
||||
|
||||
# ---[ Config generation
|
||||
configure_file(cmake/Templates/caffe_config.h.in "${PROJECT_BINARY_DIR}/caffe_config.h")
|
||||
|
||||
# ---[ Includes
|
||||
set(Caffe_INCLUDE_DIR ${PROJECT_SOURCE_DIR}/include)
|
||||
set(Caffe_SRC_DIR ${PROJECT_SOURCE_DIR}/src)
|
||||
include_directories(${PROJECT_BINARY_DIR})
|
||||
|
||||
# ---[ Includes & defines for CUDA
|
||||
|
||||
# cuda_compile() does not have per-call dependencies or include pathes
|
||||
# (cuda_compile() has per-call flags, but we set them here too for clarity)
|
||||
#
|
||||
# list(REMOVE_ITEM ...) invocations remove PRIVATE and PUBLIC keywords from collected definitions and include pathes
|
||||
if(HAVE_CUDA)
|
||||
# pass include pathes to cuda_include_directories()
|
||||
set(Caffe_ALL_INCLUDE_DIRS ${Caffe_INCLUDE_DIRS})
|
||||
list(REMOVE_ITEM Caffe_ALL_INCLUDE_DIRS PRIVATE PUBLIC)
|
||||
cuda_include_directories(${Caffe_INCLUDE_DIR} ${Caffe_SRC_DIR} ${Caffe_ALL_INCLUDE_DIRS})
|
||||
|
||||
# add definitions to nvcc flags directly
|
||||
if(Caffe_DEFINITIONS)
|
||||
set(Caffe_ALL_DEFINITIONS ${Caffe_DEFINITIONS})
|
||||
list(REMOVE_ITEM Caffe_ALL_DEFINITIONS PRIVATE PUBLIC)
|
||||
list(APPEND CUDA_NVCC_FLAGS ${Caffe_ALL_DEFINITIONS})
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ---[ Subdirectories
|
||||
add_subdirectory(src/gtest)
|
||||
add_subdirectory(src/caffe)
|
||||
add_subdirectory(tools)
|
||||
add_subdirectory(examples)
|
||||
add_subdirectory(python)
|
||||
add_subdirectory(matlab)
|
||||
add_subdirectory(docs)
|
||||
|
||||
# ---[ Linter target
|
||||
add_custom_target(lint COMMAND ${CMAKE_COMMAND} -P ${PROJECT_SOURCE_DIR}/cmake/lint.cmake)
|
||||
|
||||
# ---[ pytest target
|
||||
if(BUILD_python)
|
||||
add_custom_target(pytest COMMAND python${python_version} -m unittest discover -s caffe/test WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/python )
|
||||
add_dependencies(pytest pycaffe)
|
||||
endif()
|
||||
|
||||
# ---[ uninstall target
|
||||
configure_file(
|
||||
${CMAKE_CURRENT_SOURCE_DIR}/cmake/Uninstall.cmake.in
|
||||
${CMAKE_CURRENT_BINARY_DIR}/cmake/Uninstall.cmake
|
||||
IMMEDIATE @ONLY)
|
||||
|
||||
add_custom_target(uninstall
|
||||
COMMAND ${CMAKE_COMMAND} -P
|
||||
${CMAKE_CURRENT_BINARY_DIR}/cmake/Uninstall.cmake)
|
||||
|
||||
# ---[ Configuration summary
|
||||
caffe_print_configuration_summary()
|
||||
|
||||
# ---[ Export configs generation
|
||||
caffe_generate_export_configs()
|
||||
|
@ -1,72 +0,0 @@
|
||||
# Contributing
|
||||
|
||||
Below you will find a collection of guidelines for submitting issues as well as contributing code to the Caffe repository.
|
||||
Please read those before starting an issue or a pull request.
|
||||
|
||||
## Issues
|
||||
|
||||
Specific Caffe design and development issues, bugs, and feature requests are maintained by GitHub Issues.
|
||||
|
||||
*Please do not post installation, build, usage, or modeling questions, or other requests for help to Issues.*
|
||||
Use the [caffe-users list](https://groups.google.com/forum/#!forum/caffe-users) instead.
|
||||
This helps developers maintain a clear, uncluttered, and efficient view of the state of Caffe.
|
||||
See the chapter [caffe-users](#caffe-users) below for guidance on posting to the users list.
|
||||
|
||||
When reporting an issue, it's most helpful to provide the following information, where applicable:
|
||||
* How does the problem look like and what steps reproduce it?
|
||||
* Can you reproduce it using the latest [master](https://github.com/BVLC/caffe/tree/master), compiled with the `DEBUG` make option?
|
||||
* What hardware and software are you running? In particular:
|
||||
* GPU make and model, if relevant,
|
||||
* operating system/distribution,
|
||||
* compiler; please also post which version (for example, with GCC run `gcc --version` to check),
|
||||
* CUDA version, if applicable (run `nvcc --version` to check),
|
||||
* cuDNN version, if applicable (version number is stored in `cudnn.h`, look for lines containing `CUDNN_MAJOR`, `CUDNN_MINOR` and `CUDNN_PATCHLEVEL`),
|
||||
* BLAS library,
|
||||
* Python version, if relevant,
|
||||
* MATLAB version, if relevant.
|
||||
* **What have you already tried** to solve the problem? How did it fail? Are there any other issues related to yours?
|
||||
* If this is not a build-related issue, does your installation pass `make runtest`?
|
||||
* If the bug is a crash, provide the backtrace (usually printed by Caffe; always obtainable with `gdb`).
|
||||
* If you are reporting a build error that seems to be due to a bug in Caffe, please attach your build configuration (either Makefile.config or CMakeCache.txt) and the output of the make (or cmake) command.
|
||||
|
||||
If only a small portion of the code/log is relevant to your issue, you may paste it directly into the post, preferably using Markdown syntax for code block: triple backtick ( \`\`\` ) to open/close a block.
|
||||
In other cases (multiple files, or long files), please **attach** them to the post - this greatly improves readability.
|
||||
|
||||
If the problem arises during a complex operation (e.g. large script using pycaffe, long network prototxt), please reduce the example to the minimal size that still causes the error.
|
||||
Also, minimize influence of external modules, data etc. - this way it will be easier for others to understand and reproduce your issue, and eventually help you.
|
||||
Sometimes you will find the root cause yourself in the process.
|
||||
|
||||
Try to give your issue a title that is succinct and specific. The devs will rename issues as needed to keep track of them.
|
||||
|
||||
## Caffe-users
|
||||
|
||||
Before you post to the [caffe-users list](https://groups.google.com/forum/#!forum/caffe-users), make sure you look for existing solutions.
|
||||
The Caffe community has encountered and found solutions to countless problems - benefit from the collective experience.
|
||||
Recommended places to look:
|
||||
* the [users list](https://groups.google.com/forum/#!forum/caffe-users) itself,
|
||||
* [`caffe`](https://stackoverflow.com/questions/tagged/caffe) tag on StackOverflow,
|
||||
* [GitHub issues](https://github.com/BVLC/caffe/issues) tracker (some problems have been answered there),
|
||||
* the public [wiki](https://github.com/BVLC/caffe/wiki),
|
||||
* the official [documentation](http://caffe.berkeleyvision.org/).
|
||||
|
||||
Found a post/issue with your exact problem, but with no answer?
|
||||
Don't just leave a "me too" message - provide the details of your case.
|
||||
Problems with more available information are easier to solve and attract good attention.
|
||||
|
||||
When posting to the list, make sure you provide as much relevant information as possible - recommendations for an issue report (see above) are a good starting point.
|
||||
*Please make it very clear which version of Caffe you are using, especially if it is a fork not maintained by BVLC.*
|
||||
|
||||
Formatting recommendations hold: paste short logs/code fragments into the post (use fixed-width text for them), **attach** long logs or multiple files.
|
||||
|
||||
## Pull Requests
|
||||
|
||||
Caffe welcomes all contributions.
|
||||
|
||||
See the [contributing guide](http://caffe.berkeleyvision.org/development.html) for details.
|
||||
|
||||
Briefly: read commit by commit, a PR should tell a clean, compelling story of _one_ improvement to Caffe. In particular:
|
||||
|
||||
* A PR should do one clear thing that obviously improves Caffe, and nothing more. Making many smaller PRs is better than making one large PR; review effort is superlinear in the amount of code involved.
|
||||
* Similarly, each commit should be a small, atomic change representing one step in development. PRs should be made of many commits where appropriate.
|
||||
* Please do rewrite PR history to be clean rather than chronological. Within-PR bugfixes, style cleanups, reversions, etc. should be squashed and should not appear in merged PR history.
|
||||
* Anything nonobvious from the code should be explained in comments, commit messages, or the PR description, as appropriate.
|
@ -1,19 +0,0 @@
|
||||
# Contributors
|
||||
|
||||
Caffe is developed by a core set of BAIR members and the open-source community.
|
||||
|
||||
We thank all of our [contributors](https://github.com/BVLC/caffe/graphs/contributors)!
|
||||
|
||||
**For the detailed history of contributions** of a given file, try
|
||||
|
||||
git blame file
|
||||
|
||||
to see line-by-line credits and
|
||||
|
||||
git log --follow file
|
||||
|
||||
to see the change log even across renames and rewrites.
|
||||
|
||||
Please refer to the [acknowledgements](http://caffe.berkeleyvision.org/#acknowledgements) on the Caffe site for further details.
|
||||
|
||||
**Copyright** is held by the original contributor according to the versioning history; see LICENSE.
|
@ -1,7 +0,0 @@
|
||||
# Installation
|
||||
|
||||
See http://caffe.berkeleyvision.org/installation.html for the latest
|
||||
installation instructions.
|
||||
|
||||
Check the users group in case you need help:
|
||||
https://groups.google.com/forum/#!forum/caffe-users
|
@ -1,44 +0,0 @@
|
||||
COPYRIGHT
|
||||
|
||||
All contributions by the University of California:
|
||||
Copyright (c) 2014-2017 The Regents of the University of California (Regents)
|
||||
All rights reserved.
|
||||
|
||||
All other contributions:
|
||||
Copyright (c) 2014-2017, the respective contributors
|
||||
All rights reserved.
|
||||
|
||||
Caffe uses a shared copyright model: each contributor holds copyright over
|
||||
their contributions to Caffe. The project versioning records all such
|
||||
contribution and copyright details. If a contributor wants to further mark
|
||||
their specific copyright on a particular contribution, they should indicate
|
||||
their copyright solely in the commit message of the change when it is
|
||||
committed.
|
||||
|
||||
LICENSE
|
||||
|
||||
Redistribution and use in source and binary forms, with or without
|
||||
modification, are permitted provided that the following conditions are met:
|
||||
|
||||
1. Redistributions of source code must retain the above copyright notice, this
|
||||
list of conditions and the following disclaimer.
|
||||
2. Redistributions in binary form must reproduce the above copyright notice,
|
||||
this list of conditions and the following disclaimer in the documentation
|
||||
and/or other materials provided with the distribution.
|
||||
|
||||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
|
||||
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
|
||||
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
|
||||
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
|
||||
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
|
||||
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
|
||||
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
|
||||
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
|
||||
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||
|
||||
CONTRIBUTION AGREEMENT
|
||||
|
||||
By contributing to the BVLC/caffe repository through pull-request, comment,
|
||||
or otherwise, the contributor releases their content to the
|
||||
license and copyright terms herein.
|
@ -1,125 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
# OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,130 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
# This code is taken from https://github.com/sh1r0/caffe-android-lib
|
||||
# USE_HDF5 := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
# OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52 \
|
||||
-gencode arch=compute_60,code=sm_60 \
|
||||
-gencode arch=compute_61,code=sm_61 \
|
||||
-gencode arch=compute_61,code=compute_61
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,125 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
# OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,128 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
# OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52 \
|
||||
-gencode arch=compute_60,code=sm_60 \
|
||||
-gencode arch=compute_61,code=sm_61 \
|
||||
-gencode arch=compute_61,code=compute_61
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,129 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
# OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52 \
|
||||
-gencode arch=compute_60,code=sm_60 \
|
||||
-gencode arch=compute_61,code=sm_61 \
|
||||
-gencode arch=compute_61,code=compute_61
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include \
|
||||
/usr/local/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/aarch64-linux-gnu /usr/lib/aarch64-linux-gnu/hdf5/serial
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,129 +0,0 @@
|
||||
## Refer to http://caffe.berkeleyvision.org/installation.html
|
||||
# Contributions simplifying and improving our build system are welcome!
|
||||
|
||||
# cuDNN acceleration switch (uncomment to build with cuDNN).
|
||||
USE_CUDNN := 1
|
||||
|
||||
# CPU-only switch (uncomment to build without GPU support).
|
||||
# CPU_ONLY := 1
|
||||
|
||||
# uncomment to disable IO dependencies and corresponding data layers
|
||||
# USE_OPENCV := 0
|
||||
# USE_LEVELDB := 0
|
||||
# USE_LMDB := 0
|
||||
|
||||
# uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary)
|
||||
# You should not set this flag if you will be reading LMDBs with any
|
||||
# possibility of simultaneous read and write
|
||||
# ALLOW_LMDB_NOLOCK := 1
|
||||
|
||||
# Uncomment if you're using OpenCV 3
|
||||
OPENCV_VERSION := 3
|
||||
|
||||
# To customize your choice of compiler, uncomment and set the following.
|
||||
# N.B. the default for Linux is g++ and the default for OSX is clang++
|
||||
# CUSTOM_CXX := g++
|
||||
|
||||
# CUDA directory contains bin/ and lib/ directories that we need.
|
||||
CUDA_DIR := /usr/local/cuda
|
||||
# On Ubuntu 14.04, if cuda tools are installed via
|
||||
# "sudo apt-get install nvidia-cuda-toolkit" then use this instead:
|
||||
# CUDA_DIR := /usr
|
||||
|
||||
# CUDA architecture setting: going with all of them.
|
||||
# For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility.
|
||||
# For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility.
|
||||
CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \
|
||||
-gencode arch=compute_35,code=sm_35 \
|
||||
-gencode arch=compute_50,code=sm_50 \
|
||||
-gencode arch=compute_52,code=sm_52 \
|
||||
-gencode arch=compute_60,code=sm_60 \
|
||||
-gencode arch=compute_61,code=sm_61 \
|
||||
-gencode arch=compute_61,code=compute_61
|
||||
# Deprecated
|
||||
# CUDA_ARCH := -gencode arch=compute_20,code=sm_20 \
|
||||
# -gencode arch=compute_20,code=sm_21 \
|
||||
# -gencode arch=compute_30,code=sm_30 \
|
||||
# -gencode arch=compute_35,code=sm_35 \
|
||||
# -gencode arch=compute_50,code=sm_50 \
|
||||
# -gencode arch=compute_52,code=sm_52 \
|
||||
# -gencode arch=compute_60,code=sm_60 \
|
||||
# -gencode arch=compute_61,code=sm_61 \
|
||||
# -gencode arch=compute_61,code=compute_61
|
||||
|
||||
# BLAS choice:
|
||||
# atlas for ATLAS (default)
|
||||
# mkl for MKL
|
||||
# open for OpenBlas
|
||||
BLAS := atlas
|
||||
# Custom (MKL/ATLAS/OpenBLAS) include and lib directories.
|
||||
# Leave commented to accept the defaults for your choice of BLAS
|
||||
# (which should work)!
|
||||
# BLAS_INCLUDE := /path/to/your/blas
|
||||
# BLAS_LIB := /path/to/your/blas
|
||||
|
||||
# Homebrew puts openblas in a directory that is not on the standard search path
|
||||
# BLAS_INCLUDE := $(shell brew --prefix openblas)/include
|
||||
# BLAS_LIB := $(shell brew --prefix openblas)/lib
|
||||
|
||||
# This is required only if you will compile the matlab interface.
|
||||
# MATLAB directory should contain the mex binary in /bin.
|
||||
# MATLAB_DIR := /usr/local
|
||||
# MATLAB_DIR := /Applications/MATLAB_R2012b.app
|
||||
|
||||
# NOTE: this is required only if you will compile the python interface.
|
||||
# We need to be able to find Python.h and numpy/arrayobject.h.
|
||||
PYTHON_INCLUDE := /usr/include/python2.7 \
|
||||
/usr/lib/python2.7/dist-packages/numpy/core/include \
|
||||
/usr/local/lib/python2.7/dist-packages/numpy/core/include
|
||||
# Anaconda Python distribution is quite popular. Include path:
|
||||
# Verify anaconda location, sometimes it's in root.
|
||||
# ANACONDA_HOME := $(HOME)/anaconda
|
||||
# PYTHON_INCLUDE := $(ANACONDA_HOME)/include \
|
||||
# $(ANACONDA_HOME)/include/python2.7 \
|
||||
# $(ANACONDA_HOME)/lib/python2.7/site-packages/numpy/core/include
|
||||
|
||||
# Uncomment to use Python 3 (default is Python 2)
|
||||
# PYTHON_LIBRARIES := boost_python3 python3.5m
|
||||
# PYTHON_INCLUDE := /usr/include/python3.5m \
|
||||
# /usr/lib/python3.5/dist-packages/numpy/core/include
|
||||
|
||||
# We need to be able to find libpythonX.X.so or .dylib.
|
||||
PYTHON_LIB := /usr/lib /usr/local/lib
|
||||
# PYTHON_LIB := $(ANACONDA_HOME)/lib
|
||||
|
||||
# Homebrew installs numpy in a non standard path (keg only)
|
||||
# PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include
|
||||
# PYTHON_LIB += $(shell brew --prefix numpy)/lib
|
||||
|
||||
# Uncomment to support layers written in Python (will link against Python libs)
|
||||
# WITH_PYTHON_LAYER := 1
|
||||
|
||||
# Whatever else you find you need goes here.
|
||||
INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial
|
||||
LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/aarch64-linux-gnu /usr/lib/aarch64-linux-gnu/hdf5/serial
|
||||
|
||||
# If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies
|
||||
# INCLUDE_DIRS += $(shell brew --prefix)/include
|
||||
# LIBRARY_DIRS += $(shell brew --prefix)/lib
|
||||
|
||||
# NCCL acceleration switch (uncomment to build with NCCL)
|
||||
# https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0)
|
||||
# USE_NCCL := 1
|
||||
|
||||
# Uncomment to use `pkg-config` to specify OpenCV library paths.
|
||||
# (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.)
|
||||
# USE_PKG_CONFIG := 1
|
||||
|
||||
# N.B. both build and distribute dirs are cleared on `make clean`
|
||||
BUILD_DIR := build
|
||||
DISTRIBUTE_DIR := distribute
|
||||
|
||||
# Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171
|
||||
# DEBUG := 1
|
||||
|
||||
# The ID of the GPU that 'make runtest' will use to run unit tests.
|
||||
TEST_GPUID := 0
|
||||
|
||||
# enable pretty build (comment to see full commands)
|
||||
Q ?= @
|
@ -1,45 +0,0 @@
|
||||
# Caffe
|
||||
|
||||
[![Build Status](https://travis-ci.org/BVLC/caffe.svg?branch=master)](https://travis-ci.org/BVLC/caffe)
|
||||
[![License](https://img.shields.io/badge/license-BSD-blue.svg)](LICENSE)
|
||||
|
||||
Caffe is a deep learning framework made with expression, speed, and modularity in mind.
|
||||
It is developed by Berkeley AI Research ([BAIR](http://bair.berkeley.edu))/The Berkeley Vision and Learning Center (BVLC) and community contributors.
|
||||
|
||||
Check out the [project site](http://caffe.berkeleyvision.org) for all the details like
|
||||
|
||||
- [DIY Deep Learning for Vision with Caffe](https://docs.google.com/presentation/d/1UeKXVgRvvxg9OUdh_UiC5G71UMscNPlvArsWER41PsU/edit#slide=id.p)
|
||||
- [Tutorial Documentation](http://caffe.berkeleyvision.org/tutorial/)
|
||||
- [BAIR reference models](http://caffe.berkeleyvision.org/model_zoo.html) and the [community model zoo](https://github.com/BVLC/caffe/wiki/Model-Zoo)
|
||||
- [Installation instructions](http://caffe.berkeleyvision.org/installation.html)
|
||||
|
||||
and step-by-step examples.
|
||||
|
||||
## Custom distributions
|
||||
|
||||
- [Intel Caffe](https://github.com/BVLC/caffe/tree/intel) (Optimized for CPU and support for multi-node), in particular Xeon processors (HSW, BDW, SKX, Xeon Phi).
|
||||
- [OpenCL Caffe](https://github.com/BVLC/caffe/tree/opencl) e.g. for AMD or Intel devices.
|
||||
- [Windows Caffe](https://github.com/BVLC/caffe/tree/windows)
|
||||
|
||||
## Community
|
||||
|
||||
[![Join the chat at https://gitter.im/BVLC/caffe](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/BVLC/caffe?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
|
||||
|
||||
Please join the [caffe-users group](https://groups.google.com/forum/#!forum/caffe-users) or [gitter chat](https://gitter.im/BVLC/caffe) to ask questions and talk about methods and models.
|
||||
Framework development discussions and thorough bug reports are collected on [Issues](https://github.com/BVLC/caffe/issues).
|
||||
|
||||
Happy brewing!
|
||||
|
||||
## License and Citation
|
||||
|
||||
Caffe is released under the [BSD 2-Clause license](https://github.com/BVLC/caffe/blob/master/LICENSE).
|
||||
The BAIR/BVLC reference models are released for unrestricted use.
|
||||
|
||||
Please cite Caffe in your publications if it helps your research:
|
||||
|
||||
@article{jia2014caffe,
|
||||
Author = {Jia, Yangqing and Shelhamer, Evan and Donahue, Jeff and Karayev, Sergey and Long, Jonathan and Girshick, Ross and Guadarrama, Sergio and Darrell, Trevor},
|
||||
Journal = {arXiv preprint arXiv:1408.5093},
|
||||
Title = {Caffe: Convolutional Architecture for Fast Feature Embedding},
|
||||
Year = {2014}
|
||||
}
|
@ -1,53 +0,0 @@
|
||||
Bourne Shell
|
||||
filter remove_matches ^\s*#
|
||||
filter remove_inline #.*$
|
||||
extension sh
|
||||
script_exe sh
|
||||
C
|
||||
filter remove_matches ^\s*//
|
||||
filter call_regexp_common C
|
||||
filter remove_inline //.*$
|
||||
extension c
|
||||
extension ec
|
||||
extension pgc
|
||||
C++
|
||||
filter remove_matches ^\s*//
|
||||
filter remove_inline //.*$
|
||||
filter call_regexp_common C
|
||||
extension C
|
||||
extension cc
|
||||
extension cpp
|
||||
extension cxx
|
||||
extension pcc
|
||||
C/C++ Header
|
||||
filter remove_matches ^\s*//
|
||||
filter call_regexp_common C
|
||||
filter remove_inline //.*$
|
||||
extension H
|
||||
extension h
|
||||
extension hh
|
||||
extension hpp
|
||||
CUDA
|
||||
filter remove_matches ^\s*//
|
||||
filter remove_inline //.*$
|
||||
filter call_regexp_common C
|
||||
extension cu
|
||||
Python
|
||||
filter remove_matches ^\s*#
|
||||
filter docstring_to_C
|
||||
filter call_regexp_common C
|
||||
filter remove_inline #.*$
|
||||
extension py
|
||||
make
|
||||
filter remove_matches ^\s*#
|
||||
filter remove_inline #.*$
|
||||
extension Gnumakefile
|
||||
extension Makefile
|
||||
extension am
|
||||
extension gnumakefile
|
||||
extension makefile
|
||||
filename Gnumakefile
|
||||
filename Makefile
|
||||
filename gnumakefile
|
||||
filename makefile
|
||||
script_exe make
|
@ -1,68 +0,0 @@
|
||||
|
||||
################################################################################################
|
||||
# Helper function to get all list items that begin with given prefix
|
||||
# Usage:
|
||||
# caffe_get_items_with_prefix(<prefix> <list_variable> <output_variable>)
|
||||
function(caffe_get_items_with_prefix prefix list_variable output_variable)
|
||||
set(__result "")
|
||||
foreach(__e ${${list_variable}})
|
||||
if(__e MATCHES "^${prefix}.*")
|
||||
list(APPEND __result ${__e})
|
||||
endif()
|
||||
endforeach()
|
||||
set(${output_variable} ${__result} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Function for generation Caffe build- and install- tree export config files
|
||||
# Usage:
|
||||
# caffe_generate_export_configs()
|
||||
function(caffe_generate_export_configs)
|
||||
set(install_cmake_suffix "share/Caffe")
|
||||
|
||||
if(NOT HAVE_CUDA)
|
||||
set(HAVE_CUDA FALSE)
|
||||
endif()
|
||||
|
||||
set(HDF5_IMPORTED OFF)
|
||||
foreach(_lib ${HDF5_LIBRARIES} ${HDF5_HL_LIBRARIES})
|
||||
if(TARGET ${_lib})
|
||||
set(HDF5_IMPORTED ON)
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
# This code is taken from https://github.com/sh1r0/caffe-android-lib
|
||||
if(USE_HDF5)
|
||||
list(APPEND Caffe_DEFINITIONS -DUSE_HDF5)
|
||||
endif()
|
||||
|
||||
if(NOT HAVE_CUDNN)
|
||||
set(HAVE_CUDNN FALSE)
|
||||
endif()
|
||||
|
||||
# ---[ Configure build-tree CaffeConfig.cmake file ]---
|
||||
|
||||
configure_file("cmake/Templates/CaffeConfig.cmake.in" "${PROJECT_BINARY_DIR}/CaffeConfig.cmake" @ONLY)
|
||||
|
||||
# Add targets to the build-tree export set
|
||||
export(TARGETS caffe caffeproto FILE "${PROJECT_BINARY_DIR}/CaffeTargets.cmake")
|
||||
export(PACKAGE Caffe)
|
||||
|
||||
# ---[ Configure install-tree CaffeConfig.cmake file ]---
|
||||
|
||||
configure_file("cmake/Templates/CaffeConfig.cmake.in" "${PROJECT_BINARY_DIR}/cmake/CaffeConfig.cmake" @ONLY)
|
||||
|
||||
# Install the CaffeConfig.cmake and export set to use with install-tree
|
||||
install(FILES "${PROJECT_BINARY_DIR}/cmake/CaffeConfig.cmake" DESTINATION ${install_cmake_suffix})
|
||||
install(EXPORT CaffeTargets DESTINATION ${install_cmake_suffix})
|
||||
|
||||
# ---[ Configure and install version file ]---
|
||||
|
||||
# TODO: Lines below are commented because Caffe doesn't declare its version in headers.
|
||||
# When the declarations are added, modify `caffe_extract_caffe_version()` macro and uncomment
|
||||
|
||||
# configure_file(cmake/Templates/CaffeConfigVersion.cmake.in "${PROJECT_BINARY_DIR}/CaffeConfigVersion.cmake" @ONLY)
|
||||
# install(FILES "${PROJECT_BINARY_DIR}/CaffeConfigVersion.cmake" DESTINATION ${install_cmake_suffix})
|
||||
endfunction()
|
||||
|
||||
|
@ -1,300 +0,0 @@
|
||||
if(CPU_ONLY)
|
||||
return()
|
||||
endif()
|
||||
|
||||
# Known NVIDIA GPU achitectures Caffe can be compiled for.
|
||||
# This list will be used for CUDA_ARCH_NAME = All option
|
||||
set(Caffe_known_gpu_archs "30 35 50 52 60 61")
|
||||
# set(Caffe_known_gpu_archs "20 21(20) 30 35 50 60 61")
|
||||
|
||||
################################################################################################
|
||||
# A function for automatic detection of GPUs installed (if autodetection is enabled)
|
||||
# Usage:
|
||||
# caffe_detect_installed_gpus(out_variable)
|
||||
function(caffe_detect_installed_gpus out_variable)
|
||||
if(NOT CUDA_gpu_detect_output)
|
||||
set(__cufile ${PROJECT_BINARY_DIR}/detect_cuda_archs.cu)
|
||||
|
||||
file(WRITE ${__cufile} ""
|
||||
"#include <cstdio>\n"
|
||||
"int main()\n"
|
||||
"{\n"
|
||||
" int count = 0;\n"
|
||||
" if (cudaSuccess != cudaGetDeviceCount(&count)) return -1;\n"
|
||||
" if (count == 0) return -1;\n"
|
||||
" for (int device = 0; device < count; ++device)\n"
|
||||
" {\n"
|
||||
" cudaDeviceProp prop;\n"
|
||||
" if (cudaSuccess == cudaGetDeviceProperties(&prop, device))\n"
|
||||
" std::printf(\"%d.%d \", prop.major, prop.minor);\n"
|
||||
" }\n"
|
||||
" return 0;\n"
|
||||
"}\n")
|
||||
|
||||
execute_process(COMMAND "${CUDA_NVCC_EXECUTABLE}" "--run" "${__cufile}"
|
||||
WORKING_DIRECTORY "${PROJECT_BINARY_DIR}/CMakeFiles/"
|
||||
RESULT_VARIABLE __nvcc_res OUTPUT_VARIABLE __nvcc_out
|
||||
ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
|
||||
if(__nvcc_res EQUAL 0)
|
||||
string(REPLACE "2.1" "2.1(2.0)" __nvcc_out "${__nvcc_out}")
|
||||
set(CUDA_gpu_detect_output ${__nvcc_out} CACHE INTERNAL "Returned GPU architetures from caffe_detect_gpus tool" FORCE)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
if(NOT CUDA_gpu_detect_output)
|
||||
message(STATUS "Automatic GPU detection failed. Building for all known architectures.")
|
||||
set(${out_variable} ${Caffe_known_gpu_archs} PARENT_SCOPE)
|
||||
else()
|
||||
set(${out_variable} ${CUDA_gpu_detect_output} PARENT_SCOPE)
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
|
||||
################################################################################################
|
||||
# Function for selecting GPU arch flags for nvcc based on CUDA_ARCH_NAME
|
||||
# Usage:
|
||||
# caffe_select_nvcc_arch_flags(out_variable)
|
||||
function(caffe_select_nvcc_arch_flags out_variable)
|
||||
# List of arch names
|
||||
set(__archs_names "Kepler" "Maxwell" "Pascal" "All" "Manual")
|
||||
# set(__archs_names "Fermi" "Kepler" "Maxwell" "Pascal" "All" "Manual")
|
||||
set(__archs_name_default "All")
|
||||
if(NOT CMAKE_CROSSCOMPILING)
|
||||
list(APPEND __archs_names "Auto")
|
||||
set(__archs_name_default "Auto")
|
||||
endif()
|
||||
|
||||
# set CUDA_ARCH_NAME strings (so it will be seen as dropbox in CMake-Gui)
|
||||
set(CUDA_ARCH_NAME ${__archs_name_default} CACHE STRING "Select target NVIDIA GPU achitecture.")
|
||||
set_property( CACHE CUDA_ARCH_NAME PROPERTY STRINGS "" ${__archs_names} )
|
||||
mark_as_advanced(CUDA_ARCH_NAME)
|
||||
|
||||
# verify CUDA_ARCH_NAME value
|
||||
if(NOT ";${__archs_names};" MATCHES ";${CUDA_ARCH_NAME};")
|
||||
string(REPLACE ";" ", " __archs_names "${__archs_names}")
|
||||
message(FATAL_ERROR "Only ${__archs_names} architeture names are supported.")
|
||||
endif()
|
||||
|
||||
if(${CUDA_ARCH_NAME} STREQUAL "Manual")
|
||||
set(CUDA_ARCH_BIN ${Caffe_known_gpu_archs} CACHE STRING "Specify 'real' GPU architectures to build binaries for, BIN(PTX) format is supported")
|
||||
set(CUDA_ARCH_PTX "50" CACHE STRING "Specify 'virtual' PTX architectures to build PTX intermediate code for")
|
||||
mark_as_advanced(CUDA_ARCH_BIN CUDA_ARCH_PTX)
|
||||
else()
|
||||
unset(CUDA_ARCH_BIN CACHE)
|
||||
unset(CUDA_ARCH_PTX CACHE)
|
||||
endif()
|
||||
|
||||
if(${CUDA_ARCH_NAME} STREQUAL "Fermi")
|
||||
set(__cuda_arch_bin "20 21(20)")
|
||||
elseif(${CUDA_ARCH_NAME} STREQUAL "Kepler")
|
||||
set(__cuda_arch_bin "30 35")
|
||||
elseif(${CUDA_ARCH_NAME} STREQUAL "Maxwell")
|
||||
set(__cuda_arch_bin "50")
|
||||
elseif(${CUDA_ARCH_NAME} STREQUAL "Pascal")
|
||||
set(__cuda_arch_bin "60 61")
|
||||
elseif(${CUDA_ARCH_NAME} STREQUAL "All")
|
||||
set(__cuda_arch_bin ${Caffe_known_gpu_archs})
|
||||
elseif(${CUDA_ARCH_NAME} STREQUAL "Auto")
|
||||
caffe_detect_installed_gpus(__cuda_arch_bin)
|
||||
else() # (${CUDA_ARCH_NAME} STREQUAL "Manual")
|
||||
set(__cuda_arch_bin ${CUDA_ARCH_BIN})
|
||||
endif()
|
||||
|
||||
# remove dots and convert to lists
|
||||
string(REGEX REPLACE "\\." "" __cuda_arch_bin "${__cuda_arch_bin}")
|
||||
string(REGEX REPLACE "\\." "" __cuda_arch_ptx "${CUDA_ARCH_PTX}")
|
||||
string(REGEX MATCHALL "[0-9()]+" __cuda_arch_bin "${__cuda_arch_bin}")
|
||||
string(REGEX MATCHALL "[0-9]+" __cuda_arch_ptx "${__cuda_arch_ptx}")
|
||||
caffe_list_unique(__cuda_arch_bin __cuda_arch_ptx)
|
||||
|
||||
set(__nvcc_flags "")
|
||||
set(__nvcc_archs_readable "")
|
||||
|
||||
string(COMPARE LESS "${CUDA_VERSION}" "9.0" iscudaolderthan90)
|
||||
if(NOT iscudaolderthan90)
|
||||
string(REPLACE "21(20)" "" __cuda_arch_bin "${__cuda_arch_bin}")
|
||||
string(REPLACE "20" "" __cuda_arch_bin "${__cuda_arch_bin}")
|
||||
endif()
|
||||
|
||||
# Tell NVCC to add binaries for the specified GPUs
|
||||
foreach(__arch ${__cuda_arch_bin})
|
||||
if(__arch MATCHES "([0-9]+)\\(([0-9]+)\\)")
|
||||
# User explicitly specified PTX for the concrete BIN
|
||||
list(APPEND __nvcc_flags -gencode arch=compute_${CMAKE_MATCH_2},code=sm_${CMAKE_MATCH_1})
|
||||
list(APPEND __nvcc_archs_readable sm_${CMAKE_MATCH_1})
|
||||
else()
|
||||
# User didn't explicitly specify PTX for the concrete BIN, we assume PTX=BIN
|
||||
list(APPEND __nvcc_flags -gencode arch=compute_${__arch},code=sm_${__arch})
|
||||
list(APPEND __nvcc_archs_readable sm_${__arch})
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
# Tell NVCC to add PTX intermediate code for the specified architectures
|
||||
foreach(__arch ${__cuda_arch_ptx})
|
||||
list(APPEND __nvcc_flags -gencode arch=compute_${__arch},code=compute_${__arch})
|
||||
list(APPEND __nvcc_archs_readable compute_${__arch})
|
||||
endforeach()
|
||||
|
||||
string(REPLACE ";" " " __nvcc_archs_readable "${__nvcc_archs_readable}")
|
||||
set(${out_variable} ${__nvcc_flags} PARENT_SCOPE)
|
||||
set(${out_variable}_readable ${__nvcc_archs_readable} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Short command for cuda compilation
|
||||
# Usage:
|
||||
# caffe_cuda_compile(<objlist_variable> <cuda_files>)
|
||||
macro(caffe_cuda_compile objlist_variable)
|
||||
foreach(var CMAKE_CXX_FLAGS CMAKE_CXX_FLAGS_RELEASE CMAKE_CXX_FLAGS_DEBUG)
|
||||
set(${var}_backup_in_cuda_compile_ "${${var}}")
|
||||
|
||||
# we remove /EHa as it generates warnings under windows
|
||||
string(REPLACE "/EHa" "" ${var} "${${var}}")
|
||||
|
||||
endforeach()
|
||||
|
||||
if(UNIX OR APPLE)
|
||||
list(APPEND CUDA_NVCC_FLAGS -Xcompiler -fPIC)
|
||||
endif()
|
||||
|
||||
if(APPLE)
|
||||
list(APPEND CUDA_NVCC_FLAGS -Xcompiler -Wno-unused-function)
|
||||
endif()
|
||||
|
||||
cuda_compile(cuda_objcs ${ARGN})
|
||||
|
||||
foreach(var CMAKE_CXX_FLAGS CMAKE_CXX_FLAGS_RELEASE CMAKE_CXX_FLAGS_DEBUG)
|
||||
set(${var} "${${var}_backup_in_cuda_compile_}")
|
||||
unset(${var}_backup_in_cuda_compile_)
|
||||
endforeach()
|
||||
|
||||
set(${objlist_variable} ${cuda_objcs})
|
||||
endmacro()
|
||||
|
||||
################################################################################################
|
||||
# Short command for cuDNN detection. Believe it soon will be a part of CUDA toolkit distribution.
|
||||
# That's why not FindcuDNN.cmake file, but just the macro
|
||||
# Usage:
|
||||
# detect_cuDNN()
|
||||
function(detect_cuDNN)
|
||||
set(CUDNN_ROOT "" CACHE PATH "CUDNN root folder")
|
||||
|
||||
find_path(CUDNN_INCLUDE cudnn.h
|
||||
PATHS ${CUDNN_ROOT} $ENV{CUDNN_ROOT} ${CUDA_TOOLKIT_INCLUDE}
|
||||
DOC "Path to cuDNN include directory." )
|
||||
|
||||
# dynamic libs have different suffix in mac and linux
|
||||
if(APPLE)
|
||||
set(CUDNN_LIB_NAME "libcudnn.dylib")
|
||||
else()
|
||||
set(CUDNN_LIB_NAME "libcudnn.so")
|
||||
endif()
|
||||
|
||||
get_filename_component(__libpath_hist ${CUDA_CUDART_LIBRARY} PATH)
|
||||
find_library(CUDNN_LIBRARY NAMES ${CUDNN_LIB_NAME}
|
||||
PATHS ${CUDNN_ROOT} $ENV{CUDNN_ROOT} ${CUDNN_INCLUDE} ${__libpath_hist} ${__libpath_hist}/../lib
|
||||
DOC "Path to cuDNN library.")
|
||||
|
||||
if(CUDNN_INCLUDE AND CUDNN_LIBRARY)
|
||||
set(HAVE_CUDNN TRUE PARENT_SCOPE)
|
||||
set(CUDNN_FOUND TRUE PARENT_SCOPE)
|
||||
|
||||
file(READ ${CUDNN_INCLUDE}/cudnn.h CUDNN_VERSION_FILE_CONTENTS)
|
||||
|
||||
# cuDNN v3 and beyond
|
||||
string(REGEX MATCH "define CUDNN_MAJOR * +([0-9]+)"
|
||||
CUDNN_VERSION_MAJOR "${CUDNN_VERSION_FILE_CONTENTS}")
|
||||
string(REGEX REPLACE "define CUDNN_MAJOR * +([0-9]+)" "\\1"
|
||||
CUDNN_VERSION_MAJOR "${CUDNN_VERSION_MAJOR}")
|
||||
string(REGEX MATCH "define CUDNN_MINOR * +([0-9]+)"
|
||||
CUDNN_VERSION_MINOR "${CUDNN_VERSION_FILE_CONTENTS}")
|
||||
string(REGEX REPLACE "define CUDNN_MINOR * +([0-9]+)" "\\1"
|
||||
CUDNN_VERSION_MINOR "${CUDNN_VERSION_MINOR}")
|
||||
string(REGEX MATCH "define CUDNN_PATCHLEVEL * +([0-9]+)"
|
||||
CUDNN_VERSION_PATCH "${CUDNN_VERSION_FILE_CONTENTS}")
|
||||
string(REGEX REPLACE "define CUDNN_PATCHLEVEL * +([0-9]+)" "\\1"
|
||||
CUDNN_VERSION_PATCH "${CUDNN_VERSION_PATCH}")
|
||||
|
||||
if(NOT CUDNN_VERSION_MAJOR)
|
||||
set(CUDNN_VERSION "???")
|
||||
else()
|
||||
set(CUDNN_VERSION "${CUDNN_VERSION_MAJOR}.${CUDNN_VERSION_MINOR}.${CUDNN_VERSION_PATCH}")
|
||||
endif()
|
||||
|
||||
message(STATUS "Found cuDNN: ver. ${CUDNN_VERSION} found (include: ${CUDNN_INCLUDE}, library: ${CUDNN_LIBRARY})")
|
||||
|
||||
string(COMPARE LESS "${CUDNN_VERSION_MAJOR}" 3 cuDNNVersionIncompatible)
|
||||
if(cuDNNVersionIncompatible)
|
||||
message(FATAL_ERROR "cuDNN version >3 is required.")
|
||||
endif()
|
||||
|
||||
set(CUDNN_VERSION "${CUDNN_VERSION}" PARENT_SCOPE)
|
||||
mark_as_advanced(CUDNN_INCLUDE CUDNN_LIBRARY CUDNN_ROOT)
|
||||
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
### Non macro section
|
||||
################################################################################################
|
||||
|
||||
find_package(CUDA 5.5 QUIET)
|
||||
find_cuda_helper_libs(curand) # cmake 2.8.7 compatibility which doesn't search for curand
|
||||
|
||||
if(NOT CUDA_FOUND)
|
||||
return()
|
||||
endif()
|
||||
|
||||
set(HAVE_CUDA TRUE)
|
||||
message(STATUS "CUDA detected: " ${CUDA_VERSION})
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${CUDA_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${CUDA_CUDART_LIBRARY}
|
||||
${CUDA_curand_LIBRARY} ${CUDA_CUBLAS_LIBRARIES})
|
||||
|
||||
# cudnn detection
|
||||
if(USE_CUDNN)
|
||||
detect_cuDNN()
|
||||
if(HAVE_CUDNN)
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_CUDNN)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${CUDNN_INCLUDE})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${CUDNN_LIBRARY})
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# setting nvcc arch flags
|
||||
caffe_select_nvcc_arch_flags(NVCC_FLAGS_EXTRA)
|
||||
list(APPEND CUDA_NVCC_FLAGS ${NVCC_FLAGS_EXTRA})
|
||||
message(STATUS "Added CUDA NVCC flags for: ${NVCC_FLAGS_EXTRA_readable}")
|
||||
|
||||
# Boost 1.55 workaround, see https://svn.boost.org/trac/boost/ticket/9392 or
|
||||
# https://github.com/ComputationalRadiationPhysics/picongpu/blob/master/src/picongpu/CMakeLists.txt
|
||||
if(Boost_VERSION EQUAL 105500)
|
||||
message(STATUS "Cuda + Boost 1.55: Applying noinline work around")
|
||||
# avoid warning for CMake >= 2.8.12
|
||||
set(CUDA_NVCC_FLAGS "${CUDA_NVCC_FLAGS} \"-DBOOST_NOINLINE=__attribute__((noinline))\" ")
|
||||
endif()
|
||||
|
||||
# disable some nvcc diagnostic that apears in boost, glog, glags, opencv, etc.
|
||||
foreach(diag cc_clobber_ignored integer_sign_change useless_using_declaration set_but_not_used)
|
||||
list(APPEND CUDA_NVCC_FLAGS -Xcudafe --diag_suppress=${diag})
|
||||
endforeach()
|
||||
|
||||
# setting default testing device
|
||||
if(NOT CUDA_TEST_DEVICE)
|
||||
set(CUDA_TEST_DEVICE -1)
|
||||
endif()
|
||||
|
||||
mark_as_advanced(CUDA_BUILD_CUBIN CUDA_BUILD_EMULATION CUDA_VERBOSE_BUILD)
|
||||
mark_as_advanced(CUDA_SDK_ROOT_DIR CUDA_SEPARABLE_COMPILATION)
|
||||
|
||||
# Handle clang/libc++ issue
|
||||
if(APPLE)
|
||||
caffe_detect_darwin_version(OSX_VERSION)
|
||||
|
||||
# OSX 10.9 and higher uses clang/libc++ by default which is incompatible with old CUDA toolkits
|
||||
if(OSX_VERSION VERSION_GREATER 10.8)
|
||||
# enabled by default if and only if CUDA version is less than 7.0
|
||||
caffe_option(USE_libstdcpp "Use libstdc++ instead of libc++" (CUDA_VERSION VERSION_LESS 7.0))
|
||||
endif()
|
||||
endif()
|
@ -1,211 +0,0 @@
|
||||
# These lists are later turned into target properties on main caffe library target
|
||||
set(Caffe_LINKER_LIBS "")
|
||||
set(Caffe_INCLUDE_DIRS "")
|
||||
set(Caffe_DEFINITIONS "")
|
||||
set(Caffe_COMPILE_OPTIONS "")
|
||||
|
||||
# ---[ Boost
|
||||
find_package(Boost 1.54 REQUIRED COMPONENTS system thread filesystem)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${Boost_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${Boost_LIBRARIES})
|
||||
|
||||
# ---[ Threads
|
||||
find_package(Threads REQUIRED)
|
||||
list(APPEND Caffe_LINKER_LIBS PRIVATE ${CMAKE_THREAD_LIBS_INIT})
|
||||
|
||||
# ---[ OpenMP
|
||||
if(USE_OPENMP)
|
||||
# Ideally, this should be provided by the BLAS library IMPORTED target. However,
|
||||
# nobody does this, so we need to link to OpenMP explicitly and have the maintainer
|
||||
# to flick the switch manually as needed.
|
||||
#
|
||||
# Moreover, OpenMP package does not provide an IMPORTED target as well, and the
|
||||
# suggested way of linking to OpenMP is to append to CMAKE_{C,CXX}_FLAGS.
|
||||
# However, this naïve method will force any user of Caffe to add the same kludge
|
||||
# into their buildsystem again, so we put these options into per-target PUBLIC
|
||||
# compile options and link flags, so that they will be exported properly.
|
||||
find_package(OpenMP REQUIRED)
|
||||
list(APPEND Caffe_LINKER_LIBS PRIVATE ${OpenMP_CXX_FLAGS})
|
||||
list(APPEND Caffe_COMPILE_OPTIONS PRIVATE ${OpenMP_CXX_FLAGS})
|
||||
endif()
|
||||
|
||||
# ---[ Google-glog
|
||||
include("cmake/External/glog.cmake")
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${GLOG_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${GLOG_LIBRARIES})
|
||||
|
||||
# ---[ Google-gflags
|
||||
include("cmake/External/gflags.cmake")
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${GFLAGS_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${GFLAGS_LIBRARIES})
|
||||
|
||||
# ---[ Google-protobuf
|
||||
include(cmake/ProtoBuf.cmake)
|
||||
|
||||
# ---[ HDF5
|
||||
find_package(HDF5 COMPONENTS HL REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${HDF5_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${HDF5_LIBRARIES} ${HDF5_HL_LIBRARIES})
|
||||
|
||||
# This code is taken from https://github.com/sh1r0/caffe-android-lib
|
||||
if(USE_HDF5)
|
||||
find_package(HDF5 COMPONENTS HL REQUIRED)
|
||||
include_directories(SYSTEM ${HDF5_INCLUDE_DIRS} ${HDF5_HL_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS ${HDF5_LIBRARIES} ${HDF5_HL_LIBRARIES})
|
||||
add_definitions(-DUSE_HDF5)
|
||||
endif()
|
||||
|
||||
# ---[ LMDB
|
||||
if(USE_LMDB)
|
||||
find_package(LMDB REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${LMDB_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${LMDB_LIBRARIES})
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_LMDB)
|
||||
if(ALLOW_LMDB_NOLOCK)
|
||||
list(APPEND Caffe_DEFINITIONS PRIVATE -DALLOW_LMDB_NOLOCK)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ---[ LevelDB
|
||||
if(USE_LEVELDB)
|
||||
find_package(LevelDB REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${LevelDB_INCLUDES})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${LevelDB_LIBRARIES})
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_LEVELDB)
|
||||
endif()
|
||||
|
||||
# ---[ Snappy
|
||||
if(USE_LEVELDB)
|
||||
find_package(Snappy REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PRIVATE ${Snappy_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PRIVATE ${Snappy_LIBRARIES})
|
||||
endif()
|
||||
|
||||
# ---[ CUDA
|
||||
include(cmake/Cuda.cmake)
|
||||
if(NOT HAVE_CUDA)
|
||||
if(CPU_ONLY)
|
||||
message(STATUS "-- CUDA is disabled. Building without it...")
|
||||
else()
|
||||
message(WARNING "-- CUDA is not detected by cmake. Building without it...")
|
||||
endif()
|
||||
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DCPU_ONLY)
|
||||
endif()
|
||||
|
||||
if(USE_NCCL)
|
||||
find_package(NCCL REQUIRED)
|
||||
include_directories(SYSTEM ${NCCL_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS ${NCCL_LIBRARIES})
|
||||
add_definitions(-DUSE_NCCL)
|
||||
endif()
|
||||
|
||||
# ---[ OpenCV
|
||||
if(USE_OPENCV)
|
||||
find_package(OpenCV QUIET COMPONENTS core highgui imgproc imgcodecs)
|
||||
if(NOT OpenCV_FOUND) # if not OpenCV 3.x, then imgcodecs are not found
|
||||
find_package(OpenCV REQUIRED COMPONENTS core highgui imgproc)
|
||||
endif()
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${OpenCV_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${OpenCV_LIBS})
|
||||
message(STATUS "OpenCV found (${OpenCV_CONFIG_PATH})")
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_OPENCV)
|
||||
endif()
|
||||
|
||||
# ---[ BLAS
|
||||
if(NOT APPLE)
|
||||
set(BLAS "Atlas" CACHE STRING "Selected BLAS library")
|
||||
set_property(CACHE BLAS PROPERTY STRINGS "Atlas;Open;MKL")
|
||||
|
||||
if(BLAS STREQUAL "Atlas" OR BLAS STREQUAL "atlas")
|
||||
find_package(Atlas REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${Atlas_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${Atlas_LIBRARIES})
|
||||
elseif(BLAS STREQUAL "Open" OR BLAS STREQUAL "open")
|
||||
find_package(OpenBLAS REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${OpenBLAS_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${OpenBLAS_LIB})
|
||||
elseif(BLAS STREQUAL "MKL" OR BLAS STREQUAL "mkl")
|
||||
find_package(MKL REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${MKL_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${MKL_LIBRARIES})
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_MKL)
|
||||
endif()
|
||||
elseif(APPLE)
|
||||
find_package(vecLib REQUIRED)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${vecLib_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${vecLib_LINKER_LIBS})
|
||||
|
||||
if(VECLIB_FOUND)
|
||||
if(NOT vecLib_INCLUDE_DIR MATCHES "^/System/Library/Frameworks/vecLib.framework.*")
|
||||
list(APPEND Caffe_DEFINITIONS PUBLIC -DUSE_ACCELERATE)
|
||||
endif()
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ---[ Python
|
||||
if(BUILD_python)
|
||||
if(NOT "${python_version}" VERSION_LESS "3.0.0")
|
||||
# use python3
|
||||
find_package(PythonInterp 3.0)
|
||||
find_package(PythonLibs 3.0)
|
||||
find_package(NumPy 1.7.1)
|
||||
# Find the matching boost python implementation
|
||||
set(version ${PYTHONLIBS_VERSION_STRING})
|
||||
|
||||
STRING( REGEX REPLACE "[^0-9]" "" boost_py_version ${version} )
|
||||
find_package(Boost 1.46 COMPONENTS "python-py${boost_py_version}")
|
||||
set(Boost_PYTHON_FOUND ${Boost_PYTHON-PY${boost_py_version}_FOUND})
|
||||
|
||||
while(NOT "${version}" STREQUAL "" AND NOT Boost_PYTHON_FOUND)
|
||||
STRING( REGEX REPLACE "([0-9.]+).[0-9]+" "\\1" version ${version} )
|
||||
|
||||
STRING( REGEX REPLACE "[^0-9]" "" boost_py_version ${version} )
|
||||
find_package(Boost 1.46 COMPONENTS "python-py${boost_py_version}")
|
||||
set(Boost_PYTHON_FOUND ${Boost_PYTHON-PY${boost_py_version}_FOUND})
|
||||
|
||||
STRING( REGEX MATCHALL "([0-9.]+).[0-9]+" has_more_version ${version} )
|
||||
if("${has_more_version}" STREQUAL "")
|
||||
break()
|
||||
endif()
|
||||
endwhile()
|
||||
if(NOT Boost_PYTHON_FOUND)
|
||||
find_package(Boost 1.46 COMPONENTS python)
|
||||
endif()
|
||||
else()
|
||||
# disable Python 3 search
|
||||
find_package(PythonInterp 2.7)
|
||||
find_package(PythonLibs 2.7)
|
||||
find_package(NumPy 1.7.1)
|
||||
find_package(Boost 1.46 COMPONENTS python)
|
||||
endif()
|
||||
if(PYTHONLIBS_FOUND AND NUMPY_FOUND AND Boost_PYTHON_FOUND)
|
||||
set(HAVE_PYTHON TRUE)
|
||||
if(BUILD_python_layer)
|
||||
list(APPEND Caffe_DEFINITIONS PRIVATE -DWITH_PYTHON_LAYER)
|
||||
list(APPEND Caffe_INCLUDE_DIRS PRIVATE ${PYTHON_INCLUDE_DIRS} ${NUMPY_INCLUDE_DIR} PUBLIC ${Boost_INCLUDE_DIRS})
|
||||
list(APPEND Caffe_LINKER_LIBS PRIVATE ${PYTHON_LIBRARIES} PUBLIC ${Boost_LIBRARIES})
|
||||
endif()
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ---[ Matlab
|
||||
if(BUILD_matlab)
|
||||
find_package(MatlabMex)
|
||||
if(MATLABMEX_FOUND)
|
||||
set(HAVE_MATLAB TRUE)
|
||||
endif()
|
||||
|
||||
# sudo apt-get install liboctave-dev
|
||||
find_program(Octave_compiler NAMES mkoctfile DOC "Octave C++ compiler")
|
||||
|
||||
if(HAVE_MATLAB AND Octave_compiler)
|
||||
set(Matlab_build_mex_using "Matlab" CACHE STRING "Select Matlab or Octave if both detected")
|
||||
set_property(CACHE Matlab_build_mex_using PROPERTY STRINGS "Matlab;Octave")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ---[ Doxygen
|
||||
if(BUILD_docs)
|
||||
find_package(Doxygen)
|
||||
endif()
|
@ -1,56 +0,0 @@
|
||||
if (NOT __GFLAGS_INCLUDED) # guard against multiple includes
|
||||
set(__GFLAGS_INCLUDED TRUE)
|
||||
|
||||
# use the system-wide gflags if present
|
||||
find_package(GFlags)
|
||||
if (GFLAGS_FOUND)
|
||||
set(GFLAGS_EXTERNAL FALSE)
|
||||
else()
|
||||
# gflags will use pthreads if it's available in the system, so we must link with it
|
||||
find_package(Threads)
|
||||
|
||||
# build directory
|
||||
set(gflags_PREFIX ${CMAKE_BINARY_DIR}/external/gflags-prefix)
|
||||
# install directory
|
||||
set(gflags_INSTALL ${CMAKE_BINARY_DIR}/external/gflags-install)
|
||||
|
||||
# we build gflags statically, but want to link it into the caffe shared library
|
||||
# this requires position-independent code
|
||||
if (UNIX)
|
||||
set(GFLAGS_EXTRA_COMPILER_FLAGS "-fPIC")
|
||||
endif()
|
||||
|
||||
set(GFLAGS_CXX_FLAGS ${CMAKE_CXX_FLAGS} ${GFLAGS_EXTRA_COMPILER_FLAGS})
|
||||
set(GFLAGS_C_FLAGS ${CMAKE_C_FLAGS} ${GFLAGS_EXTRA_COMPILER_FLAGS})
|
||||
|
||||
ExternalProject_Add(gflags
|
||||
PREFIX ${gflags_PREFIX}
|
||||
GIT_REPOSITORY "https://github.com/gflags/gflags.git"
|
||||
GIT_TAG "v2.1.2"
|
||||
UPDATE_COMMAND ""
|
||||
INSTALL_DIR ${gflags_INSTALL}
|
||||
CMAKE_ARGS -DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE}
|
||||
-DCMAKE_INSTALL_PREFIX=${gflags_INSTALL}
|
||||
-DBUILD_SHARED_LIBS=OFF
|
||||
-DBUILD_STATIC_LIBS=ON
|
||||
-DBUILD_PACKAGING=OFF
|
||||
-DBUILD_TESTING=OFF
|
||||
-DBUILD_NC_TESTS=OFF
|
||||
-BUILD_CONFIG_TESTS=OFF
|
||||
-DINSTALL_HEADERS=ON
|
||||
-DCMAKE_C_FLAGS=${GFLAGS_C_FLAGS}
|
||||
-DCMAKE_CXX_FLAGS=${GFLAGS_CXX_FLAGS}
|
||||
LOG_DOWNLOAD 1
|
||||
LOG_INSTALL 1
|
||||
)
|
||||
|
||||
set(GFLAGS_FOUND TRUE)
|
||||
set(GFLAGS_INCLUDE_DIRS ${gflags_INSTALL}/include)
|
||||
set(GFLAGS_LIBRARIES ${gflags_INSTALL}/lib/libgflags.a ${CMAKE_THREAD_LIBS_INIT})
|
||||
set(GFLAGS_LIBRARY_DIRS ${gflags_INSTALL}/lib)
|
||||
set(GFLAGS_EXTERNAL TRUE)
|
||||
|
||||
list(APPEND external_project_dependencies gflags)
|
||||
endif()
|
||||
|
||||
endif()
|
@ -1,57 +0,0 @@
|
||||
# glog depends on gflags
|
||||
include("cmake/External/gflags.cmake")
|
||||
|
||||
if (NOT __GLOG_INCLUDED)
|
||||
set(__GLOG_INCLUDED TRUE)
|
||||
|
||||
# try the system-wide glog first
|
||||
find_package(Glog)
|
||||
if (GLOG_FOUND)
|
||||
set(GLOG_EXTERNAL FALSE)
|
||||
else()
|
||||
# fetch and build glog from github
|
||||
|
||||
# build directory
|
||||
set(glog_PREFIX ${CMAKE_BINARY_DIR}/external/glog-prefix)
|
||||
# install directory
|
||||
set(glog_INSTALL ${CMAKE_BINARY_DIR}/external/glog-install)
|
||||
|
||||
# we build glog statically, but want to link it into the caffe shared library
|
||||
# this requires position-independent code
|
||||
if (UNIX)
|
||||
set(GLOG_EXTRA_COMPILER_FLAGS "-fPIC")
|
||||
endif()
|
||||
|
||||
set(GLOG_CXX_FLAGS ${CMAKE_CXX_FLAGS} ${GLOG_EXTRA_COMPILER_FLAGS})
|
||||
set(GLOG_C_FLAGS ${CMAKE_C_FLAGS} ${GLOG_EXTRA_COMPILER_FLAGS})
|
||||
|
||||
# depend on gflags if we're also building it
|
||||
if (GFLAGS_EXTERNAL)
|
||||
set(GLOG_DEPENDS gflags)
|
||||
endif()
|
||||
|
||||
ExternalProject_Add(glog
|
||||
DEPENDS ${GLOG_DEPENDS}
|
||||
PREFIX ${glog_PREFIX}
|
||||
GIT_REPOSITORY "https://github.com/google/glog"
|
||||
GIT_TAG "v0.3.4"
|
||||
UPDATE_COMMAND ""
|
||||
INSTALL_DIR ${gflags_INSTALL}
|
||||
PATCH_COMMAND autoreconf -i ${glog_PREFIX}/src/glog
|
||||
CONFIGURE_COMMAND env "CFLAGS=${GLOG_C_FLAGS}" "CXXFLAGS=${GLOG_CXX_FLAGS}" ${glog_PREFIX}/src/glog/configure --prefix=${glog_INSTALL} --enable-shared=no --enable-static=yes --with-gflags=${GFLAGS_LIBRARY_DIRS}/..
|
||||
LOG_DOWNLOAD 1
|
||||
LOG_CONFIGURE 1
|
||||
LOG_INSTALL 1
|
||||
)
|
||||
|
||||
set(GLOG_FOUND TRUE)
|
||||
set(GLOG_INCLUDE_DIRS ${glog_INSTALL}/include)
|
||||
set(GLOG_LIBRARIES ${GFLAGS_LIBRARIES} ${glog_INSTALL}/lib/libglog.a)
|
||||
set(GLOG_LIBRARY_DIRS ${glog_INSTALL}/lib)
|
||||
set(GLOG_EXTERNAL TRUE)
|
||||
|
||||
list(APPEND external_project_dependencies glog)
|
||||
endif()
|
||||
|
||||
endif()
|
||||
|
@ -1,53 +0,0 @@
|
||||
# ---[ Configuration types
|
||||
set(CMAKE_CONFIGURATION_TYPES "Debug;Release" CACHE STRING "Possible configurations" FORCE)
|
||||
mark_as_advanced(CMAKE_CONFIGURATION_TYPES)
|
||||
|
||||
if(DEFINED CMAKE_BUILD_TYPE)
|
||||
set_property(CACHE CMAKE_BUILD_TYPE PROPERTY STRINGS ${CMAKE_CONFIGURATION_TYPES})
|
||||
endif()
|
||||
|
||||
# --[ If user doesn't specify build type then assume release
|
||||
if("${CMAKE_BUILD_TYPE}" STREQUAL "")
|
||||
set(CMAKE_BUILD_TYPE Release)
|
||||
endif()
|
||||
|
||||
if("${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang")
|
||||
set(CMAKE_COMPILER_IS_CLANGXX TRUE)
|
||||
endif()
|
||||
|
||||
# ---[ Solution folders
|
||||
caffe_option(USE_PROJECT_FOLDERS "IDE Solution folders" (MSVC_IDE OR CMAKE_GENERATOR MATCHES Xcode) )
|
||||
|
||||
if(USE_PROJECT_FOLDERS)
|
||||
set_property(GLOBAL PROPERTY USE_FOLDERS ON)
|
||||
set_property(GLOBAL PROPERTY PREDEFINED_TARGETS_FOLDER "CMakeTargets")
|
||||
endif()
|
||||
|
||||
# ---[ Install options
|
||||
if(CMAKE_INSTALL_PREFIX_INITIALIZED_TO_DEFAULT)
|
||||
set(CMAKE_INSTALL_PREFIX "${PROJECT_BINARY_DIR}/install" CACHE PATH "Default install path" FORCE)
|
||||
endif()
|
||||
|
||||
# ---[ RPATH settings
|
||||
set(CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE CACHE BOOLEAN "Use link paths for shared library rpath")
|
||||
set(CMAKE_MACOSX_RPATH TRUE)
|
||||
|
||||
list(FIND CMAKE_PLATFORM_IMPLICIT_LINK_DIRECTORIES
|
||||
${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR} __is_systtem_dir)
|
||||
if(${__is_systtem_dir} STREQUAL -1)
|
||||
set(CMAKE_INSTALL_RPATH ${CMAKE_INSTALL_PREFIX}/${CMAKE_INSTALL_LIBDIR})
|
||||
endif()
|
||||
|
||||
# ---[ Funny target
|
||||
if(UNIX OR APPLE)
|
||||
add_custom_target(symlink_to_build COMMAND "ln" "-sf" "${PROJECT_BINARY_DIR}" "${PROJECT_SOURCE_DIR}/build"
|
||||
COMMENT "Adding symlink: <caffe_root>/build -> ${PROJECT_BINARY_DIR}" )
|
||||
endif()
|
||||
|
||||
# ---[ Set debug postfix
|
||||
set(Caffe_DEBUG_POSTFIX "-d")
|
||||
|
||||
set(Caffe_POSTFIX "")
|
||||
if(CMAKE_BUILD_TYPE MATCHES "Debug")
|
||||
set(Caffe_POSTFIX ${Caffe_DEBUG_POSTFIX})
|
||||
endif()
|
@ -1,52 +0,0 @@
|
||||
# Find the Atlas (and Lapack) libraries
|
||||
#
|
||||
# The following variables are optionally searched for defaults
|
||||
# Atlas_ROOT_DIR: Base directory where all Atlas components are found
|
||||
#
|
||||
# The following are set after configuration is done:
|
||||
# Atlas_FOUND
|
||||
# Atlas_INCLUDE_DIRS
|
||||
# Atlas_LIBRARIES
|
||||
# Atlas_LIBRARYRARY_DIRS
|
||||
|
||||
set(Atlas_INCLUDE_SEARCH_PATHS
|
||||
/usr/include/atlas
|
||||
/usr/include/atlas-base
|
||||
$ENV{Atlas_ROOT_DIR}
|
||||
$ENV{Atlas_ROOT_DIR}/include
|
||||
)
|
||||
|
||||
set(Atlas_LIB_SEARCH_PATHS
|
||||
/usr/lib/atlas
|
||||
/usr/lib/atlas-base
|
||||
$ENV{Atlas_ROOT_DIR}
|
||||
$ENV{Atlas_ROOT_DIR}/lib
|
||||
)
|
||||
|
||||
find_path(Atlas_CBLAS_INCLUDE_DIR NAMES cblas.h PATHS ${Atlas_INCLUDE_SEARCH_PATHS})
|
||||
find_path(Atlas_CLAPACK_INCLUDE_DIR NAMES clapack.h PATHS ${Atlas_INCLUDE_SEARCH_PATHS})
|
||||
|
||||
find_library(Atlas_CBLAS_LIBRARY NAMES ptcblas_r ptcblas cblas_r cblas PATHS ${Atlas_LIB_SEARCH_PATHS})
|
||||
find_library(Atlas_BLAS_LIBRARY NAMES atlas_r atlas PATHS ${Atlas_LIB_SEARCH_PATHS})
|
||||
find_library(Atlas_LAPACK_LIBRARY NAMES lapack alapack_r alapack lapack_atlas atllapack PATHS ${Atlas_LIB_SEARCH_PATHS})
|
||||
|
||||
set(LOOKED_FOR
|
||||
Atlas_CBLAS_INCLUDE_DIR
|
||||
Atlas_CLAPACK_INCLUDE_DIR
|
||||
|
||||
Atlas_CBLAS_LIBRARY
|
||||
Atlas_BLAS_LIBRARY
|
||||
Atlas_LAPACK_LIBRARY
|
||||
)
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(Atlas DEFAULT_MSG ${LOOKED_FOR})
|
||||
|
||||
if(ATLAS_FOUND)
|
||||
set(Atlas_INCLUDE_DIR ${Atlas_CBLAS_INCLUDE_DIR} ${Atlas_CLAPACK_INCLUDE_DIR})
|
||||
set(Atlas_LIBRARIES ${Atlas_LAPACK_LIBRARY} ${Atlas_CBLAS_LIBRARY} ${Atlas_BLAS_LIBRARY})
|
||||
mark_as_advanced(${LOOKED_FOR})
|
||||
|
||||
message(STATUS "Found Atlas (include: ${Atlas_CBLAS_INCLUDE_DIR} library: ${Atlas_BLAS_LIBRARY} lapack: ${Atlas_LAPACK_LIBRARY}")
|
||||
endif(ATLAS_FOUND)
|
||||
|
@ -1,50 +0,0 @@
|
||||
# - Try to find GFLAGS
|
||||
#
|
||||
# The following variables are optionally searched for defaults
|
||||
# GFLAGS_ROOT_DIR: Base directory where all GFLAGS components are found
|
||||
#
|
||||
# The following are set after configuration is done:
|
||||
# GFLAGS_FOUND
|
||||
# GFLAGS_INCLUDE_DIRS
|
||||
# GFLAGS_LIBRARIES
|
||||
# GFLAGS_LIBRARYRARY_DIRS
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
|
||||
set(GFLAGS_ROOT_DIR "" CACHE PATH "Folder contains Gflags")
|
||||
|
||||
# We are testing only a couple of files in the include directories
|
||||
if(WIN32)
|
||||
find_path(GFLAGS_INCLUDE_DIR gflags/gflags.h
|
||||
PATHS ${GFLAGS_ROOT_DIR}/src/windows)
|
||||
else()
|
||||
find_path(GFLAGS_INCLUDE_DIR gflags/gflags.h
|
||||
PATHS ${GFLAGS_ROOT_DIR})
|
||||
endif()
|
||||
|
||||
if(MSVC)
|
||||
find_library(GFLAGS_LIBRARY_RELEASE
|
||||
NAMES libgflags
|
||||
PATHS ${GFLAGS_ROOT_DIR}
|
||||
PATH_SUFFIXES Release)
|
||||
|
||||
find_library(GFLAGS_LIBRARY_DEBUG
|
||||
NAMES libgflags-debug
|
||||
PATHS ${GFLAGS_ROOT_DIR}
|
||||
PATH_SUFFIXES Debug)
|
||||
|
||||
set(GFLAGS_LIBRARY optimized ${GFLAGS_LIBRARY_RELEASE} debug ${GFLAGS_LIBRARY_DEBUG})
|
||||
else()
|
||||
find_library(GFLAGS_LIBRARY gflags)
|
||||
endif()
|
||||
|
||||
find_package_handle_standard_args(GFlags DEFAULT_MSG GFLAGS_INCLUDE_DIR GFLAGS_LIBRARY)
|
||||
|
||||
|
||||
if(GFLAGS_FOUND)
|
||||
set(GFLAGS_INCLUDE_DIRS ${GFLAGS_INCLUDE_DIR})
|
||||
set(GFLAGS_LIBRARIES ${GFLAGS_LIBRARY})
|
||||
message(STATUS "Found gflags (include: ${GFLAGS_INCLUDE_DIR}, library: ${GFLAGS_LIBRARY})")
|
||||
mark_as_advanced(GFLAGS_LIBRARY_DEBUG GFLAGS_LIBRARY_RELEASE
|
||||
GFLAGS_LIBRARY GFLAGS_INCLUDE_DIR GFLAGS_ROOT_DIR)
|
||||
endif()
|
@ -1,48 +0,0 @@
|
||||
# - Try to find Glog
|
||||
#
|
||||
# The following variables are optionally searched for defaults
|
||||
# GLOG_ROOT_DIR: Base directory where all GLOG components are found
|
||||
#
|
||||
# The following are set after configuration is done:
|
||||
# GLOG_FOUND
|
||||
# GLOG_INCLUDE_DIRS
|
||||
# GLOG_LIBRARIES
|
||||
# GLOG_LIBRARYRARY_DIRS
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
|
||||
set(GLOG_ROOT_DIR "" CACHE PATH "Folder contains Google glog")
|
||||
|
||||
if(WIN32)
|
||||
find_path(GLOG_INCLUDE_DIR glog/logging.h
|
||||
PATHS ${GLOG_ROOT_DIR}/src/windows)
|
||||
else()
|
||||
find_path(GLOG_INCLUDE_DIR glog/logging.h
|
||||
PATHS ${GLOG_ROOT_DIR})
|
||||
endif()
|
||||
|
||||
if(MSVC)
|
||||
find_library(GLOG_LIBRARY_RELEASE libglog_static
|
||||
PATHS ${GLOG_ROOT_DIR}
|
||||
PATH_SUFFIXES Release)
|
||||
|
||||
find_library(GLOG_LIBRARY_DEBUG libglog_static
|
||||
PATHS ${GLOG_ROOT_DIR}
|
||||
PATH_SUFFIXES Debug)
|
||||
|
||||
set(GLOG_LIBRARY optimized ${GLOG_LIBRARY_RELEASE} debug ${GLOG_LIBRARY_DEBUG})
|
||||
else()
|
||||
find_library(GLOG_LIBRARY glog
|
||||
PATHS ${GLOG_ROOT_DIR}
|
||||
PATH_SUFFIXES lib lib64)
|
||||
endif()
|
||||
|
||||
find_package_handle_standard_args(Glog DEFAULT_MSG GLOG_INCLUDE_DIR GLOG_LIBRARY)
|
||||
|
||||
if(GLOG_FOUND)
|
||||
set(GLOG_INCLUDE_DIRS ${GLOG_INCLUDE_DIR})
|
||||
set(GLOG_LIBRARIES ${GLOG_LIBRARY})
|
||||
message(STATUS "Found glog (include: ${GLOG_INCLUDE_DIR}, library: ${GLOG_LIBRARY})")
|
||||
mark_as_advanced(GLOG_ROOT_DIR GLOG_LIBRARY_RELEASE GLOG_LIBRARY_DEBUG
|
||||
GLOG_LIBRARY GLOG_INCLUDE_DIR)
|
||||
endif()
|
@ -1,190 +0,0 @@
|
||||
# - Find LAPACK library
|
||||
# This module finds an installed fortran library that implements the LAPACK
|
||||
# linear-algebra interface (see http://www.netlib.org/lapack/).
|
||||
#
|
||||
# The approach follows that taken for the autoconf macro file, acx_lapack.m4
|
||||
# (distributed at http://ac-archive.sourceforge.net/ac-archive/acx_lapack.html).
|
||||
#
|
||||
# This module sets the following variables:
|
||||
# LAPACK_FOUND - set to true if a library implementing the LAPACK interface is found
|
||||
# LAPACK_LIBRARIES - list of libraries (using full path name) for LAPACK
|
||||
|
||||
# Note: I do not think it is a good idea to mixup different BLAS/LAPACK versions
|
||||
# Hence, this script wants to find a Lapack library matching your Blas library
|
||||
|
||||
# Do nothing if LAPACK was found before
|
||||
IF(NOT LAPACK_FOUND)
|
||||
|
||||
SET(LAPACK_LIBRARIES)
|
||||
SET(LAPACK_INFO)
|
||||
|
||||
IF(LAPACK_FIND_QUIETLY OR NOT LAPACK_FIND_REQUIRED)
|
||||
FIND_PACKAGE(BLAS)
|
||||
ELSE(LAPACK_FIND_QUIETLY OR NOT LAPACK_FIND_REQUIRED)
|
||||
FIND_PACKAGE(BLAS REQUIRED)
|
||||
ENDIF(LAPACK_FIND_QUIETLY OR NOT LAPACK_FIND_REQUIRED)
|
||||
|
||||
# Old search lapack script
|
||||
include(CheckFortranFunctionExists)
|
||||
|
||||
macro(Check_Lapack_Libraries LIBRARIES _prefix _name _flags _list _blas)
|
||||
# This macro checks for the existence of the combination of fortran libraries
|
||||
# given by _list. If the combination is found, this macro checks (using the
|
||||
# Check_Fortran_Function_Exists macro) whether can link against that library
|
||||
# combination using the name of a routine given by _name using the linker
|
||||
# flags given by _flags. If the combination of libraries is found and passes
|
||||
# the link test, LIBRARIES is set to the list of complete library paths that
|
||||
# have been found. Otherwise, LIBRARIES is set to FALSE.
|
||||
# N.B. _prefix is the prefix applied to the names of all cached variables that
|
||||
# are generated internally and marked advanced by this macro.
|
||||
set(_libraries_work TRUE)
|
||||
set(${LIBRARIES})
|
||||
set(_combined_name)
|
||||
foreach(_library ${_list})
|
||||
set(_combined_name ${_combined_name}_${_library})
|
||||
if(_libraries_work)
|
||||
if (WIN32)
|
||||
find_library(${_prefix}_${_library}_LIBRARY
|
||||
NAMES ${_library} PATHS ENV LIB PATHS ENV PATH)
|
||||
else (WIN32)
|
||||
if(APPLE)
|
||||
find_library(${_prefix}_${_library}_LIBRARY
|
||||
NAMES ${_library}
|
||||
PATHS /usr/local/lib /usr/lib /usr/local/lib64 /usr/lib64
|
||||
ENV DYLD_LIBRARY_PATH)
|
||||
else(APPLE)
|
||||
find_library(${_prefix}_${_library}_LIBRARY
|
||||
NAMES ${_library}
|
||||
PATHS /usr/local/lib /usr/lib /usr/local/lib64 /usr/lib64
|
||||
ENV LD_LIBRARY_PATH)
|
||||
endif(APPLE)
|
||||
endif(WIN32)
|
||||
mark_as_advanced(${_prefix}_${_library}_LIBRARY)
|
||||
set(${LIBRARIES} ${${LIBRARIES}} ${${_prefix}_${_library}_LIBRARY})
|
||||
set(_libraries_work ${${_prefix}_${_library}_LIBRARY})
|
||||
endif(_libraries_work)
|
||||
endforeach(_library ${_list})
|
||||
if(_libraries_work)
|
||||
# Test this combination of libraries.
|
||||
set(CMAKE_REQUIRED_LIBRARIES ${_flags} ${${LIBRARIES}} ${_blas})
|
||||
if (CMAKE_Fortran_COMPILER_WORKS)
|
||||
check_fortran_function_exists(${_name} ${_prefix}${_combined_name}_WORKS)
|
||||
else (CMAKE_Fortran_COMPILER_WORKS)
|
||||
check_function_exists("${_name}_" ${_prefix}${_combined_name}_WORKS)
|
||||
endif (CMAKE_Fortran_COMPILER_WORKS)
|
||||
set(CMAKE_REQUIRED_LIBRARIES)
|
||||
mark_as_advanced(${_prefix}${_combined_name}_WORKS)
|
||||
set(_libraries_work ${${_prefix}${_combined_name}_WORKS})
|
||||
endif(_libraries_work)
|
||||
if(NOT _libraries_work)
|
||||
set(${LIBRARIES} FALSE)
|
||||
endif(NOT _libraries_work)
|
||||
endmacro(Check_Lapack_Libraries)
|
||||
|
||||
|
||||
if(BLAS_FOUND)
|
||||
|
||||
# Intel MKL
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "mkl"))
|
||||
IF(MKL_LAPACK_LIBRARIES)
|
||||
SET(LAPACK_LIBRARIES ${MKL_LAPACK_LIBRARIES} ${MKL_LIBRARIES})
|
||||
ELSE(MKL_LAPACK_LIBRARIES)
|
||||
SET(LAPACK_LIBRARIES ${MKL_LIBRARIES})
|
||||
ENDIF(MKL_LAPACK_LIBRARIES)
|
||||
SET(LAPACK_INCLUDE_DIR ${MKL_INCLUDE_DIR})
|
||||
SET(LAPACK_INFO "mkl")
|
||||
ENDIF()
|
||||
|
||||
# OpenBlas
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "open"))
|
||||
SET(CMAKE_REQUIRED_LIBRARIES ${BLAS_LIBRARIES})
|
||||
check_function_exists("cheev_" OPEN_LAPACK_WORKS)
|
||||
if(OPEN_LAPACK_WORKS)
|
||||
SET(LAPACK_INFO "open")
|
||||
else()
|
||||
message(STATUS "It seems OpenBlas has not been compiled with Lapack support")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# GotoBlas
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "goto"))
|
||||
SET(CMAKE_REQUIRED_LIBRARIES ${BLAS_LIBRARIES})
|
||||
check_function_exists("cheev_" GOTO_LAPACK_WORKS)
|
||||
if(GOTO_LAPACK_WORKS)
|
||||
SET(LAPACK_INFO "goto")
|
||||
else()
|
||||
message(STATUS "It seems GotoBlas has not been compiled with Lapack support")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# ACML
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "acml"))
|
||||
SET(CMAKE_REQUIRED_LIBRARIES ${BLAS_LIBRARIES})
|
||||
check_function_exists("cheev_" ACML_LAPACK_WORKS)
|
||||
if(ACML_LAPACK_WORKS)
|
||||
SET(LAPACK_INFO "acml")
|
||||
else()
|
||||
message(STATUS "Strangely, this ACML library does not support Lapack?!")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# Accelerate
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "accelerate"))
|
||||
SET(CMAKE_REQUIRED_LIBRARIES ${BLAS_LIBRARIES})
|
||||
check_function_exists("cheev_" ACCELERATE_LAPACK_WORKS)
|
||||
if(ACCELERATE_LAPACK_WORKS)
|
||||
SET(LAPACK_INFO "accelerate")
|
||||
else()
|
||||
message(STATUS "Strangely, this Accelerate library does not support Lapack?!")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# vecLib
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "veclib"))
|
||||
SET(CMAKE_REQUIRED_LIBRARIES ${BLAS_LIBRARIES})
|
||||
check_function_exists("cheev_" VECLIB_LAPACK_WORKS)
|
||||
if(VECLIB_LAPACK_WORKS)
|
||||
SET(LAPACK_INFO "veclib")
|
||||
else()
|
||||
message(STATUS "Strangely, this vecLib library does not support Lapack?!")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# Generic LAPACK library?
|
||||
IF((NOT LAPACK_INFO) AND (BLAS_INFO STREQUAL "generic"))
|
||||
check_lapack_libraries(
|
||||
LAPACK_LIBRARIES
|
||||
LAPACK
|
||||
cheev
|
||||
""
|
||||
"lapack"
|
||||
"${BLAS_LIBRARIES}"
|
||||
)
|
||||
if(LAPACK_LIBRARIES)
|
||||
SET(LAPACK_INFO "generic")
|
||||
endif(LAPACK_LIBRARIES)
|
||||
endif()
|
||||
|
||||
else(BLAS_FOUND)
|
||||
message(STATUS "LAPACK requires BLAS")
|
||||
endif(BLAS_FOUND)
|
||||
|
||||
if(LAPACK_INFO)
|
||||
set(LAPACK_FOUND TRUE)
|
||||
else(LAPACK_INFO)
|
||||
set(LAPACK_FOUND FALSE)
|
||||
endif(LAPACK_INFO)
|
||||
|
||||
IF (NOT LAPACK_FOUND AND LAPACK_FIND_REQUIRED)
|
||||
message(FATAL_ERROR "Cannot find a library with LAPACK API. Please specify library location.")
|
||||
ENDIF (NOT LAPACK_FOUND AND LAPACK_FIND_REQUIRED)
|
||||
IF(NOT LAPACK_FIND_QUIETLY)
|
||||
IF(LAPACK_FOUND)
|
||||
MESSAGE(STATUS "Found a library with LAPACK API. (${LAPACK_INFO})")
|
||||
ELSE(LAPACK_FOUND)
|
||||
MESSAGE(STATUS "Cannot find a library with LAPACK API. Not using LAPACK.")
|
||||
ENDIF(LAPACK_FOUND)
|
||||
ENDIF(NOT LAPACK_FIND_QUIETLY)
|
||||
|
||||
# Do nothing if LAPACK was found before
|
||||
ENDIF(NOT LAPACK_FOUND)
|
@ -1,28 +0,0 @@
|
||||
# Try to find the LMBD libraries and headers
|
||||
# LMDB_FOUND - system has LMDB lib
|
||||
# LMDB_INCLUDE_DIR - the LMDB include directory
|
||||
# LMDB_LIBRARIES - Libraries needed to use LMDB
|
||||
|
||||
# FindCWD based on FindGMP by:
|
||||
# Copyright (c) 2006, Laurent Montel, <montel@kde.org>
|
||||
#
|
||||
# Redistribution and use is allowed according to the terms of the BSD license.
|
||||
|
||||
# Adapted from FindCWD by:
|
||||
# Copyright 2013 Conrad Steenberg <conrad.steenberg@gmail.com>
|
||||
# Aug 31, 2013
|
||||
|
||||
find_path(LMDB_INCLUDE_DIR NAMES lmdb.h PATHS "$ENV{LMDB_DIR}/include")
|
||||
find_library(LMDB_LIBRARIES NAMES lmdb PATHS "$ENV{LMDB_DIR}/lib" )
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(LMDB DEFAULT_MSG LMDB_INCLUDE_DIR LMDB_LIBRARIES)
|
||||
|
||||
if(LMDB_FOUND)
|
||||
message(STATUS "Found lmdb (include: ${LMDB_INCLUDE_DIR}, library: ${LMDB_LIBRARIES})")
|
||||
mark_as_advanced(LMDB_INCLUDE_DIR LMDB_LIBRARIES)
|
||||
|
||||
caffe_parse_header(${LMDB_INCLUDE_DIR}/lmdb.h
|
||||
LMDB_VERSION_LINES MDB_VERSION_MAJOR MDB_VERSION_MINOR MDB_VERSION_PATCH)
|
||||
set(LMDB_VERSION "${MDB_VERSION_MAJOR}.${MDB_VERSION_MINOR}.${MDB_VERSION_PATCH}")
|
||||
endif()
|
@ -1,44 +0,0 @@
|
||||
# - Find LevelDB
|
||||
#
|
||||
# LevelDB_INCLUDES - List of LevelDB includes
|
||||
# LevelDB_LIBRARIES - List of libraries when using LevelDB.
|
||||
# LevelDB_FOUND - True if LevelDB found.
|
||||
|
||||
# Look for the header file.
|
||||
find_path(LevelDB_INCLUDE NAMES leveldb/db.h
|
||||
PATHS $ENV{LEVELDB_ROOT}/include /opt/local/include /usr/local/include /usr/include
|
||||
DOC "Path in which the file leveldb/db.h is located." )
|
||||
|
||||
# Look for the library.
|
||||
find_library(LevelDB_LIBRARY NAMES leveldb
|
||||
PATHS /usr/lib $ENV{LEVELDB_ROOT}/lib
|
||||
DOC "Path to leveldb library." )
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(LevelDB DEFAULT_MSG LevelDB_INCLUDE LevelDB_LIBRARY)
|
||||
|
||||
if(LEVELDB_FOUND)
|
||||
message(STATUS "Found LevelDB (include: ${LevelDB_INCLUDE}, library: ${LevelDB_LIBRARY})")
|
||||
set(LevelDB_INCLUDES ${LevelDB_INCLUDE})
|
||||
set(LevelDB_LIBRARIES ${LevelDB_LIBRARY})
|
||||
mark_as_advanced(LevelDB_INCLUDE LevelDB_LIBRARY)
|
||||
|
||||
if(EXISTS "${LevelDB_INCLUDE}/leveldb/db.h")
|
||||
file(STRINGS "${LevelDB_INCLUDE}/leveldb/db.h" __version_lines
|
||||
REGEX "static const int k[^V]+Version[ \t]+=[ \t]+[0-9]+;")
|
||||
|
||||
foreach(__line ${__version_lines})
|
||||
if(__line MATCHES "[^k]+kMajorVersion[ \t]+=[ \t]+([0-9]+);")
|
||||
set(LEVELDB_VERSION_MAJOR ${CMAKE_MATCH_1})
|
||||
elseif(__line MATCHES "[^k]+kMinorVersion[ \t]+=[ \t]+([0-9]+);")
|
||||
set(LEVELDB_VERSION_MINOR ${CMAKE_MATCH_1})
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
if(LEVELDB_VERSION_MAJOR AND LEVELDB_VERSION_MINOR)
|
||||
set(LEVELDB_VERSION "${LEVELDB_VERSION_MAJOR}.${LEVELDB_VERSION_MINOR}")
|
||||
endif()
|
||||
|
||||
caffe_clear_vars(__line __version_lines)
|
||||
endif()
|
||||
endif()
|
@ -1,110 +0,0 @@
|
||||
# Find the MKL libraries
|
||||
#
|
||||
# Options:
|
||||
#
|
||||
# MKL_USE_SINGLE_DYNAMIC_LIBRARY : use single dynamic library interface
|
||||
# MKL_USE_STATIC_LIBS : use static libraries
|
||||
# MKL_MULTI_THREADED : use multi-threading
|
||||
#
|
||||
# This module defines the following variables:
|
||||
#
|
||||
# MKL_FOUND : True mkl is found
|
||||
# MKL_INCLUDE_DIR : include directory
|
||||
# MKL_LIBRARIES : the libraries to link against.
|
||||
|
||||
|
||||
# ---[ Options
|
||||
caffe_option(MKL_USE_SINGLE_DYNAMIC_LIBRARY "Use single dynamic library interface" ON)
|
||||
caffe_option(MKL_USE_STATIC_LIBS "Use static libraries" OFF IF NOT MKL_USE_SINGLE_DYNAMIC_LIBRARY)
|
||||
caffe_option(MKL_MULTI_THREADED "Use multi-threading" ON IF NOT MKL_USE_SINGLE_DYNAMIC_LIBRARY)
|
||||
|
||||
# ---[ Root folders
|
||||
set(INTEL_ROOT "/opt/intel" CACHE PATH "Folder contains intel libs")
|
||||
find_path(MKL_ROOT include/mkl.h PATHS $ENV{MKLROOT} ${INTEL_ROOT}/mkl
|
||||
DOC "Folder contains MKL")
|
||||
|
||||
# ---[ Find include dir
|
||||
find_path(MKL_INCLUDE_DIR mkl.h PATHS ${MKL_ROOT} PATH_SUFFIXES include)
|
||||
set(__looked_for MKL_INCLUDE_DIR)
|
||||
|
||||
# ---[ Find libraries
|
||||
if(CMAKE_SIZEOF_VOID_P EQUAL 4)
|
||||
set(__path_suffixes lib lib/ia32)
|
||||
else()
|
||||
set(__path_suffixes lib lib/intel64)
|
||||
endif()
|
||||
|
||||
set(__mkl_libs "")
|
||||
if(MKL_USE_SINGLE_DYNAMIC_LIBRARY)
|
||||
list(APPEND __mkl_libs rt)
|
||||
else()
|
||||
if(CMAKE_SIZEOF_VOID_P EQUAL 4)
|
||||
if(WIN32)
|
||||
list(APPEND __mkl_libs intel_c)
|
||||
else()
|
||||
list(APPEND __mkl_libs intel gf)
|
||||
endif()
|
||||
else()
|
||||
list(APPEND __mkl_libs intel_lp64 gf_lp64)
|
||||
endif()
|
||||
|
||||
if(MKL_MULTI_THREADED)
|
||||
list(APPEND __mkl_libs intel_thread)
|
||||
else()
|
||||
list(APPEND __mkl_libs sequential)
|
||||
endif()
|
||||
|
||||
list(APPEND __mkl_libs core cdft_core)
|
||||
endif()
|
||||
|
||||
|
||||
foreach (__lib ${__mkl_libs})
|
||||
set(__mkl_lib "mkl_${__lib}")
|
||||
string(TOUPPER ${__mkl_lib} __mkl_lib_upper)
|
||||
|
||||
if(MKL_USE_STATIC_LIBS)
|
||||
set(__mkl_lib "lib${__mkl_lib}.a")
|
||||
endif()
|
||||
|
||||
find_library(${__mkl_lib_upper}_LIBRARY
|
||||
NAMES ${__mkl_lib}
|
||||
PATHS ${MKL_ROOT} "${MKL_INCLUDE_DIR}/.."
|
||||
PATH_SUFFIXES ${__path_suffixes}
|
||||
DOC "The path to Intel(R) MKL ${__mkl_lib} library")
|
||||
mark_as_advanced(${__mkl_lib_upper}_LIBRARY)
|
||||
|
||||
list(APPEND __looked_for ${__mkl_lib_upper}_LIBRARY)
|
||||
list(APPEND MKL_LIBRARIES ${${__mkl_lib_upper}_LIBRARY})
|
||||
endforeach()
|
||||
|
||||
|
||||
if(NOT MKL_USE_SINGLE_DYNAMIC_LIBRARY)
|
||||
if (MKL_USE_STATIC_LIBS)
|
||||
set(__iomp5_libs iomp5 libiomp5mt.lib)
|
||||
else()
|
||||
set(__iomp5_libs iomp5 libiomp5md.lib)
|
||||
endif()
|
||||
|
||||
if(WIN32)
|
||||
find_path(INTEL_INCLUDE_DIR omp.h PATHS ${INTEL_ROOT} PATH_SUFFIXES include)
|
||||
list(APPEND __looked_for INTEL_INCLUDE_DIR)
|
||||
endif()
|
||||
|
||||
find_library(MKL_RTL_LIBRARY ${__iomp5_libs}
|
||||
PATHS ${INTEL_RTL_ROOT} ${INTEL_ROOT}/compiler ${MKL_ROOT}/.. ${MKL_ROOT}/../compiler
|
||||
PATH_SUFFIXES ${__path_suffixes}
|
||||
DOC "Path to Path to OpenMP runtime library")
|
||||
|
||||
list(APPEND __looked_for MKL_RTL_LIBRARY)
|
||||
list(APPEND MKL_LIBRARIES ${MKL_RTL_LIBRARY})
|
||||
endif()
|
||||
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(MKL DEFAULT_MSG ${__looked_for})
|
||||
|
||||
if(MKL_FOUND)
|
||||
message(STATUS "Found MKL (include: ${MKL_INCLUDE_DIR}, lib: ${MKL_LIBRARIES}")
|
||||
endif()
|
||||
|
||||
caffe_clear_vars(__looked_for __mkl_libs __path_suffixes __lib_suffix __iomp5_libs)
|
@ -1,48 +0,0 @@
|
||||
# This module looks for MatlabMex compiler
|
||||
# Defines variables:
|
||||
# Matlab_DIR - Matlab root dir
|
||||
# Matlab_mex - path to mex compiler
|
||||
# Matlab_mexext - path to mexext
|
||||
|
||||
if(MSVC)
|
||||
foreach(__ver "9.30" "7.14" "7.11" "7.10" "7.9" "7.8" "7.7")
|
||||
get_filename_component(__matlab_root "[HKEY_LOCAL_MACHINE\\SOFTWARE\\MathWorks\\MATLAB\\${__ver};MATLABROOT]" ABSOLUTE)
|
||||
if(__matlab_root)
|
||||
break()
|
||||
endif()
|
||||
endforeach()
|
||||
endif()
|
||||
|
||||
if(APPLE)
|
||||
foreach(__ver "R2014b" "R2014a" "R2013b" "R2013a" "R2012b" "R2012a" "R2011b" "R2011a" "R2010b" "R2010a")
|
||||
if(EXISTS /Applications/MATLAB_${__ver}.app)
|
||||
set(__matlab_root /Applications/MATLAB_${__ver}.app)
|
||||
break()
|
||||
endif()
|
||||
endforeach()
|
||||
endif()
|
||||
|
||||
if(UNIX)
|
||||
execute_process(COMMAND which matlab OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
OUTPUT_VARIABLE __out RESULT_VARIABLE __res)
|
||||
|
||||
if(__res MATCHES 0) # Suppress `readlink` warning if `which` returned nothing
|
||||
execute_process(COMMAND which matlab COMMAND xargs readlink
|
||||
COMMAND xargs dirname COMMAND xargs dirname COMMAND xargs echo -n
|
||||
OUTPUT_VARIABLE __matlab_root OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
|
||||
find_path(Matlab_DIR NAMES bin/mex bin/mexext PATHS ${__matlab_root}
|
||||
DOC "Matlab directory" NO_DEFAULT_PATH)
|
||||
|
||||
find_program(Matlab_mex NAMES mex mex.bat HINTS ${Matlab_DIR} PATH_SUFFIXES bin NO_DEFAULT_PATH)
|
||||
find_program(Matlab_mexext NAMES mexext mexext.bat HINTS ${Matlab_DIR} PATH_SUFFIXES bin NO_DEFAULT_PATH)
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(MatlabMex DEFAULT_MSG Matlab_mex Matlab_mexext)
|
||||
|
||||
if(MATLABMEX_FOUND)
|
||||
mark_as_advanced(Matlab_mex Matlab_mexext)
|
||||
endif()
|
@ -1,26 +0,0 @@
|
||||
set(NCCL_INC_PATHS
|
||||
/usr/include
|
||||
/usr/local/include
|
||||
$ENV{NCCL_DIR}/include
|
||||
)
|
||||
|
||||
set(NCCL_LIB_PATHS
|
||||
/lib
|
||||
/lib64
|
||||
/usr/lib
|
||||
/usr/lib64
|
||||
/usr/local/lib
|
||||
/usr/local/lib64
|
||||
$ENV{NCCL_DIR}/lib
|
||||
)
|
||||
|
||||
find_path(NCCL_INCLUDE_DIR NAMES nccl.h PATHS ${NCCL_INC_PATHS})
|
||||
find_library(NCCL_LIBRARIES NAMES nccl PATHS ${NCCL_LIB_PATHS})
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(NCCL DEFAULT_MSG NCCL_INCLUDE_DIR NCCL_LIBRARIES)
|
||||
|
||||
if (NCCL_FOUND)
|
||||
message(STATUS "Found NCCL (include: ${NCCL_INCLUDE_DIR}, library: ${NCCL_LIBRARIES})")
|
||||
mark_as_advanced(NCCL_INCLUDE_DIR NCCL_LIBRARIES)
|
||||
endif ()
|
@ -1,58 +0,0 @@
|
||||
# - Find the NumPy libraries
|
||||
# This module finds if NumPy is installed, and sets the following variables
|
||||
# indicating where it is.
|
||||
#
|
||||
# TODO: Update to provide the libraries and paths for linking npymath lib.
|
||||
#
|
||||
# NUMPY_FOUND - was NumPy found
|
||||
# NUMPY_VERSION - the version of NumPy found as a string
|
||||
# NUMPY_VERSION_MAJOR - the major version number of NumPy
|
||||
# NUMPY_VERSION_MINOR - the minor version number of NumPy
|
||||
# NUMPY_VERSION_PATCH - the patch version number of NumPy
|
||||
# NUMPY_VERSION_DECIMAL - e.g. version 1.6.1 is 10601
|
||||
# NUMPY_INCLUDE_DIR - path to the NumPy include files
|
||||
|
||||
unset(NUMPY_VERSION)
|
||||
unset(NUMPY_INCLUDE_DIR)
|
||||
|
||||
if(PYTHONINTERP_FOUND)
|
||||
execute_process(COMMAND "${PYTHON_EXECUTABLE}" "-c"
|
||||
"import numpy as n; print(n.__version__); print(n.get_include());"
|
||||
RESULT_VARIABLE __result
|
||||
OUTPUT_VARIABLE __output
|
||||
OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
|
||||
if(__result MATCHES 0)
|
||||
string(REGEX REPLACE ";" "\\\\;" __values ${__output})
|
||||
string(REGEX REPLACE "\r?\n" ";" __values ${__values})
|
||||
list(GET __values 0 NUMPY_VERSION)
|
||||
list(GET __values 1 NUMPY_INCLUDE_DIR)
|
||||
|
||||
string(REGEX MATCH "^([0-9])+\\.([0-9])+\\.([0-9])+" __ver_check "${NUMPY_VERSION}")
|
||||
if(NOT "${__ver_check}" STREQUAL "")
|
||||
set(NUMPY_VERSION_MAJOR ${CMAKE_MATCH_1})
|
||||
set(NUMPY_VERSION_MINOR ${CMAKE_MATCH_2})
|
||||
set(NUMPY_VERSION_PATCH ${CMAKE_MATCH_3})
|
||||
math(EXPR NUMPY_VERSION_DECIMAL
|
||||
"(${NUMPY_VERSION_MAJOR} * 10000) + (${NUMPY_VERSION_MINOR} * 100) + ${NUMPY_VERSION_PATCH}")
|
||||
string(REGEX REPLACE "\\\\" "/" NUMPY_INCLUDE_DIR ${NUMPY_INCLUDE_DIR})
|
||||
else()
|
||||
unset(NUMPY_VERSION)
|
||||
unset(NUMPY_INCLUDE_DIR)
|
||||
message(STATUS "Requested NumPy version and include path, but got instead:\n${__output}\n")
|
||||
endif()
|
||||
endif()
|
||||
else()
|
||||
message(STATUS "To find NumPy Python interpretator is required to be found.")
|
||||
endif()
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(NumPy REQUIRED_VARS NUMPY_INCLUDE_DIR NUMPY_VERSION
|
||||
VERSION_VAR NUMPY_VERSION)
|
||||
|
||||
if(NUMPY_FOUND)
|
||||
message(STATUS "NumPy ver. ${NUMPY_VERSION} found (include: ${NUMPY_INCLUDE_DIR})")
|
||||
endif()
|
||||
|
||||
caffe_clear_vars(__result __output __error_value __values __ver_check __error_value)
|
||||
|
@ -1,64 +0,0 @@
|
||||
|
||||
|
||||
SET(Open_BLAS_INCLUDE_SEARCH_PATHS
|
||||
/usr/include
|
||||
/usr/include/openblas
|
||||
/usr/include/openblas-base
|
||||
/usr/local/include
|
||||
/usr/local/include/openblas
|
||||
/usr/local/include/openblas-base
|
||||
/opt/OpenBLAS/include
|
||||
$ENV{OpenBLAS_HOME}
|
||||
$ENV{OpenBLAS_HOME}/include
|
||||
)
|
||||
|
||||
SET(Open_BLAS_LIB_SEARCH_PATHS
|
||||
/lib/
|
||||
/lib/openblas-base
|
||||
/lib64/
|
||||
/usr/lib
|
||||
/usr/lib/openblas-base
|
||||
/usr/lib64
|
||||
/usr/local/lib
|
||||
/usr/local/lib64
|
||||
/opt/OpenBLAS/lib
|
||||
$ENV{OpenBLAS}cd
|
||||
$ENV{OpenBLAS}/lib
|
||||
$ENV{OpenBLAS_HOME}
|
||||
$ENV{OpenBLAS_HOME}/lib
|
||||
)
|
||||
|
||||
FIND_PATH(OpenBLAS_INCLUDE_DIR NAMES cblas.h PATHS ${Open_BLAS_INCLUDE_SEARCH_PATHS})
|
||||
FIND_LIBRARY(OpenBLAS_LIB NAMES openblas PATHS ${Open_BLAS_LIB_SEARCH_PATHS})
|
||||
|
||||
SET(OpenBLAS_FOUND ON)
|
||||
|
||||
# Check include files
|
||||
IF(NOT OpenBLAS_INCLUDE_DIR)
|
||||
SET(OpenBLAS_FOUND OFF)
|
||||
MESSAGE(STATUS "Could not find OpenBLAS include. Turning OpenBLAS_FOUND off")
|
||||
ENDIF()
|
||||
|
||||
# Check libraries
|
||||
IF(NOT OpenBLAS_LIB)
|
||||
SET(OpenBLAS_FOUND OFF)
|
||||
MESSAGE(STATUS "Could not find OpenBLAS lib. Turning OpenBLAS_FOUND off")
|
||||
ENDIF()
|
||||
|
||||
IF (OpenBLAS_FOUND)
|
||||
IF (NOT OpenBLAS_FIND_QUIETLY)
|
||||
MESSAGE(STATUS "Found OpenBLAS libraries: ${OpenBLAS_LIB}")
|
||||
MESSAGE(STATUS "Found OpenBLAS include: ${OpenBLAS_INCLUDE_DIR}")
|
||||
ENDIF (NOT OpenBLAS_FIND_QUIETLY)
|
||||
ELSE (OpenBLAS_FOUND)
|
||||
IF (OpenBLAS_FIND_REQUIRED)
|
||||
MESSAGE(FATAL_ERROR "Could not find OpenBLAS")
|
||||
ENDIF (OpenBLAS_FIND_REQUIRED)
|
||||
ENDIF (OpenBLAS_FOUND)
|
||||
|
||||
MARK_AS_ADVANCED(
|
||||
OpenBLAS_INCLUDE_DIR
|
||||
OpenBLAS_LIB
|
||||
OpenBLAS
|
||||
)
|
||||
|
@ -1,28 +0,0 @@
|
||||
# Find the Snappy libraries
|
||||
#
|
||||
# The following variables are optionally searched for defaults
|
||||
# Snappy_ROOT_DIR: Base directory where all Snappy components are found
|
||||
#
|
||||
# The following are set after configuration is done:
|
||||
# SNAPPY_FOUND
|
||||
# Snappy_INCLUDE_DIR
|
||||
# Snappy_LIBRARIES
|
||||
|
||||
find_path(Snappy_INCLUDE_DIR NAMES snappy.h
|
||||
PATHS ${SNAPPY_ROOT_DIR} ${SNAPPY_ROOT_DIR}/include)
|
||||
|
||||
find_library(Snappy_LIBRARIES NAMES snappy
|
||||
PATHS ${SNAPPY_ROOT_DIR} ${SNAPPY_ROOT_DIR}/lib)
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(Snappy DEFAULT_MSG Snappy_INCLUDE_DIR Snappy_LIBRARIES)
|
||||
|
||||
if(SNAPPY_FOUND)
|
||||
message(STATUS "Found Snappy (include: ${Snappy_INCLUDE_DIR}, library: ${Snappy_LIBRARIES})")
|
||||
mark_as_advanced(Snappy_INCLUDE_DIR Snappy_LIBRARIES)
|
||||
|
||||
caffe_parse_header(${Snappy_INCLUDE_DIR}/snappy-stubs-public.h
|
||||
SNAPPY_VERION_LINES SNAPPY_MAJOR SNAPPY_MINOR SNAPPY_PATCHLEVEL)
|
||||
set(Snappy_VERSION "${SNAPPY_MAJOR}.${SNAPPY_MINOR}.${SNAPPY_PATCHLEVEL}")
|
||||
endif()
|
||||
|
@ -1,36 +0,0 @@
|
||||
# Find the vecLib libraries as part of Accelerate.framework or as standalon framework
|
||||
#
|
||||
# The following are set after configuration is done:
|
||||
# VECLIB_FOUND
|
||||
# vecLib_INCLUDE_DIR
|
||||
# vecLib_LINKER_LIBS
|
||||
|
||||
|
||||
if(NOT APPLE)
|
||||
return()
|
||||
endif()
|
||||
|
||||
set(__veclib_include_suffix "Frameworks/vecLib.framework/Versions/Current/Headers")
|
||||
|
||||
exec_program(xcode-select ARGS -print-path OUTPUT_VARIABLE CMAKE_XCODE_DEVELOPER_DIR)
|
||||
find_path(vecLib_INCLUDE_DIR vecLib.h
|
||||
DOC "vecLib include directory"
|
||||
PATHS /System/Library/Frameworks/Accelerate.framework/Versions/Current/${__veclib_include_suffix}
|
||||
/System/Library/${__veclib_include_suffix}
|
||||
${CMAKE_XCODE_DEVELOPER_DIR}/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/Frameworks/Accelerate.framework/Versions/Current/Frameworks/vecLib.framework/Headers/
|
||||
NO_DEFAULT_PATH)
|
||||
|
||||
include(FindPackageHandleStandardArgs)
|
||||
find_package_handle_standard_args(vecLib DEFAULT_MSG vecLib_INCLUDE_DIR)
|
||||
|
||||
if(VECLIB_FOUND)
|
||||
if(vecLib_INCLUDE_DIR MATCHES "^/System/Library/Frameworks/vecLib.framework.*")
|
||||
set(vecLib_LINKER_LIBS -lcblas "-framework vecLib")
|
||||
message(STATUS "Found standalone vecLib.framework")
|
||||
else()
|
||||
set(vecLib_LINKER_LIBS -lcblas "-framework Accelerate")
|
||||
message(STATUS "Found vecLib as part of Accelerate.framework")
|
||||
endif()
|
||||
|
||||
mark_as_advanced(vecLib_INCLUDE_DIR)
|
||||
endif()
|
@ -1,90 +0,0 @@
|
||||
# Finds Google Protocol Buffers library and compilers and extends
|
||||
# the standard cmake script with version and python generation support
|
||||
|
||||
find_package( Protobuf REQUIRED )
|
||||
list(APPEND Caffe_INCLUDE_DIRS PUBLIC ${PROTOBUF_INCLUDE_DIR})
|
||||
list(APPEND Caffe_LINKER_LIBS PUBLIC ${PROTOBUF_LIBRARIES})
|
||||
|
||||
# As of Ubuntu 14.04 protoc is no longer a part of libprotobuf-dev package
|
||||
# and should be installed separately as in: sudo apt-get install protobuf-compiler
|
||||
if(EXISTS ${PROTOBUF_PROTOC_EXECUTABLE})
|
||||
message(STATUS "Found PROTOBUF Compiler: ${PROTOBUF_PROTOC_EXECUTABLE}")
|
||||
else()
|
||||
message(FATAL_ERROR "Could not find PROTOBUF Compiler")
|
||||
endif()
|
||||
|
||||
if(PROTOBUF_FOUND)
|
||||
# fetches protobuf version
|
||||
caffe_parse_header(${PROTOBUF_INCLUDE_DIR}/google/protobuf/stubs/common.h VERION_LINE GOOGLE_PROTOBUF_VERSION)
|
||||
string(REGEX MATCH "([0-9])00([0-9])00([0-9])" PROTOBUF_VERSION ${GOOGLE_PROTOBUF_VERSION})
|
||||
set(PROTOBUF_VERSION "${CMAKE_MATCH_1}.${CMAKE_MATCH_2}.${CMAKE_MATCH_3}")
|
||||
unset(GOOGLE_PROTOBUF_VERSION)
|
||||
endif()
|
||||
|
||||
# place where to generate protobuf sources
|
||||
set(proto_gen_folder "${PROJECT_BINARY_DIR}/include/caffe/proto")
|
||||
include_directories("${PROJECT_BINARY_DIR}/include")
|
||||
|
||||
set(PROTOBUF_GENERATE_CPP_APPEND_PATH TRUE)
|
||||
|
||||
################################################################################################
|
||||
# Modification of standard 'protobuf_generate_cpp()' with output dir parameter and python support
|
||||
# Usage:
|
||||
# caffe_protobuf_generate_cpp_py(<output_dir> <srcs_var> <hdrs_var> <python_var> <proto_files>)
|
||||
function(caffe_protobuf_generate_cpp_py output_dir srcs_var hdrs_var python_var)
|
||||
if(NOT ARGN)
|
||||
message(SEND_ERROR "Error: caffe_protobuf_generate_cpp_py() called without any proto files")
|
||||
return()
|
||||
endif()
|
||||
|
||||
if(PROTOBUF_GENERATE_CPP_APPEND_PATH)
|
||||
# Create an include path for each file specified
|
||||
foreach(fil ${ARGN})
|
||||
get_filename_component(abs_fil ${fil} ABSOLUTE)
|
||||
get_filename_component(abs_path ${abs_fil} PATH)
|
||||
list(FIND _protoc_include ${abs_path} _contains_already)
|
||||
if(${_contains_already} EQUAL -1)
|
||||
list(APPEND _protoc_include -I ${abs_path})
|
||||
endif()
|
||||
endforeach()
|
||||
else()
|
||||
set(_protoc_include -I ${CMAKE_CURRENT_SOURCE_DIR})
|
||||
endif()
|
||||
|
||||
if(DEFINED PROTOBUF_IMPORT_DIRS)
|
||||
foreach(dir ${PROTOBUF_IMPORT_DIRS})
|
||||
get_filename_component(abs_path ${dir} ABSOLUTE)
|
||||
list(FIND _protoc_include ${abs_path} _contains_already)
|
||||
if(${_contains_already} EQUAL -1)
|
||||
list(APPEND _protoc_include -I ${abs_path})
|
||||
endif()
|
||||
endforeach()
|
||||
endif()
|
||||
|
||||
set(${srcs_var})
|
||||
set(${hdrs_var})
|
||||
set(${python_var})
|
||||
foreach(fil ${ARGN})
|
||||
get_filename_component(abs_fil ${fil} ABSOLUTE)
|
||||
get_filename_component(fil_we ${fil} NAME_WE)
|
||||
|
||||
list(APPEND ${srcs_var} "${output_dir}/${fil_we}.pb.cc")
|
||||
list(APPEND ${hdrs_var} "${output_dir}/${fil_we}.pb.h")
|
||||
list(APPEND ${python_var} "${output_dir}/${fil_we}_pb2.py")
|
||||
|
||||
add_custom_command(
|
||||
OUTPUT "${output_dir}/${fil_we}.pb.cc"
|
||||
"${output_dir}/${fil_we}.pb.h"
|
||||
"${output_dir}/${fil_we}_pb2.py"
|
||||
COMMAND ${CMAKE_COMMAND} -E make_directory "${output_dir}"
|
||||
COMMAND ${PROTOBUF_PROTOC_EXECUTABLE} --cpp_out ${output_dir} ${_protoc_include} ${abs_fil}
|
||||
COMMAND ${PROTOBUF_PROTOC_EXECUTABLE} --python_out ${PROJECT_BINARY_DIR}/include --proto_path ${PROJECT_SOURCE_DIR}/src ${_protoc_include} ${abs_fil}
|
||||
DEPENDS ${abs_fil}
|
||||
COMMENT "Running C++/Python protocol buffer compiler on ${fil}" VERBATIM )
|
||||
endforeach()
|
||||
|
||||
set_source_files_properties(${${srcs_var}} ${${hdrs_var}} ${${python_var}} PROPERTIES GENERATED TRUE)
|
||||
set(${srcs_var} ${${srcs_var}} PARENT_SCOPE)
|
||||
set(${hdrs_var} ${${hdrs_var}} PARENT_SCOPE)
|
||||
set(${python_var} ${${python_var}} PARENT_SCOPE)
|
||||
endfunction()
|
@ -1,180 +0,0 @@
|
||||
################################################################################################
|
||||
# Caffe status report function.
|
||||
# Automatically align right column and selects text based on condition.
|
||||
# Usage:
|
||||
# caffe_status(<text>)
|
||||
# caffe_status(<heading> <value1> [<value2> ...])
|
||||
# caffe_status(<heading> <condition> THEN <text for TRUE> ELSE <text for FALSE> )
|
||||
function(caffe_status text)
|
||||
set(status_cond)
|
||||
set(status_then)
|
||||
set(status_else)
|
||||
|
||||
set(status_current_name "cond")
|
||||
foreach(arg ${ARGN})
|
||||
if(arg STREQUAL "THEN")
|
||||
set(status_current_name "then")
|
||||
elseif(arg STREQUAL "ELSE")
|
||||
set(status_current_name "else")
|
||||
else()
|
||||
list(APPEND status_${status_current_name} ${arg})
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
if(DEFINED status_cond)
|
||||
set(status_placeholder_length 23)
|
||||
string(RANDOM LENGTH ${status_placeholder_length} ALPHABET " " status_placeholder)
|
||||
string(LENGTH "${text}" status_text_length)
|
||||
if(status_text_length LESS status_placeholder_length)
|
||||
string(SUBSTRING "${text}${status_placeholder}" 0 ${status_placeholder_length} status_text)
|
||||
elseif(DEFINED status_then OR DEFINED status_else)
|
||||
message(STATUS "${text}")
|
||||
set(status_text "${status_placeholder}")
|
||||
else()
|
||||
set(status_text "${text}")
|
||||
endif()
|
||||
|
||||
if(DEFINED status_then OR DEFINED status_else)
|
||||
if(${status_cond})
|
||||
string(REPLACE ";" " " status_then "${status_then}")
|
||||
string(REGEX REPLACE "^[ \t]+" "" status_then "${status_then}")
|
||||
message(STATUS "${status_text} ${status_then}")
|
||||
else()
|
||||
string(REPLACE ";" " " status_else "${status_else}")
|
||||
string(REGEX REPLACE "^[ \t]+" "" status_else "${status_else}")
|
||||
message(STATUS "${status_text} ${status_else}")
|
||||
endif()
|
||||
else()
|
||||
string(REPLACE ";" " " status_cond "${status_cond}")
|
||||
string(REGEX REPLACE "^[ \t]+" "" status_cond "${status_cond}")
|
||||
message(STATUS "${status_text} ${status_cond}")
|
||||
endif()
|
||||
else()
|
||||
message(STATUS "${text}")
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
|
||||
################################################################################################
|
||||
# Function for fetching Caffe version from git and headers
|
||||
# Usage:
|
||||
# caffe_extract_caffe_version()
|
||||
function(caffe_extract_caffe_version)
|
||||
set(Caffe_GIT_VERSION "unknown")
|
||||
find_package(Git)
|
||||
if(GIT_FOUND)
|
||||
execute_process(COMMAND ${GIT_EXECUTABLE} describe --tags --always --dirty
|
||||
ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE
|
||||
WORKING_DIRECTORY "${PROJECT_SOURCE_DIR}"
|
||||
OUTPUT_VARIABLE Caffe_GIT_VERSION
|
||||
RESULT_VARIABLE __git_result)
|
||||
if(NOT ${__git_result} EQUAL 0)
|
||||
set(Caffe_GIT_VERSION "unknown")
|
||||
endif()
|
||||
endif()
|
||||
|
||||
set(Caffe_GIT_VERSION ${Caffe_GIT_VERSION} PARENT_SCOPE)
|
||||
set(Caffe_VERSION "<TODO> (Caffe doesn't declare its version in headers)" PARENT_SCOPE)
|
||||
|
||||
# caffe_parse_header(${Caffe_INCLUDE_DIR}/caffe/version.hpp Caffe_VERSION_LINES CAFFE_MAJOR CAFFE_MINOR CAFFE_PATCH)
|
||||
# set(Caffe_VERSION "${CAFFE_MAJOR}.${CAFFE_MINOR}.${CAFFE_PATCH}" PARENT_SCOPE)
|
||||
|
||||
# or for #define Caffe_VERSION "x.x.x"
|
||||
# caffe_parse_header_single_define(Caffe ${Caffe_INCLUDE_DIR}/caffe/version.hpp Caffe_VERSION)
|
||||
# set(Caffe_VERSION ${Caffe_VERSION_STRING} PARENT_SCOPE)
|
||||
|
||||
endfunction()
|
||||
|
||||
|
||||
################################################################################################
|
||||
# Prints accumulated caffe configuration summary
|
||||
# Usage:
|
||||
# caffe_print_configuration_summary()
|
||||
|
||||
function(caffe_print_configuration_summary)
|
||||
caffe_extract_caffe_version()
|
||||
set(Caffe_VERSION ${Caffe_VERSION} PARENT_SCOPE)
|
||||
|
||||
caffe_merge_flag_lists(__flags_rel CMAKE_CXX_FLAGS_RELEASE CMAKE_CXX_FLAGS)
|
||||
caffe_merge_flag_lists(__flags_deb CMAKE_CXX_FLAGS_DEBUG CMAKE_CXX_FLAGS)
|
||||
|
||||
caffe_status("")
|
||||
caffe_status("******************* Caffe Configuration Summary *******************")
|
||||
caffe_status("General:")
|
||||
caffe_status(" Version : ${CAFFE_TARGET_VERSION}")
|
||||
caffe_status(" Git : ${Caffe_GIT_VERSION}")
|
||||
caffe_status(" System : ${CMAKE_SYSTEM_NAME}")
|
||||
caffe_status(" C++ compiler : ${CMAKE_CXX_COMPILER}")
|
||||
caffe_status(" Release CXX flags : ${__flags_rel}")
|
||||
caffe_status(" Debug CXX flags : ${__flags_deb}")
|
||||
caffe_status(" Build type : ${CMAKE_BUILD_TYPE}")
|
||||
caffe_status("")
|
||||
caffe_status(" BUILD_SHARED_LIBS : ${BUILD_SHARED_LIBS}")
|
||||
caffe_status(" BUILD_python : ${BUILD_python}")
|
||||
caffe_status(" BUILD_matlab : ${BUILD_matlab}")
|
||||
caffe_status(" BUILD_docs : ${BUILD_docs}")
|
||||
caffe_status(" CPU_ONLY : ${CPU_ONLY}")
|
||||
caffe_status(" USE_OPENCV : ${USE_OPENCV}")
|
||||
caffe_status(" USE_LEVELDB : ${USE_LEVELDB}")
|
||||
caffe_status(" USE_LMDB : ${USE_LMDB}")
|
||||
caffe_status(" USE_NCCL : ${USE_NCCL}")
|
||||
caffe_status(" ALLOW_LMDB_NOLOCK : ${ALLOW_LMDB_NOLOCK}")
|
||||
# This code is taken from https://github.com/sh1r0/caffe-android-lib
|
||||
caffe_status(" USE_HDF5 : ${USE_HDF5}")
|
||||
caffe_status("")
|
||||
caffe_status("Dependencies:")
|
||||
caffe_status(" BLAS : " APPLE THEN "Yes (vecLib)" ELSE "Yes (${BLAS})")
|
||||
caffe_status(" Boost : Yes (ver. ${Boost_MAJOR_VERSION}.${Boost_MINOR_VERSION})")
|
||||
caffe_status(" glog : Yes")
|
||||
caffe_status(" gflags : Yes")
|
||||
caffe_status(" protobuf : " PROTOBUF_FOUND THEN "Yes (ver. ${PROTOBUF_VERSION})" ELSE "No" )
|
||||
if(USE_LMDB)
|
||||
caffe_status(" lmdb : " LMDB_FOUND THEN "Yes (ver. ${LMDB_VERSION})" ELSE "No")
|
||||
endif()
|
||||
if(USE_LEVELDB)
|
||||
caffe_status(" LevelDB : " LEVELDB_FOUND THEN "Yes (ver. ${LEVELDB_VERSION})" ELSE "No")
|
||||
caffe_status(" Snappy : " SNAPPY_FOUND THEN "Yes (ver. ${Snappy_VERSION})" ELSE "No" )
|
||||
endif()
|
||||
if(USE_OPENCV)
|
||||
caffe_status(" OpenCV : Yes (ver. ${OpenCV_VERSION})")
|
||||
endif()
|
||||
caffe_status(" CUDA : " HAVE_CUDA THEN "Yes (ver. ${CUDA_VERSION})" ELSE "No" )
|
||||
caffe_status("")
|
||||
if(HAVE_CUDA)
|
||||
caffe_status("NVIDIA CUDA:")
|
||||
caffe_status(" Target GPU(s) : ${CUDA_ARCH_NAME}" )
|
||||
caffe_status(" GPU arch(s) : ${NVCC_FLAGS_EXTRA_readable}")
|
||||
if(USE_CUDNN)
|
||||
caffe_status(" cuDNN : " HAVE_CUDNN THEN "Yes (ver. ${CUDNN_VERSION})" ELSE "Not found")
|
||||
else()
|
||||
caffe_status(" cuDNN : Disabled")
|
||||
endif()
|
||||
caffe_status("")
|
||||
endif()
|
||||
if(HAVE_PYTHON)
|
||||
caffe_status("Python:")
|
||||
caffe_status(" Interpreter :" PYTHON_EXECUTABLE THEN "${PYTHON_EXECUTABLE} (ver. ${PYTHON_VERSION_STRING})" ELSE "No")
|
||||
caffe_status(" Libraries :" PYTHONLIBS_FOUND THEN "${PYTHON_LIBRARIES} (ver ${PYTHONLIBS_VERSION_STRING})" ELSE "No")
|
||||
caffe_status(" NumPy :" NUMPY_FOUND THEN "${NUMPY_INCLUDE_DIR} (ver ${NUMPY_VERSION})" ELSE "No")
|
||||
caffe_status("")
|
||||
endif()
|
||||
if(BUILD_matlab)
|
||||
caffe_status("Matlab:")
|
||||
caffe_status(" Matlab :" HAVE_MATLAB THEN "Yes (${Matlab_mex}, ${Matlab_mexext}" ELSE "No")
|
||||
caffe_status(" Octave :" Octave_compiler THEN "Yes (${Octave_compiler})" ELSE "No")
|
||||
if(HAVE_MATLAB AND Octave_compiler)
|
||||
caffe_status(" Build mex using : ${Matlab_build_mex_using}")
|
||||
endif()
|
||||
caffe_status("")
|
||||
endif()
|
||||
if(BUILD_docs)
|
||||
caffe_status("Documentaion:")
|
||||
caffe_status(" Doxygen :" DOXYGEN_FOUND THEN "${DOXYGEN_EXECUTABLE} (${DOXYGEN_VERSION})" ELSE "No")
|
||||
caffe_status(" config_file : ${DOXYGEN_config_file}")
|
||||
|
||||
caffe_status("")
|
||||
endif()
|
||||
caffe_status("Install:")
|
||||
caffe_status(" Install path : ${CMAKE_INSTALL_PREFIX}")
|
||||
caffe_status("")
|
||||
endfunction()
|
@ -1,174 +0,0 @@
|
||||
################################################################################################
|
||||
# Defines global Caffe_LINK flag, This flag is required to prevent linker from excluding
|
||||
# some objects which are not addressed directly but are registered via static constructors
|
||||
macro(caffe_set_caffe_link)
|
||||
if(BUILD_SHARED_LIBS)
|
||||
set(Caffe_LINK caffe)
|
||||
else()
|
||||
if("${CMAKE_CXX_COMPILER_ID}" STREQUAL "Clang")
|
||||
set(Caffe_LINK -Wl,-force_load caffe)
|
||||
elseif("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
|
||||
set(Caffe_LINK -Wl,--whole-archive caffe -Wl,--no-whole-archive)
|
||||
endif()
|
||||
endif()
|
||||
endmacro()
|
||||
################################################################################################
|
||||
# Convenient command to setup source group for IDEs that support this feature (VS, XCode)
|
||||
# Usage:
|
||||
# caffe_source_group(<group> GLOB[_RECURSE] <globbing_expression>)
|
||||
function(caffe_source_group group)
|
||||
cmake_parse_arguments(CAFFE_SOURCE_GROUP "" "" "GLOB;GLOB_RECURSE" ${ARGN})
|
||||
if(CAFFE_SOURCE_GROUP_GLOB)
|
||||
file(GLOB srcs1 ${CAFFE_SOURCE_GROUP_GLOB})
|
||||
source_group(${group} FILES ${srcs1})
|
||||
endif()
|
||||
|
||||
if(CAFFE_SOURCE_GROUP_GLOB_RECURSE)
|
||||
file(GLOB_RECURSE srcs2 ${CAFFE_SOURCE_GROUP_GLOB_RECURSE})
|
||||
source_group(${group} FILES ${srcs2})
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Collecting sources from globbing and appending to output list variable
|
||||
# Usage:
|
||||
# caffe_collect_sources(<output_variable> GLOB[_RECURSE] <globbing_expression>)
|
||||
function(caffe_collect_sources variable)
|
||||
cmake_parse_arguments(CAFFE_COLLECT_SOURCES "" "" "GLOB;GLOB_RECURSE" ${ARGN})
|
||||
if(CAFFE_COLLECT_SOURCES_GLOB)
|
||||
file(GLOB srcs1 ${CAFFE_COLLECT_SOURCES_GLOB})
|
||||
set(${variable} ${variable} ${srcs1})
|
||||
endif()
|
||||
|
||||
if(CAFFE_COLLECT_SOURCES_GLOB_RECURSE)
|
||||
file(GLOB_RECURSE srcs2 ${CAFFE_COLLECT_SOURCES_GLOB_RECURSE})
|
||||
set(${variable} ${variable} ${srcs2})
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Short command getting caffe sources (assuming standard Caffe code tree)
|
||||
# Usage:
|
||||
# caffe_pickup_caffe_sources(<root>)
|
||||
function(caffe_pickup_caffe_sources root)
|
||||
# put all files in source groups (visible as subfolder in many IDEs)
|
||||
caffe_source_group("Include" GLOB "${root}/include/caffe/*.h*")
|
||||
caffe_source_group("Include\\Util" GLOB "${root}/include/caffe/util/*.h*")
|
||||
caffe_source_group("Include" GLOB "${PROJECT_BINARY_DIR}/caffe_config.h*")
|
||||
caffe_source_group("Source" GLOB "${root}/src/caffe/*.cpp")
|
||||
caffe_source_group("Source\\Util" GLOB "${root}/src/caffe/util/*.cpp")
|
||||
caffe_source_group("Source\\Layers" GLOB "${root}/src/caffe/layers/*.cpp")
|
||||
caffe_source_group("Source\\Cuda" GLOB "${root}/src/caffe/layers/*.cu")
|
||||
caffe_source_group("Source\\Cuda" GLOB "${root}/src/caffe/util/*.cu")
|
||||
caffe_source_group("Source\\Proto" GLOB "${root}/src/caffe/proto/*.proto")
|
||||
|
||||
# source groups for test target
|
||||
caffe_source_group("Include" GLOB "${root}/include/caffe/test/test_*.h*")
|
||||
caffe_source_group("Source" GLOB "${root}/src/caffe/test/test_*.cpp")
|
||||
caffe_source_group("Source\\Cuda" GLOB "${root}/src/caffe/test/test_*.cu")
|
||||
|
||||
# collect files
|
||||
file(GLOB test_hdrs ${root}/include/caffe/test/test_*.h*)
|
||||
file(GLOB test_srcs ${root}/src/caffe/test/test_*.cpp)
|
||||
file(GLOB_RECURSE hdrs ${root}/include/caffe/*.h*)
|
||||
file(GLOB_RECURSE srcs ${root}/src/caffe/*.cpp)
|
||||
list(REMOVE_ITEM hdrs ${test_hdrs})
|
||||
list(REMOVE_ITEM srcs ${test_srcs})
|
||||
|
||||
# adding headers to make the visible in some IDEs (Qt, VS, Xcode)
|
||||
list(APPEND srcs ${hdrs} ${PROJECT_BINARY_DIR}/caffe_config.h)
|
||||
list(APPEND test_srcs ${test_hdrs})
|
||||
|
||||
# collect cuda files
|
||||
file(GLOB test_cuda ${root}/src/caffe/test/test_*.cu)
|
||||
file(GLOB_RECURSE cuda ${root}/src/caffe/*.cu)
|
||||
list(REMOVE_ITEM cuda ${test_cuda})
|
||||
|
||||
# add proto to make them editable in IDEs too
|
||||
file(GLOB_RECURSE proto_files ${root}/src/caffe/*.proto)
|
||||
list(APPEND srcs ${proto_files})
|
||||
|
||||
# convert to absolute paths
|
||||
caffe_convert_absolute_paths(srcs)
|
||||
caffe_convert_absolute_paths(cuda)
|
||||
caffe_convert_absolute_paths(test_srcs)
|
||||
caffe_convert_absolute_paths(test_cuda)
|
||||
|
||||
# propagate to parent scope
|
||||
set(srcs ${srcs} PARENT_SCOPE)
|
||||
set(cuda ${cuda} PARENT_SCOPE)
|
||||
set(test_srcs ${test_srcs} PARENT_SCOPE)
|
||||
set(test_cuda ${test_cuda} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Short command for setting default target properties
|
||||
# Usage:
|
||||
# caffe_default_properties(<target>)
|
||||
function(caffe_default_properties target)
|
||||
set_target_properties(${target} PROPERTIES
|
||||
DEBUG_POSTFIX ${Caffe_DEBUG_POSTFIX}
|
||||
ARCHIVE_OUTPUT_DIRECTORY "${PROJECT_BINARY_DIR}/lib"
|
||||
LIBRARY_OUTPUT_DIRECTORY "${PROJECT_BINARY_DIR}/lib"
|
||||
RUNTIME_OUTPUT_DIRECTORY "${PROJECT_BINARY_DIR}/bin")
|
||||
# make sure we build all external dependencies first
|
||||
if (DEFINED external_project_dependencies)
|
||||
add_dependencies(${target} ${external_project_dependencies})
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Short command for setting runtime directory for build target
|
||||
# Usage:
|
||||
# caffe_set_runtime_directory(<target> <dir>)
|
||||
function(caffe_set_runtime_directory target dir)
|
||||
set_target_properties(${target} PROPERTIES
|
||||
RUNTIME_OUTPUT_DIRECTORY "${dir}")
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Short command for setting solution folder property for target
|
||||
# Usage:
|
||||
# caffe_set_solution_folder(<target> <folder>)
|
||||
function(caffe_set_solution_folder target folder)
|
||||
if(USE_PROJECT_FOLDERS)
|
||||
set_target_properties(${target} PROPERTIES FOLDER "${folder}")
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Reads lines from input file, prepends source directory to each line and writes to output file
|
||||
# Usage:
|
||||
# caffe_configure_testdatafile(<testdatafile>)
|
||||
function(caffe_configure_testdatafile file)
|
||||
file(STRINGS ${file} __lines)
|
||||
set(result "")
|
||||
foreach(line ${__lines})
|
||||
set(result "${result}${PROJECT_SOURCE_DIR}/${line}\n")
|
||||
endforeach()
|
||||
file(WRITE ${file}.gen.cmake ${result})
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Filter out all files that are not included in selected list
|
||||
# Usage:
|
||||
# caffe_leave_only_selected_tests(<filelist_variable> <selected_list>)
|
||||
function(caffe_leave_only_selected_tests file_list)
|
||||
if(NOT ARGN)
|
||||
return() # blank list means leave all
|
||||
endif()
|
||||
string(REPLACE "," ";" __selected ${ARGN})
|
||||
list(APPEND __selected caffe_main)
|
||||
|
||||
set(result "")
|
||||
foreach(f ${${file_list}})
|
||||
get_filename_component(name ${f} NAME_WE)
|
||||
string(REGEX REPLACE "^test_" "" name ${name})
|
||||
list(FIND __selected ${name} __index)
|
||||
if(NOT __index EQUAL -1)
|
||||
list(APPEND result ${f})
|
||||
endif()
|
||||
endforeach()
|
||||
set(${file_list} ${result} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
@ -1,55 +0,0 @@
|
||||
# Config file for the Caffe package.
|
||||
#
|
||||
# Note:
|
||||
# Caffe and this config file depends on opencv,
|
||||
# so put `find_package(OpenCV)` before searching Caffe
|
||||
# via `find_package(Caffe)`. All other lib/includes
|
||||
# dependencies are hard coded in the file
|
||||
#
|
||||
# After successful configuration the following variables
|
||||
# will be defined:
|
||||
#
|
||||
# Caffe_LIBRARIES - IMPORTED targets to link against
|
||||
# (There is no Caffe_INCLUDE_DIRS and Caffe_DEFINITIONS
|
||||
# because they are specified in the IMPORTED target interface.)
|
||||
#
|
||||
# Caffe_HAVE_CUDA - signals about CUDA support
|
||||
# Caffe_HAVE_CUDNN - signals about cuDNN support
|
||||
|
||||
|
||||
# OpenCV dependency (optional)
|
||||
|
||||
if(@USE_OPENCV@)
|
||||
if(NOT OpenCV_FOUND)
|
||||
set(Caffe_OpenCV_CONFIG_PATH "@OpenCV_CONFIG_PATH@")
|
||||
if(Caffe_OpenCV_CONFIG_PATH)
|
||||
get_filename_component(Caffe_OpenCV_CONFIG_PATH ${Caffe_OpenCV_CONFIG_PATH} ABSOLUTE)
|
||||
|
||||
if(EXISTS ${Caffe_OpenCV_CONFIG_PATH} AND NOT TARGET opencv_core)
|
||||
message(STATUS "Caffe: using OpenCV config from ${Caffe_OpenCV_CONFIG_PATH}")
|
||||
include(${Caffe_OpenCV_CONFIG_PATH}/OpenCVConfig.cmake)
|
||||
endif()
|
||||
|
||||
else()
|
||||
find_package(OpenCV REQUIRED)
|
||||
endif()
|
||||
unset(Caffe_OpenCV_CONFIG_PATH)
|
||||
endif()
|
||||
endif()
|
||||
|
||||
# Compute paths
|
||||
get_filename_component(Caffe_CMAKE_DIR "${CMAKE_CURRENT_LIST_FILE}" PATH)
|
||||
|
||||
# Our library dependencies
|
||||
if(NOT TARGET caffe AND NOT caffe_BINARY_DIR)
|
||||
include("${Caffe_CMAKE_DIR}/CaffeTargets.cmake")
|
||||
endif()
|
||||
|
||||
# List of IMPORTED libs created by CaffeTargets.cmake
|
||||
# These targets already specify all needed definitions and include pathes
|
||||
set(Caffe_LIBRARIES caffe)
|
||||
|
||||
# Cuda support variables
|
||||
set(Caffe_CPU_ONLY @CPU_ONLY@)
|
||||
set(Caffe_HAVE_CUDA @HAVE_CUDA@)
|
||||
set(Caffe_HAVE_CUDNN @HAVE_CUDNN@)
|
@ -1,11 +0,0 @@
|
||||
set(PACKAGE_VERSION "@Caffe_VERSION@")
|
||||
|
||||
# Check whether the requested PACKAGE_FIND_VERSION is compatible
|
||||
if("${PACKAGE_VERSION}" VERSION_LESS "${PACKAGE_FIND_VERSION}")
|
||||
set(PACKAGE_VERSION_COMPATIBLE FALSE)
|
||||
else()
|
||||
set(PACKAGE_VERSION_COMPATIBLE TRUE)
|
||||
if ("${PACKAGE_VERSION}" VERSION_EQUAL "${PACKAGE_FIND_VERSION}")
|
||||
set(PACKAGE_VERSION_EXACT TRUE)
|
||||
endif()
|
||||
endif()
|
@ -1,12 +0,0 @@
|
||||
/* Sources directory */
|
||||
#define SOURCE_FOLDER "${PROJECT_SOURCE_DIR}"
|
||||
|
||||
/* Binaries directory */
|
||||
#define BINARY_FOLDER "${PROJECT_BINARY_DIR}"
|
||||
|
||||
/* This is an absolute path so that we can run test from any build
|
||||
* directory */
|
||||
#define ABS_TEST_DATA_DIR "${PROJECT_SOURCE_DIR}/src/caffe/test/test_data/"
|
||||
|
||||
/* Test device */
|
||||
#define CUDA_TEST_DEVICE ${CUDA_TEST_DEVICE}
|
@ -1,26 +0,0 @@
|
||||
if(NOT EXISTS "@CMAKE_CURRENT_BINARY_DIR@/install_manifest.txt")
|
||||
message(FATAL_ERROR "Cannot find install manifest: @CMAKE_CURRENT_BINARY_DIR@/install_manifest.txt")
|
||||
endif(NOT EXISTS "@CMAKE_CURRENT_BINARY_DIR@/install_manifest.txt")
|
||||
|
||||
if (NOT DEFINED CMAKE_INSTALL_PREFIX)
|
||||
set (CMAKE_INSTALL_PREFIX "@CMAKE_INSTALL_PREFIX@")
|
||||
endif ()
|
||||
message(${CMAKE_INSTALL_PREFIX})
|
||||
|
||||
file(READ "@CMAKE_CURRENT_BINARY_DIR@/install_manifest.txt" files)
|
||||
string(REGEX REPLACE "\n" ";" files "${files}")
|
||||
foreach(file ${files})
|
||||
message(STATUS "Uninstalling $ENV{DESTDIR}${file}")
|
||||
if(IS_SYMLINK "$ENV{DESTDIR}${file}" OR EXISTS "$ENV{DESTDIR}${file}")
|
||||
exec_program(
|
||||
"@CMAKE_COMMAND@" ARGS "-E remove \"$ENV{DESTDIR}${file}\""
|
||||
OUTPUT_VARIABLE rm_out
|
||||
RETURN_VALUE rm_retval
|
||||
)
|
||||
if(NOT "${rm_retval}" STREQUAL 0)
|
||||
message(FATAL_ERROR "Problem when removing $ENV{DESTDIR}${file}")
|
||||
endif(NOT "${rm_retval}" STREQUAL 0)
|
||||
else(IS_SYMLINK "$ENV{DESTDIR}${file}" OR EXISTS "$ENV{DESTDIR}${file}")
|
||||
message(STATUS "File $ENV{DESTDIR}${file} does not exist.")
|
||||
endif(IS_SYMLINK "$ENV{DESTDIR}${file}" OR EXISTS "$ENV{DESTDIR}${file}")
|
||||
endforeach(file)
|
@ -1,382 +0,0 @@
|
||||
################################################################################################
|
||||
# Command alias for debugging messages
|
||||
# Usage:
|
||||
# dmsg(<message>)
|
||||
function(dmsg)
|
||||
message(STATUS ${ARGN})
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Removes duplicates from list(s)
|
||||
# Usage:
|
||||
# caffe_list_unique(<list_variable> [<list_variable>] [...])
|
||||
macro(caffe_list_unique)
|
||||
foreach(__lst ${ARGN})
|
||||
if(${__lst})
|
||||
list(REMOVE_DUPLICATES ${__lst})
|
||||
endif()
|
||||
endforeach()
|
||||
endmacro()
|
||||
|
||||
################################################################################################
|
||||
# Clears variables from list
|
||||
# Usage:
|
||||
# caffe_clear_vars(<variables_list>)
|
||||
macro(caffe_clear_vars)
|
||||
foreach(_var ${ARGN})
|
||||
unset(${_var})
|
||||
endforeach()
|
||||
endmacro()
|
||||
|
||||
################################################################################################
|
||||
# Removes duplicates from string
|
||||
# Usage:
|
||||
# caffe_string_unique(<string_variable>)
|
||||
function(caffe_string_unique __string)
|
||||
if(${__string})
|
||||
set(__list ${${__string}})
|
||||
separate_arguments(__list)
|
||||
list(REMOVE_DUPLICATES __list)
|
||||
foreach(__e ${__list})
|
||||
set(__str "${__str} ${__e}")
|
||||
endforeach()
|
||||
set(${__string} ${__str} PARENT_SCOPE)
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Prints list element per line
|
||||
# Usage:
|
||||
# caffe_print_list(<list>)
|
||||
function(caffe_print_list)
|
||||
foreach(e ${ARGN})
|
||||
message(STATUS ${e})
|
||||
endforeach()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Function merging lists of compiler flags to single string.
|
||||
# Usage:
|
||||
# caffe_merge_flag_lists(out_variable <list1> [<list2>] [<list3>] ...)
|
||||
function(caffe_merge_flag_lists out_var)
|
||||
set(__result "")
|
||||
foreach(__list ${ARGN})
|
||||
foreach(__flag ${${__list}})
|
||||
string(STRIP ${__flag} __flag)
|
||||
set(__result "${__result} ${__flag}")
|
||||
endforeach()
|
||||
endforeach()
|
||||
string(STRIP ${__result} __result)
|
||||
set(${out_var} ${__result} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Converts all paths in list to absolute
|
||||
# Usage:
|
||||
# caffe_convert_absolute_paths(<list_variable>)
|
||||
function(caffe_convert_absolute_paths variable)
|
||||
set(__dlist "")
|
||||
foreach(__s ${${variable}})
|
||||
get_filename_component(__abspath ${__s} ABSOLUTE)
|
||||
list(APPEND __list ${__abspath})
|
||||
endforeach()
|
||||
set(${variable} ${__list} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Reads set of version defines from the header file
|
||||
# Usage:
|
||||
# caffe_parse_header(<file> <define1> <define2> <define3> ..)
|
||||
macro(caffe_parse_header FILENAME FILE_VAR)
|
||||
set(vars_regex "")
|
||||
set(__parnet_scope OFF)
|
||||
set(__add_cache OFF)
|
||||
foreach(name ${ARGN})
|
||||
if("${name}" STREQUAL "PARENT_SCOPE")
|
||||
set(__parnet_scope ON)
|
||||
elseif("${name}" STREQUAL "CACHE")
|
||||
set(__add_cache ON)
|
||||
elseif(vars_regex)
|
||||
set(vars_regex "${vars_regex}|${name}")
|
||||
else()
|
||||
set(vars_regex "${name}")
|
||||
endif()
|
||||
endforeach()
|
||||
if(EXISTS "${FILENAME}")
|
||||
file(STRINGS "${FILENAME}" ${FILE_VAR} REGEX "#define[ \t]+(${vars_regex})[ \t]+[0-9]+" )
|
||||
else()
|
||||
unset(${FILE_VAR})
|
||||
endif()
|
||||
foreach(name ${ARGN})
|
||||
if(NOT "${name}" STREQUAL "PARENT_SCOPE" AND NOT "${name}" STREQUAL "CACHE")
|
||||
if(${FILE_VAR})
|
||||
if(${FILE_VAR} MATCHES ".+[ \t]${name}[ \t]+([0-9]+).*")
|
||||
string(REGEX REPLACE ".+[ \t]${name}[ \t]+([0-9]+).*" "\\1" ${name} "${${FILE_VAR}}")
|
||||
else()
|
||||
set(${name} "")
|
||||
endif()
|
||||
if(__add_cache)
|
||||
set(${name} ${${name}} CACHE INTERNAL "${name} parsed from ${FILENAME}" FORCE)
|
||||
elseif(__parnet_scope)
|
||||
set(${name} "${${name}}" PARENT_SCOPE)
|
||||
endif()
|
||||
else()
|
||||
unset(${name} CACHE)
|
||||
endif()
|
||||
endif()
|
||||
endforeach()
|
||||
endmacro()
|
||||
|
||||
################################################################################################
|
||||
# Reads single version define from the header file and parses it
|
||||
# Usage:
|
||||
# caffe_parse_header_single_define(<library_name> <file> <define_name>)
|
||||
function(caffe_parse_header_single_define LIBNAME HDR_PATH VARNAME)
|
||||
set(${LIBNAME}_H "")
|
||||
if(EXISTS "${HDR_PATH}")
|
||||
file(STRINGS "${HDR_PATH}" ${LIBNAME}_H REGEX "^#define[ \t]+${VARNAME}[ \t]+\"[^\"]*\".*$" LIMIT_COUNT 1)
|
||||
endif()
|
||||
|
||||
if(${LIBNAME}_H)
|
||||
string(REGEX REPLACE "^.*[ \t]${VARNAME}[ \t]+\"([0-9]+).*$" "\\1" ${LIBNAME}_VERSION_MAJOR "${${LIBNAME}_H}")
|
||||
string(REGEX REPLACE "^.*[ \t]${VARNAME}[ \t]+\"[0-9]+\\.([0-9]+).*$" "\\1" ${LIBNAME}_VERSION_MINOR "${${LIBNAME}_H}")
|
||||
string(REGEX REPLACE "^.*[ \t]${VARNAME}[ \t]+\"[0-9]+\\.[0-9]+\\.([0-9]+).*$" "\\1" ${LIBNAME}_VERSION_PATCH "${${LIBNAME}_H}")
|
||||
set(${LIBNAME}_VERSION_MAJOR ${${LIBNAME}_VERSION_MAJOR} ${ARGN} PARENT_SCOPE)
|
||||
set(${LIBNAME}_VERSION_MINOR ${${LIBNAME}_VERSION_MINOR} ${ARGN} PARENT_SCOPE)
|
||||
set(${LIBNAME}_VERSION_PATCH ${${LIBNAME}_VERSION_PATCH} ${ARGN} PARENT_SCOPE)
|
||||
set(${LIBNAME}_VERSION_STRING "${${LIBNAME}_VERSION_MAJOR}.${${LIBNAME}_VERSION_MINOR}.${${LIBNAME}_VERSION_PATCH}" PARENT_SCOPE)
|
||||
|
||||
# append a TWEAK version if it exists:
|
||||
set(${LIBNAME}_VERSION_TWEAK "")
|
||||
if("${${LIBNAME}_H}" MATCHES "^.*[ \t]${VARNAME}[ \t]+\"[0-9]+\\.[0-9]+\\.[0-9]+\\.([0-9]+).*$")
|
||||
set(${LIBNAME}_VERSION_TWEAK "${CMAKE_MATCH_1}" ${ARGN} PARENT_SCOPE)
|
||||
endif()
|
||||
if(${LIBNAME}_VERSION_TWEAK)
|
||||
set(${LIBNAME}_VERSION_STRING "${${LIBNAME}_VERSION_STRING}.${${LIBNAME}_VERSION_TWEAK}" ${ARGN} PARENT_SCOPE)
|
||||
else()
|
||||
set(${LIBNAME}_VERSION_STRING "${${LIBNAME}_VERSION_STRING}" ${ARGN} PARENT_SCOPE)
|
||||
endif()
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
########################################################################################################
|
||||
# An option that the user can select. Can accept condition to control when option is available for user.
|
||||
# Usage:
|
||||
# caffe_option(<option_variable> "doc string" <initial value or boolean expression> [IF <condition>])
|
||||
function(caffe_option variable description value)
|
||||
set(__value ${value})
|
||||
set(__condition "")
|
||||
set(__varname "__value")
|
||||
foreach(arg ${ARGN})
|
||||
if(arg STREQUAL "IF" OR arg STREQUAL "if")
|
||||
set(__varname "__condition")
|
||||
else()
|
||||
list(APPEND ${__varname} ${arg})
|
||||
endif()
|
||||
endforeach()
|
||||
unset(__varname)
|
||||
if("${__condition}" STREQUAL "")
|
||||
set(__condition 2 GREATER 1)
|
||||
endif()
|
||||
|
||||
if(${__condition})
|
||||
if("${__value}" MATCHES ";")
|
||||
if(${__value})
|
||||
option(${variable} "${description}" ON)
|
||||
else()
|
||||
option(${variable} "${description}" OFF)
|
||||
endif()
|
||||
elseif(DEFINED ${__value})
|
||||
if(${__value})
|
||||
option(${variable} "${description}" ON)
|
||||
else()
|
||||
option(${variable} "${description}" OFF)
|
||||
endif()
|
||||
else()
|
||||
option(${variable} "${description}" ${__value})
|
||||
endif()
|
||||
else()
|
||||
unset(${variable} CACHE)
|
||||
endif()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Utility macro for comparing two lists. Used for CMake debugging purposes
|
||||
# Usage:
|
||||
# caffe_compare_lists(<list_variable> <list2_variable> [description])
|
||||
function(caffe_compare_lists list1 list2 desc)
|
||||
set(__list1 ${${list1}})
|
||||
set(__list2 ${${list2}})
|
||||
list(SORT __list1)
|
||||
list(SORT __list2)
|
||||
list(LENGTH __list1 __len1)
|
||||
list(LENGTH __list2 __len2)
|
||||
|
||||
if(NOT ${__len1} EQUAL ${__len2})
|
||||
message(FATAL_ERROR "Lists are not equal. ${__len1} != ${__len2}. ${desc}")
|
||||
endif()
|
||||
|
||||
foreach(__i RANGE 1 ${__len1})
|
||||
math(EXPR __index "${__i}- 1")
|
||||
list(GET __list1 ${__index} __item1)
|
||||
list(GET __list2 ${__index} __item2)
|
||||
if(NOT ${__item1} STREQUAL ${__item2})
|
||||
message(FATAL_ERROR "Lists are not equal. Differ at element ${__index}. ${desc}")
|
||||
endif()
|
||||
endforeach()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Command for disabling warnings for different platforms (see below for gcc and VisualStudio)
|
||||
# Usage:
|
||||
# caffe_warnings_disable(<CMAKE_[C|CXX]_FLAGS[_CONFIGURATION]> -Wshadow /wd4996 ..,)
|
||||
macro(caffe_warnings_disable)
|
||||
set(_flag_vars "")
|
||||
set(_msvc_warnings "")
|
||||
set(_gxx_warnings "")
|
||||
|
||||
foreach(arg ${ARGN})
|
||||
if(arg MATCHES "^CMAKE_")
|
||||
list(APPEND _flag_vars ${arg})
|
||||
elseif(arg MATCHES "^/wd")
|
||||
list(APPEND _msvc_warnings ${arg})
|
||||
elseif(arg MATCHES "^-W")
|
||||
list(APPEND _gxx_warnings ${arg})
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
if(NOT _flag_vars)
|
||||
set(_flag_vars CMAKE_C_FLAGS CMAKE_CXX_FLAGS)
|
||||
endif()
|
||||
|
||||
if(MSVC AND _msvc_warnings)
|
||||
foreach(var ${_flag_vars})
|
||||
foreach(warning ${_msvc_warnings})
|
||||
set(${var} "${${var}} ${warning}")
|
||||
endforeach()
|
||||
endforeach()
|
||||
elseif((CMAKE_COMPILER_IS_GNUCXX OR CMAKE_COMPILER_IS_CLANGXX) AND _gxx_warnings)
|
||||
foreach(var ${_flag_vars})
|
||||
foreach(warning ${_gxx_warnings})
|
||||
if(NOT warning MATCHES "^-Wno-")
|
||||
string(REPLACE "${warning}" "" ${var} "${${var}}")
|
||||
string(REPLACE "-W" "-Wno-" warning "${warning}")
|
||||
endif()
|
||||
set(${var} "${${var}} ${warning}")
|
||||
endforeach()
|
||||
endforeach()
|
||||
endif()
|
||||
caffe_clear_vars(_flag_vars _msvc_warnings _gxx_warnings)
|
||||
endmacro()
|
||||
|
||||
################################################################################################
|
||||
# Helper function get current definitions
|
||||
# Usage:
|
||||
# caffe_get_current_definitions(<definitions_variable>)
|
||||
function(caffe_get_current_definitions definitions_var)
|
||||
get_property(current_definitions DIRECTORY PROPERTY COMPILE_DEFINITIONS)
|
||||
set(result "")
|
||||
|
||||
foreach(d ${current_definitions})
|
||||
list(APPEND result -D${d})
|
||||
endforeach()
|
||||
|
||||
caffe_list_unique(result)
|
||||
set(${definitions_var} ${result} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Helper function get current includes/definitions
|
||||
# Usage:
|
||||
# caffe_get_current_cflags(<cflagslist_variable>)
|
||||
function(caffe_get_current_cflags cflags_var)
|
||||
get_property(current_includes DIRECTORY PROPERTY INCLUDE_DIRECTORIES)
|
||||
caffe_convert_absolute_paths(current_includes)
|
||||
caffe_get_current_definitions(cflags)
|
||||
|
||||
foreach(i ${current_includes})
|
||||
list(APPEND cflags "-I${i}")
|
||||
endforeach()
|
||||
|
||||
caffe_list_unique(cflags)
|
||||
set(${cflags_var} ${cflags} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Helper function to parse current linker libs into link directories, libflags and osx frameworks
|
||||
# Usage:
|
||||
# caffe_parse_linker_libs(<Caffe_LINKER_LIBS_var> <directories_var> <libflags_var> <frameworks_var>)
|
||||
function(caffe_parse_linker_libs Caffe_LINKER_LIBS_variable folders_var flags_var frameworks_var)
|
||||
|
||||
set(__unspec "")
|
||||
set(__debug "")
|
||||
set(__optimized "")
|
||||
set(__framework "")
|
||||
set(__varname "__unspec")
|
||||
|
||||
# split libs into debug, optimized, unspecified and frameworks
|
||||
foreach(list_elem ${${Caffe_LINKER_LIBS_variable}})
|
||||
if(list_elem STREQUAL "debug")
|
||||
set(__varname "__debug")
|
||||
elseif(list_elem STREQUAL "optimized")
|
||||
set(__varname "__optimized")
|
||||
elseif(list_elem MATCHES "^-framework[ \t]+([^ \t].*)")
|
||||
list(APPEND __framework -framework ${CMAKE_MATCH_1})
|
||||
else()
|
||||
list(APPEND ${__varname} ${list_elem})
|
||||
set(__varname "__unspec")
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
# attach debug or optimized libs to unspecified according to current configuration
|
||||
if(CMAKE_BUILD_TYPE MATCHES "Debug")
|
||||
set(__libs ${__unspec} ${__debug})
|
||||
else()
|
||||
set(__libs ${__unspec} ${__optimized})
|
||||
endif()
|
||||
|
||||
set(libflags "")
|
||||
set(folders "")
|
||||
|
||||
# convert linker libraries list to link flags
|
||||
foreach(lib ${__libs})
|
||||
if(TARGET ${lib})
|
||||
list(APPEND folders $<TARGET_LINKER_FILE_DIR:${lib}>)
|
||||
list(APPEND libflags -l${lib})
|
||||
elseif(lib MATCHES "^-l.*")
|
||||
list(APPEND libflags ${lib})
|
||||
elseif(IS_ABSOLUTE ${lib})
|
||||
get_filename_component(folder ${lib} PATH)
|
||||
get_filename_component(filename ${lib} NAME)
|
||||
string(REGEX REPLACE "\\.[^.]*$" "" filename_without_shortest_ext ${filename})
|
||||
|
||||
string(REGEX MATCH "^lib(.*)" __match ${filename_without_shortest_ext})
|
||||
list(APPEND libflags -l${CMAKE_MATCH_1})
|
||||
list(APPEND folders ${folder})
|
||||
else()
|
||||
message(FATAL_ERROR "Logic error. Need to update cmake script")
|
||||
endif()
|
||||
endforeach()
|
||||
|
||||
caffe_list_unique(libflags folders)
|
||||
|
||||
set(${folders_var} ${folders} PARENT_SCOPE)
|
||||
set(${flags_var} ${libflags} PARENT_SCOPE)
|
||||
set(${frameworks_var} ${__framework} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Helper function to detect Darwin version, i.e. 10.8, 10.9, 10.10, ....
|
||||
# Usage:
|
||||
# caffe_detect_darwin_version(<version_variable>)
|
||||
function(caffe_detect_darwin_version output_var)
|
||||
if(APPLE)
|
||||
execute_process(COMMAND /usr/bin/sw_vers -productVersion
|
||||
RESULT_VARIABLE __sw_vers OUTPUT_VARIABLE __sw_vers_out
|
||||
ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE)
|
||||
|
||||
set(${output_var} ${__sw_vers_out} PARENT_SCOPE)
|
||||
else()
|
||||
set(${output_var} "" PARENT_SCOPE)
|
||||
endif()
|
||||
endfunction()
|
@ -1,50 +0,0 @@
|
||||
|
||||
set(CMAKE_SOURCE_DIR ..)
|
||||
set(LINT_COMMAND ${CMAKE_SOURCE_DIR}/scripts/cpp_lint.py)
|
||||
set(SRC_FILE_EXTENSIONS h hpp hu c cpp cu cc)
|
||||
set(EXCLUDE_FILE_EXTENSTIONS pb.h pb.cc)
|
||||
set(LINT_DIRS include src/caffe examples tools python matlab)
|
||||
|
||||
cmake_policy(SET CMP0009 NEW) # suppress cmake warning
|
||||
|
||||
# find all files of interest
|
||||
foreach(ext ${SRC_FILE_EXTENSIONS})
|
||||
foreach(dir ${LINT_DIRS})
|
||||
file(GLOB_RECURSE FOUND_FILES ${CMAKE_SOURCE_DIR}/${dir}/*.${ext})
|
||||
set(LINT_SOURCES ${LINT_SOURCES} ${FOUND_FILES})
|
||||
endforeach()
|
||||
endforeach()
|
||||
|
||||
# find all files that should be excluded
|
||||
foreach(ext ${EXCLUDE_FILE_EXTENSTIONS})
|
||||
file(GLOB_RECURSE FOUND_FILES ${CMAKE_SOURCE_DIR}/*.${ext})
|
||||
set(EXCLUDED_FILES ${EXCLUDED_FILES} ${FOUND_FILES})
|
||||
endforeach()
|
||||
|
||||
# exclude generated pb files
|
||||
list(REMOVE_ITEM LINT_SOURCES ${EXCLUDED_FILES})
|
||||
|
||||
execute_process(
|
||||
COMMAND ${LINT_COMMAND} ${LINT_SOURCES}
|
||||
ERROR_VARIABLE LINT_OUTPUT
|
||||
ERROR_STRIP_TRAILING_WHITESPACE
|
||||
)
|
||||
|
||||
string(REPLACE "\n" ";" LINT_OUTPUT ${LINT_OUTPUT})
|
||||
|
||||
list(GET LINT_OUTPUT -1 LINT_RESULT)
|
||||
list(REMOVE_AT LINT_OUTPUT -1)
|
||||
string(REPLACE " " ";" LINT_RESULT ${LINT_RESULT})
|
||||
list(GET LINT_RESULT -1 NUM_ERRORS)
|
||||
if(NUM_ERRORS GREATER 0)
|
||||
foreach(msg ${LINT_OUTPUT})
|
||||
string(FIND ${msg} "Done" result)
|
||||
if(result LESS 0)
|
||||
message(STATUS ${msg})
|
||||
endif()
|
||||
endforeach()
|
||||
message(FATAL_ERROR "Lint found ${NUM_ERRORS} errors!")
|
||||
else()
|
||||
message(STATUS "Lint did not find any errors!")
|
||||
endif()
|
||||
|
@ -1,47 +0,0 @@
|
||||
### Running an official image
|
||||
|
||||
You can run one of the automatic [builds](https://hub.docker.com/r/bvlc/caffe). E.g. for the CPU version:
|
||||
|
||||
`docker run -ti bvlc/caffe:cpu caffe --version`
|
||||
|
||||
or for GPU support (You need a CUDA 8.0 capable driver and
|
||||
[nvidia-docker](https://github.com/NVIDIA/nvidia-docker)):
|
||||
|
||||
`nvidia-docker run -ti bvlc/caffe:gpu caffe --version`
|
||||
|
||||
You might see an error about libdc1394, ignore it.
|
||||
|
||||
### Docker run options
|
||||
|
||||
By default caffe runs as root, thus any output files, e.g. snapshots, will be owned
|
||||
by root. It also runs by default in a container-private folder.
|
||||
|
||||
You can change this using flags, like user (-u), current directory, and volumes (-w and -v).
|
||||
E.g. this behaves like the usual caffe executable:
|
||||
|
||||
`docker run --rm -u $(id -u):$(id -g) -v $(pwd):$(pwd) -w $(pwd) bvlc/caffe:cpu caffe train --solver=example_solver.prototxt`
|
||||
|
||||
Containers can also be used interactively, specifying e.g. `bash` or `ipython`
|
||||
instead of `caffe`.
|
||||
|
||||
```
|
||||
docker run -ti bvlc/caffe:cpu ipython
|
||||
import caffe
|
||||
...
|
||||
```
|
||||
|
||||
The caffe build requirements are included in the container, so this can be used to
|
||||
build and run custom versions of caffe. Also, `caffe/python` is in PATH, so python
|
||||
utilities can be used directly, e.g. `draw_net.py`, `classify.py`, or `detect.py`.
|
||||
|
||||
### Building images yourself
|
||||
|
||||
Examples:
|
||||
|
||||
`docker build -t caffe:cpu cpu`
|
||||
|
||||
`docker build -t caffe:gpu gpu`
|
||||
|
||||
You can also build Caffe and run the tests in the image:
|
||||
|
||||
`docker run -ti caffe:cpu bash -c "cd /opt/caffe/build; make runtest"`
|
@ -1,46 +0,0 @@
|
||||
FROM ubuntu:16.04
|
||||
LABEL maintainer caffe-maint@googlegroups.com
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
build-essential \
|
||||
cmake \
|
||||
git \
|
||||
wget \
|
||||
libatlas-base-dev \
|
||||
libboost-all-dev \
|
||||
libgflags-dev \
|
||||
libgoogle-glog-dev \
|
||||
libhdf5-serial-dev \
|
||||
libleveldb-dev \
|
||||
liblmdb-dev \
|
||||
libopencv-dev \
|
||||
libprotobuf-dev \
|
||||
libsnappy-dev \
|
||||
protobuf-compiler \
|
||||
python-dev \
|
||||
python-numpy \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-scipy && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
ENV CAFFE_ROOT=/opt/caffe
|
||||
WORKDIR $CAFFE_ROOT
|
||||
|
||||
# FIXME: use ARG instead of ENV once DockerHub supports this
|
||||
# https://github.com/docker/hub-feedback/issues/460
|
||||
ENV CLONE_TAG=1.0
|
||||
|
||||
RUN git clone -b ${CLONE_TAG} --depth 1 https://github.com/BVLC/caffe.git . && \
|
||||
pip install --upgrade pip && \
|
||||
cd python && for req in $(cat requirements.txt) pydot; do pip install $req; done && cd .. && \
|
||||
mkdir build && cd build && \
|
||||
cmake -DCPU_ONLY=1 .. && \
|
||||
make -j"$(nproc)"
|
||||
|
||||
ENV PYCAFFE_ROOT $CAFFE_ROOT/python
|
||||
ENV PYTHONPATH $PYCAFFE_ROOT:$PYTHONPATH
|
||||
ENV PATH $CAFFE_ROOT/build/tools:$PYCAFFE_ROOT:$PATH
|
||||
RUN echo "$CAFFE_ROOT/build/lib" >> /etc/ld.so.conf.d/caffe.conf && ldconfig
|
||||
|
||||
WORKDIR /workspace
|
@ -1,47 +0,0 @@
|
||||
FROM nvidia/cuda:8.0-cudnn6-devel-ubuntu16.04
|
||||
LABEL maintainer caffe-maint@googlegroups.com
|
||||
|
||||
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||
build-essential \
|
||||
cmake \
|
||||
git \
|
||||
wget \
|
||||
libatlas-base-dev \
|
||||
libboost-all-dev \
|
||||
libgflags-dev \
|
||||
libgoogle-glog-dev \
|
||||
libhdf5-serial-dev \
|
||||
libleveldb-dev \
|
||||
liblmdb-dev \
|
||||
libopencv-dev \
|
||||
libprotobuf-dev \
|
||||
libsnappy-dev \
|
||||
protobuf-compiler \
|
||||
python-dev \
|
||||
python-numpy \
|
||||
python-pip \
|
||||
python-setuptools \
|
||||
python-scipy && \
|
||||
rm -rf /var/lib/apt/lists/*
|
||||
|
||||
ENV CAFFE_ROOT=/opt/caffe
|
||||
WORKDIR $CAFFE_ROOT
|
||||
|
||||
# FIXME: use ARG instead of ENV once DockerHub supports this
|
||||
# https://github.com/docker/hub-feedback/issues/460
|
||||
ENV CLONE_TAG=1.0
|
||||
|
||||
RUN git clone -b ${CLONE_TAG} --depth 1 https://github.com/BVLC/caffe.git . && \
|
||||
pip install --upgrade pip && \
|
||||
cd python && for req in $(cat requirements.txt) pydot; do pip install $req; done && cd .. && \
|
||||
git clone https://github.com/NVIDIA/nccl.git && cd nccl && make -j install && cd .. && rm -rf nccl && \
|
||||
mkdir build && cd build && \
|
||||
cmake -DUSE_CUDNN=1 -DUSE_NCCL=1 .. && \
|
||||
make -j"$(nproc)"
|
||||
|
||||
ENV PYCAFFE_ROOT $CAFFE_ROOT/python
|
||||
ENV PYTHONPATH $PYCAFFE_ROOT:$PYTHONPATH
|
||||
ENV PATH $CAFFE_ROOT/build/tools:$PYCAFFE_ROOT:$PATH
|
||||
RUN echo "$CAFFE_ROOT/build/lib" >> /etc/ld.so.conf.d/caffe.conf && ldconfig
|
||||
|
||||
WORKDIR /workspace
|
@ -1,106 +0,0 @@
|
||||
# Building docs script
|
||||
# Requirements:
|
||||
# sudo apt-get install doxygen texlive ruby-dev
|
||||
# sudo gem install jekyll execjs therubyracer
|
||||
|
||||
if(NOT BUILD_docs OR NOT DOXYGEN_FOUND)
|
||||
return()
|
||||
endif()
|
||||
|
||||
#################################################################################################
|
||||
# Gather docs from <root>/examples/**/readme.md
|
||||
function(gather_readmes_as_prebuild_cmd target gathered_dir root)
|
||||
set(full_gathered_dir ${root}/${gathered_dir})
|
||||
|
||||
file(GLOB_RECURSE readmes ${root}/examples/readme.md ${root}/examples/README.md)
|
||||
foreach(file ${readmes})
|
||||
# Only use file if it is to be included in docs.
|
||||
file(STRINGS ${file} file_lines REGEX "include_in_docs: true")
|
||||
|
||||
if(file_lines)
|
||||
# Since everything is called readme.md, rename it by its dirname.
|
||||
file(RELATIVE_PATH file ${root} ${file})
|
||||
get_filename_component(folder ${file} PATH)
|
||||
set(new_filename ${full_gathered_dir}/${folder}.md)
|
||||
|
||||
# folder value might be like <subfolder>/readme.md. That's why make directory.
|
||||
get_filename_component(new_folder ${new_filename} PATH)
|
||||
add_custom_command(TARGET ${target} PRE_BUILD
|
||||
COMMAND ${CMAKE_COMMAND} -E make_directory ${new_folder}
|
||||
COMMAND ln -sf ${root}/${file} ${new_filename}
|
||||
COMMENT "Creating symlink ${new_filename} -> ${root}/${file}"
|
||||
WORKING_DIRECTORY ${root} VERBATIM)
|
||||
endif()
|
||||
endforeach()
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
# Gather docs from examples/*.ipynb and add YAML front-matter.
|
||||
function(gather_notebooks_as_prebuild_cmd target gathered_dir root)
|
||||
set(full_gathered_dir ${root}/${gathered_dir})
|
||||
|
||||
if(NOT PYTHON_EXECUTABLE)
|
||||
message(STATUS "Python interpeter is not found. Can't include *.ipynb files in docs. Skipping...")
|
||||
return()
|
||||
endif()
|
||||
|
||||
file(GLOB_RECURSE notebooks ${root}/examples/*.ipynb)
|
||||
foreach(file ${notebooks})
|
||||
file(RELATIVE_PATH file ${root} ${file})
|
||||
set(new_filename ${full_gathered_dir}/${file})
|
||||
|
||||
get_filename_component(new_folder ${new_filename} PATH)
|
||||
add_custom_command(TARGET ${target} PRE_BUILD
|
||||
COMMAND ${CMAKE_COMMAND} -E make_directory ${new_folder}
|
||||
COMMAND ${PYTHON_EXECUTABLE} scripts/copy_notebook.py ${file} ${new_filename}
|
||||
COMMENT "Copying notebook ${file} to ${new_filename}"
|
||||
WORKING_DIRECTORY ${root} VERBATIM)
|
||||
endforeach()
|
||||
|
||||
set(${outputs_var} ${outputs} PARENT_SCOPE)
|
||||
endfunction()
|
||||
|
||||
################################################################################################
|
||||
########################## [ Non macro part ] ##################################################
|
||||
|
||||
# Gathering is done at each 'make doc'
|
||||
file(REMOVE_RECURSE ${PROJECT_SOURCE_DIR}/docs/gathered)
|
||||
|
||||
# Doxygen config file path
|
||||
set(DOXYGEN_config_file ${PROJECT_SOURCE_DIR}/.Doxyfile CACHE FILEPATH "Doxygen config file")
|
||||
|
||||
# Adding docs target
|
||||
add_custom_target(docs COMMAND ${DOXYGEN_EXECUTABLE} ${DOXYGEN_config_file}
|
||||
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}
|
||||
COMMENT "Launching doxygen..." VERBATIM)
|
||||
|
||||
# Gathering examples into docs subfolder
|
||||
gather_notebooks_as_prebuild_cmd(docs docs/gathered ${PROJECT_SOURCE_DIR})
|
||||
gather_readmes_as_prebuild_cmd(docs docs/gathered ${PROJECT_SOURCE_DIR})
|
||||
|
||||
# Auto detect output directory
|
||||
file(STRINGS ${DOXYGEN_config_file} config_line REGEX "OUTPUT_DIRECTORY[ \t]+=[^=].*")
|
||||
if(config_line)
|
||||
string(REGEX MATCH "OUTPUT_DIRECTORY[ \t]+=([^=].*)" __ver_check "${config_line}")
|
||||
string(STRIP ${CMAKE_MATCH_1} output_dir)
|
||||
message(STATUS "Detected Doxygen OUTPUT_DIRECTORY: ${output_dir}")
|
||||
else()
|
||||
set(output_dir ./doxygen/)
|
||||
message(STATUS "Can't find OUTPUT_DIRECTORY in doxygen config file. Try to use default: ${output_dir}")
|
||||
endif()
|
||||
|
||||
if(NOT IS_ABSOLUTE ${output_dir})
|
||||
set(output_dir ${PROJECT_SOURCE_DIR}/${output_dir})
|
||||
get_filename_component(output_dir ${output_dir} ABSOLUTE)
|
||||
endif()
|
||||
|
||||
# creates symlink in docs subfolder to code documentation built by doxygen
|
||||
add_custom_command(TARGET docs POST_BUILD VERBATIM
|
||||
COMMAND ln -sfn "${output_dir}/html" doxygen
|
||||
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/docs
|
||||
COMMENT "Creating symlink ${PROJECT_SOURCE_DIR}/docs/doxygen -> ${output_dir}/html")
|
||||
|
||||
# for quick launch of jekyll
|
||||
add_custom_target(jekyll COMMAND jekyll serve -w -s . -d _site --port=4000
|
||||
WORKING_DIRECTORY ${PROJECT_SOURCE_DIR}/docs
|
||||
COMMENT "Launching jekyll..." VERBATIM)
|
@ -1 +0,0 @@
|
||||
caffe.berkeleyvision.org
|
@ -1,5 +0,0 @@
|
||||
# Caffe Documentation
|
||||
|
||||
To generate the documentation, run `$CAFFE_ROOT/scripts/build_docs.sh`.
|
||||
|
||||
To push your changes to the documentation to the gh-pages branch of your or the BVLC repo, run `$CAFFE_ROOT/scripts/deploy_docs.sh <repo_name>`.
|
@ -1,7 +0,0 @@
|
||||
defaults:
|
||||
-
|
||||
scope:
|
||||
path: "" # an empty string here means all files in the project
|
||||
values:
|
||||
layout: "default"
|
||||
|
@ -1,62 +0,0 @@
|
||||
<!doctype html>
|
||||
<html>
|
||||
<head>
|
||||
<!-- MathJax -->
|
||||
<script type="text/javascript"
|
||||
src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML">
|
||||
</script>
|
||||
<meta charset="utf-8">
|
||||
<meta http-equiv="X-UA-Compatible" content="chrome=1">
|
||||
<title>
|
||||
Caffe {% if page contains 'title' %}| {{ page.title }}{% endif %}
|
||||
</title>
|
||||
|
||||
<link rel="icon" type="image/png" href="/images/caffeine-icon.png">
|
||||
|
||||
<link rel="stylesheet" href="/stylesheets/reset.css">
|
||||
<link rel="stylesheet" href="/stylesheets/styles.css">
|
||||
<link rel="stylesheet" href="/stylesheets/pygment_trac.css">
|
||||
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, user-scalable=no">
|
||||
<!--[if lt IE 9]>
|
||||
<script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script>
|
||||
<![endif]-->
|
||||
</head>
|
||||
<body>
|
||||
<script>
|
||||
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
|
||||
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
|
||||
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
|
||||
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
|
||||
|
||||
ga('create', 'UA-46255508-1', 'daggerfs.com');
|
||||
ga('send', 'pageview');
|
||||
</script>
|
||||
<div class="wrapper">
|
||||
<header>
|
||||
<h1 class="header"><a href="/">Caffe</a></h1>
|
||||
<p class="header">
|
||||
Deep learning framework by <a class="header name" href="http://bair.berkeley.edu/">BAIR</a>
|
||||
</p>
|
||||
<p class="header">
|
||||
Created by
|
||||
<br>
|
||||
<a class="header name" href="http://daggerfs.com/">Yangqing Jia</a>
|
||||
<br>
|
||||
Lead Developer
|
||||
<br>
|
||||
<a class="header name" href="http://imaginarynumber.net/">Evan Shelhamer</a>
|
||||
<ul>
|
||||
<li>
|
||||
<a class="buttons github" href="https://github.com/BVLC/caffe">View On GitHub</a>
|
||||
</li>
|
||||
</ul>
|
||||
</header>
|
||||
<section>
|
||||
|
||||
{{ content }}
|
||||
|
||||
</section>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
@ -1,120 +0,0 @@
|
||||
---
|
||||
title: Developing and Contributing
|
||||
---
|
||||
# Development and Contributing
|
||||
|
||||
Caffe is developed with active participation of the community.<br>
|
||||
The [BAIR](http://bair.berkeley.edu/)/BVLC brewers welcome all contributions!
|
||||
|
||||
The exact details of contributions are recorded by versioning and cited in our [acknowledgements](http://caffe.berkeleyvision.org/#acknowledgements).
|
||||
This method is impartial and always up-to-date.
|
||||
|
||||
## License
|
||||
|
||||
Caffe is licensed under the terms in [LICENSE](https://github.com/BVLC/caffe/blob/master/LICENSE). By contributing to the project, you agree to the license and copyright terms therein and release your contribution under these terms.
|
||||
|
||||
## Copyright
|
||||
|
||||
Caffe uses a shared copyright model: each contributor holds copyright over their contributions to Caffe. The project versioning records all such contribution and copyright details.
|
||||
|
||||
If a contributor wants to further mark their specific copyright on a particular contribution, they should indicate their copyright solely in the commit message of the change when it is committed. Do not include copyright notices in files for this purpose.
|
||||
|
||||
### Documentation
|
||||
|
||||
This website, written with [Jekyll](http://jekyllrb.com/), acts as the official Caffe documentation -- simply run `scripts/build_docs.sh` and view the website at `http://0.0.0.0:4000`.
|
||||
|
||||
We prefer tutorials and examples to be documented close to where they live, in `readme.md` files.
|
||||
The `build_docs.sh` script gathers all `examples/**/readme.md` and `examples/*.ipynb` files, and makes a table of contents.
|
||||
To be included in the docs, the readme files must be annotated with [YAML front-matter](http://jekyllrb.com/docs/frontmatter/), including the flag `include_in_docs: true`.
|
||||
Similarly for IPython notebooks: simply include `"include_in_docs": true` in the `"metadata"` JSON field.
|
||||
|
||||
Other docs, such as installation guides, are written in the `docs` directory and manually linked to from the `index.md` page.
|
||||
|
||||
We strive to provide lots of usage examples, and to document all code in docstrings.
|
||||
We absolutely appreciate any contribution to this effort!
|
||||
|
||||
### Versioning
|
||||
|
||||
The `master` branch receives all new development including community contributions.
|
||||
We try to keep it in a reliable state, but it is the bleeding edge, and things do get broken every now and then.
|
||||
BAIR maintainers will periodically make releases by marking stable checkpoints as tags and maintenance branches. [Past releases](https://github.com/BVLC/caffe/releases) are catalogued online.
|
||||
|
||||
#### Issues & Pull Request Protocol
|
||||
|
||||
Post [Issues](https://github.com/BVLC/caffe/issues) to propose features, report [bugs], and discuss framework code.
|
||||
Large-scale development work is guided by [milestones], which are sets of Issues selected for bundling as releases.
|
||||
|
||||
Please note that since the core developers are largely researchers, we may work on a feature in isolation for some time before releasing it to the community, so as to claim honest academic contribution.
|
||||
We do release things as soon as a reasonable technical report may be written, and we still aim to inform the community of ongoing development through Github Issues.
|
||||
|
||||
**When you are ready to develop a feature or fixing a bug, follow this protocol**:
|
||||
|
||||
- Develop in [feature branches] with descriptive names. Branch off of the latest `master`.
|
||||
- Bring your work up-to-date by [rebasing] onto the latest `master` when done.
|
||||
(Groom your changes by [interactive rebase], if you'd like.)
|
||||
- [Pull request] your contribution to `BVLC/caffe`'s `master` branch for discussion and review.
|
||||
- Make PRs *as soon as development begins*, to let discussion guide development.
|
||||
- A PR is only ready for merge review when it is a fast-forward merge, and all code is documented, linted, and tested -- that means your PR must include tests!
|
||||
- When the PR satisfies the above properties, use comments to request maintainer review.
|
||||
|
||||
The following is a poetic presentation of the protocol in code form.
|
||||
|
||||
#### [Shelhamer's](https://github.com/shelhamer) “life of a branch in four acts”
|
||||
|
||||
Make the `feature` branch off of the latest `bvlc/master`
|
||||
|
||||
git checkout master
|
||||
git pull upstream master
|
||||
git checkout -b feature
|
||||
# do your work, make commits
|
||||
|
||||
Prepare to merge by rebasing your branch on the latest `bvlc/master`
|
||||
|
||||
# make sure master is fresh
|
||||
git checkout master
|
||||
git pull upstream master
|
||||
# rebase your branch on the tip of master
|
||||
git checkout feature
|
||||
git rebase master
|
||||
|
||||
Push your branch to pull request it into `BVLC/caffe:master`
|
||||
|
||||
git push origin feature
|
||||
# ...make pull request to master...
|
||||
|
||||
Now make a pull request! You can do this from the command line (`git pull-request -b master`) if you install [hub](https://github.com/github/hub). Hub has many other magical uses.
|
||||
|
||||
The pull request of `feature` into `master` will be a clean merge. Applause.
|
||||
|
||||
[bugs]: https://github.com/BVLC/caffe/issues?labels=bug&page=1&state=open
|
||||
[milestones]: https://github.com/BVLC/caffe/issues?milestone=1
|
||||
[Pull request]: https://help.github.com/articles/using-pull-requests
|
||||
[interactive rebase]: https://help.github.com/articles/interactive-rebase
|
||||
[rebasing]: http://git-scm.com/book/en/Git-Branching-Rebasing
|
||||
[feature branches]: https://www.atlassian.com/git/workflows#!workflow-feature-branch
|
||||
|
||||
**Historical note**: Caffe once relied on a two branch `master` and `dev` workflow.
|
||||
PRs from this time are still open but these will be merged into `master` or closed.
|
||||
|
||||
### Testing
|
||||
|
||||
Run `make runtest` to check the project tests. New code requires new tests. Pull requests that fail tests will not be accepted.
|
||||
|
||||
The `gtest` framework we use provides many additional options, which you can access by running the test binaries directly. One of the more useful options is `--gtest_filter`, which allows you to filter tests by name:
|
||||
|
||||
# run all tests with CPU in the name
|
||||
build/test/test_all.testbin --gtest_filter='*CPU*'
|
||||
|
||||
# run all tests without GPU in the name (note the leading minus sign)
|
||||
build/test/test_all.testbin --gtest_filter=-'*GPU*'
|
||||
|
||||
To get a list of all options `googletest` provides, simply pass the `--help` flag:
|
||||
|
||||
build/test/test_all.testbin --help
|
||||
|
||||
### Style
|
||||
|
||||
- **Run `make lint` to check C++ code.**
|
||||
- Wrap lines at 80 chars.
|
||||
- Follow [Google C++ style](https://google.github.io/styleguide/cppguide.html) and [Google python style](https://google.github.io/styleguide/pyguide.html) + [PEP 8](http://legacy.python.org/dev/peps/pep-0008/).
|
||||
- Remember that “a foolish consistency is the hobgoblin of little minds,” so use your best judgement to write the clearest code for your particular case.
|
Before Width: | Height: | Size: 2.6 KiB |
Before Width: | Height: | Size: 954 B |
@ -1,101 +0,0 @@
|
||||
---
|
||||
title: Deep Learning Framework
|
||||
---
|
||||
|
||||
# Caffe
|
||||
|
||||
Caffe is a deep learning framework made with expression, speed, and modularity in mind.
|
||||
It is developed by Berkeley AI Research ([BAIR](http://bair.berkeley.edu)) and by community contributors.
|
||||
[Yangqing Jia](http://daggerfs.com) created the project during his PhD at UC Berkeley.
|
||||
Caffe is released under the [BSD 2-Clause license](https://github.com/BVLC/caffe/blob/master/LICENSE).
|
||||
|
||||
Check out our web image classification [demo](http://demo.caffe.berkeleyvision.org)!
|
||||
|
||||
## Why Caffe?
|
||||
|
||||
**Expressive architecture** encourages application and innovation.
|
||||
Models and optimization are defined by configuration without hard-coding.
|
||||
Switch between CPU and GPU by setting a single flag to train on a GPU machine then deploy to commodity clusters or mobile devices.
|
||||
|
||||
**Extensible code** fosters active development.
|
||||
In Caffe's first year, it has been forked by over 1,000 developers and had many significant changes contributed back.
|
||||
Thanks to these contributors the framework tracks the state-of-the-art in both code and models.
|
||||
|
||||
**Speed** makes Caffe perfect for research experiments and industry deployment.
|
||||
Caffe can process **over 60M images per day** with a single NVIDIA K40 GPU\*.
|
||||
That's 1 ms/image for inference and 4 ms/image for learning and more recent library versions and hardware are faster still.
|
||||
We believe that Caffe is among the fastest convnet implementations available.
|
||||
|
||||
**Community**: Caffe already powers academic research projects, startup prototypes, and even large-scale industrial applications in vision, speech, and multimedia.
|
||||
Join our community of brewers on the [caffe-users group](https://groups.google.com/forum/#!forum/caffe-users) and [Github](https://github.com/BVLC/caffe/).
|
||||
|
||||
<p class="footnote" markdown="1">
|
||||
\* With the ILSVRC2012-winning [SuperVision](http://www.image-net.org/challenges/LSVRC/2012/supervision.pdf) model and prefetching IO.
|
||||
</p>
|
||||
|
||||
## Documentation
|
||||
|
||||
- [DIY Deep Learning for Vision with Caffe](https://docs.google.com/presentation/d/1UeKXVgRvvxg9OUdh_UiC5G71UMscNPlvArsWER41PsU/edit#slide=id.p) and [Caffe in a Day](https://docs.google.com/presentation/d/1HxGdeq8MPktHaPb-rlmYYQ723iWzq9ur6Gjo71YiG0Y/edit#slide=id.gc2fcdcce7_216_0)<br>
|
||||
Tutorial presentation of the framework and a full-day crash course.
|
||||
- [Tutorial Documentation](/tutorial)<br>
|
||||
Practical guide and framework reference.
|
||||
- [arXiv / ACM MM '14 paper](http://arxiv.org/abs/1408.5093)<br>
|
||||
A 4-page report for the ACM Multimedia Open Source competition (arXiv:1408.5093v1).
|
||||
- [Installation instructions](/installation.html)<br>
|
||||
Tested on Ubuntu, Red Hat, OS X.
|
||||
* [Model Zoo](/model_zoo.html)<br>
|
||||
BAIR suggests a standard distribution format for Caffe models, and provides trained models.
|
||||
* [Developing & Contributing](/development.html)<br>
|
||||
Guidelines for development and contributing to Caffe.
|
||||
* [API Documentation](/doxygen/annotated.html)<br>
|
||||
Developer documentation automagically generated from code comments.
|
||||
* [Benchmarking](https://docs.google.com/spreadsheets/d/1Yp4rqHpT7mKxOPbpzYeUfEFLnELDAgxSSBQKp5uKDGQ/edit#gid=0)<br>
|
||||
Comparison of inference and learning for different networks and GPUs.
|
||||
|
||||
### Notebook Examples
|
||||
|
||||
{% assign notebooks = site.pages | where:'category','notebook' | sort: 'priority' %}
|
||||
{% for page in notebooks %}
|
||||
- <div><a href="http://nbviewer.ipython.org/github/BVLC/caffe/blob/master/{{page.original_path}}">{{page.title}}</a><br>{{page.description}}</div>
|
||||
{% endfor %}
|
||||
|
||||
### Command Line Examples
|
||||
|
||||
{% assign examples = site.pages | where:'category','example' | sort: 'priority' %}
|
||||
{% for page in examples %}
|
||||
- <div><a href="{{page.url}}">{{page.title}}</a><br>{{page.description}}</div>
|
||||
{% endfor %}
|
||||
|
||||
## Citing Caffe
|
||||
|
||||
Please cite Caffe in your publications if it helps your research:
|
||||
|
||||
@article{jia2014caffe,
|
||||
Author = {Jia, Yangqing and Shelhamer, Evan and Donahue, Jeff and Karayev, Sergey and Long, Jonathan and Girshick, Ross and Guadarrama, Sergio and Darrell, Trevor},
|
||||
Journal = {arXiv preprint arXiv:1408.5093},
|
||||
Title = {Caffe: Convolutional Architecture for Fast Feature Embedding},
|
||||
Year = {2014}
|
||||
}
|
||||
|
||||
If you do publish a paper where Caffe helped your research, we encourage you to cite the framework for tracking by [Google Scholar](https://scholar.google.com/citations?view_op=view_citation&hl=en&citation_for_view=-ltRSM0AAAAJ:u5HHmVD_uO8C).
|
||||
|
||||
## Contacting Us
|
||||
|
||||
Join the [caffe-users group](https://groups.google.com/forum/#!forum/caffe-users) to ask questions and discuss methods and models. This is where we talk about usage, installation, and applications.
|
||||
|
||||
Framework development discussions and thorough bug reports are collected on [Issues](https://github.com/BVLC/caffe/issues).
|
||||
|
||||
## Acknowledgements
|
||||
|
||||
The BAIR Caffe developers would like to thank NVIDIA for GPU donation, A9 and Amazon Web Services for a research grant in support of Caffe development and reproducible research in deep learning, and BAIR PI [Trevor Darrell](http://www.eecs.berkeley.edu/~trevor/) for guidance.
|
||||
|
||||
The BAIR members who have contributed to Caffe are (alphabetical by first name):
|
||||
[Carl Doersch](http://www.carldoersch.com/), [Eric Tzeng](https://github.com/erictzeng), [Evan Shelhamer](http://imaginarynumber.net/), [Jeff Donahue](http://jeffdonahue.com/), [Jon Long](https://github.com/longjon), [Philipp Krähenbühl](http://www.philkr.net/), [Ronghang Hu](http://ronghanghu.com/), [Ross Girshick](http://www.cs.berkeley.edu/~rbg/), [Sergey Karayev](http://sergeykarayev.com/), [Sergio Guadarrama](http://www.eecs.berkeley.edu/~sguada/), [Takuya Narihira](https://github.com/tnarihi), and [Yangqing Jia](http://daggerfs.com/).
|
||||
|
||||
The open-source community plays an important and growing role in Caffe's development.
|
||||
Check out the Github [project pulse](https://github.com/BVLC/caffe/pulse) for recent activity and the [contributors](https://github.com/BVLC/caffe/graphs/contributors) for the full list.
|
||||
|
||||
We sincerely appreciate your interest and contributions!
|
||||
If you'd like to contribute, please read the [developing & contributing](development.html) guide.
|
||||
|
||||
Yangqing would like to give a personal thanks to the NVIDIA Academic program for providing GPUs, [Oriol Vinyals](http://www1.icsi.berkeley.edu/~vinyals/) for discussions along the journey, and BAIR PI [Trevor Darrell](http://www.eecs.berkeley.edu/~trevor/) for advice.
|
@ -1,82 +0,0 @@
|
||||
---
|
||||
title: "Installation: Ubuntu"
|
||||
---
|
||||
|
||||
# Ubuntu Installation
|
||||
|
||||
### For Ubuntu (>= 17.04)
|
||||
|
||||
**Installing pre-compiled Caffe**
|
||||
|
||||
Everything including caffe itself is packaged in 17.04 and higher versions.
|
||||
To install pre-compiled Caffe package, just do it by
|
||||
|
||||
sudo apt install caffe-cpu
|
||||
|
||||
for CPU-only version, or
|
||||
|
||||
sudo apt install caffe-cuda
|
||||
|
||||
for CUDA version. Note, the cuda version may break if your NVIDIA driver
|
||||
and CUDA toolkit are not installed by APT.
|
||||
|
||||
[Package status of CPU-only version](https://launchpad.net/ubuntu/+source/caffe)
|
||||
|
||||
[Package status of CUDA version](https://launchpad.net/ubuntu/+source/caffe-contrib)
|
||||
|
||||
**Installing Caffe from source**
|
||||
|
||||
We may install the dependencies by merely one line
|
||||
|
||||
sudo apt build-dep caffe-cpu # dependencies for CPU-only version
|
||||
sudo apt build-dep caffe-cuda # dependencies for CUDA version
|
||||
|
||||
It requires a `deb-src` line in your `sources.list`.
|
||||
Continue with [compilation](installation.html#compilation).
|
||||
|
||||
### For Ubuntu (\< 17.04)
|
||||
|
||||
**General dependencies**
|
||||
|
||||
sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev protobuf-compiler
|
||||
sudo apt-get install --no-install-recommends libboost-all-dev
|
||||
sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev
|
||||
|
||||
**CUDA**: Install by `apt-get` or the NVIDIA `.run` package.
|
||||
The NVIDIA package tends to follow more recent library and driver versions, but the installation is more manual.
|
||||
If installing from packages, install the library and latest driver separately; the driver bundled with the library is usually out-of-date.
|
||||
This can be skipped for CPU-only installation.
|
||||
|
||||
**BLAS**: install ATLAS by `sudo apt-get install libatlas-base-dev` or install OpenBLAS by `sudo apt-get install libopenblas-dev` or MKL for better CPU performance.
|
||||
|
||||
**Python** (optional): if you use the default Python you will need to `sudo apt-get install` the `python-dev` package to have the Python headers for building the pycaffe interface.
|
||||
|
||||
**Compatibility notes, 16.04**
|
||||
|
||||
CUDA 8 is required on Ubuntu 16.04.
|
||||
|
||||
**Remaining dependencies, 12.04**
|
||||
|
||||
These dependencies need manual installation in 12.04.
|
||||
|
||||
# glog
|
||||
wget https://github.com/google/glog/archive/v0.3.3.tar.gz
|
||||
tar zxvf v0.3.3.tar.gz
|
||||
cd glog-0.3.3
|
||||
./configure
|
||||
make && make install
|
||||
# gflags
|
||||
wget https://github.com/schuhschuh/gflags/archive/master.zip
|
||||
unzip master.zip
|
||||
cd gflags-master
|
||||
mkdir build && cd build
|
||||
export CXXFLAGS="-fPIC" && cmake .. && make VERBOSE=1
|
||||
make && make install
|
||||
# lmdb
|
||||
git clone https://github.com/LMDB/lmdb
|
||||
cd lmdb/libraries/liblmdb
|
||||
make && make install
|
||||
|
||||
Note that glog does not compile with the most recent gflags version (2.1), so before that is resolved you will need to build with glog first.
|
||||
|
||||
Continue with [compilation](installation.html#compilation).
|
@ -1,163 +0,0 @@
|
||||
---
|
||||
title: "Installation: Debian"
|
||||
---
|
||||
|
||||
# Debian Installation
|
||||
|
||||
Caffe packages are available for several Debian versions, as shown in the
|
||||
following chart:
|
||||
|
||||
```
|
||||
Your Distro | CPU_ONLY | CUDA | Codename
|
||||
----------------+------------+--------+-------------------
|
||||
Debian/oldstable| ✘ | ✘ | Jessie (8.0)
|
||||
Debian/stable | ✔ | ✔ | Stretch (9.0)
|
||||
Debian/testing | ✔ | ✔ | Buster
|
||||
Debian/unstable | ✔ | ✔ | Buster
|
||||
```
|
||||
|
||||
* `✘ ` You should take a look at [Ubuntu installation instruction](install_apt.html).
|
||||
|
||||
* `✔ ` You can install caffe with a single command line following this guide.
|
||||
|
||||
* [Package status of CPU-only version](https://tracker.debian.org/pkg/caffe)
|
||||
|
||||
* [Package status of CUDA version](https://tracker.debian.org/pkg/caffe-contrib)
|
||||
|
||||
Last update: 2017-07-08
|
||||
|
||||
## Binary installation with APT
|
||||
|
||||
Apart from the installation methods based on source, Debian users can install
|
||||
pre-compiled Caffe packages from the official archive with APT.
|
||||
|
||||
Make sure that your `/etc/apt/sources.list` contains `contrib` and `non-free`
|
||||
sections if you want to install the CUDA version, for instance:
|
||||
|
||||
```
|
||||
deb http://ftp2.cn.debian.org/debian sid main contrib non-free
|
||||
```
|
||||
|
||||
Then we update APT cache and directly install Caffe. Note, the cpu version and
|
||||
the cuda version cannot coexist.
|
||||
|
||||
```
|
||||
$ sudo apt update
|
||||
$ sudo apt install [ caffe-cpu | caffe-cuda ]
|
||||
$ caffe # command line interface working
|
||||
$ python3 -c 'import caffe; print(caffe.__path__)' # python3 interface working
|
||||
```
|
||||
|
||||
These Caffe packages should work for you out of box. However, the CUDA version
|
||||
may break if your NVIDIA driver and CUDA toolkit are not installed with APT.
|
||||
|
||||
#### Customizing caffe packages
|
||||
|
||||
Some users may need to customize the Caffe package. The way to customize
|
||||
the package is beyond this guide. Here is only a brief guide of producing
|
||||
the customized `.deb` packages.
|
||||
|
||||
Make sure that there is a `dec-src` source in your `/etc/apt/sources.list`,
|
||||
for instance:
|
||||
|
||||
```
|
||||
deb http://ftp2.cn.debian.org/debian sid main contrib non-free
|
||||
deb-src http://ftp2.cn.debian.org/debian sid main contrib non-free
|
||||
```
|
||||
|
||||
Then we build caffe deb files with the following commands:
|
||||
|
||||
```
|
||||
$ sudo apt update
|
||||
$ sudo apt install build-essential debhelper devscripts # standard package building tools
|
||||
$ sudo apt build-dep [ caffe-cpu | caffe-cuda ] # the most elegant way to pull caffe build dependencies
|
||||
$ apt source [ caffe-cpu | caffe-cuda ] # download the source tarball and extract
|
||||
$ cd caffe-XXXX
|
||||
[ ... optional, customizing caffe code/build ... ]
|
||||
$ dch --local "Modified XXX" # bump package version and write changelog
|
||||
$ debuild -B -j4 # build caffe with 4 parallel jobs (similar to make -j4)
|
||||
[ ... building ...]
|
||||
$ debc # optional, if you want to check the package contents
|
||||
$ sudo debi # optional, install the generated packages
|
||||
$ ls ../ # optional, you will see the resulting packages
|
||||
```
|
||||
|
||||
It is a BUG if the package failed to build without any change.
|
||||
The changelog will be installed at e.g. `/usr/share/doc/caffe-cpu/changelog.Debian.gz`.
|
||||
|
||||
## Source installation
|
||||
|
||||
Source installation under Debian/unstable and Debian/testing is similar to that of Ubuntu, but
|
||||
here is a more elegant way to pull caffe build dependencies:
|
||||
|
||||
```
|
||||
$ sudo apt build-dep [ caffe-cpu | caffe-cuda ]
|
||||
```
|
||||
|
||||
Note, this requires a `deb-src` entry in your `/etc/apt/sources.list`.
|
||||
|
||||
#### Compiler Combinations
|
||||
|
||||
Some users may find their favorate compiler doesn't work with CUDA.
|
||||
|
||||
```
|
||||
CXX compiler | CUDA 7.5 | CUDA 8.0 | CUDA 9.0 |
|
||||
-------------+------------+------------+------------+
|
||||
GCC-8 | ? | ? | ? |
|
||||
GCC-7 | ? | ? | ? |
|
||||
GCC-6 | ✘ | ✘ | ✔ |
|
||||
GCC-5 | ✔ [1] | ✔ | ✔ |
|
||||
-------------+------------+------------+------------+
|
||||
CLANG-4.0 | ? | ? | ? |
|
||||
CLANG-3.9 | ✘ | ✘ | ✔ |
|
||||
CLANG-3.8 | ? | ✔ | ✔ |
|
||||
```
|
||||
|
||||
`[1]` CUDA 7.5 's `host_config.h` must be patched before working with GCC-5.
|
||||
|
||||
`[2]` CUDA 9.0: https://devblogs.nvidia.com/parallelforall/cuda-9-features-revealed/
|
||||
|
||||
BTW, please forget the GCC-4.X series, since its `libstdc++` ABI is not compatible with GCC-5's.
|
||||
You may encounter failure linking GCC-4.X object files against GCC-5 libraries.
|
||||
(See https://wiki.debian.org/GCC5 )
|
||||
|
||||
## Notes
|
||||
|
||||
* Consider re-compiling OpenBLAS locally with optimization flags for sake of
|
||||
performance. This is highly recommended for any kind of production use, including
|
||||
academic research.
|
||||
|
||||
* If you are installing `caffe-cuda`, APT will automatically pull some of the
|
||||
CUDA packages and the nvidia driver packages. Please be careful if you have
|
||||
manually installed or hacked nvidia driver or CUDA toolkit or any other
|
||||
related stuff, because in this case APT may fail.
|
||||
|
||||
* Additionally, a manpage (`man caffe`) and a bash complementation script
|
||||
(`caffe <TAB><TAB>`, `caffe train <TAB><TAB>`) are provided.
|
||||
Both of the two files are still not merged into caffe master.
|
||||
|
||||
* The python interface is Python 3 version: `python3-caffe-{cpu,cuda}`.
|
||||
No plan to support python2.
|
||||
|
||||
* If you encountered any problem related to the packaging system (e.g. failed to install `caffe-*`),
|
||||
please report bug to Debian via Debian's bug tracking system. See https://www.debian.org/Bugs/ .
|
||||
Patches and suggestions are also welcome.
|
||||
|
||||
## FAQ
|
||||
|
||||
* where is caffe-cudnn?
|
||||
|
||||
CUDNN library seems not redistributable currently. If you really want the
|
||||
caffe-cudnn deb packages, the workaround is to install cudnn by yourself,
|
||||
and hack the packaging scripts, then build your customized package.
|
||||
|
||||
* I installed the CPU version. How can I switch to the CUDA version?
|
||||
|
||||
`sudo apt install caffe-cuda`, apt's dependency resolver is smart enough to deal with this.
|
||||
|
||||
* Where are the examples, the models and other documentation stuff?
|
||||
|
||||
```
|
||||
$ sudo apt install caffe-doc
|
||||
$ dpkg -L caffe-doc
|
||||
```
|
@ -1,128 +0,0 @@
|
||||
---
|
||||
title: "Installation: OS X"
|
||||
---
|
||||
|
||||
# OS X Installation
|
||||
|
||||
We highly recommend using the [Homebrew](http://brew.sh/) package manager.
|
||||
Ideally you could start from a clean `/usr/local` to avoid conflicts.
|
||||
In the following, we assume that you're using Anaconda Python and Homebrew.
|
||||
|
||||
**CUDA**: Install via the NVIDIA package that includes both CUDA and the bundled driver. **CUDA 7 is strongly suggested.** Older CUDA require `libstdc++` while clang++ is the default compiler and `libc++` the default standard library on OS X 10.9+. This disagreement makes it necessary to change the compilation settings for each of the dependencies. This is prone to error.
|
||||
|
||||
**Library Path**: We find that everything compiles successfully if `$LD_LIBRARY_PATH` is not set at all, and `$DYLD_FALLBACK_LIBRARY_PATH` is set to provide CUDA, Python, and other relevant libraries (e.g. `/usr/local/cuda/lib:$HOME/anaconda/lib:/usr/local/lib:/usr/lib`).
|
||||
In other `ENV` settings, things may not work as expected.
|
||||
|
||||
**General dependencies**
|
||||
|
||||
brew install -vd snappy leveldb gflags glog szip lmdb
|
||||
# need the homebrew science source for OpenCV and hdf5
|
||||
brew tap homebrew/science
|
||||
brew install hdf5 opencv
|
||||
|
||||
If using Anaconda Python, a modification to the OpenCV formula might be needed
|
||||
Do `brew edit opencv` and change the lines that look like the two lines below to exactly the two lines below.
|
||||
|
||||
-DPYTHON_LIBRARY=#{py_prefix}/lib/libpython2.7.dylib
|
||||
-DPYTHON_INCLUDE_DIR=#{py_prefix}/include/python2.7
|
||||
|
||||
If using Anaconda Python, HDF5 is bundled and the `hdf5` formula can be skipped.
|
||||
|
||||
**Remaining dependencies, with / without Python**
|
||||
|
||||
# with Python pycaffe needs dependencies built from source
|
||||
brew install --build-from-source --with-python -vd protobuf
|
||||
brew install --build-from-source -vd boost boost-python
|
||||
# without Python the usual installation suffices
|
||||
brew install protobuf boost
|
||||
|
||||
**BLAS**: already installed as the [Accelerate / vecLib Framework](https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/man7/Accelerate.7.html). OpenBLAS and MKL are alternatives for faster CPU computation.
|
||||
|
||||
**Python** (optional): Anaconda is the preferred Python.
|
||||
If you decide against it, please use Homebrew.
|
||||
Check that Caffe and dependencies are linking against the same, desired Python.
|
||||
|
||||
Continue with [compilation](installation.html#compilation).
|
||||
|
||||
## libstdc++ installation
|
||||
|
||||
This route is not for the faint of heart.
|
||||
For OS X 10.10 and 10.9 you should install CUDA 7 and follow the instructions above.
|
||||
If that is not an option, take a deep breath and carry on.
|
||||
|
||||
In OS X 10.9+, clang++ is the default C++ compiler and uses `libc++` as the standard library.
|
||||
However, NVIDIA CUDA (even version 6.0) currently links only with `libstdc++`.
|
||||
This makes it necessary to change the compilation settings for each of the dependencies.
|
||||
|
||||
We do this by modifying the Homebrew formulae before installing any packages.
|
||||
Make sure that Homebrew doesn't install any software dependencies in the background; all packages must be linked to `libstdc++`.
|
||||
|
||||
The prerequisite Homebrew formulae are
|
||||
|
||||
boost snappy leveldb protobuf gflags glog szip lmdb homebrew/science/opencv
|
||||
|
||||
For each of these formulas, `brew edit FORMULA`, and add the ENV definitions as shown:
|
||||
|
||||
def install
|
||||
# ADD THE FOLLOWING:
|
||||
ENV.append "CXXFLAGS", "-stdlib=libstdc++"
|
||||
ENV.append "CFLAGS", "-stdlib=libstdc++"
|
||||
ENV.append "LDFLAGS", "-stdlib=libstdc++ -lstdc++"
|
||||
# The following is necessary because libtool likes to strip LDFLAGS:
|
||||
ENV["CXX"] = "/usr/bin/clang++ -stdlib=libstdc++"
|
||||
...
|
||||
|
||||
To edit the formulae in turn, run
|
||||
|
||||
for x in snappy leveldb protobuf gflags glog szip boost boost-python lmdb homebrew/science/opencv; do brew edit $x; done
|
||||
|
||||
After this, run
|
||||
|
||||
for x in snappy leveldb gflags glog szip lmdb homebrew/science/opencv; do brew uninstall $x; brew install --build-from-source -vd $x; done
|
||||
brew uninstall protobuf; brew install --build-from-source --with-python -vd protobuf
|
||||
brew install --build-from-source -vd boost boost-python
|
||||
|
||||
If this is not done exactly right then linking errors will trouble you.
|
||||
|
||||
**Homebrew versioning** that Homebrew maintains itself as a separate git repository and making the above `brew edit FORMULA` changes will change files in your local copy of homebrew's master branch. By default, this will prevent you from updating Homebrew using `brew update`, as you will get an error message like the following:
|
||||
|
||||
$ brew update
|
||||
error: Your local changes to the following files would be overwritten by merge:
|
||||
Library/Formula/lmdb.rb
|
||||
Please, commit your changes or stash them before you can merge.
|
||||
Aborting
|
||||
Error: Failure while executing: git pull -q origin refs/heads/master:refs/remotes/origin/master
|
||||
|
||||
One solution is to commit your changes to a separate Homebrew branch, run `brew update`, and rebase your changes onto the updated master. You'll have to do this both for the main Homebrew repository in `/usr/local/` and the Homebrew science repository that contains OpenCV in `/usr/local/Library/Taps/homebrew/homebrew-science`, as follows:
|
||||
|
||||
cd /usr/local
|
||||
git checkout -b caffe
|
||||
git add .
|
||||
git commit -m "Update Caffe dependencies to use libstdc++"
|
||||
cd /usr/local/Library/Taps/homebrew/homebrew-science
|
||||
git checkout -b caffe
|
||||
git add .
|
||||
git commit -m "Update Caffe dependencies"
|
||||
|
||||
Then, whenever you want to update homebrew, switch back to the master branches, do the update, rebase the caffe branches onto master and fix any conflicts:
|
||||
|
||||
# Switch batch to homebrew master branches
|
||||
cd /usr/local
|
||||
git checkout master
|
||||
cd /usr/local/Library/Taps/homebrew/homebrew-science
|
||||
git checkout master
|
||||
|
||||
# Update homebrew; hopefully this works without errors!
|
||||
brew update
|
||||
|
||||
# Switch back to the caffe branches with the formulae that you modified earlier
|
||||
cd /usr/local
|
||||
git rebase master caffe
|
||||
# Fix any merge conflicts and commit to caffe branch
|
||||
cd /usr/local/Library/Taps/homebrew/homebrew-science
|
||||
git rebase master caffe
|
||||
# Fix any merge conflicts and commit to caffe branch
|
||||
|
||||
# Done!
|
||||
|
||||
At this point, you should be running the latest Homebrew packages and your Caffe-related modifications will remain in place.
|
@ -1,45 +0,0 @@
|
||||
---
|
||||
title: "Installation: RHEL / Fedora / CentOS"
|
||||
---
|
||||
|
||||
# RHEL / Fedora / CentOS Installation
|
||||
|
||||
**General dependencies**
|
||||
|
||||
sudo yum install protobuf-devel leveldb-devel snappy-devel opencv-devel boost-devel hdf5-devel
|
||||
|
||||
**Remaining dependencies, recent OS**
|
||||
|
||||
sudo yum install gflags-devel glog-devel lmdb-devel
|
||||
|
||||
**Remaining dependencies, if not found**
|
||||
|
||||
# glog
|
||||
wget https://storage.googleapis.com/google-code-archive-downloads/v2/code.google.com/google-glog/glog-0.3.3.tar.gz
|
||||
tar zxvf glog-0.3.3.tar.gz
|
||||
cd glog-0.3.3
|
||||
./configure
|
||||
make && make install
|
||||
# gflags
|
||||
wget https://github.com/schuhschuh/gflags/archive/master.zip
|
||||
unzip master.zip
|
||||
cd gflags-master
|
||||
mkdir build && cd build
|
||||
export CXXFLAGS="-fPIC" && cmake .. && make VERBOSE=1
|
||||
make && make install
|
||||
# lmdb
|
||||
git clone https://github.com/LMDB/lmdb
|
||||
cd lmdb/libraries/liblmdb
|
||||
make && make install
|
||||
|
||||
Note that glog does not compile with the most recent gflags version (2.1), so before that is resolved you will need to build with glog first.
|
||||
|
||||
**CUDA**: Install via the NVIDIA package instead of `yum` to be certain of the library and driver versions.
|
||||
Install the library and latest driver separately; the driver bundled with the library is usually out-of-date.
|
||||
+ CentOS/RHEL/Fedora:
|
||||
|
||||
**BLAS**: install ATLAS by `sudo yum install atlas-devel` or install OpenBLAS or MKL for better CPU performance. For the Makefile build, uncomment and set `BLAS_LIB` accordingly as ATLAS is usually installed under `/usr/lib[64]/atlas`).
|
||||
|
||||
**Python** (optional): if you use the default Python you will need to `sudo yum install` the `python-devel` package to have the Python headers for building the pycaffe wrapper.
|
||||
|
||||
Continue with [compilation](installation.html#compilation).
|
@ -1,72 +0,0 @@
|
||||
---
|
||||
title: Model Zoo
|
||||
---
|
||||
# Caffe Model Zoo
|
||||
|
||||
Lots of researchers and engineers have made Caffe models for different tasks with all kinds of architectures and data: check out the [model zoo](https://github.com/BVLC/caffe/wiki/Model-Zoo)!
|
||||
These models are learned and applied for problems ranging from simple regression, to large-scale visual classification, to Siamese networks for image similarity, to speech and robotics applications.
|
||||
|
||||
To help share these models, we introduce the model zoo framework:
|
||||
|
||||
- A standard format for packaging Caffe model info.
|
||||
- Tools to upload/download model info to/from Github Gists, and to download trained `.caffemodel` binaries.
|
||||
- A central wiki page for sharing model info Gists.
|
||||
|
||||
## Where to get trained models
|
||||
|
||||
First of all, we bundle BAIR-trained models for unrestricted, out of the box use.
|
||||
<br>
|
||||
See the [BAIR model license](#bair-model-license) for details.
|
||||
Each one of these can be downloaded by running `scripts/download_model_binary.py <dirname>` where `<dirname>` is specified below:
|
||||
|
||||
- **BAIR Reference CaffeNet** in `models/bvlc_reference_caffenet`: AlexNet trained on ILSVRC 2012, with a minor variation from the version as described in [ImageNet classification with deep convolutional neural networks](http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks) by Krizhevsky et al. in NIPS 2012. (Trained by Jeff Donahue @jeffdonahue)
|
||||
- **BAIR AlexNet** in `models/bvlc_alexnet`: AlexNet trained on ILSVRC 2012, almost exactly as described in [ImageNet classification with deep convolutional neural networks](http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks) by Krizhevsky et al. in NIPS 2012. (Trained by Evan Shelhamer @shelhamer)
|
||||
- **BAIR Reference R-CNN ILSVRC-2013** in `models/bvlc_reference_rcnn_ilsvrc13`: pure Caffe implementation of [R-CNN](https://github.com/rbgirshick/rcnn) as described by Girshick et al. in CVPR 2014. (Trained by Ross Girshick @rbgirshick)
|
||||
- **BAIR GoogLeNet** in `models/bvlc_googlenet`: GoogLeNet trained on ILSVRC 2012, almost exactly as described in [Going Deeper with Convolutions](http://arxiv.org/abs/1409.4842) by Szegedy et al. in ILSVRC 2014. (Trained by Sergio Guadarrama @sguada)
|
||||
|
||||
**Community models** made by Caffe users are posted to a publicly editable [model zoo wiki page](https://github.com/BVLC/caffe/wiki/Model-Zoo).
|
||||
These models are subject to conditions of their respective authors such as citation and license.
|
||||
Thank you for sharing your models!
|
||||
|
||||
## Model info format
|
||||
|
||||
A caffe model is distributed as a directory containing:
|
||||
|
||||
- Solver/model prototxt(s)
|
||||
- `readme.md` containing
|
||||
- YAML frontmatter
|
||||
- Caffe version used to train this model (tagged release or commit hash).
|
||||
- [optional] file URL and SHA1 of the trained `.caffemodel`.
|
||||
- [optional] github gist id.
|
||||
- Information about what data the model was trained on, modeling choices, etc.
|
||||
- License information.
|
||||
- [optional] Other helpful scripts.
|
||||
|
||||
This simple format can be handled through bundled scripts or manually if need be.
|
||||
|
||||
### Hosting model info
|
||||
|
||||
Github Gist is a good format for model info distribution because it can contain multiple files, is versionable, and has in-browser syntax highlighting and markdown rendering.
|
||||
|
||||
`scripts/upload_model_to_gist.sh <dirname>` uploads non-binary files in the model directory as a Github Gist and prints the Gist ID. If `gist_id` is already part of the `<dirname>/readme.md` frontmatter, then updates existing Gist.
|
||||
|
||||
Try doing `scripts/upload_model_to_gist.sh models/bvlc_alexnet` to test the uploading (don't forget to delete the uploaded gist afterward).
|
||||
|
||||
Downloading model info is done just as easily with `scripts/download_model_from_gist.sh <gist_id> <dirname>`.
|
||||
|
||||
### Hosting trained models
|
||||
|
||||
It is up to the user where to host the `.caffemodel` file.
|
||||
We host our BAIR-provided models on our own server.
|
||||
Dropbox also works fine (tip: make sure that `?dl=1` is appended to the end of the URL).
|
||||
|
||||
`scripts/download_model_binary.py <dirname>` downloads the `.caffemodel` from the URL specified in the `<dirname>/readme.md` frontmatter and confirms SHA1.
|
||||
|
||||
## BAIR model license
|
||||
|
||||
The Caffe models bundled by the BAIR are released for unrestricted use.
|
||||
|
||||
These models are trained on data from the [ImageNet project](http://www.image-net.org/) and training data includes internet photos that may be subject to copyright.
|
||||
|
||||
Our present understanding as researchers is that there is no restriction placed on the open release of these learned model weights, since none of the original images are distributed in whole or in part.
|
||||
To the extent that the interpretation arises that weights are derivative works of the original copyright holder and they assert such a copyright, UC Berkeley makes no representations as to what use is allowed other than to consider our present release in the spirit of fair use in the academic mission of the university to disseminate knowledge and tools as broadly as possible without restriction.
|
@ -1,26 +0,0 @@
|
||||
---
|
||||
title: Multi-GPU Usage, Hardware Configuration Assumptions, and Performance
|
||||
---
|
||||
|
||||
# Multi-GPU Usage
|
||||
|
||||
Currently Multi-GPU is only supported via the C/C++ paths and only for training.
|
||||
|
||||
The GPUs to be used for training can be set with the "-gpu" flag on the command line to the 'caffe' tool. e.g. "build/tools/caffe train --solver=models/bvlc_alexnet/solver.prototxt --gpu=0,1" will train on GPUs 0 and 1.
|
||||
|
||||
**NOTE**: each GPU runs the batchsize specified in your train_val.prototxt. So if you go from 1 GPU to 2 GPU, your effective batchsize will double. e.g. if your train_val.prototxt specified a batchsize of 256, if you run 2 GPUs your effective batch size is now 512. So you need to adjust the batchsize when running multiple GPUs and/or adjust your solver params, specifically learning rate.
|
||||
|
||||
# Hardware Configuration Assumptions
|
||||
|
||||
The current implementation uses a tree reduction strategy. e.g. if there are 4 GPUs in the system, 0:1, 2:3 will exchange gradients, then 0:2 (top of the tree) will exchange gradients, 0 will calculate
|
||||
updated model, 0\-\>2, and then 0\-\>1, 2\-\>3.
|
||||
|
||||
For best performance, P2P DMA access between devices is needed. Without P2P access, for example crossing PCIe root complex, data is copied through host and effective exchange bandwidth is greatly reduced.
|
||||
|
||||
Current implementation has a "soft" assumption that the devices being used are homogeneous. In practice, any devices of the same general class should work together, but performance and total size is limited by the smallest device being used. e.g. if you combine a TitanX and a GTX980, performance will be limited by the 980. Mixing vastly different levels of boards, e.g. Kepler and Fermi, is not supported.
|
||||
|
||||
"nvidia-smi topo -m" will show you the connectivity matrix. You can do P2P through PCIe bridges, but not across socket level links at this time, e.g. across CPU sockets on a multi-socket motherboard.
|
||||
|
||||
# Scaling Performance
|
||||
|
||||
Performance is **heavily** dependent on the PCIe topology of the system, the configuration of the neural network you are training, and the speed of each of the layers. Systems like the DIGITS DevBox have an optimized PCIe topology (X99-E WS chipset). In general, scaling on 2 GPUs tends to be ~1.8X on average for networks like AlexNet, CaffeNet, VGG, GoogleNet. 4 GPUs begins to have falloff in scaling. Generally with "weak scaling" where the batchsize increases with the number of GPUs you will see 3.5x scaling or so. With "strong scaling", the system can become communication bound, especially with layer performance optimizations like those in [cuDNNv3](http://nvidia.com/cudnn), and you will likely see closer to mid 2.x scaling in performance. Networks that have heavy computation compared to the number of parameters tend to have the best scaling performance.
|