1. Tensorflow compatible libs
https://www.tensorflow.org/versions/r0.12/get_started/os_setup
https://www.tensorflow.org/install/install_linux
2. Pre-installation actions
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/docker
https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&target_distro=Ubuntu
http://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#pre-installation-actions
3. Install the Nvidia driver
http://programmingmatrix.blogspot.com/2017/10/ubuntu-1604-nvidia-gpu-driver-cuda-and.html
http://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#ubuntu-installation
4. Install Nvidia Docker
https://github.com/NVIDIA/nvidia-docker
5. Test that tensorflow links with libcudnn
workon deep learning_3.5
python
>>> import tensorflow
6. Check the Tensorflow version
python -c "import tensorflow; print(tensorflow.__version__)"
7. Test that Tensorflow is using the GPU
python -c "import tensorflow as tf; sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))"
8. Setup a Docker volume for persistence
https://github.com/anurag/fastai-course-1
9. How to use Nvidia Docker
https://medium.com/@ceshine/docker-nvidia-gpu-nvidia-docker-808b23e1657
https://www.tensorflow.org/versions/r0.12/get_started/os_setup
https://www.tensorflow.org/install/install_linux
2. Pre-installation actions
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/tools/docker
https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&target_distro=Ubuntu
http://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#pre-installation-actions
3. Install the Nvidia driver
http://programmingmatrix.blogspot.com/2017/10/ubuntu-1604-nvidia-gpu-driver-cuda-and.html
http://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html#ubuntu-installation
4. Install Nvidia Docker
https://github.com/NVIDIA/nvidia-docker
GPU support
Prior to installing TensorFlow with GPU support, ensure that your system meets all NVIDIA software requirements. To launch a Docker container with NVidia GPU support, enter a command of the following format:$ nvidia-docker run -it -p hostPort:containerPort TensorFlowGPUImagewhere:
- -p hostPort:containerPort is optional. If you plan
to run TensorFlow programs from the shell, omit this option. If you plan
to run TensorFlow programs as Jupyter notebooks, set both
hostPort and
containerPort
to8888
. - TensorFlowGPUImage specifies the Docker container. You must
specify one of the following values:
- gcr.io/tensorflow/tensorflow:latest-gpu, which is the latest TensorFlow GPU binary image.
- gcr.io/tensorflow/tensorflow:latest-devel-gpu, which is the latest TensorFlow GPU Binary image plus source code.
- gcr.io/tensorflow/tensorflow:version-gpu, which is the specified version (for example, 0.12.1) of the TensorFlow GPU binary image.
- gcr.io/tensorflow/tensorflow:version-devel-gpu, which is the specified version (for example, 0.12.1) of the TensorFlow GPU binary image plus source code.
latest
versions. For example, the
following command launches the latest TensorFlow GPU binary image in a
Docker container from which you can run TensorFlow programs in a shell:$ nvidia-docker run -it gcr.io/tensorflow/tensorflow:latest-gpu bashThe following command also launches the latest TensorFlow GPU binary image in a Docker container. In this Docker container, you can run TensorFlow programs in a Jupyter notebook:
$ nvidia-docker run -it -p 8888:8888 gcr.io/tensorflow/tensorflow:latest-gpuThe following command installs an older TensorFlow version (0.12.1):
$ nvidia-docker run -it -p 8888:8888 gcr.io/tensorflow/tensorflow:0.12.1-gpuDocker will download the TensorFlow binary image the first time you launch it. For more details see the TensorFlow docker readme.
Next Steps
You should now validate your installation.5. Test that tensorflow links with libcudnn
workon deep learning_3.5
python
6. Check the Tensorflow version
python -c "import tensorflow; print(tensorflow.__version__)"
7. Test that Tensorflow is using the GPU
python -c "import tensorflow as tf; sess = tf.Session(config=tf.ConfigProto(log_device_placement=True))"
8. Setup a Docker volume for persistence
https://github.com/anurag/fastai-course-1
9. How to use Nvidia Docker
https://medium.com/@ceshine/docker-nvidia-gpu-nvidia-docker-808b23e1657
No comments:
Post a Comment