Docker-based TensorFlow experimental environment
Created#More Posted time:Jan 12, 2017 13:52 PM
This series will utilize Docker and Alibaba Cloud Container Service to help you get started with TensorFlow machine learning schemes.
• Article 1 Create a TensorFlow experimental environment - this article
• Article 2 Establish a TensorFlow Serving cluster easily
• Article 3 Straighten out a TensorFlow continuous training link
Machine learning, as an important AI technology, has been widely applied to computer vision, natural language processing and medical diagnosis fields. TensorFlow is an open-source distributed machine learning framework launched by Google. It is also the most watched machine learning project in the GitHub community and has received more than 30,000 stars.
TensorFlow provides multiple ways of installation and easy configuration. But it remains challenging for beginners to establish a TensorFlow learning environment from scratch. Fortunately, TensorFlow provides Docker-based deployment and developers can get started quickly.
This article is the first of this series. We will quickly establish a TensorFlow learning environment based on Docker.
Prepare the Docker environment
To establish an experimental environment using Docker and Docker Compose orchestration, we need to install Docker for Mac/Windows or install Docker and Docker Compose in Linux.
Establish a local environment
There are many TensorFlow-related learning materials available on GitHub. Among them, https://github.com/aymericdamien/TensorFlow-Examples is a good tutorial. The article provides examples from simple to complex to introduce TensorFlow functions.
First, run the following commands to get the tutorial code:
git clone https://github.com/aymericdamien/TensorFlow-Examples
To run this tutorial, you need to install TensorFlow's execution environment and configure “jupyter” and “tensorboard” for interactive operations.
A simplest way is to create the following docker-compose.yml template in the current directory.
Execute the following commands to create the TensorFlow learning environment in one click.
docker-compose up -d
We can check that the Docker container started.
yili@yili-mbp:~/work/TensorFlow-Examples$ docker-compose ps
Name Command State Ports
tensorflowexamples_jupyter_1 /run_jupyter.sh /root/note ... Up 6006/tcp, 0.0.0.0:8888->8888/tcp
tensorflowexamples_tensorboard_1 tensorboard --logdir /tmp/ ... Up 0.0.0.0:6006->6006/tcp, 8888/tcp
You can visit TensorFlow's Jupyter interactive experimental environment at http://127.0.0.1:8888/tree in the browser.
You can also visit the model visualization tool TensorBoard at http://127.0.0.1:6006 in the browser.
Note: You can run http://127.0.0.1:8888/notebooks/4_Utils/tensorboard_basic.ipynb to experiment on the TensorBoard functions. The log directory configured for the TensorBoard container in the example is “/tmp/tensorflow_logs”. For your own notebook, you can set the log output path in the code by referring to tensorboard_basic.
• In specific, registry.cn-hangzhou.aliyuncs.com/denverdino/tensorflow:0.9.0 is constructed based on the tensorflow/tensorflow:0.9.0 image, with only the APT-source and Pipy-source Alibaba Cloud images added. You can also refer to the Dockerfile in https://github.com/denverdino/tensorflow-docker to build an image on your own to add your desired Python library, algorithm library and other resources in advance.
• Thanks to the volumes mechanism, Jupyter can obtain the example directly from the current notebooks directory. Jupyter and TensorBoard containers can also share the event logs through the file volumes.
Service of Alibaba Cloud Container Service
Alibaba Cloud Container Service supports deployment with Docker Compose templates. With the template below we can easily deploy the TensorFlow learning environment to the cloud.
• With the help of the aliyun.routing label, we can easily define the access endpoints of Jupyter and TensorBoard.
• If it is an old cluster, you need to click the Container Service agent to upgrade it to provide required features and stability enhancement.
After a few minutes, we can enjoy using TensorFlow in the learning environment on the cloud.
We can use Docker and Alibaba Cloud Container Service to easily establish TensorFlow learning environments locally and on the cloud. As a standard software delivery means, Docker can greatly simplify the deployment and O&M complexity of applications and software. Alibaba Cloud Container Service supports container orchestration using Docker Compose, and provides numerous extensions to facilitate on-cloud deployment and management of container-based microservices applications.
Alibaba Cloud Container Service will also work with the high-performance computing (HPC) team to provide machine learning solutions integrated with CPU acceleration and Docker cluster management on Alibaba Cloud, in a bid to improve the machine learning efficiency on the cloud end.
[Dave edited the post at Jan 13, 2017 17:07 PM]