📜 ⬆️ ⬇️

R & Python Web Services

Hi, Habr! Containerization is an approach to software development in which an application or service, its dependencies, and configuration (abstract deployment manifest files) are packaged together in a container image. In this article, we will look at creating a docker image and using it to launch an R shell, Python, and much more. Join now!



A containerized application can be tested as a module and deployed as an instance of a container in the operating system (OS) of the current node. Docker is an open source project to automate the deployment of applications as portable, self-contained containers that can work in the cloud or locally. For more information, see here .
')
Microsoft Machine Learning Server is a flexible corporate platform for scalable data analysis, creating intelligent applications and finding business-valuable information with full support for Python and R. The term " operationalization " means deploying models and code in R and Python on Microsoft Machine Learning Server in the form of web services and the subsequent use of these services in client applications to improve the efficiency of the company.

In this article, we will look at how to create a docker image containing Machine Learning Server 9.3 using Docker files, and how to use it to perform the following operations:

  1. Run shell R.
  2. Python Shell Launch
  3. Launch Jupyter Notebook.
  4. Run OneBox configuration.
  5. Run web service R.
  6. Starting a Python web service.

Required components


Any Linux virtual machine with docker community edition (CE) software installed. In preparing this article, I deployed Ubuntu 16.04 VM and installed docker CE .

Step 1


First, we will create an image called mlserver with Machine Learning Server 9.3 installed using the following docker file:

FROM ubuntu:16.04 RUN apt-get -y update \ && apt-get install -y apt-transport-https wget \ && echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main" | tee /etc/apt/sources.list.d/azure-cli.list \ && wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb -O /tmp/prod.deb \ && dpkg -i /tmp/prod.deb \ && rm -f /tmp/prod.deb \ && apt-key adv --keyserver packages.microsoft.com --recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893 \ && apt-get -y update \ && apt-get install -y microsoft-r-open-foreachiterators-3.4.3 \ && apt-get install -y microsoft-r-open-mkl-3.4.3 \ && apt-get install -y microsoft-r-open-mro-3.4.3 \ && apt-get install -y microsoft-mlserver-packages-r-9.3.0 \ && apt-get install -y microsoft-mlserver-python-9.3.0 \ && apt-get install -y microsoft-mlserver-packages-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-py-9.3.0 \ && apt-get install -y azure-cli=2.0.26-1~xenial \ && apt-get install -y dotnet-runtime-2.0.0 \ && apt-get install -y microsoft-mlserver-adminutil-9.3.0 \ && apt-get install -y microsoft-mlserver-config-rserve-9.3.0 \ && apt-get install -y microsoft-mlserver-computenode-9.3.0 \ && apt-get install -y microsoft-mlserver-webnode-9.3.0 \ && apt-get clean \ && /opt/microsoft/mlserver/9.3.0/bin/R/activate.sh 

Use the docker build command to create an mlserver image using the above docker file:

 docker build -f mlserver-dockerfile -t mlserver. 

Check whether the mlserver image creation was completed successfully by running the following command:

 docker images 

Run R shell


 docker run -it mlserver R 



Python Shell Launch


 docker run -it mlserver mlserver-python 



Launch Jupyter Notebook


 docker run -p 8888:8888 -it mlserver /opt/microsoft/mlserver/9.3.0/runtime/python/bin/jupyter notebook --no-browser --port=8888 --ip=0.0.0.0 --allow-root 

Running the above command gives a link that, when opened in a browser, you can use Jupyter Notebooks.



Run OneBox Configuration


The Microsoft Learning Server can be configured after installation to use as a deployment server and to host analytical web services for operationalization .

  FROM ubuntu:16.04 RUN apt-get -y update \ && apt-get install -y apt-transport-https wget \ && echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main" | tee /etc/apt/sources.list.d/azure-cli.list \ && wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb -O /tmp/prod.deb \ && dpkg -i /tmp/prod.deb \ && rm -f /tmp/prod.deb \ && apt-key adv --keyserver packages.microsoft.com --recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893 \ && apt-get -y update \ && apt-get install -y microsoft-r-open-foreachiterators-3.4.3 \ && apt-get install -y microsoft-r-open-mkl-3.4.3 \ && apt-get install -y microsoft-r-open-mro-3.4.3 \ && apt-get install -y microsoft-mlserver-packages-r-9.3.0 \ && apt-get install -y microsoft-mlserver-python-9.3.0 \ && apt-get install -y microsoft-mlserver-packages-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-py-9.3.0 \ && apt-get install -y azure-cli=2.0.26-1~xenial \ && apt-get install -y dotnet-runtime-2.0.0 \ && apt-get install -y microsoft-mlserver-adminutil-9.3.0 \ && apt-get install -y microsoft-mlserver-config-rserve-9.3.0 \ && apt-get install -y microsoft-mlserver-computenode-9.3.0 \ && apt-get install -y microsoft-mlserver-webnode-9.3.0 \ && apt-get clean \ && /opt/microsoft/mlserver/9.3.0/bin/R/activate.sh RUN echo $'#!/bin/bash \n\ set -e \n\ az ml admin bootstrap --admin-password "Microsoft@2018" --confirm-password "Microsoft@2018" \n\ exec "$@"' > bootstrap.sh RUN chmod +x bootstrap.sh EXPOSE 12800 ENTRYPOINT ["/bootstrap.sh"] CMD ["bash"] 

Create an mlserver-onebox image using the above docker file:

 docker build -f mlserver-onebox-dockerfile -t mlserver-onebox. 

Check if the mlserver-onebox image creation was completed successfully by running the following command:

 docker images 

Start the onebox container with the command:

 docker run --name mlserver-onebox-container -dit mlserver-onebox 

Check the status of the container with:

 docker logs mlserver-onebox-container 

After verifying that the diagnostic tests were successful with the above command, you can use this container as a one-box (the Docker log data should contain the following line: "All Diagnostic Tests have passed".).

Get the container IP address by running the following command:

 docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' mlserver-onebox-container '172.17.0.3' 

And use it as a one-box:

 az login --mls --mls-endpoint "http://172.17.0.3:12800" --username "admin" --password "Microsoft@2018" az ml admin diagnostic run 

Run R Web Service


We can also create an image with a pre-configured web service so that it is ready for use as soon as we deploy the container. Here is an example of creating an image with R web service for a Manual Transmission simulation pre-configured inside it.

  FROM ubuntu:16.04 RUN apt-get -y update \ && apt-get install -y apt-transport-https wget \ && echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main" | tee /etc/apt/sources.list.d/azure-cli.list \ && wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb -O /tmp/prod.deb \ && dpkg -i /tmp/prod.deb \ && rm -f /tmp/prod.deb \ && apt-key adv --keyserver packages.microsoft.com --recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893 \ && apt-get -y update \ && apt-get install -y microsoft-r-open-foreachiterators-3.4.3 \ && apt-get install -y microsoft-r-open-mkl-3.4.3 \ && apt-get install -y microsoft-r-open-mro-3.4.3 \ && apt-get install -y microsoft-mlserver-packages-r-9.3.0 \ && apt-get install -y microsoft-mlserver-python-9.3.0 \ && apt-get install -y microsoft-mlserver-packages-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-py-9.3.0 \ && apt-get install -y azure-cli=2.0.26-1~xenial \ && apt-get install -y dotnet-runtime-2.0.0 \ && apt-get install -y microsoft-mlserver-adminutil-9.3.0 \ && apt-get install -y microsoft-mlserver-config-rserve-9.3.0 \ && apt-get install -y microsoft-mlserver-computenode-9.3.0 \ && apt-get install -y microsoft-mlserver-webnode-9.3.0 \ && apt-get clean \ && /opt/microsoft/mlserver/9.3.0/bin/R/activate.sh RUN echo $'library(mrsdeploy) \n\ carsModel <- glm(formula = am ~ hp + wt, data = mtcars, family = binomial) \n\ manualTransmission <- function(hp, wt) { \n\ newdata <- data.frame(hp = hp, wt = wt) \n\ predict(carsModel, newdata, type = "response") \n\ } \n\ remoteLogin("http://localhost:12800", username = "admin", password = "Microsoft@2018", session = FALSE) \n\ api <- publishService("ManualTransmissionService", code = manualTransmission, model = carsModel, inputs = list(hp = "numeric", wt = "numeric"), outputs = list(answer = "numeric"), v = "1.0.0") \n\ result <- api$manualTransmission(120, 2.8) \n\ print(result$output("answer")) \n\ remoteLogout()' > /tmp/ManualTransmission.R RUN echo $'#!/bin/bash \n\ set -e \n\ az ml admin bootstrap --admin-password "Microsoft@2018" --confirm-password "Microsoft@2018" \n\ /usr/bin/Rscript --no-save --no-restore --verbose "/tmp/ManualTransmission.R" \n\ exec "$@"' > bootstrap.sh RUN chmod +x bootstrap.sh EXPOSE 12800 ENTRYPOINT ["/bootstrap.sh"] CMD ["bash"] 

Create a rmanualtransmission image using the above docker file:

 docker build -f r-manualtransmission-dockerfile -t rmanualtransmission. 

Check if the rmanualtransmission image creation was completed successfully by running the command:

 docker images 

Start the container with the command:

 docker run --name rmanualtransmission-container -dit rmanualtransmission 

Check the status of the container with:

 docker logs rmanualtransmission-container 

After verifying that the diagnostic tests were successful and the web service is published, you can start using it.

Get the container IP address by running the following command:

 docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' rmanualtransmission-container '172.17.0.3' 

You can use or get the swagger.json file of the R web service settings to simulate a manual transfer using curl commands:

 apt-get -y install jq curl -s --header "Content-Type: application/json" --request POST --data '{"username":"admin","password":"Microsoft@2018"}' http://172.17.0.3:12800/login | jq -r '.access_token' <access token> curl -s --header "Content-Type: application/json" --header "Authorization: Bearer <access token>" --request POST --data '{"hp":120,"wt":2.8}' http://172.17.0.3:12800/api/ManualTransmissionService/1.0.0 {"success":true,"errorMessage":"","outputParameters":{"answer":0.64181252840938208},"outputFiles":{},"consoleOutput":"","changedFiles":[]} curl -s --header "Authorization: Bearer <access token>" --request GET http://172.17.0.3:12800/api/ManualTransmissionService/1.0.0/swagger.json -o swagger.json 

The swagger.json file is suitable for creating a client library in any language .

Start Python Web Service


Below is an example of creating an image with a Python web service for a Manual Transmission simulation pre-configured inside it.

  FROM ubuntu:16.04 RUN apt-get -y update \ && apt-get install -y apt-transport-https wget \ && echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ xenial main" | tee /etc/apt/sources.list.d/azure-cli.list \ && wget https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb -O /tmp/prod.deb \ && dpkg -i /tmp/prod.deb \ && rm -f /tmp/prod.deb \ && apt-key adv --keyserver packages.microsoft.com --recv-keys 52E16F86FEE04B979B07E28DB02C46DF417A0893 \ && apt-get -y update \ && apt-get install -y microsoft-r-open-foreachiterators-3.4.3 \ && apt-get install -y microsoft-r-open-mkl-3.4.3 \ && apt-get install -y microsoft-r-open-mro-3.4.3 \ && apt-get install -y microsoft-mlserver-packages-r-9.3.0 \ && apt-get install -y microsoft-mlserver-python-9.3.0 \ && apt-get install -y microsoft-mlserver-packages-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mml-py-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-r-9.3.0 \ && apt-get install -y microsoft-mlserver-mlm-py-9.3.0 \ && apt-get install -y azure-cli=2.0.26-1~xenial \ && apt-get install -y dotnet-runtime-2.0.0 \ && apt-get install -y microsoft-mlserver-adminutil-9.3.0 \ && apt-get install -y microsoft-mlserver-config-rserve-9.3.0 \ && apt-get install -y microsoft-mlserver-computenode-9.3.0 \ && apt-get install -y microsoft-mlserver-webnode-9.3.0 \ && apt-get clean \ && /opt/microsoft/mlserver/9.3.0/bin/R/activate.sh RUN echo $'from microsoftml.datasets.datasets import DataSetMtCars \n\ import pandas as pd \n\ from revoscalepy import rx_lin_mod, rx_predict \n\ cars_model = rx_lin_mod(formula="am ~ hp + wt", data=DataSetMtCars().as_df()) \n\ mydata = pd.DataFrame({"hp":[120],"wt":[2.8]}) \n\ def manualTransmission(hp, wt): \n\ \timport pandas as pd \n\ \tfrom revoscalepy import rx_predict \n\ \tnewData = pd.DataFrame({"hp":[hp], "wt":[wt]}) \n\ \treturn rx_predict(cars_model, newData, type="response") \n\ \n\ from azureml.deploy import DeployClient \n\ from azureml.deploy.server import MLServer \n\ from azureml.common.configuration import Configuration \n\ \n\ HOST = "http://localhost:12800" \n\ context = ("admin", "Microsoft@2018") \n\ client = DeployClient(HOST, use=MLServer, auth=context) \n\ service_name = "ManualTransmissionService" \n\ service_version = "1.0.0" \n\ service = client.service(service_name).version(service_version).code_fn(manualTransmission).inputs(hp=float, wt=float).outputs(answer=pd.DataFrame).models(cars_model=cars_model).description("Man ual Transmission Service").deploy() \n\ res = service.manualTransmission(120, 2.8) \n\ print(res.output("answer"))' > /tmp/ManualTransmission.py RUN echo $'#!/bin/bash \n\ set -e \n\ az ml admin bootstrap --admin-password "Microsoft@2018" --confirm-password "Microsoft@2018" \n\ mlserver-python /tmp/ManualTransmission.py \n\ exec "$@"' > bootstrap.sh RUN chmod +x bootstrap.sh EXPOSE 12800 ENTRYPOINT ["/bootstrap.sh"] CMD ["bash"] 

Create a pymanualtransmission image using the above docker file:

 docker build -f py-manualtransmission-dockerfile -t pymanualtransmission. 

Check if the pymanualtransmission image creation was completed successfully by running the command:

 docker images 

Start the container with the command:

 docker run --name pymanualtransmission-container -dit pymanualtransmission 

Check the status of the container with:

 docker logs pymanualtransmission-container 

After verifying that the diagnostic tests were successful and the web service is published, you can start using it.

Get the container IP address by running the following command:

 docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' pymanualtransmission-container '172.17.0.3' 

You can get the swagger.json file for the Python web service settings to simulate a manual transfer using curl commands:

 apt-get -y install jq curl -s --header "Content-Type: application/json" --request POST --data '{"username":"admin","password":"Microsoft@2018"}' http://172.17.0.3:12800/login | jq -r '.access_token' <access token> curl -s --header "Content-Type: application/json" --header "Authorization: Bearer <access token>" --request POST --data '{"hp":120,"wt":2.8}' http://172.17.0.3:12800/api/ManualTransmissionService/1.0.0 {"success":true,"errorMessage":"","outputParameters":{"answer":0.64181252840938208},"outputFiles":{},"consoleOutput":"","changedFiles":[]} curl -s --header "Authorization: Bearer <access token>" --request GET http://172.17.0.3:12800/api/ManualTransmissionService/1.0.0/swagger.json -o swagger.json 

The swagger.json file is suitable for creating a client library in any language .

NOTE. You can also change the settings for the appsettings.json website using the docker magic file and enable LDAP / AAD authentication .

Extensions


Created local docker images can be sent to the Azure Container Registry (ACR ).

Create an ACR cluster in Azure Kubernetes Service (AKS) using images from an ACR cluster that can be automatically scaled in both directions via autoscale capsules (Autoscale pods).

REFERENCES:


Source: https://habr.com/ru/post/420159/


All Articles