Install Pivotal Cloud Foundry on Microsoft Azure (Prerequisites)

Install Pivotal Cloud Foundry on Microsoft Azure (Prerequisites)


Throughout his career as a musician, Johnny Cash filled our lives with wisdom. We learned important life lessons, such as:

In fact, many enterprises today are following Johnny’s advice when outfitting their development teams. They are building their Platform as a Service (PaaS) one piece at a time. This, however, isn’t the only option. You can help your development teams work faster and more safely today using Pivotal Cloud Foundry (PCF) on top of Microsoft Azure.


  1. Prerequisites
  2. Install
  3. Take it for a test run - Coming Soon


Many of the preliminary steps in this article are taken from the documentation. I will go into a bit more depth with screen shots and explanations. Furthermore, it is my intent to follow up this series of postings with more articles about operationalizing the platform and development workflow. Let’s get started.

Setting up a Pivotal Network account

image image

Your automated PCF install will need access to the install packages and the ability to accept the EULA on your behalf. You can create an account by navigating to Pivotal Network. Simply:

  • Click on one of the Join buttons to begin
  • Fill out the form to create a login

Capture your API token

image image

Once your account is created, click on the dropdown arrow next to your name. In the dropdown list, select Edit Profile as seen in the above image. On your profile page scroll all the way to the bottom where you will see your API TOKEN.

This would be a good time to start a text file to keep all the information in that you will need to complete the automated setup. The API TOKEN should be your first addition to this file.

Setup an Azure account

Use your existing account, or setup a new account on Microsoft Azure.

PCF will need increased resources on your subscription. In particular, you will need:

  • 53 VMs (I know what you are thinking)

    Of those 53, 15 Vms are used during the install phase to create builds and run errands such as tests. Once the deploy is finished, those 15 VMs are destroyed. This leaves your install with 38 VMs under its control. We’ll dive a bit into the architecture later and you’ll see the reason for those VMs.

  • 1 storage account - This setup is for a 90 day trial version of PCF. In an actual production install, you will utilize multiple storage accounts
  • 3 public IP addresses
  • 1 jumpbox VM that runs the deployment

You will also need to increase your core and storage account quotas on Azure. This link will show you how to do request the increase. Please get the requests in the queue before continuing.

Install the Azure Command Line Interface (CLI)

For this install I chose to install the Azure CLI in a Docker container. It’s as simple as

docker run -it microsoft/azure-cli

Login to your Azure account through the CLI and create a Service Principal

We need to create a Service Principal (sp) for the PCF install to use. We will use our Docker azure-cli container to login to our subscription and then create the sp using this tool. We will then transfer the output file to our local machine and upload it to Azure during the PCF install process. Let’s get started:

Run Azure-Cli Docker image

Ensure Docker is running properly in your terminal. Try: docker run hello-world If this works, proceed to the next step. If not, get Docker setup using this link.

Now we are ready to run a container.

docker run -it microsoft/azure-cli

Install azure-sp-tool

Install Azure-Sp-Tool

In your docker container, execute: npm install azure-sp-tool -g

Install azure-sp-tool

Login to Azure through the CLI

In your docker container, execute: azure login

Install azure-sp-tool

Follow the directions in the terminal. In a browser on your local machine open the url indicated. When prompted enter the code to authenticate.

Install azure-sp-tool

Click Continue and you will be prompted to login to your azure account, or simply click on the user with access to your subscription if you have already logged in as that user in another tab.

Install azure-sp-tool

Hop back over to your terminal with the Docker container running in it and when you see login command OK, you are ready to run the sp create tool.

Install azure-sp-tool

Create the Service Principal

Ensure that your desired subscription is set to default by running azure account list Install azure-sp-tool

If you have multiple subscriptions, please use this link to learn how to set the desired subscription as default.

Next, run the following command in the docker container to create your Service Principal azure-sp-tool create-sp Install azure-sp-tool

Copy the credential file to the local machine

We are going to open up another terminal and use the docker cp command to copy the file created in your container to your local machine. Without exiting your Docker container, open a new terminal and type:

eval $(docker-machine env default)

docker ps

Install azure-sp-tool

Capture the container id. In my case it is 08639683cc69. I will use that in the following command to copy the credential file from the Docker container to my home directory ~/.

docker cp 08639683cc69:/azure-credentials.json ~/

Install azure-sp-tool

Protect this file. Please store it in a secure location because it contains sensitive credentials. Once the copy is complete, go ahead and exit out of your docker container.

If you need any clarification on the above steps, here is a video demonstrating the full workflow:

Create a public key

This key will be used to identify you when ssh-ing into the jumpbox. You must generate 2048-bit RSA public and private key files.

  • For Linux/Unix/Mac OS X - Use ssh-keygen -t rsa -b 2048
  • For Windows - Download, install, and use PuTTYgen.

Which ever method you use, protect the key files and store them in a secure location.