Public Notes: Running Python in Azure Batch on Windows using the Azure Portal

TLDR; This post explains how to setup an Azure Batch node which uses Windows using installers. It also explains how to use application packages to preload Azure Batch nodes with utilities so that tasks can just be command lines. The explanation use case is “run a Python script”, but should apply more broadly to “install tools, distribute stuff, and run command lines.”

PDF Version of this article (which has a bit better formatting)

When I start experimenting with something, I do not start out with writing code to automate everything. Sometimes, I try to use a GUI to bootstrap my process, then automate when setup is correct. Why? A lot of environments, like Azure, will allow for fast cycles from the UI tools. My latest adventure took a bit of time, so I’m documenting what I did.

Here’s the context: I am developing a mid-sized example project for scalability. If everything goes to plan, the demo will show how to solve the same problem with Azure Batch and the Azure Kubernetes Service. The demo is targeting a special kind of data scientist: an actuary. Actuaries frequently write code in one of three languages: APL, Python, and R. I’m focusing on Python.

My goals:

  1. Configure everything through the Azure portal.
  2. Install Python on the batch node at startup.
  3. Use the application packages feature to deliver the Python virtual environment prior to any tasks running.
  4. Run a command line without resources to make sure I can run a script in the Python virtual environment.

What follows is essentially a lab for you to follow and do the same things I did. As time marches forward, this lab’s correctness will degrade. Hit me up on LinkedIn if you catch this and I may go back and update the details.

For all the Azure pieces, try to keep things in the same region. I’ll be using East US. This isn’t necessary, but is helpful for the parts that transfer files. Staying in the same region gives better speed.

1         Create a new project in PyCharm

  1. Open up your editor; I’m using PyCharm. Details for other editors will differ. Set the location to wherever you like. I’m naming the project BatchDemo.
  2. Setup a New environment using Virtualenv. The location will be in the venv directory of your project. For the base interpreter, use the one installed on your machine already.

For me, the dialog looks like this:


  1. Click on Create.
  2. Add a new file,, to the project in the BatchDemo directory.
  3. Add the library to use table storage.
    1. Select File–>Settings–>Project: BatchDemo–>Project Interpreter.
    2. In the list of packages, you’ll see a plus sign ‘+’. Click on that.
    3. Select azure-cosmosdb-table. Click on Install Package.
    4. Close the Available Packages window once done. My machine looked like this before I closed the window:


  1. Click OK on the Settings Window. You should now have a number of packages installed.
  2. Add the following code to the file. The code is fairly “Hello, world!”-ish: add a row to a Table Storage table (using libraries from the azure.cosmosdb namespace, but interacts with Storage, so no, not intuitive). The script is simple and adds one row to a table named tasktable:
from azure.cosmosdb.table.tableservice import TableService
import datetime

def main():
    table_name = 'tasktable'
    table_service = TableService(
        account_name="<your Azure Storage Account Name>",
        account_key="<your Azure Storage Account Key>")

    if not (table_service.exists(table_name)):
    task = \
            'PartitionKey': 'tasks',
            'RowKey': str(datetime.datetime.utcnow()),
            'description': 'Do some task'
    table_service.insert_entity(table_name, task)   

if __name__ == "__main__":

For the highlighted code, take the name of one of your Azure Storage Accounts and a corresponding key, then plug in the proper values. If you need to create a storage account, instructions are here. To get the keys, look in the same doc (or click here) and follow the instructions. If you create a new storage account, use a new resource group and name the resource group BatchPython. We’ll use that group name later too.

One last comment here: for a production app, you really should use Key Vault. The credentials are being handled this way to keep the concept count reasonably low.

Test the code by running it. You should be able to look at a table named tasktable in your storage account and see the new row. The RowKey is the current timestamp, so in our case it should provide for a unique enough key.

Once you have all this working and tested, let’s look at how to run this simple thing in Azure Batch. Again, this is for learning how to do some simple stuff via a Hello, World.

2         Create a batch account

In this step, we are going to create a batch account which I’ll refer to as batch_account in here; your name will be different. Just know to substitute a proper string where needed.

  1. In the Azure portal, click on Create a resource.
  2. Search for Batch Service. Click on Batch Service, published by Microsoft.
  3. Click on Create.
  • Account name: For the account name, enter in batch_account [remember, this is a string you need to make up, then reuse. You’re picking something unique. I used scseelybatch]
  • Resource Group: Select BatchPython. If you didn’t create this earlier, select Create New.
  • Select a storage account to use with batch. You can use the same one you created to test the table insertion.
  • Leave the other defaults as is.
  • Click on Create.

3         Upload the Python Installer

Upload the Python installer which you want to use. I used the Windows x86-64 executable installer from here.

  1. In your storage account, create a container named installers.
    1. In the Azure Portal, navigate to your Storage Account.
    2. Select Blob ServiceàBrowse Blobs
    3. Click on + Container.
    4. Set the Name to installers.
    5. Click on OK.
  2. Once created, click on the installers container.
  3. Upload the Python installer from your machine.
    1. Click on Upload.
    2. In the Upload blob screen, point to your installer and click on Upload.
    3. Wait until the upload completes.
  4. Get a SAS URL for the installer.
    1. Right click on the uploaded file.
    2. Select Generate SAS.
    3. Set the expiration of the token to some time in the future. I went for 2 years in the future.
    4. Click on Generate blob SAS token and URL


Copy the Blob SAS URL. Store that in a scratch area. You’ll need it in a bit.

4         Create a Batch Application

  1. Going back to your machine, go to the BatchDemo directory which contains your file along with the virtual environment. Zip up BatchDemo and everything else inside into a file called [Mine is about 12MB in size]
  2. Open up your list of Resource Groups in the portal. Click on BatchPython.
  3. Click on your Batch account.
  4. Select FeaturesàApplications
  5. Click on Add.
    1. Application id: BatchPython
    2. Version: 1.0
    3. Application package: Select
    4. Click on OK.

The file will upload. When complete, you’ll have 1/20 applications installed.

  1. Click on BatchPython.
  2. Set Default Version to 1.0.
  3. Click on Save.

5         Create a Batch Pool

  1. Open up your list of Resource Groups in the portal. Click on BatchPython.
  2. Click on your Batch account.
  3. Select on FeaturesàPools
  4. Click on Add.
    • Pool ID: Python
    • Publisher: MicrosoftWindowsServer
    • Offer: WindowsServer
    • Sku: 2016-Datacenter
    • Node pricing tier: Standard D11_v2 [Editorial: When experimenting, I prefer to pick nodes with at least 2 cores. 1 for the OS to do its thing, 1 for my code. I’ll do one core for simple production stuff once I have things working. This is particularly important to allow for effective remote desktop/SSH. The extra core keeps the connection happy.]
    • Target dedicated nodes: 1
    • Start Task/Start task: Enabled
    • Start Task/Command Line: We want this installed for all users, available on the Path environment variable, and we do not want a UI.

python-3.6.6-amd64.exe /quiet InstallAllUsers=1 PrependPath=1 Include_test=0

  • Start Task/User identity: Task autouser, Admin
  • Start Task/Wait for success: True
  • Start Task/Resource files:
    • Blob Source: Same as the URL you saved from the piece labeled “Upload the Python Installer”. The SAS token is necessary.
    • File Path: python-3.6.6-amd64.exe
    • Click on Select
  • Optional Settings/Application packages
    • Click on Application Packages.
    • Application: BatchPython
    • Version: Use default version
    • Click on Select
  • Click on OK.

The pool will be created. In my experience, creating the pool and getting the node ready can take a few minutes. Wait until the node appears as Idle before continuing.


6         Run a job

  1. Open up your list of Resource Groups in the portal. Click on BatchPython.
  2. Click on your Batch account.
  3. Select on FeaturesàJobs
  4. Click on Add
    1. Job ID: AddARow
    2. Pool: Python
    3. Job manager, preparation, and release tasks:
      1. Mode: Custom
      2. Job Manager Task:
        1. Task ID: AddARowTask
        2. Command line:


Note on the environment variable: Application packages are zip files. Batch puts the location of the unzipped application package into an environment variables in one of two ways, depending on if you select the default version or a specific version.



  • Click on Select
  • Click on OK

The job should start running immediately. Because it’s a short task, it’ll finish quickly too. Click on Refresh and you’ll probably see that the AddARowTask has completed.


You can then verify the output by opening up the table and looking at the rows. A new one should be present. I’ll expect a row that completed near 21:33 on July 19; the time will be recorded as UTC, and I’m in Redmond, WA, USA, which is 7 hours behind UTC time.


That view is courtesy of the Azure tools present in Visual Studio 2017.

7         So, what next?

Now that you’ve done all this, what does it mean? For your batch pools, you could preload them with a common toolset. The resource files you pass in to a job can be files to operate on, independent of binaries. Your tasks start times can be greatly reduced by loading the prerequisites early. Could you do this with custom VMs? Sure, but then you need to keep the VMs patched. This mechanism allows you to use a patched VM and just install your few items.

This is definitely a toy example, meant to show how to do the initial setup in the portal. Here’s what you want to do for automation purposes:

  1. Script all of this.
  2. For the Python piece, add a mechanism to create the zip file after you have a successful build and test.
  3. Script the management of the binaries, creating the Batch Service, and configuring the pools and application package(s).
  4. Add an integration test to validate that the pool is ready to run.
  5. Minimize the number of secrets in the code to 0 secrets. Use Key Vault to manage that stuff.
%d bloggers like this: