Archive for category Uncategorized

Thoughts on Day 1 of Money2020, Europe

I just finished day 1 of Money 20/20 Europe. I stuck mainly to the large sessions and to the show floor. What I saw was a repeated vision of what this group in finance sees as the next set of important things to be tackled. Everything they are doing revolves around the customer and making things better for customers. Depending on where you are in the financial ecosystem determines which pieces you are building and which pieces you are integrating.

From the banking side, we heard from many folks. I took the most notes from the talks by Ralph Hamers (CEO of ING Group) and Andy Maguire (Group COO at HSBC). After these two, the themes repeated which only solidified that they weren’t unique in their visions. Because banks already have the balance sheets and other nuts and bolts of building a banking business, their vision is to provide a banking platform that other businesses can plug into. Any workable platform must be open: competitors need to be able to plug into it just as easily as partners. This will allow the bank to stay good at what it knows while letting other partners fill the gaps with the wide variety of expertise that the bank does not have so that it can participate in new opportunities more easily. For example, many banks are finding success by going into geographies where their customers only interact with them over a digital experience: no human to human interaction over 99% of the time. To do this, they craft their platform and their onboarding experience to be as easy to use as possible. Several banks talked of doing work to reduce the integration times with their platforms from months down to weeks. These efforts are paying off to allow the banks to find ways to interact with more customers in more countries.

From the FinTech side of the house (which for this conference so far is the “everyone else” even though I know this leaves out personal finance folks), I saw a lot of interesting technology. A lot of the technology focused on a few areas, all with interesting takes on how to accomplish the goals. I saw a lot of distributed ledger technology (aka blockchain) with implementations that have already gone live. It wasn’t clear to me how blockchain is being leveraged, but tomorrow promises to have a number of talks around the “what” and “how”. The show also has a number of folks presenting different ways to present your identity. Many of these still focus around the two factors for authenticating and many are avoiding passwords, PIN codes, and the like. The primary mechanism here is:

  1. Some biometric. Two most commonly cited are fingerprint and face.
  2. Smart phone.

So, yes, the argument that goes “What about people from [some part of world that they think doesn’t have Android or Apple phones]?” is not under consideration. In the countries where the banks operate, they know that most of their customers have smart phones.

The final thing I noticed is that AI came up a bunch and it was all nebulous to the speakers. Asking some of the AI firms on the floor, the sales folks know that they have data scientists and those people build and maintain their models. AI/ML is being applied to Know Your Customer/Anti-money Laundering work as well as fraud detection. Given the sales process, my guess here is that the people who need the tech will talk to those who make it and then have their engineers have the nitty gritty discussions of integration. I’m definitely looking forward to learning more there.

I also spent a bit of time on the show floor. Because it’s banking, a lot of the vendors create solutions that run in the client data center OR the cloud. For those folks, I’d like to let you know that you should look at joining the Azure Marketplace. This can give you ease of deployment for your customers who run in Azure and is fairly handy for VM only deployments. Contact me and I can help you get on board.

Leave a comment

Copying files from a Docker container onto local machine

This past week, I’ve spent time wiping away my ignorance of containers. To do this, I started in my usual way:

  1. Buy a bunch of books. Probably too many.
  2. Work through books, doing exercises as I go.

The first book I’m running through is Using Docker: Developing and Deploying Software with Containers by Adrian Mouat. I’m posting this bit now to hopefully help others.

When working through the exercise to backup the redis database in Chapter 3, I ran the command to backup the database:

docker run --rm --volumes-from myredis -v $PWD/backup:/backup debian cp /data/dump.rdb /backup/

This then emits the error message:

C:\Program Files\Docker\Docker\Resources\bin\docker.exe: Error response from daemon: Drive has not been shared.
 See 'C:\Program Files\Docker\Docker\Resources\bin\docker.exe run --help'.

This is happening because I never shared the C-Drive with Docker. To do this, right click on the Docker icon sitting in your toolbar and select Settings… . Then, select Shared Drives and check the drive(s) on your system which you want to be able to use. DockerSettingsSharedDrive

Upon clicking Apply, enter your credentials. The command should now work.

One other note: I found that the command did not work right in cmd.exe or some bash shells. It did work just fine from a powershell window. So, that’s another note…


Leave a comment

.NET Fx version to Azure Cloud Service Mapping

Posting this here mostly for me so I can find this easily again:

I’m monitoring this URL, waiting for .NET 4.7 support to appear. I’m hopeful that we’ll see something early in 2017 Q4, but I won’t be holding my breath either 😉

Leave a comment

Azure OS Family 5 changes to RDP/Remote Desktop prevent logins on short passwords

TLDR; Azure OS Family 5 requires Remote Desktop passwords >= 10 characters. Anything less will cause your login to fail, repeatedly requesting that you re-enter your password.

I ran into an issue when upgrading an Azure application from OS Family 4 to OS Family 5. We have configured RDP for our development deployments. As part of that deployment, we had configured special passwords for each environment. Those passwords had a strong enough length when we added them a few years ago: 8 and 9 characters. OS Family 5 (Windows Server 2016) requires that the passwords are at least 10 characters long.

As a result, we found that the deployment went fine (no errors reported) but that we simply couldn’t log in post upgrade. Looking on the portal, we noted that one has to have a password of at least 10 characters to add Remote Desktop from the portal. We counted the characters in our passwords, adjusted lengths, and found we could login again.

Leave a comment

(Re)claiming My Blue Badge

TL;DR: Microsoft offered me a position on the Windows Azure Service Bus team and I took it. I’m ex-Microsoft and I reclaim my blue badge on February 11, 2013.

Longer version: From 2000 to 2006, I worked at Microsoft on MSDN and later on Indigo (WCF). The family loved living in Washington state and I loved my job at Microsoft. However, my wife and I don’t ever want to look at life and see ourselves doing things that we know we will regret. One of the things we were starting to regret was not letting our kids get to know their extended family. In 2006, my wife and I chose to return to the Midwest so that our three children (then 10, 5, and 3 years old) could get to know their cousins, aunts, uncles, and grandparents. Since 2006, we’ve been able to attend graduations, weddings, and generally get to visit family whenever the spirit moved us. We got to know everyone in our extended family quite well. As happens quickly, the families have seen their kids get older, other activities occupy more of their time, and this has limited the ease in all of us getting together. Essentially, Thanksgiving works great- everything else is a crap shoot.

Over the last 2 years, getting together just got tougher, so my family reevaluated our goals and wants. We decided we wanted to go back to the Pacific Northwest and I figured that, if I’m going to move there, why not work for Microsoft again? One of the teams I was interested in was the Windows Azure Service Bus team. They had an opening and after a nice, long day of interviews, they decided to take a risk on an RD and Integration MVP. I really clicked with the team, so I accepted the offer. This choice also allows me to work on one of the largest scale systems in the world on a product that ships on an Internet cadence. I’m extremely excited about this opportunity and can’t wait to get into the code.

I plan to continue recording courses for Pluralsight on the weekends and evenings- the authoring/teaching bug bit me back in 1998. Pluralsight provides a great way to scratch that itch.


New work machine

Back in 1999, I officially gave up on the desktop computer. Since then, my personal machine has always been a desktop replacement quality laptop. I enjoy being able to take a powerful box wherever I go. This past December, I felt a need to get a portable machine that supported Windows 8 with multi-touch. I’m floored by how light a desktop replacement can be! I wound up with a Lenovo X230 tablet. The thing is small- 12.5” screen. I equipped it with a slice battery so that I can work a full day away from a power source. You can also easily enhance the box to make it a wonderful workhorse. I picked up the I5 configuration with the basic memory and HDD. About 2 hours after receiving the unit from Lenovo, the machine had:

  • 16GB RAM (Crucial)
  • 256GB mSATA SSD boot disk (Crucial)
  • 512GB Samsung SSD
  • Screen protector
  • Windows 8

When in its docking station, the machine drives a 27” Planar touch screen over DisplayPort and a second regular 27” Acer monitor over a USB to DVI display adapter. For the past several weeks, I’ve been using this setup to get stuff done wherever I go. I’m impressed with how small and light the X230 is. Travelling with this little machine has been pleasant. It’s easy to get work done with it on a plane, including writing code. This machine also runs virtual machines like a champ, which has been helpful for me to get my experimentation done and in just learning new stuff.

I will acknowledge that this laptop is not for everyone. For me, it met some important requirements:

  1. Support multiple HDDs: I frequently rebuild my system due to the amount of beta software I tend to run. Keeping apps on one disk, data on another means I just need to reinstall my apps—the data is automatically available.
  2. Support a lot of memory: I use VMs a lot. 16 GB seems to be a good min bar for support, though I would have preferred 32 GB as is supported on the W520.
  3. Weighs little: I wanted something that was light. I’m getting older and the W520 kills my back when I carry it in a backpack. The x230 is just tiny—and the power supply is super small too!
  4. Airplane friendly: I like to write code on planes. The W520 wasn’t comfortable to use in coach. The X230 is alright in those small seats.
  5. Docking station: I don’t want to think about reconnecting monitors, USB, keyboard, mouse, microphone, and more when I want to sit at a desk with bigger screens to get “big things” done. Most of the light and portable machines don’t support docking stations. The X230 does.

Given what is coming out for ultralight laptops over the next 6 months, the X230 still looks like a great option. If you are doing Win8 development and need a touch device, or just want a nice, light development machine, I highly recommend this little beauty.

Leave a comment

Revisiting REST Versioning

I was recently asked what my opinion was on REST versioning, a few years after having written and after having recorded some stuff for Pluralsight on versioning as well. The questions were general purpose enough that I thought I’d share my answer on the blog. Here are the questions and my answers:

1. Given 2.5 years since the article, have you seen any shift toward one or the other method in the industry?

What I’ve seen in behavior is that people only change the URL for breaking changes. They try like crazy to always use the same endpoint for everything, including new functionality. I have seen a lot of uptake with the WebAPI bits released in .NET 4.5. Some companies have gone NUTS on the ability to negotiate content types, and this is for applications for big companies with thousands to millions of customers.

For APIs meant to be consumed by less process oriented folks, I see more APIs that just use JSON. The API owner then documents things for internal SDK development teams. What appears to happen is the internal SDK development teams ‘test’ the docs by building APIs in .NET, Ruby, PHP, Java, Objective C, and so on. When this phase is done, the QA’d REST documents and resulting SDK documentation is published. Development of an SDK seems to be done in an effort to make API adoption easier and to reduce the amount of support needed to get API consumers up and running.

If I was leading a REST API development project, I would design a good structure, document whatever the team did for the REST API, and then lean on the SDK as the only well supported mechanism for accessing the API. This move would let the team build a nice SDK without worrying about making sure everyone can understand the REST documentation. The reality is that most developers do not know and do not want to fully understand the intricacies of content negotiation, cache headers, and so on. They just want to build software. Your job is to worry about these things and a good SDK makes it easier for all users to do the right things.  

2. If you were building a new API today what direction would you go?

I’m reading this as a SOAP question as well as a “how do I build a REST endpoint today?” question. The answer is: it depends. If I need the API to be consumed by internal endpoints, SOAP gives me a faster way to build things and, since the app is internal, it’s highly likely that transaction consistency, security, and an RPC-like calling convention fit in well with the existing needs. I’d still version endpoints as much as I could, but reality is that most places roll out new versions of systems that need to be versioned in lockstep due to business and regulatory concerns that versioning simply does not care about. Oftentimes, the reason for the new version is new requirements that make the old version obsolete.

For external APIs, I’d only ever use HTTP based APIs. Then, the question is obviously: do you create new media types and use content negotiation, do you use new endpoints for version changes, or do you use something else? My preference today is to use new endpoints and worry about wiring things up correctly under the covers. Doing this allows me to monitor usage of each version using existing HTTP log scraping tools and seeing which URLs are being used most heavily. For everything else: how RESTful I am vs. just using HTTP as an RPC mechanism, data types, payloads, and so on, I’d stick with building SDKs as the preferred method to interact with the service. I value good design but I hate arguing about things like whether or not the ETags are configured correctly. The SDK documents the API team’s decisions where those arguments happened and let’s everyone else just use things.

3. Do you know of any particularly good resources on the topic of the top of your head that don’t come up in google and bing?

Actually, no. It seems like a lot of great stuff was written about designing REST APIs and what I’ve found on the various search engines all seems pretty decent. Today, tools like WebAPI from Microsoft and others make it easy to do the right things as an API developer.

1 Comment

Body Scanners, Fluoroscopes, and the TSA

From the 1920s through about 1960, shoe stores used an amazing device to sell shoes: the fluoroscope. Depending on the year, the reason behind the device changed; it helped you fit the shoe better, it revealed any issues in your foot, it was cool to see the bones in your foot move. The fluoroscope achieved this magic through x-rays. By the 1950s, people understood that being exposed to lots of x-rays was really bad for people. Overexposure to x-rays increases the odds you get various cancers. In 1957, Pennsylvania started a pattern of governments banning the devices( The fears were along the lines of the following:

  1. Growing people (kids) shouldn’t be exposed to this many x-rays.
  2. Lots of x-rays were taken of the salespeople.

Think about #2. According to, people got radiation burns and cancers from lots of exposures.

Many shoe salespersons put their hands into the x-ray beam to squeeze the shoe during the fitting. As a result, one saleswoman who had operated a shoe fitting fluoroscope 10 to 20 times each day over a ten year period developed dermatitis of the hands. One of the more serious injuries linked to the operation of these machines involved a shoe model who received such a serious radiation burn that her leg had to be amputated (Bavley 1950).

Interestingly enough, the Transportation Security Administration has installed thousands of x-ray machines as full body scanners. Frequent travelers are getting x-rayed several times a week. TSA employees are getting nearby exposure scans hundreds of times per day. Because of concerns around the health effects, a case was brought and was decided against the TSA that they need to figure out if these machines are safe. Our executive branch needs to enforce this ruling, but so far have chosen not to enforce it.

If you have some time today, I recommend that you go to and sign the petition to get the executive branch to carry out the decision of the judicial branch.

Leave a comment

Validating a WRAP ACS token in node.js

A few friends and I are building a system for home automation. Specifically, it is an application that opens and closes a garage door. One of the design decisions was to write the server side in node.js but to use Azure when it made sense. One of the Azure features we are using is the Access Control Service. When a client presents a token, you need to make sure that the signature on that token is valid. That turns out to be fairly interesting if you are new to node.js and have never used it before. I fit that model well. After a lot of tinkering and learning, I was able to write a function that validated a wrap_access_token using node.js and some associated, standard libraries. Here is the code, in its entirety. I include some ‘test’ code as well to allow others to verify results. I’ve already rotated the ACS signing key so that I don’t breach security too badly. This whole thing works surprisingly well.

In case you can’t read the code too well, here is what it does:

1. Parse the token into it’s constituent parts.

2. Pass the wrap_access_token to the function, along with the associated key.

Within the function:

1. Remove the signature part from the token since we need to verify that we generate the same signature. Since the signature is generated based on the bytes that precede it, the signature can’t be part of itself (this part is obvious when you think of it; the hard part is remembering to think of it!)

2. Unescape the signature and remember the base64 version of the signature, which is really just a byte array.

3. Generate the SHA256 HMAC signature using the shared secret/key.

4. Verify that the base64 encoding of the digest that we generated matches the one that was sent it.

5. If the signature passed in matches the one we generated, then the other entity knows the secret and can be trusted to have signed the tokens.

6. Party on, because the claims are valid.

The code would next need to split out the claims. The claims are just form-encoded key value pairs within the wrap_access_token. That step is left as an exercise for the reader.

var crypto = require(‘crypto’);
var util = require(‘util’);
var querystring = require(‘querystring’);
var buffer = require(‘buffer’);

function ValidateToken(token, key){
var hmacToken = “&HMACSHA256=”;
var indexOfToken = token.indexOf(hmacToken) + hmacToken.length;
var swtSignature = querystring.unescape(token.substr(indexOfToken, token.length – indexOfToken));
var signedPiece = token.substr(0, indexOfToken – hmacToken.length);
var buffer = new Buffer(key, encoding=”base64″);
var hmac = crypto.createHmac(“sha256″, buffer);
var digest = hmac.digest(encoding=”base64”);
return digest == swtSignature;

var theToken = “”;
var theData = querystring.parse(theToken, sep=’&’, eq=’=’);
var theKey = “Bn7TfLML5wK+R5TAa2VrO/9JANwuk3lzt/ykc4no+h0=”;

util.puts(ValidateToken(theData.wrap_access_token, theKey));

Leave a comment

REST Presentation at Chicago Software Development Community in Oakbrook

Thanks again to everyone who showed up for my presentation on REST at the Microsoft Store in Oakbrook. I’ve posted the slides and demos here. It was a great time. I’ve never presented in a store, never had a component of the “audience” that was just shopping either. It was an interesting, unique experience to say the least! I also enjoyed the conversations afterwards.

Leave a comment