Archive for category Uncategorized

Thoughts on Day 3 of Money 2020, Europe

Walking the show floor at Money 2020, one sees lots of payment providers. Its no shock then that today, I saw a lot of talks on how this works. This is in keeping with the theme of the show: the future of money. The previous two days, I did hear a fair amount of talk about moving away from paper money, what to do about fiat currency vs. other currency, and difficulty in managing payments. Today, we heard about what is possible with payments.

The facts keep stacking up around us about people preferring to pay electronically instead of with plastic or currency. Those electronic payments are happening in apps:

The first three are as phone OS capabilities, the last two are via apps on the phones. Electronic payments make a number of things better for consumers, retailers, and credit institutions (banks, credit card companies, etc.). For consumers, they get convenience. For retailers, they find less friction at checkout. For credit institutions, they get reduced fraud. So far, so good, right? Well, what I learned after this was a lot more interesting: if you are just reducing payment friction, you are leaving a lot of opportunity on the table. Opportunity for:

  • Learning about the customer. You are collecting their buying habits. Imagine what you could do if you did more to help, what else could you learn?
  • Getting retailers to use your payment system. You know a lot about your customers and what they like to buy. You can now refer them to other retailers, services, and so on. Use that to convince retailers that if they use your system, you will drive more interested customers their way.
  • Delight your customers in novel ways. WeChat has delighted their Chinese users by making it easier to send Hong Bao (a monetary gift in a red envelope) to others. Their users love this feature. Here’s what the growth has looked like year over year (numbers are from a presentation by Ashley Guo of WeChat):

wechat-hongbao

You can also look at Ant Financial/Alipay. Despite their name, they are not a payment company; they are a marketing company. You use them to schedule doctor appointments, figure out how to travel on public transportation or taxi, manage vacations, discover information about products, and so on. And yes, when the service is performed or goods are purchased, they also make sure that the vendor is compensated by you. But, they make it all seamless. The product has been successful in China, turning their tier 1 and tier 2 cities into cashless areas. The services are so popular with their Chinese user base that the apps are used around the world at high end retailers down to businesses like Burger King.

Both WeChat and Alipay emphasized that they use the data to better market to users. The users like the targeting in their lives. When traveling, they discover attractions and restaurants that appeal to them because the app knows them so well. The businesses are happy to participate because they acquire customers who may not have found them otherwise.

What I saw today was a lot of companies thinking about how to make transactions easier by working with banks and credit card companies to remove plastic from your life. This is great. I look forward to the day when my wallet no longer bulges because of all the cards I need to carry.

I also saw something wonderful and scary: a world where things will generally improve for me if I let artificial intelligence and machine learning see all the things I do. By knowing what I eat, where I go, and so on maybe the algorithms can warn me to start doing some things (walk more) and stop doing other things (keep it to two coffees a day). Scary, because I worry what would happen if all that data was combined in some nefarious way. For example, if the algorithm senses that I get out of depression by spending money, maybe the algorithm seizes on this by getting my spending up using knowledge sales people only wish they knew. Or, me being denied a job because the data leaks that I buy [something the employer wants to look out for: alcohol, cigarettes, etc.].

In all, a very interesting day around payments.

Leave a comment

Thoughts on Day 2 of Money 2020, Europe

The conference at Money2020 has many tracks. Given the amount of questions I have seen from customers around distributed ledger technology (DLT) (Blockchain, Hyperledger Fabric, Ethereum, Corda, etc.), I attended that track. In the track, there were a fair number of panels staffed by either users of DLT, consortiums looking to make their use cases more ubiquitous, and implementers of DLT. While all groups had different lenses on what is going on, they tended to agree on what problems DLT solves and how to use the technology.

For finance, DLT helps eliminate a lot of verification/validation work with happens in the middle and back office.

This has been framed as the “Do you see what I see?” (DYSWIS diss-wiss) problem. DYSWIS solution attempts from the past have involved things like cryptographic signatures where two parties compare signatures of the data. Those solutions fail for the main reason crypto comparison always fails: normalization of data prior to signing. DLT solutions solve this in different ways, but they all have ways to sign facts and achieve consensus about the facts.

Anyhow, back to the main point about saving time confirming that what I see matches what you see. The finance industry has used a strong central authority for smaller, contract-less transactions in the form of Visa, Mastercard, and others. Then, we get to more complex transactions. How complex? Consider this scenario:

A European importer purchases some goods from an East African exporter. The importer prefers to pay when goods arrive. The exporter prefers to be paid when goods are shipped. Neither gets their preferred mode because of risk. The bank for the importer needs to issue a letter of credit. The exporter also insures the goods until delivery. The shipper, sitting between importer and exporter, will orchestrate the movement of the goods through several partners, who in turn may use other partners. Finally, the exporter will insure the goods in case of loss. It is normal for the shipment to pass through around 30 entities and have around 200 transactions [info from a presentation by TradeIX.com].

How does DLT help here? Using DLT, the importer, exporter, bank, shipper, and insurers can all see what is happening in real time (within minutes). Because facts are attested to and sent digitally, human transcription errors disappear. This means that humans may only to verify that the numbers and such look “right” before allowing their end of a transaction to proceed. This frees up human capital to do more valuable tasks.

So, one question you might ask yourself is “which DLT is right for me?” More than a few of the C-level folks on panels said a variant of “I don’t care. I just want something that works.” For those of you that care about the details and optimal choices, understand this: if you are joining a DLT consortium and it doesn’t use what you consider to be best, you need to just build something that works with the choice. If you complicate things by creating translation layers between something like Corda and Ethereum, expect to be looking for a new job tomorrow (because you’ve been fired).

The great news here is that the businesses now understand how to apply DLT. They have found that their normal transaction volumes of 200 TPS are already handled by most enterprise DLT solutions. They also understand the difference between on-chain and off-chain data, so don’t put PII and other GDPR prohibited data on the chain.

Over and over again, I heard the C-level folks say “I want DLT for the use cases where I spend a lot of time verifying that data was input correctly because that work costs too much time and slows down the business.” Then, using those facts, they want to drive cost savings elsewhere. The instant verification of the truth reduces financial risk. The reduced financial risk means the business can now make decisions sooner to further improve their ability to move money, settle accounts, and so on.

In 2018 you will hear a number of implementations of DLT in a number of markets. At this time, it seems prudent to be familiar with the leading contenders in the space. At the moment, these seem to be (in no particular order):

This will be an exciting year for DLT.

Leave a comment

Thoughts on Day 1 of Money2020, Europe

I just finished day 1 of Money 20/20 Europe. I stuck mainly to the large sessions and to the show floor. What I saw was a repeated vision of what this group in finance sees as the next set of important things to be tackled. Everything they are doing revolves around the customer and making things better for customers. Depending on where you are in the financial ecosystem determines which pieces you are building and which pieces you are integrating.

From the banking side, we heard from many folks. I took the most notes from the talks by Ralph Hamers (CEO of ING Group) and Andy Maguire (Group COO at HSBC). After these two, the themes repeated which only solidified that they weren’t unique in their visions. Because banks already have the balance sheets and other nuts and bolts of building a banking business, their vision is to provide a banking platform that other businesses can plug into. Any workable platform must be open: competitors need to be able to plug into it just as easily as partners. This will allow the bank to stay good at what it knows while letting other partners fill the gaps with the wide variety of expertise that the bank does not have so that it can participate in new opportunities more easily. For example, many banks are finding success by going into geographies where their customers only interact with them over a digital experience: no human to human interaction over 99% of the time. To do this, they craft their platform and their onboarding experience to be as easy to use as possible. Several banks talked of doing work to reduce the integration times with their platforms from months down to weeks. These efforts are paying off to allow the banks to find ways to interact with more customers in more countries.

From the FinTech side of the house (which for this conference so far is the “everyone else” even though I know this leaves out personal finance folks), I saw a lot of interesting technology. A lot of the technology focused on a few areas, all with interesting takes on how to accomplish the goals. I saw a lot of distributed ledger technology (aka blockchain) with implementations that have already gone live. It wasn’t clear to me how blockchain is being leveraged, but tomorrow promises to have a number of talks around the “what” and “how”. The show also has a number of folks presenting different ways to present your identity. Many of these still focus around the two factors for authenticating and many are avoiding passwords, PIN codes, and the like. The primary mechanism here is:

  1. Some biometric. Two most commonly cited are fingerprint and face.
  2. Smart phone.

So, yes, the argument that goes “What about people from [some part of world that they think doesn’t have Android or Apple phones]?” is not under consideration. In the countries where the banks operate, they know that most of their customers have smart phones.

The final thing I noticed is that AI came up a bunch and it was all nebulous to the speakers. Asking some of the AI firms on the floor, the sales folks know that they have data scientists and those people build and maintain their models. AI/ML is being applied to Know Your Customer/Anti-money Laundering work as well as fraud detection. Given the sales process, my guess here is that the people who need the tech will talk to those who make it and then have their engineers have the nitty gritty discussions of integration. I’m definitely looking forward to learning more there.

I also spent a bit of time on the show floor. Because it’s banking, a lot of the vendors create solutions that run in the client data center OR the cloud. For those folks, I’d like to let you know that you should look at joining the Azure Marketplace. This can give you ease of deployment for your customers who run in Azure and is fairly handy for VM only deployments. Contact me and I can help you get on board.

Leave a comment

Copying files from a Docker container onto local machine

This past week, I’ve spent time wiping away my ignorance of containers. To do this, I started in my usual way:

  1. Buy a bunch of books. Probably too many.
  2. Work through books, doing exercises as I go.

The first book I’m running through is Using Docker: Developing and Deploying Software with Containers by Adrian Mouat. I’m posting this bit now to hopefully help others.

When working through the exercise to backup the redis database in Chapter 3, I ran the command to backup the database:

docker run --rm --volumes-from myredis -v $PWD/backup:/backup debian cp /data/dump.rdb /backup/

This then emits the error message:

C:\Program Files\Docker\Docker\Resources\bin\docker.exe: Error response from daemon: Drive has not been shared.
 See 'C:\Program Files\Docker\Docker\Resources\bin\docker.exe run --help'.

This is happening because I never shared the C-Drive with Docker. To do this, right click on the Docker icon sitting in your toolbar and select Settings… . Then, select Shared Drives and check the drive(s) on your system which you want to be able to use. DockerSettingsSharedDrive

Upon clicking Apply, enter your credentials. The command should now work.

One other note: I found that the command did not work right in cmd.exe or some bash shells. It did work just fine from a powershell window. So, that’s another note…

 

Leave a comment

.NET Fx version to Azure Cloud Service Mapping

Posting this here mostly for me so I can find this easily again:

https://docs.microsoft.com/en-us/azure/cloud-services/cloud-services-guestos-update-matrix.

I’m monitoring this URL, waiting for .NET 4.7 support to appear. I’m hopeful that we’ll see something early in 2017 Q4, but I won’t be holding my breath either 😉

Leave a comment

Azure OS Family 5 changes to RDP/Remote Desktop prevent logins on short passwords

TLDR; Azure OS Family 5 requires Remote Desktop passwords >= 10 characters. Anything less will cause your login to fail, repeatedly requesting that you re-enter your password.

I ran into an issue when upgrading an Azure application from OS Family 4 to OS Family 5. We have configured RDP for our development deployments. As part of that deployment, we had configured special passwords for each environment. Those passwords had a strong enough length when we added them a few years ago: 8 and 9 characters. OS Family 5 (Windows Server 2016) requires that the passwords are at least 10 characters long.

As a result, we found that the deployment went fine (no errors reported) but that we simply couldn’t log in post upgrade. Looking on the portal, we noted that one has to have a password of at least 10 characters to add Remote Desktop from the portal. We counted the characters in our passwords, adjusted lengths, and found we could login again.

Leave a comment

(Re)claiming My Blue Badge

TL;DR: Microsoft offered me a position on the Windows Azure Service Bus team and I took it. I’m ex-Microsoft and I reclaim my blue badge on February 11, 2013.

Longer version: From 2000 to 2006, I worked at Microsoft on MSDN and later on Indigo (WCF). The family loved living in Washington state and I loved my job at Microsoft. However, my wife and I don’t ever want to look at life and see ourselves doing things that we know we will regret. One of the things we were starting to regret was not letting our kids get to know their extended family. In 2006, my wife and I chose to return to the Midwest so that our three children (then 10, 5, and 3 years old) could get to know their cousins, aunts, uncles, and grandparents. Since 2006, we’ve been able to attend graduations, weddings, and generally get to visit family whenever the spirit moved us. We got to know everyone in our extended family quite well. As happens quickly, the families have seen their kids get older, other activities occupy more of their time, and this has limited the ease in all of us getting together. Essentially, Thanksgiving works great- everything else is a crap shoot.

Over the last 2 years, getting together just got tougher, so my family reevaluated our goals and wants. We decided we wanted to go back to the Pacific Northwest and I figured that, if I’m going to move there, why not work for Microsoft again? One of the teams I was interested in was the Windows Azure Service Bus team. They had an opening and after a nice, long day of interviews, they decided to take a risk on an RD and Integration MVP. I really clicked with the team, so I accepted the offer. This choice also allows me to work on one of the largest scale systems in the world on a product that ships on an Internet cadence. I’m extremely excited about this opportunity and can’t wait to get into the code.

I plan to continue recording courses for Pluralsight on the weekends and evenings- the authoring/teaching bug bit me back in 1998. Pluralsight provides a great way to scratch that itch.

4 Comments

New work machine

Back in 1999, I officially gave up on the desktop computer. Since then, my personal machine has always been a desktop replacement quality laptop. I enjoy being able to take a powerful box wherever I go. This past December, I felt a need to get a portable machine that supported Windows 8 with multi-touch. I’m floored by how light a desktop replacement can be! I wound up with a Lenovo X230 tablet. The thing is small- 12.5” screen. I equipped it with a slice battery so that I can work a full day away from a power source. You can also easily enhance the box to make it a wonderful workhorse. I picked up the I5 configuration with the basic memory and HDD. About 2 hours after receiving the unit from Lenovo, the machine had:

  • 16GB RAM (Crucial)
  • 256GB mSATA SSD boot disk (Crucial)
  • 512GB Samsung SSD
  • Screen protector
  • Windows 8

When in its docking station, the machine drives a 27” Planar touch screen over DisplayPort and a second regular 27” Acer monitor over a USB to DVI display adapter. For the past several weeks, I’ve been using this setup to get stuff done wherever I go. I’m impressed with how small and light the X230 is. Travelling with this little machine has been pleasant. It’s easy to get work done with it on a plane, including writing code. This machine also runs virtual machines like a champ, which has been helpful for me to get my experimentation done and in just learning new stuff.

I will acknowledge that this laptop is not for everyone. For me, it met some important requirements:

  1. Support multiple HDDs: I frequently rebuild my system due to the amount of beta software I tend to run. Keeping apps on one disk, data on another means I just need to reinstall my apps—the data is automatically available.
  2. Support a lot of memory: I use VMs a lot. 16 GB seems to be a good min bar for support, though I would have preferred 32 GB as is supported on the W520.
  3. Weighs little: I wanted something that was light. I’m getting older and the W520 kills my back when I carry it in a backpack. The x230 is just tiny—and the power supply is super small too!
  4. Airplane friendly: I like to write code on planes. The W520 wasn’t comfortable to use in coach. The X230 is alright in those small seats.
  5. Docking station: I don’t want to think about reconnecting monitors, USB, keyboard, mouse, microphone, and more when I want to sit at a desk with bigger screens to get “big things” done. Most of the light and portable machines don’t support docking stations. The X230 does.

Given what is coming out for ultralight laptops over the next 6 months, the X230 still looks like a great option. If you are doing Win8 development and need a touch device, or just want a nice, light development machine, I highly recommend this little beauty.

Leave a comment

Revisiting REST Versioning

I was recently asked what my opinion was on REST versioning, a few years after having written http://www.informit.com/articles/article.aspx?p=1566460 and after having recorded some stuff for Pluralsight on versioning as well. The questions were general purpose enough that I thought I’d share my answer on the blog. Here are the questions and my answers:

1. Given 2.5 years since the article, have you seen any shift toward one or the other method in the industry?

What I’ve seen in behavior is that people only change the URL for breaking changes. They try like crazy to always use the same endpoint for everything, including new functionality. I have seen a lot of uptake with the WebAPI bits released in .NET 4.5. Some companies have gone NUTS on the ability to negotiate content types, and this is for applications for big companies with thousands to millions of customers.

For APIs meant to be consumed by less process oriented folks, I see more APIs that just use JSON. The API owner then documents things for internal SDK development teams. What appears to happen is the internal SDK development teams ‘test’ the docs by building APIs in .NET, Ruby, PHP, Java, Objective C, and so on. When this phase is done, the QA’d REST documents and resulting SDK documentation is published. Development of an SDK seems to be done in an effort to make API adoption easier and to reduce the amount of support needed to get API consumers up and running.

If I was leading a REST API development project, I would design a good structure, document whatever the team did for the REST API, and then lean on the SDK as the only well supported mechanism for accessing the API. This move would let the team build a nice SDK without worrying about making sure everyone can understand the REST documentation. The reality is that most developers do not know and do not want to fully understand the intricacies of content negotiation, cache headers, and so on. They just want to build software. Your job is to worry about these things and a good SDK makes it easier for all users to do the right things.  

2. If you were building a new API today what direction would you go?

I’m reading this as a SOAP question as well as a “how do I build a REST endpoint today?” question. The answer is: it depends. If I need the API to be consumed by internal endpoints, SOAP gives me a faster way to build things and, since the app is internal, it’s highly likely that transaction consistency, security, and an RPC-like calling convention fit in well with the existing needs. I’d still version endpoints as much as I could, but reality is that most places roll out new versions of systems that need to be versioned in lockstep due to business and regulatory concerns that versioning simply does not care about. Oftentimes, the reason for the new version is new requirements that make the old version obsolete.

For external APIs, I’d only ever use HTTP based APIs. Then, the question is obviously: do you create new media types and use content negotiation, do you use new endpoints for version changes, or do you use something else? My preference today is to use new endpoints and worry about wiring things up correctly under the covers. Doing this allows me to monitor usage of each version using existing HTTP log scraping tools and seeing which URLs are being used most heavily. For everything else: how RESTful I am vs. just using HTTP as an RPC mechanism, data types, payloads, and so on, I’d stick with building SDKs as the preferred method to interact with the service. I value good design but I hate arguing about things like whether or not the ETags are configured correctly. The SDK documents the API team’s decisions where those arguments happened and let’s everyone else just use things.

3. Do you know of any particularly good resources on the topic of the top of your head that don’t come up in google and bing?

Actually, no. It seems like a lot of great stuff was written about designing REST APIs and what I’ve found on the various search engines all seems pretty decent. Today, tools like WebAPI from Microsoft and others make it easy to do the right things as an API developer.

1 Comment

Body Scanners, Fluoroscopes, and the TSA

From the 1920s through about 1960, shoe stores used an amazing device to sell shoes: the fluoroscope. Depending on the year, the reason behind the device changed; it helped you fit the shoe better, it revealed any issues in your foot, it was cool to see the bones in your foot move. The fluoroscope achieved this magic through x-rays. By the 1950s, people understood that being exposed to lots of x-rays was really bad for people. Overexposure to x-rays increases the odds you get various cancers. In 1957, Pennsylvania started a pattern of governments banning the devices(http://www.smithsonianmag.com/history-archaeology/Heres_Looking_at_You_Kids.html). The fears were along the lines of the following:

  1. Growing people (kids) shouldn’t be exposed to this many x-rays.
  2. Lots of x-rays were taken of the salespeople.

Think about #2. According to http://www.orau.org/ptp/collection/shoefittingfluor/shoe.htm, people got radiation burns and cancers from lots of exposures.

Many shoe salespersons put their hands into the x-ray beam to squeeze the shoe during the fitting. As a result, one saleswoman who had operated a shoe fitting fluoroscope 10 to 20 times each day over a ten year period developed dermatitis of the hands. One of the more serious injuries linked to the operation of these machines involved a shoe model who received such a serious radiation burn that her leg had to be amputated (Bavley 1950).

Interestingly enough, the Transportation Security Administration has installed thousands of x-ray machines as full body scanners. Frequent travelers are getting x-rayed several times a week. TSA employees are getting nearby exposure scans hundreds of times per day. Because of concerns around the health effects, a case was brought and was decided against the TSA that they need to figure out if these machines are safe. Our executive branch needs to enforce this ruling, but so far have chosen not to enforce it.

If you have some time today, I recommend that you go to https://petitions.whitehouse.gov/petition/require-transportation-security-administration-follow-law/tffCTwDd and sign the petition to get the executive branch to carry out the decision of the judicial branch.

Leave a comment