Blockchain: The Silver bullet?

Blockchain!!!

If you’ve read the news Blockchains are going to be everywhere, revolutionizing all industries as they go along.

But what is it really?

A blockchain is essentially an immutable distributed database. This database is shared but all parties on the network, each having their own copy, and each able to inspect it. All participants can add data and query the database, but none can change its previous state (like delete a previous entry). In our view its better to visualise it as a distributed ledger, say a land registry, rather than database.

Immuatibility

This basically means that once some data has been added to the block chain it can never be removed. carrying on with the land registry, example, one can’t erase a record, even if its an error. A new entry has to be added that updates the previous entry.

There are lots of resources on the web that go into great detail explaining how the immutability is achieved, but at a high level the participants of the network enable this. For any one participant to alter the chain, it needs to convince at least 50% of the other participants to accept its version of the chain.

Bitcoin

Blockchain became popular with the size of Bitcoin, where the concept of an immutable distributed ledger enables Bitcoin to function without any central authority.

This early implementation of blockchain can be referred to as blockchain 1.0. What has got everyone really excited is Blockchain 2.0, which introduces the concept of smart contract. A smart contract is essentially computer code that is compiled and stored on the block chain. Users can then updated its data (variables) when interacting with it. Since the Blockchain ensures immutability, we can be assured that no one will alter the code or values (without leaving a paper tail).

Using this concept we can code up real world contracts between one or more parties in code, that can mimic, say what happens in a supply chain. In a traditional supply chain we have a long list of suppliers and consumers, e.g. iron ore manufacturer -> Iron manufacturer -> steel sheet maker -> automobile body builder and so on. This leads to inefficiencies as currently tracking the goods and the amounts owing would be maintained by each party in generally their own supply chain and billing management system. There is also no one version of the truth, and when things don’t match manual reconciliation needs to take place.

Using smart conracts all participants, from the iron ore producer to the car manufacturer can enter into a smart contract, that tracks the good from the point of origin to the eventual destination. Along with it ti can also tract and trigger payments (via a virtual crypto currency like Bitcoin) automatically at the agreed times.

But it’s not all rosy. There are still many issues to be solved. If you read the news recently, ehterum (a public Blockchain 2.0 provider) was hacked, due to known bug in the execution engine that the smart contract programmer had not handled correctly. This led to millions of dollars of funds vanishing. This is something that the industry will be watching very closely.

Also there are real use cases when the contract needs to obtain information from an off chain system, e.g. a stock ticker, or a time provider, to execute some transaction. Microsoft recently announced Project Bletchley, which is touted as Blockchain 3.0. This seems to offer an answer to this problem by provided a secure way for Blockchain to interact with off chain system.

The ASX recently announced that they are experimenting with Blockchains as a means for faster stock trading reconciliations. It’s still early days yet but advancements in Blockchain could really change the way we do business especially in the financial sector.

World peace, well that may have to wait till 4.0

 

DevOps Deployment Framework. The rise of dodo

Here at Webjet IT we live and breathe DevOps. Our DevOps journey has been one borne out of automation and a cultural shift in terms of developing and operating production environments.  This first blog will outline our view on automation and how this has help define a framework that has helped improve the cycle times for our pipeline management.

Automation

We strive to automate our software delivery pipeline and ensure our software quality is of the highest standards. Automation is a topic that contributes to the build, deployment and testing pipeline and it involves a lot of scripting. Our pipeline consists of Octopus deployment, mostly PowerShell scripts, Teamcity build and Azure infrastructure.

Most of the time when writing scripts around a process, it’s a process that occurs often… at times too often. For example, our deployment script would look very similar for one product and another. A good example of this could be when a person writes a script to automate deployment of an IIS website, another person would come along a few days later and would want to do the same thing. In this scenario, we’d end up with two implementations of the same solution.  The same applies to software builds, running tests, deployment scripts and many other processes.

Centralised Code Repository for DevOps

Our first approach to solving for duplicating workloads, was to house all these “helper” scripts into a DevOps code repository where IT staff could help themselves to re-use some of the scripts other people have written.

The problem with this solution is isolated scripts cannot really be versioned correctly. Versions cannot be enforced and changes to a centralised scripts can cause breaking changes for others using the same set of scripts.

One perfect example of this was the implementation of our scripts to allow automated deployments of Azure Web Apps. In our DevOps code repository we started writing a set of helper scripts that we could use to interact with the Azure API to deploy Web Apps to Azure cloud. The scripts would allow developers to pass input and expect full deployment and management capabilities of the Azure web app that was being deployed. Our development teams could deploy their apps with single commands, perform configuration updates and slot swaps to make applications active between staging and production slots.

It was a matter of a 2-3 weeks and we were already deploying about 10 web apps to production using the centralized set of scripts

Life was good!

The Azure API is simple to use, but often one line commands are not good enough and defensive coding practices usually end up in many more lines of code that need to be maintained. Centralised framework for these implementations was needed.

DevOps WebApp Deployment Framework was born

We were convinced that what we had was not good enough. Although the scripts were centralised in the DevOps code repository, development teams were still required to physically copy the code out into their build artifacts  so that it can be used. By copying code, you lose versioning.

We created a separate code repository to house the newly formed “DevOps Azure Web App Deployment Framework” and implement tagging as a versioning mechanism.

Development teams would then use Git sub-modules to access the framework, rather than copying the code out. This allows developers to pick the version “tag” of the managed deployment framework that they want to use.

The framework quickly evolved from there and Azure Web-jobs deployment feature was added.

Life got even better!

Development teams were consuming the framework on a number of different Azure web app and web job solutions and it hardly required any bug fixes. Git submodule introduced it own problems and I had to think of a better approach to consuming the framework.

PowerShell modules were exactly what we needed. They are centralised, self contained, versioned and can live on many machines with many different versions on the same machine. PowerShell modules can also be consumed by a single shell instance in memory at runtime which means it does not have to be installed on a machine if you don’t want to install it.

DODO was born!

     ______ __
   {-_-_= '. `'.
    {=_=_-  \   \
     {_-_   |   /
      '-.   |  /    .===,
   .--.__\  |_(_,==`  ( o)'-.
  `---.=_ `     ;      `/    \
      `,-_       ;    .'--') /
        {=_       ;=~`    `"`
         `//__,-=~`
         <<__ \\__
         /`)))/`)))

We needed to call this module something. Give it a code name that be catchy and become popular among the teams at Webjet.

The DevOps Deployment Orchestration  module (DODO) was born.

The latest code from the DevOps Web App Deployment Framework was copied out into a new repository and converted to a PowerShell module.

We decided to simplify the input and make every command of DODO take a JSON object, JSON string or JSON file path. This means the user can simply use a deployment template and pass it to a DODO command which will perform the actions.

Development teams no longer had to use GIT sub-modules and did not require the code inside their repositories. Where ever we have an Octopus deployment tentacle agent, we’d install DODO and development teams can simplify their deployment scripts by calling the one liner commands with JSON input.

Example: Publish-DODOAzureWebApp $parameters

$parameters would be a json object which houses the required parameters to deploy such as Azure subscription ID, web app name etc.

Azure Deployments and Infrastructure deployments

DODO grew very quickly to support most of the Azure infrastructure.

You can spin up an entire environment including Virtual Networks, Web Apps, SQL servers + Databases, Redis Cache, Storage Accounts, Security Groups, Load Balancers, Automation accounts, Run books, Key Vaults, Cloud Services, Virtual Machines and perform DSC operations.

DODO also supports various windows automation features, such as installing IIS, web applications, application pools, Application Request Routing features, rules, probes and server farms.

Future of DODO

We’ve recently created the command line (dodo.exe) version of DODO which is just a C# console application that runs DODO operations.

Because DODO is mainly written in PowerShell, the EXE simply runs a C# PowerShell runspace that imports DODO into a shell and runs commands on top of the JSON files that are passed in.

The beauty about this is development teams no longer needed knowledge of PowerShell and could simply call DODO.EXE from their deployment script.

Example: dodo.exe C:\path\to\deployment\template.json

As we shift more and more software into the cloud, DODO would continue to target all our automation needs. Possibly also focus more on build and testing rather than only deployments.

There are also talks on potentially moving DODO to the open source space – but who knows 🙂

The future is bright 🙂