Ethereum DevOps with Truffle, TestRPC & Visual Studio Team Services

I have been working on automating the compilation and testing of Ethereum solidity contracts, via the use of Truffle. I’ve got the test results being published back into the portal, allowing me to see on each commit if my code still compiles and passes my tests.

image

I’m assuming you already have a Truffle project locally that you want to automate the continuous builds & testing on. Follow my tutorial on installing Truffle & TestRPC on Windows.

My final system will allow you to run “truffle test” locally to see standard test output, but will modify the test runner on the server to output it as JUnit format.

The Build system

The system uses the Visual Studio Team Services (VSTS) build engine to automate this. You can sign up for free, and get unlimited private Git repos.
You can have the code hosted on any Git provider. So either within VSTS itself, or GitHub, BitBucket, etc.

Prepare truffle.js

A pre-step is to define the  test section in the truffle.js file

mocha: {
reporter: “spec”,
reporterOptions: {
mochaFile: ‘junitresults.xml’
}
}

image

Create a build agent

VSTS does provide hosted build agents, which are generic and can build standard .Net projects, Xamarin, etc. But because we are going to use npm packages installed globally on the box to handle the Truffle builds

  • Create a new Windows VM (Can be your own hosted server, or Azure).
    e.g. Windows Server 2016 Datacentre edition on Azure
  • Install the VSTS build agent. Instructions at https://www.visualstudio.com/en-us/docs/build/admin/agents/v2-windows
    Note: DON’T select to run service as NT AUTHORITY\NETWORK, this will not work with TestRPC (needs to open ports).
    Run the service as another user, or NT AUTHORITY\SYSTEM
  • Install chocolatey
    https://chocolatey.org/install
  • Install these chocolatey packages
    • choco install git -y
    • choco install nodejs.install –
  • Install npm packages (make sure you open a new PowerShell window so that node is in your path)
    • npm install -g npm
    • npm install -g –production windows-build-tools
    • npm install -g ethereumjs-testrpc
    • npm install -g truffle
    • npm install -g mocha
    • npm install -g mocha-junit-reporter
  • Restart the build agent so that all new paths are working

Configure VSTS build

    • Create a new variable with the path to where the npm global path is, for the user you installed the npm packages on above
      variable name: npm.path
      variable value: path to npm packages e.g. C:\Users\<user>\AppData\Roaming\npm
      image
    • Add 7 PowerShell tasks, and configure them like this
      • Name: System version information
        Script:
        #Setting environment paths
        $ENV:Path = $ENV:Path + “;” + $env:npm_path
        npm config set prefix $env:npm_path    #only needs to be set once, will update for user
        #DEBUG
        #$env:path
        #npm list -g –depth=0
        #Display system information
        Write-Host “System version information”
        Write-Host -nonewline    “node version: ” ; node -v
        Write-Host -nonewline    “npm version: “; npm -v
        Write-Host -nonewline    “npm prefix: “;  npm prefix -g
        Write-Host -nonewline    “truffle: ” ;    truffle version
        image
    • Name: Config transform & test clean
      Script:
      # remove old test results
      rm .\junitresults.xml -ea SilentlyContinue
       

      # Modify the Truffle test runner to use the JUnit reporter
      Rename-Item .\truffle.js .\truffle_temp.js
      cat .\truffle_temp.js | % { $_ -replace ‘reporter: “spec”‘, ‘reporter: “mocha-junit-reporter”‘ } | Out-File -Encoding ASCII .\truffle.js
      rm .\truffle_temp.js
      image

    • Name: Truffle build
      Script:
      #Setting environment paths
      $ENV:Path = $ENV:Path + “;” + $env:npm_path
      #Truffle build
      truffle compile
      image
    • Name: Launch TestRPC
      Script:
      #Setting environment paths
      $ENV:Path = $ENV:Path + “;” + $env:npm_path
      # launch the process
      echo “launching TestRPC”
      $testrpcProcess = Start-Process testrpc -passthru
      # persist the PID to disk and display in logs
      $testrpcProcess.Id | Export-CliXml testrpcPID.xml
      cat testrpcPID.xml

      image

    • Name: Run Truffle tests
      Script:
      #Setting environment paths
      $ENV:Path = $ENV:Path + “;” + $env:npm_path
      # Run the tests
      truffle test
      image
    • Name: Shutdown TestRPC
      Other Settings: Enable “Always Run” (to make sure it is shutdown if there is an error)
      Script:
      #Setting environment paths
      $ENV:Path = $ENV:Path + “;” + $env:npm_path
      # retrieve the PID and kill the entire processs tree
      cat testrpcPID.xml
      $testrpcPID = Import-CliXml testrpcPID.xml
      taskkill /pid $testrpcPID /F /T
      image
  • Add a new Publish test result
    • Test Result Format: JUnit
      Test Result Files: junitresults.xml
      image

 

Future work

Things that I would like to add in the future:

  • Figure out how to automate this on a Linux build agent (VSTS supports both Windows & Linux based build agents)
  • Automate Release Management to run “truffle migrate” to push to a Bletchley test environment

Adding Application Insights to SharePoint

app insights sharepoint

TL;DR I helped write an SSW rule on setting up Application Insights in SharePoint.

I’ve been adding Application Insights to a number of SSW websites (such as SSW.com.auSSWTimepro.com and SSWLinkAuditor.com). Since being added, App Insights has been helping us to keep on top of our application metrics and unhandled exceptions.

We wanted to add Application Insights to SharePoint, but we couldn’t find any useful documentation online. I started some investigations into how I could do this manually myself. As SharePoint is an ASP.NET application, I started teased apart how Visual Studio adds hooks into projects. I created an empty git repository, created a website, checked it in, used Visual Studio to add App Insights, checked in, then diffed all the changes.

After investigating, we discovered that it was easier than I thought. You can track the browser metrics by simply adding the App Insights JavaScript to the SharePoint master page.

For the server side metrics, as it is an ASP.NET website, you can update the web.config file on the server to start tracking those metrics, we found that the Application Insights Status Monitor configuration tool was the easiest way to get this done.

A full write up of the SSW rule on setting up Application Insights in SharePoint is available for you to follow.
I have also helped write a series of SSW Rules to better Application Insights that can help you get the most out of it.

Xbox One cloud + Geo-distributed Windows Azure datacentres

Previously I blogged about how Microsoft were creating new Windows Azure data centres in Australia https://davidburela.wordpress.com/2013/05/21/windows-azure-datacentres-coming-to-australia/.

Another recent announcement was Microsoft stating their upcoming Xbox One gaming console will be support “by the cloud”. From an interview:
“We touched on it briefly in our press conference, and I think it’s a really important point and a key differentiator for Xbox One. Essentially, no longer are we constrained to the processing power of Xbox One in terms of hardware; we’re able to do a lot of processing that was traditionally done on the box, in the cloud.
“It’s also been stated that the Xbox One is ten times more powerful than the Xbox 360, so we’re effectively 40 times greater than the Xbox 360 in terms of processing capabilities [using the cloud]. If you look to the cloud as something that is no doubt going to evolve and grow over time, it really spells out that there’s no limit to where the processing power of Xbox One can go.”
Source http://stevivor.com/2013/05/microsoft-xbox-australia-on-some-of-todays-lingering-xbox-one-questions/

Offloading processing power

For those who haven’t kept up to date on the announcements, the cloud support for the Xbox One has the potential to open up a lot more back end processing to support games than has been previously possible. Traditionally console games have been wholly self contained on the console.

Cloud support enables scenarios where you need processing done, but it isn’t latency sensitive. When you turn the console off, the game world stops until you turn it back on again. Having a cloud backend that can easily support “persistence” in your console game can make it more rich. For example while your console is off, the people in your castle can continue to go about their day to day activities so that when you come back, you can see what progress has been made in your world.

Other examples of latency tolerable calculations have been lighting effects in large areas and large physics simulations. Traditionally physics effects have been restricted to what can be processed in real time. But larger more complex simulations can be enabled now. In a first person shooter, a player could fire numerous rockets at large building. Each impact weakening the structure. Once the building has taken enough damage it can trigger the cloud to process a complex crumbling animation that takes each of the impact points into consideration. Waiting a few seconds for the results to come back is fine, as the building was slowly weakening to the point of collapse.

Game hosting

With quick match games like the original Halo, traditionally it would come down to one of the consoles acting as the game host with the other players connect to that one console. This ties up compute power on the console, as that one player needs to process game server information as well as render the scene for the local player. Using cloud hosting enables developers to offload the processing off the console to free up even more processor cycles to making the game look GOOD.

An interview with the developers of an upcoming game Titan Fall goes into great detail about the benefits the Xbox One cloud brings to them http://www.respawn.com/news/lets-talk-about-the-xbox-live-cloud/

One great quote is how they can scale their servers based on the demand. Removing the usual “launch day” woes of over provisioning to handle the initial wave of players.

How is this different from other dedicated servers?
With the Xbox Live Cloud, we don’t have to worry about estimating how many servers we’ll need on launch day. We don’t have to find ISPs all over the globe and rent servers from each one. We don’t have to maintain the servers or copy new builds to every server. That lets us focus on things that make our game more fun. And best yet, Microsoft has data centres all over the world, so everyone playing our game should have a consistent, low latency connection to their local data centre.
Most importantly to us, Microsoft priced it so that it’s far more affordable than other hosting options – their goal here is to get more awesome games, not to nickel-and-dime developers. So because of this, dedicated servers are much more of a realistic option for developers who don’t want to make compromises on their player experience, and it opens up a lot more things that we can do in an online game.
Source http://www.respawn.com/news/lets-talk-about-the-xbox-live-cloud/

Bringing the two together

The reason I tied these two announcements together (Xbox One cloud & Microsoft opening data centres in Australia) is because it will help Australian consumers a lot. While some tasks offloaded to the cloud can be handled in a high latency situation (economy simulations), there are other lots where lower latency is always better (hosting multiplayer games).

In low latency situations Australia always gets the short end of the stick due to our physical distance from both the USA and Europe. The announcements of Microsoft releasing Windows Azure datacentres which could potentially host Xbox One servers is an encouraging situation.

If my crystal ball is correct on this one, then local cloud compute happening in Australia will be a great boon to all Australian gamers.

By David Burela

My first book has been published Azure & Silverlight integration

It is with GREAT pride and pleasure that I am able to finally announce that my first book has been published and is available to be purchased right now!

Microsoft Silverlight 5 and Windows Azure Enterprise Integration details how enterprise Silverlight applications can be written to take advantage of the key features of Windows Azure to create scalable applications.

It is available as as eBook,  in print format and is available on the Kindle, Nook, etc.  It can be purchased from the following websites:

By David Burela

New Windows Azure Training Kit & 1.4 SDK available

There is a new release of the Windows Azure training kit, and also an update (and subsequent refresh) of the Azure SDK available.

New training kit

The training kit in my opinion is always the best way to learn the newest features that are available in SDKs. This latest update brings a new hands on lab explaining how to authenticate users from a Windows Phone 7.
http://blogs.msdn.com/b/windowsazure/archive/2011/04/28/now-available-windows-azure-platform-training-kit-april-update.aspx

 

Windows Azure SDK 1.4 released

A new version of the Azure SDK is now available. The biggest feature is a new way for developers to push out new development builds to an Azure server, without having to repackage and waiting for it to be spun up on new machines. The feature is called “web deploy” and details can be found in the update link below
http://blogs.msdn.com/b/windowsazure/archive/2011/04/15/now-available-windows-azure-sdk-1-4-refresh-with-webdeploy-integration.aspx

There was an issue with the 1.4 release which has been fixed. If you downloaded the SDK before April 25th you will be required to download the new version. If you are installing it for the first time then you are fine
http://blogs.msdn.com/b/windowsazure/archive/2011/04/27/windows-azure-sdk-1-4-refresh-issue-resolved.aspx

By David Burela

Community report: Melbourne Azure Bizspark camp 2011

Last weekend I helped out at Melbourne’s “Windows Azure Bizspark camp”. It was a 3 day event aimed at fostering innovation while also training small start ups in how to pitch an idea and seek funding.
You may remember that I participated in the last Bizspark camp, and came runner up with my application ‘Beachy’. It was the same event but this time it was on Windows Azure instead, and I was helping out rather than being an attendee.
More basic information about the event is on the Australian Bizspark website.

My involvement over the weekend was to assist each of the teams with their ideas, giving advice & assistance with any Windows Azure, Silverlight and Windows Phone 7 based questions. At the same time I would also sit down with each team and get them to talk me through their business ideas. Getting them to focus on the one or two key points that they would pitch to the judges, and not just rely on the fact that they “wrote a lot of really cool code”. Where I could I tried to get them to focus on what exactly they would show the judges on the final day, and to be sure to polish around those key points.

I was also impressed to see that one of the largest gaming websites (ign.com), did a blog post on what Bizspark and the Bizspark camp were http://www.ign.com/blogs/kiera2/2011/02/05/whats-a-bizspark/

All the photos from the event have been uploaded to my Flickr account with a Creative Commons license if anyone would like to use photos of themselves

Day 1: Training the teams in Windows Azure

The first day consisted of a compressed training day for all the participants. Graham Elliot and Mitch Denny delivered training covering Azure basics (Windows Azure Compute, Management, Diagnostics, Storage, SQL Azure, App Fabric). The slides from the training day have been uploaded to the internet for anyone to download and go through themselves.

Another useful training guide if you wish to go through it yourself is to use the latest Windows Azure training kit.

IMG_2784IMG_2787

Day 2 & 3: App development & start-up training

The rest of the event was mostly allocated to the teams, to let them build their applications. Most of the teams were 2 people with a few of the larger ideas having teams of 3. All of the teams went to find their own space around the Microsoft office and worked on the projects. They all worked really hard while there at the office, and judging by the bags under a few eyes the next morning, many of them obviously pulled all nighters back at their own places. This was meat of the event, the teams needed to build enough of the application in order to have something to show to the judges the next day. But they needed to be focused and disciplined, there wasn’t enough spare time to waste on things that wouldn’t be shown. They needed to decide what they were going to demonstrate to the judges the next day and focus on enabling enough of their app to demo.

IMG_2797IMG_2841

Of course, there was a pizza lunch to keep the attendees going!

IMG_2829

Industry talks

During those 2 days, Catherine Eibner organised 2 speakers to come and talk during the lunch times. This helped to try and break the day up a bit and give the attendees a chance to clear their heads.

IMG_2816

On the Saturday Ross Hill (twitter: @RossHill) talked to the attendees about the importance of networking when creating your own start-up. Events are important and enable you to meet the type of people that you need to help you get your start-up off the ground. Ross co-founded one of the most successful entrepreneurial networking events The Hive which runs in Melbourne, Sydney and Brisbane. I recommend you attend the hive if you live in one of those cities and are interested in the entrepreneurial spirit.

I didn’t record Ross’ talk unfortunately, but I did manage to get some of the URLs he listed of other good events to attend

Jack Delosa’s talk

On the Sunday, there was a talk by Jack Delosa (@JackDelosa) who runs a group called Entourage which “is a movement which encourages and facilitates entrepreneurship in the 18-35 demographic”.  More information about Entourage can be found at http://www.the-entourage.com.au/

Everyone on Twitter was amazed at how much information he was able to cram into 45 minutes. He was able to distill down the basics of what anyone who is thinking of creating a startup needs to know, from the idea, raising capital, creating your Business plan/Information Memorandum/executive summary.

This is a MUST WATCH video for anyone thinking creating their own start-up company.

Jack Delosa

http://vimeo.com/19889493

Final afternoon

On the final afternoon, some of the judges set in a room and allowed each of the teams to have a chance to do a practice pitch. This allowed the teams to have a first run, and get feedback on the sorts of things they should focus on when doing the final pitch to the judges. This in itself was a valuable experience for all of the attendees.

All the judges arrived and the teams took turns to do their 5 minute pitch, and then have a chance for 5mins of question time from the judges.

Idea pitch by http://Rome2Rio.com

The judges then spent half an hour debating amongst themselves, trying to determine the winner.

Awards ceremony

http://vimeo.com/19892066

The final winners were:

  1. http://Rome2Rio.com
  2. http://blog.mapdojo.com/
  3. TrendFrendz

The winners of the social networking awareness competitionIMG_2943

Everyone talking after the event has finished

Videos of the pitches

I recorded all 11 pitches from the teams, which I will upload slowly over the next few weeks.

Other coverage of the Azure Bizspark camp

By David Burela

Azure free to trial for 30 days

If you currently have an MSDN account (paid for or a Bizspark account), then you should be taking advantage of the 750 free compute hours you get each month. Just look at your subscriber benefits.

However if you still haven’t been able to give Windows Azure a try, you are now able to get a 30 day free trial https://blogs.technet.com/b/webtech/archive/2010/11/24/it-s-back-30-days-windows-azure-platform-no-credit-card-required.aspx

By David Burela