My notes from PDC Session ES 01

Notes from ES 01: Developing and deploy your first cloud service

Talk is composed of 90% demos, so noting down what he does

In this session he is aims to create a blog website hosted on Azure using ASP.Net MVC & Azure storage

Azure SDK provides a “consistent, familiar development” environment.

Can use .Net, IIS7, WCF

The ‘cloud on your desktop’ development environment is a lot like cassini. When I develop my website and click run, my webpage is launched. But instead of cassini, the developer fabric spins up instances and runs my website.

Can develop in Visual Web developer Express

We have a “definition file” and a “configuration file”. Provide metadata about the project

Definition file defines the roles and the endpoints. What configurations are there? How many instances do we want of each role. e.g. 2 web roles and 3 worker roles. Define configutation settings

The config file is like web.config so you can provide values for config settings of your app

Can build and package services using command prompt.

Cspack.exe packages up your service & config file

Csrun.exe can spin the dev fabric up using the command line

So he used those tools to allow eclipse to dev against it. Meh sif use eclipse over VWD express

So what Azure gives us is scalability for free. Zero downtime upgrades, and all works with existing tools and skills

Horizontal scaling. We have one server, so lets just keep on adding more servers to the side to help out. But what does this mean for state?

Separate that state out from the app, and put into the durable store.

Durable storage means: Blobs, tables & queues.

Simple interface, can pull data out via REST and ado.net data services

He opens up an ASP.Net MVC project (have to use their provided sample as there are some tweaks needed to get MVC working on Azure)

He adds a reference to the “storage client” library. It is sample library that is included in the SDK to help you interface with the durable storage.

First you get a container for the blob  – var container = BlobStorage.Create(getBlobStorageAccount….)

Then add things to the container – container.CreateBlob(new blobProperties…, new BlobContents…)

To retrieve, create a container again. Then container.ListBlobs(….

Goes into the service definition file to define the configsettings in the app: AccountName, AccountSharedKey, blobStorageEndPoint

Then goes into the config file to define the values for those config settings. (puts in the BlobEndPoint of the developer cloud instance on the pc 127.0.0.1), and account key.

Defines a global container that is public so that anyone can retrieve data from the storage via a URL

Tables

Now from blobs to tables

Need to create a datamodel first

Is going to use the client .dll of the ADO.Net data services

Makes his datamodel inherit from TableStorageEntity

PartitionKey is how to partition the data in the table (I think for multi tennancy in a table?)

RowKey is like the rowId, so just a GUID will do, unless you want them in order for some reason

Creates a new class that inherits from TableStorageDataServicesContext

NEED to have SqlExpress on the dev machine to use the data storage in the dev tools

Meh, he spends ages showing off his MVC skills. I don’t caaaaaaaare. *wastes another 20mins or so*

Queues / Worker role

Azure isn’t just websites. Can use it to crunch lots of data

Use a worker role to achieve the background crunching of data. Get the web role to pump items into a queue, then get out worker role to loop forever and retrieve items from the queue and process

While(true)

Var msg = queue.GetMessage()

If(msg != null) …..

Queue.DeleteMessage(msg)

Notice that GetMessage doesn’t dequeue and that we manuall delete the message

What getmessage does is give us a message, but then hides it for a few seconds from other workers. This gives us durability. If one of our servers bursts into flames, or crashes, then the message is still there and can be processed by another thread automatically.

Queues are actually data sitting on top of tables

Clouddrive

Is a powershell plugin

Lets you use the storage in Azure as if it were a drive on your system

Can browse queues, the blobs, etc.

Nice for debugging

Debugging

Can debug on the local machine really easily. Set your breakpoint and it all just works

But debugging in the cloud is only available through logging. But there will be a lot more functionality over time

RoleManager.WriteToLog(ErrorLevel: critical error/warning/information

Can get a windows live alert when you get a critical error message.

Can get the logs from your app on the Azure.com dashboard

How to deploy to the cloud

  • Write the code
  • Get VS to package
  • Upload to Azure.com
  • Profit

By David Burela

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s