Copying a blob snapshot to another blob

Recently I had a question in the comments on my blog post Backing up a Windows Azure Virtual Machine using PowerShell –

Can a snapshot blob be copied to another location instead of overwriting the original blob?

The short answer is yes, of course, as snapshots present themselves more or less like any other blob, but there is a subtle point around how to reference the snapshot so I thought it’s worth demonstrating using a quick console app, here are the key steps –

I created a console app and added the necessary references using

Install-Package WindowsAzure.Storage

I already had a few blobs I could play with, so I just added a couple of lines to connect to the account and create a snapshot of one of them 

string storageConnection = ConfigurationManager.AppSettings["StorageConnectionString"];
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnection);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

CloudBlobContainer container = blobClient.GetContainerReference("metars");
CloudBlockBlob blob = container.GetBlockBlobReference("EGLL.txt");

CloudBlockBlob snapshot = blob.CreateSnapshot();

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

As you can see, the CreateSnapshot returns a blob, but – even as demos go – it is unrealistic to expect to want to copy a snapshot immediately after its creation, so I drop this object and start from scratch.

To find the snapshot again I need to enumerate on all the blobs in the container, indicating to the platform I wish to see snapshots (and for that purpose – flattening the result as well).

As I’m only interested in this specific blob at this point I can add its name as the prefix –

string storageConnection = ConfigurationManager.AppSettings["StorageConnectionString"];
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnection);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

CloudBlobContainer container = blobClient.GetContainerReference("metars");

IEnumerable<IListBlobItem> blobAndSnapshots = container.ListBlobs( 
       prefix: "EGLL.txt",        
       useFlatBlobListing: true,         
       blobListingDetails: BlobListingDetails.Snapshots); 

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

Note: this will return the blob AND all its snapshots, so if wishing to enumerate on the snapshots only ignore any items with SnapshotTime value of null.

Once again – the result is a list of blobs, and so one can easily create a copy of the snapshot using the standard blob operations, a simple blob copy function could look something along the lines –

            //create a reference to targt blob
            ICloudBlob newCopy = container.GetBlockBlobReference("target.txt");
            //create a callback to note completion of copy
            AsyncCallback callBack = new AsyncCallback(CopyCompleted);
            //start copy
            newCopy.BeginStartCopyFromBlob(sourceBlob.Uri, callBack, null);

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

Note: in ‘real world’ using the async based interface with its Start and End methods would be more useful, but this patterns is well documented.

But here’s the subtle point – whether the a blob is a snapshot or not, it’s Uri property will refer to the main blob and not to any snapshot and so using the copy code above with the sourceBlob referencing a snapshot will copy the main blob and not its references snapshot.

To copy the snapshot a specific Url needs to be constructed and used, here’s the modified code –

            //construct snapshot's uri
            string snapshotUrl = getSnapshotUrl(sourceBlob);
            //create a reference to targt blob
            ICloudBlob newCopy = container.GetBlockBlobReference("target.txt");
            //create a callback to note completion of copy
            AsyncCallback callBack = new AsyncCallback(CopyCompleted);
            //start copy
            newCopy.BeginStartCopyFromBlob(new Uri(snapshotUrl), callBack, null);

.
.
.
        private static string getSnapshotUrl(CloudBlockBlob snapshotBlob)
        {
            string encodedTime = System.Web.HttpUtility.UrlEncode(snapshotBlob.SnapshotTime.Value.ToString("yyyy-MM-ddTHH:mm:ss.fffffffZ"));
            return string.Format("{0}?snapshot={1}", snapshotBlob.Uri, encodedTime);
        }

.csharpcode, .csharpcode pre
{
font-size: small;
color: black;
font-family: consolas, “Courier New”, courier, monospace;
background-color: #ffffff;
/*white-space: pre;*/
}
.csharpcode pre { margin: 0em; }
.csharpcode .rem { color: #008000; }
.csharpcode .kwrd { color: #0000ff; }
.csharpcode .str { color: #006080; }
.csharpcode .op { color: #0000c0; }
.csharpcode .preproc { color: #cc6633; }
.csharpcode .asp { background-color: #ffff00; }
.csharpcode .html { color: #800000; }
.csharpcode .attr { color: #ff0000; }
.csharpcode .alt
{
background-color: #f4f4f4;
width: 100%;
margin: 0em;
}
.csharpcode .lnum { color: #606060; }

 

As you can see- to copy the snapshot and not the main blob the source url needs to be changed to include the snapshot time offset. Doing so creates a new blob with the snapshot’s contents.

In this case I used the same container, but I could have used any other container or indeed another storage account.


Note: when copying a blob snapshot to another storage account one has to obtain, and provide, a shared signature signature token to the copy command (see example of such copy here). When doing so it is important that the token is the first query string parameter and the snapshot detail is provided after that (separated by an ampersand (‘&’), of course).

Advertisements

Integrating BizTalk and BizTalk [Services]

Over the last few months I was constantly asked how does BizTalk Services (formerly known as Windows Azure Integration Services) compares to BizTalk and whether one can use one instead of the other…

Whilst, generally speaking, I like the fact that this had been renamed to BizTalk Services (even if it means that searching information online about it will be somewhat difficult, due to the raft of publications on BizTalk out there), I suspect that this will only add to the confusion.

In any case – my answer has been, and will be for the foreseeable future – BizTalk Services, for the most part, is a great complimentary component to BizTalk. Sure – in some cases it can be a component in a solution that does not involve BizTalk, but it is very far from doing everything that BizTalk does, and – as far as I know – there are no plans to make it so.

So – if the two are so complimentary – how does one integrate BizTalk and BizTalk services?

Going in one direction – BizTalk Services to BizTalk Server – is very easy – one can publish a web service on BizTalk and consume that from BizTalk Services, maybe leveraging the Relay Service, part of the Windows Azure Service Bus to traverse the firewall. In some case one might want to publish the message from BizTalk Services to a Queue or a Topic on the Windows Azure Service Bus and read that from BizTalk using the built in capability. There are plenty of options to choose from.

Going in the other direction, though, is less flexible. At the time of writing BizTalk Services only support 3 way to deliver messages to it – HTTP, FTP (ok – including FTP/S, so maybe 4) and SFTP. To my surprise there is no built in capability to reading messages from the Windows Azure Service Bus or the ability to publish a web service.

Considering all these potions, I suspected that the HTTP route is the best one for integrating with BizTalk Server, so I decided to give it a go. here’s how it went –

The first step was to use the MessageSender sample to confirm that my bridge is working correctly (baby steps are important!)

Next, I wanted to make sure BizTalk is working (and I still remember how to configure it!) so I created a simple, messaging only, scenario – I started with a simple File Receive Location to File Send Port.

Procrastination over it was time to get serious, so I went on to add a second send port, this time using the WCF-WebHttp adapter configured to my bridge’s address, which is easily found in it’s properties –

image

It is important to use HTTPs and not HTTP, I found (trying to be lazy). Using the latter, Iwill result in the following error –

<Error><Code>400</Code><Detail>Transport security is required to protect the security token.</Detail></Error>

BizTalk Services protects me from myself and prevents me from exchanging security tokens without ssl. good or bad – you decide…

 

At this point my port configuration looked like this –

image

– and I figured it’s time to give it a go. As I haven’t integrated ACS yet I don’t expect to get through to the bridge but it’s a useful experiment on its own right – firstly to see that indeed authentication is required (no real doubt there) but also to see the end-to-end behaviour

Sure enough dropping a message for the File receive adapter I quickly receive a suspended instance, with the following error –

image

Which, unsurprisingly, looks very much like what I get in a browser –

<Error>

<Code>401</Code>

<Detail>Manage claim is required for this operation..TrackingId:b44aa846-7105-4869-beb5-d7830eaeb2df_G0,TimeStamp:6/12/2013 9:02:58 PM</Detail>

</Error>

Good – so I’m failing consistently, that’s progress 🙂 now all I needed to do is add the required bits to obtain the ACS token and add it to the outgoing request…..easy 😦

To do this I needed a solution that will allow me to obtain the ACS’ token for this request and inject it as an HTTP header to the outgoing request.

At this point I was sure I’m going to need to implement a custom WCF behaviour that will get the ACS’ token for the request and manipulate the outgoing message’s http headers to include the token but then it hit me – BizTalk already knows how to do that, because it has built in support for Windows Azure Service bus integration, which uses exactly the same mechanism.

A quick look at the WCF-WebHttp adapter configuration revealed a little tick box and some properties behind it –

image image

Awesome – a big chunk of work I expected to have to do – create and configure a custom behaviour to call ACS and obtain a token and add it to every outgoing message – was all done for me!

At this point I had to try again, and it looked promising, alas I did get another error – this time about not being able to establish trust relationship for the SSL/TSL secure channel.

This makes sense, it is exactly the issue I had with deployment of my BizTalk Service project initially and of course it is because I’m using a self-signed certificate so I promptly added the certificate to the Trusted Root Certificate Authorities store on my BizTalk server and tried again.

Well – nearly – it looks like I did get to BizTalk services, as I’m getting back a tracking id, although I didn’t quite manage to see anything in the BizTalk Services tracking page. Unfortunately – the tracking id came with an error message –

The incoming request is not recognised as a namespace policy put request: Unsupported or missing Content-Type

Ah! –BizTalk to the rescue again – at this point I was happy with hardcoding everything, I just wanted to get this to work! 🙂 and it turns out the WCF-WebHttp Binding lets you set static Http headers you wish to add to the outgoing message –

image

That done and I’m ready for another attempt – message into BizTalk and……..nothing. No suspended instance….did it actually work? a quick check in the BizTalk Services Tracking reveals that it has! BizTalk had really made this easier than I expected and now I can integrate my on-premises BizTalk environment with BizTalk Services and through that…..the world! 🙂

Check what happens on failure on the BizTalk services side – do we still get an error on the BizTalk side? (so that we don’t lose the message)

 

Cross posted on the Solidsoft Blog

BizTalk Services – Xml Schema Editor

Here’s another short blurt on BizTalk Services – this time the Xml Schema Editor.

The team had clearly, and thankfully, ‘borrowed’ the Schema Editor that we all know and love from BizTalk Server.

The editor allows creating and modifying Xml Schemas without needing to worry about the exact syntax of the underlying XSD making the process much less painful and more productive. After all – selecting from menus and setting properties is something we’re all used to and typing lots of delicate xml isn’t

image

In addition to the editor itself, right clicking on an XSD file in the solution explorer reveals three options that exist on BizTalk as well-

image

Validate Schema – checks that the schema itself is valid
Validate Instance – checks whether a given xml file is valid according to the schema and
Generate Instance – generates an xml file based on the schema

In all my years of BizTalking I hardly ever used the first option. Not so much because I am an XSD guru (I was! :-)) but because the editor does a very good job saving myself from myself.

The other two options are very useful indeed though, whether during the development process itself or during debugging and troubleshooting.

Sadly – at this point in the preview they may give the misleading impression that they do not work (Richard – did you manage to fall into that trap as I did initially?) – BizTalk used to pop out the output windows whenever executing either operation, where information about the files used (with links!) and the outcome was registered. BizTalk services does write to the output window, but without popping it open the unsuspecting user  (that’s me!) is oblivious to the fact and thinks it did nothing!

It’s amazing what difference a small UI behaviour can make..

 

Note: as is the case with BizTalk the input file for validation and the location of the output file for instance generations are set as properties of the XSD file in the solution explorer:

image

Below are the screen shots showing BizTalk services does indeed report to the output window, I just wish it became visible at that point:

image

image

image

image

 

Cross posted on the Solidsoft Blog

A hidden gem in BizTalk Services?

A fellow Solidsofter– Ian McQueen had pointed out a real (hidden?) gem with BizTalk services I hadn’t heard about.

At the off-chance it isn’t just me, here it is –

Did you know that you get a BizTalk Standard license to use on-premises with BizTalk Services Premium edition?

We spotted this here

image

At this point I could not find any more details on this, but it is very interesting indeed as it opens the capability for more advanced integration to on-premises systems in conjunction with the ease and cost-effectiveness of BizTalk Services

FTP and BizTalk Services

Continuing with my exploration of BizTalk services I thought I’d start with a simple end-to-end scenario, although it is something that roughly maps to a real customer requirement we have right now –

I wanted a bridge to pick up a file from an FTP location and drop it onto a Service Bus Queue (from which BizTalk 2013 would pick it up, but that’s another story…)

 

I create a simple pass-through bridge configuration to begin with, with my topic details on the one side and the ftp details on the other –

image

For the FTP server I deployed a small IaaS instance on Azure and configured an FTP server on it, I then entered all the details for the topic and configured the FTP endpoint.

That done I deployed the solution only to see an error stating –

Failed to connect to the FTP server using specified configuration. Error message – ‘The remote certificate is invalid according to the validation procedure.’

It was nice to see such a detailed error an indeed I quickly realised that the Use SSL property on the FTP source is on by default, but I had not configured SSL for my FTP site so I promptly changed that to False and re-deployed.

Unfortunately, this one didn’t work either, and now the error wasn’t that useful – “The underlying connection was closed: An unexpected error occurred on a receive.” – but it was useful to be able to get more details on the BizTalk Services (silverlight based) portal, accessible through the Windows Azure Portal –

image

In the BizTalk Services Portal there’s a tracking tab, and in that I found the details of this particular issue –

image

Failed to connect to the FTP server using specified configuration. Error message – ‘The remote server returned an error: 227 Entering Passive Mode (100,84,86,13,192,39)..’

As far as I understand this is an issue with my FTP setup more than anything else – it is  fair to expect to need to use passive mode for FTP (BizTalk services will block incoming FTP connections required for active mode) and my Azure based FTP server configuration will struggle accepting incoming connection on semi-random ports, which I believe is what’s happening here (and if you want to read more, I found this very useful)

I’ve decided to ignore this self-introduced issue and use another public FTP server I have access to, but it was good to see the level of detailed errors one can get from the BizTalk portal with ease.

With this ‘proper’ FTP server and after a bit more fiddling with settings and re-deploying, it worked and I could see my ‘messages’ being picked up from my FTP location and placed on the Service Bus Queue

 

Cross posted on the Solidsoft blog

Deploying to BizTalk Services

The busy life at SolidSoft got in the way, but I finally got some hands-on time with the recently re-released, in preview, Windows Azure BizTalk Services.

It didn’t take long before I hit the first hurdle – it took a bit of time to get my BizTalk services account set-up, but lots have been written about this already, so I won’t repeat that, but beyond that – I created my first test project and wanted to deploy it, which didn’t work first time.

What was easily missed is that the deployment from Visual Studio to your BizTalk account uses SSL, established using the certificate you upload during the account creation. This certificate needs to be installed in your certificate store but if, like me and most others, you use a self-signed certificate, you must ensure you also install it in the “Trusted Root Certificate Authorities” store.

Without that the deployment from Visual Studio will fail with the error

The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.

Cross posted on the Solidsoft blog

%d bloggers like this: