October 26, 2011 Leave a comment
I’ve been preparing a demonstration for a customer about SQL Azure Reporting so I’ve been playing around a little bit and I thought I’d share, at high level, what I’ve done (nothing fancy, I’m afraid, but if you’ve never looked at it, this should give you an idea of what’s involved)
To beginning was to get a data source to work on, and at the moment, that means SQL Azure database(s), which – of course – makes perfect sense, and so I promptly created a SQL Azure Database server, and, using the SQL Azure Migration Wizard, I’ve migrated good old Northwind onto it.
Now that I have a data source with some familiar data, it was time to create a report.
Given that I’m by no means a reporting expert and that this isn’t really the point of the demonstration, I did not try to get too creative and created a simple report of customers by country
I started by opening Visual Studio 2008 and creating a new project of type ‘Report Server Project Wizard’
The first step in the wizard was to define a data source, and it’s great that SQL Azure is an entry in the list of possible types; all that’s needed is to provide the connection string and the UI helps make that easy too
It was simply a case of typing in my database server name and credentials and provide the database name. The only other thing I needed to do is set the TrustServercertificate to True under the properties accessed through the Advanced button.
I then used the Query Builder to select the entire Customers Table and carried on with the Wizard specifying Tabular Format, Group By Country and the details fields (you can see I’ve been very creative)
Then, at the last page of the wizard, it was time to specify the deployment location I replaced the default value of http://localhost/ReportServer with the address of my Azure-based SQL Reporting ‘Server’, which I copied from the management portal
This, of course, is not necessary at this stage, it is perfectly fine to start working against a local reporting server and deploying the report later either through the management portal or by changing the server property in the report project’s properties and deploying from Visual Studio.
With the wizard complete I could now run my report from Visual Studio and see the results and the only thing I noticed is that I had to provide the credentials to the data source every time I ran the report.
This might be desirable in some cases, but I wanted a more streamlined experience, and so I set the credentials to the database in the data source. the report file itself will be protected through the management portal and the login to that, so these don’t get compromised.
With the data source credentials sorted I now deploy the project straight from visual studio and after minute or so it is visible in the management console. Clicking on the report renders it successfully –
So – at this point the report is fully operatoinal, and can be accessed via a publicly available url. access is governed by username/password pairs setup through the admin console and permissions set on the report itself (or a folder0, and that’s probably good enough for many scenarios for departmental reports inside the organisation.
For more public reports, ones available for external parties for example, I think that re-hosting the report in a web role and leveraging the ACS for access control would be a lot more flexible and manageable, and so I moved on to do this as well –
Embedding the report simply meant, in my little example anyway, using the ReportViewer control on an ASP.net page; I’ve configured the ServerReport property of the viewer with the relevant Uri’s and made sure to set the control ProcessingMode property to ‘Remote”.
I then used code to assign the fixed credentials to reporting services. once again – my application is going to be protected by ACS and this code is server side code, so I am comfortable with embedding these in the code (should be configuration, of course…)
At this point I could run my little ASP.net application locally and that would succesfully access the report in reporting services and display it on screen –
The last step, then, was to add support for STS.
I’ve made all the necessary configuration in the management portal, and copied the ws-federation metadata url, and then used the add STS reference wizard to make the necessary configuration changes to my application –
The result of the wizard was a set of entries added to my web.config, to which I added, under <system.web> the following –
<deny users=”?” />
Running the application now automatically redirects me to the ACS, and – as I have configured to possible identity providers (Windows Live Id and Google) I am presented with a selection screen –
Choosing the provider I want I am redirected to the login screen, hosted by the identity provider, and from there back to my application. the second time I will access my reporting application these redirects will happen, but local cookies in all parties will remember me and I won’t need to sign in again, until I sign out or the cookies expire.
The only thing to note is that the ACS configuration includes the url to the application, so once tested locally this needs to change to include the url on Windows Azure but once done, and deployed to Windows Azure, I can now browse to my reporting application, login using, for example, my Windows Live ID, and view a report on SQL Azure Reporting.