The SQL Azure team has announced that they are going to introduce a new feature for SQL Azure code named “Data Explorer” (code name should be named after animals and food!). From the product team’s blog, Data Explorer” is a new concept which provides an innovative way to gain new insights from the data you care about. And the data you care about need not to be SQL Azure or SQL Server, they can be in any form such as Excel, file (unstructured) and Windows Azure Marketplace (list based)
My complain of traditional data analytics tools such as SQL Server Analysis Services is the lack of integration with external data source to provide more information about your data. So with “Data Explorer”, you can then discover additional data which you may need, as new datasets and data services from the Windows Azure Marketplace are automatically recommended for you
“Data Explorer” is not yet available yet but you can sign up for preview here. https://dataexplorer-signup.sqlazurelabs.com
For screenshots and detail explanations, check out the official blog http://blogs.msdn.com/b/dataexplorer/
Microsoft don’t active talk about their datacentres especially what they have running inside. Even the team managing the centres known as Microsoft Global Foundation Services (but they have their own website!) doesn’t get much limelight for themself. But today I found out Dave Aiken has a blog post with a couple of videos on what is powering our Windows Azure instances.
Below is a video showing the inside of datacentres running not just Windows Azure but also the more familiar Windows Live (aka Hotmail), Office 365 and XBox Live. Also shown the Dublin datacentre running latest Generation 4 modular datacentre.
Grab more videos here http://www.davidaiken.com/2011/10/07/our-datacenters-are-awesomeness-in-a-box/
I notice the Data Sync button on the left when I logged in to my Windows Azure portal this morning. SQL Azure Data Sync is the answer for customers having web applications on SQL Azure but would want to have a local on premise copy for different purposes, example would be the web store database to be sync’ed back to a local SQL Server instance to run Analysis Services. Another example would be to synchronized SQL Azure databases on different region in a geographical distributed database design.
Sync Framework 2.1 and the October 4.0 CTP already provide a basic application framework to synchronize databases from different sources and editions, but most customers would like a out of box solution that works in most cases and leave specific scenarios to the developers.
Since this service is still on CTP, there are no charges of trying it out, however Windows Azure bandwidth charges applied when data are downloaded back to local database or between different regions.
Since this is a preview, only US and Europe region are available at the moment. Meaning if your SQL Azure database is hosted in the APAC region like me, I believe it will incur some data transfer charges (need to investigate further and ask my local technology evangelist)
I setup an instance of the service that is based on North Central US, note that your subscription info will be updated. Sync pairing are grouped into sync group with specific rule sets applied to a group of databases.
Once you have provisioned your Data Sync service, you can choose to setup synchronization between your local SQL Server and Azure or sync between 2 SQL Azure instances.
SQL Azure to SQL Azure is a no brainer, and since my SQL Server sits behind a firewall, I wonder how it works, so I chose the first option. Data Sync Services provides a pretty intuitive UI to setup.
Now I try to add a local SQL Server to the sync group
Just like Sync Framework, I have options to choose the sync direction
Since the DB will be
Now I want to check out whether this works with SQL Express, I proceed to download the agent
The agent runs as a Windows Service, so I have to specify the user account to use
Strange, I got an error message, let me try grant local admin right to the service account
Yup it works!
Proceed back to Azure portal to generate the agent key
After that I would have to register the key in my local Agent
You have to add the DB you would like to sync by clicking on the Register button
Then head back to the portal and click on ‘Get Database List’
And yes! Looks like I can use the agent to connect to any database in my local network. Next step is to add a SQL Azure database to the sync group
This part is not as intuitive as I need to enter the details manually. It could be I created the database just now.
Step 4, set up the schedule when the sync happens and the conflict resolution rules.
Next you have to define the datasets (tables) to be synchronized.
For someone with no prior experience of SQL Azure Data Sync, this screen looks a bit puzzling, especially when I have both the local and the cloud version of my database using the same name. Let’s try my luck with the first one, then I click ‘Get Latest Schema’.
Oops?! Later I realize the first option points to the DB on SQL Azure and it is empty, hence the error. Let’s choose the second one.
And objects got populated on the screen. Note the message in red ‘Some tables/ columns do not meet the schema requirements. See the Report. You may still create a sync group by selecting the remaining tables/ columns from the list.” Now I click on the ‘See the Report’ link, it will tell me (on the next screen) what’s wrong with my database schema.
Most of them are caused by the absent of clustered index and use of User Defined Datatypes, which are not supported on SQL Azure.
Note that I can also define filter for the dataset. Now noted that if at this point you reselect the database from the drop down list on top, you will lost whatever changes you made previously even when you change back to the database you worked on just now.
Now comes to the end of the process, I saw a deploy instruction on top of the screen.
And bomb! I got another not-so-helpful error message. I suspect its because my database on the cloud is empty. Let me retry with something simple.
Hmm, when I reprovision the databse on the local sync agent (Pic 1), the changes are not updated on SQL Azure portal and the Refresh button is greyed out (Pic 2), Looks like I got to wait sometime for the information to be refreshed.
So on my 2nd try to add the SQL Azure database the drop down list works
Voilà! This time with the simpler database, the deployment works. I will go back to this later, this post is just to test out Data Sync on a simple scenario.
Since the database on SQL Azure is still empty, a sync would have upload all tables in the dataset to the database. Let’s confirm the sync via the log viewer.
Then I take a look at the SQL Azure database via SQL Server Management Studio to make sure everything works.
Noted that Data Sync Services created it’s own tracking tables for each of the tables in the dataset. I checked my local database and it has the same set of new tables.
That’s my first experience with SQL Azure Data Sync services. Having been playing with Sync Framework for the past 3 years, there are still features not available out of box on Data Sync which a custom application (normally built using Sync Framework) is required; for example:
- 1. Custom synchronization rules
- 2. Sync with SQL compact
- 3. Database sync between SQL Azure and non-SQL Server database such as Oracle and MySQL
- 4. Data sync between 2 or more SQL Server instances on different locations (no SQL Azure instance involved here)
My company for instance is working on a product based on scenario 1 and 4.
Microsoft has already published a FAQ on Data Sync here.
I used my company’s MSDN subscription to sign up for Windows Azure’s MSDN offer. Basically I was allocated 750 hours of extra small compute for the first 6 months of my sign up. Even now that the offer is over, and I am paying the full price whenever I use Azure, Microsoft Online Service (which manages the billing) still send me a reminder of my so called exceed usage. Below is a screen clip taken from my email client.
What all this means is if you accidentally exceeded your usage, you have better delete your instance if it is only used for testing. Do not shutdown the instance below it will still incur charges.
It was a great event with around 180 attendees turn out. I have uploaded by slides and demo source code at the Azure in 1 day Codeplex website here http://azureoneday.codeplex.com