LocalResource for Diagnostics, the size config matters

In one of my Azure projects, we use local storage to temporarily store some logging data files and then leverage the Windows Azure Diagnostics (WAD) to transfer these files to a storage account.

We encountered a weird problem when we tried to configure the size for the abovementioned local resource. I’ll call it LocalStorageLogDump as below. My initial thinking of this sizeInMB configuration is that as long as it is below the limit of the virtual machine’s local storage size, which is about 200GB for a small instance, it should be just fine. However, when I put a 5000MB (5GB) in the config, it failed to start the diagnostics monitor.

By reflectoring the diagnostic assembly and later reading the following article, I then realized, the local resource that is used for WAD, has to be less than the size of a local resource named DiagnosticStore which by default is 4GB and not in the .csdef file. Then after I explicitly added that configuration entry and gave it a larger value, WAD comes back to work.

http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.diagnostics.diagnosticmonitorconfiguration.overallquotainmb.aspx

<LocalResources>
<LocalStorage name=”LocalStorageLogDump” cleanOnRoleRecycle=”true” sizeInMB=”5000″ />
<LocalStorage name=”DiagnosticStore” cleanOnRoleRecycle=”false” sizeInMB=”10000” />
</LocalResources>

[Update: 9/16/2013]

The cleanOnRoleRecycle attribute is better with a “false” value, according to this post, in case you encountered the same/similar issues.

LocalResource for Diagnostics, the size config matters

A tool to simplify and automate your SQL Azure database backup

So you have one or more SQL Azure databases that have data stored you treat really seriously and want to back them up on a regular basis like what you usually do for on-premises databases. However, the backup/restore story in Azure is different from on-premises. What you can do is to leverage the existing options provided by Azure platform. Following is a list of references you want to check out.

Business Continuity in SQL Azure
http://msdn.microsoft.com/en-us/library/windowsazure/hh852669.aspx

SQL Azure Backup and Restore Strategy
http://social.technet.microsoft.com/wiki/contents/articles/1792.sql-azure-backup-and-restore-strategy.aspx

How to Use Data-Tier Application Import and Export with SQL Azure
http://social.technet.microsoft.com/wiki/contents/articles/2639.aspx

Copying Databases in SQL Azure
http://msdn.microsoft.com/en-us/library/windowsazure/ff951624.aspx

However, if you really want to automate the backup operation and save some manual efforts, you then need to write some codes/scripts to orchestrate all these existing options and maximize the overall goodness that you can get from these options.

Now, here is the good news, a tool (SQL Azure Data Protector) is created for you just to do all these things. What you only need to do is:

  • Download and install the little tool on a machine that usually schedules Ops tasks for you
  • Create a profile (configurations like server names, credentials, locations, storage account, etc.)
  • Schedule a periodic task using, for example, Windows Task Scheduler

If you want to understand more details about how this tool works, you can read the documents. Of course, you can also check out the source codes directly, as well.

Till next time,

-Jack

A tool to simplify and automate your SQL Azure database backup

win 8 doesn’t like WEP Wi-Fi connection

My newly installed win 8 system failed to connect to my home Wi-Fi without any useful information provided. I tried many times, and it always failed. Then I tried to re-configure my router to change the security type to WPA2-Personal and used AES algorithm for the encryption type. And now win 8 can connect to my home Wi-Fi like a charm.

WEP is basically non-secure, so if you encountered the same issue as mine, you can try the same approach to see if it would be the rescue.

Till next time,

-Jack

win 8 doesn’t like WEP Wi-Fi connection

I like this statement: The 100% coverage practice is costly only when it forces substandard code to be well designed.

And it is from a post of Patrick Smacchia on simple-talk.

Patrick wrote an essay about writing unit testing code in c#, in which I personally found a lot of good points that I could not agree more. For example:

  • Not all classes need to be 100% covered by tests or even covered at all.
  • The 100% coverage practice is costly only when it forces substandard code to be well designed.

And especially for the second one, I think it is inspirational and encouraging. I faced situations from time to time that for some certain pieces of codes/logic, it was difficult to write a comprehensive test suite to cover all the branches. Honestly, most of time when I faced this, I will choose to leave the coverage not to 100% and accept the fact. But after reading this, from now on, I will try my best to achieve the 100% coverage by not only writing comprehensive test cases, but also refactoring the “difficult” logic to the best designed, intuitive, and enjoyable to read codes.

I like this statement: The 100% coverage practice is costly only when it forces substandard code to be well designed.