Monday, October 29, 2007

Quick method to initialize a strongly-typed DataRow in C#

Last night I needed a routine to initialize all the fields in a datarow of a strongly-typed dataset. I'm working in Visual Studio 2005 and the .Net Framework 2.0. Below is the code that I came up with. Note that I'm slowing switching over to C# for new projects.

This code is not comprehensive of all the possible data types, but it did the trick for me:

private void InitializeNewDataRow(DataRow dr, System.Data.DataColumnCollection dcs) {
object[] fields = dr.ItemArray;
DataColumn[] dca = new DataColumn[fields.Length];
dcs.CopyTo(dca,0);
for (int i=1; i < fields.Length; i++) {
if (dca[i].DataType == typeof(int)) fields[i] = 0;
if (dca[i].DataType == typeof(Int16)) fields[i] = 0;
if (dca[i].DataType == typeof(Int32)) fields[i] = 0;
if (dca[i].DataType == typeof(Int64)) fields[i] = 0;
if (dca[i].DataType == typeof(bool)) fields[i] = false;
if (dca[i].DataType == typeof(string)) fields[i] = string.Empty;
if (dca[i].DataType == typeof(decimal)) fields[i] = 0.0;
if (dca[i].DataType == typeof(DateTime)) fields[i] = DateTime.MinValue;
}
dr.ItemArray = fields;
}
Hope it helps.
Joe Kunk

Monday, October 08, 2007

Server and PC virtualization as a Disaster Recovery strategy

Please excuse me as I take a short diversion from application development topics for this post.

I spent a stint as a Information Technology Director several years ago. One of the difficult challenges that I faced was implementing a viable disaster recovery plan to protect the business and ensure continued operations in the event of any kind of business interruption.

The best option available at the time was to contract with a firm that would provide compatible servers and PCs within a specified timeframe in the case of a business interrruption - basically a mobile computer center in the back of a semi-truck. Using recent off-site backup tapes, we would recreate our computing environment and continue to operate.

My concern was that backup tapes and off-site tape procedures are often less than 100% reliable. The slightest error on one of the tapes could invalidate the restore. Differences between the supplied hardware and that used in-house could adversely affect the operation of the server. While test sessions were available from the vendor to help ensure that everything was ready to go prior to an interruption, there was always the knowledge that bringing up a duplicate server/PC environment would be difficult and time-consuming under the best conditions. To be clear, I was much more concerned about successfully recreating the server environment than the end-user PC environment.

The reason for this post is to point out that server and PC virtualization software systems have advanced to the point that they can serve as the basis for a new and more effective approach to disaster recovery. I won't mention vendor or product names but a quick Internet search for "server virtualization" will provide the current list.

In a nutshell, virtualization software allows the creation of a virtual machine image that represents all the resources of a computer as a single file or collection of files, including the hardware, operating system, disk drives, applications, data, etc. The host application loads and runs the virtual machine and allocated physical resources among the virtual machines.

This layer of virtualization does have a performance penalty but with the latest multi-core servers and PCs available, it can be a minor concern.

Now consider this. What if you implemented all your daily production servers in your data center as virtual servers such that that actual physical servers rarely ran more than the virtualization software?

Virtual server backup simply becomes a file copy to an external disk drive device. One Terabyte (1,000 Gigabyte or 1 million Megabyte) external drives are available now for just a couple hundred dollars each. By rotating backups among at least two external drives, you always have your latest server backup off site.

Most PC users will not want to work inside a virtual machine image on a regular basis. It is best to create a master end-user PC configuration that has all the necessary software installed and properly configured. This configuration can be written to an image file by software designed to provide hard drive cloning and restored onto a client PC when needed. Restoration tends to be fast and has the advantage of providing a uniform end-user starting point for the disaster recovery environment.

Under this scenario, disaster recovery becomes much more viable since to get properly running servers, all you have to do is get any sufficiently powerful server that can run the virtualization host software, load and run the saved server virtual machine file, and you're up! You are no longer dependent on getting server hardware in an emergency that is configured exactly as your in-house servers. In my opinion, this can make the difference between a successful disaster recovery scenario and one that is disappointing at the least.

Attach a couple PCs that have been updated with the reference end-user configuration image, and you are up and running in your own business environment, just the way that you are used to seeing it.

I am no longer responsible for implementing this kind of strategy but it just makes so much sense to me that I wanted to share it in hopes that it can be beneficial to those that are facing this challenge yet today.

Hope it helps,

Joe Kunk

Friday, October 05, 2007

WOW! - Microsoft to Release the Source Code for the .NET Framework Libraries

Scott Guthrie's blog on last Wednesday announced the upcoming availability of source code to portions of the .Net Framework. This is huge! What better way to learn .Net ?!

For more information, see his post at http://weblogs.asp.net/scottgu/archive/2007/10/03/releasing-the-source-code-for-the-net-framework-libraries.aspx.

Joe Kunk

Thursday, October 04, 2007

Dynamically create a simple array in VB.Net

Using an array is more efficient than a collection if an array can suffice, but in the past I have found myself resorting to using a collection simply because I did not know in advance the number of elements needed until runtime. Turns out it is easy to create a simple system array at runtime and specify the length of the array at that time.

For example, to create a string array of 5 elements (0 to 4):
Dim MyArray as System.Array
MyArray = System.Array.CreateInstance(GetType(String),5))

which is equivalent to the following code at compile time:
Dim MyArray(4) as String

The same technique can be used to dynamically create arrays of any type at runtime.

Note that the above example assumes Option Strict is off. If Option Strict is on, you must use the .SetValue and .GetValue methods to access the array elements. If Option Strict is on, a proper code sample would be:

'Option Strict On
Dim mya As System.Array
mya = System.Array.CreateInstance(GetType(String), 3)
mya.SetValue("Word1", 0)
mya.SetValue("Word2", 1)
mya.SetValue("Word3", 2)
MessageBox.Show("Element (2) is " & _
Convert.ToString(mya.GetValue(2))) 'Returns "Word3"

Hope that helps.
Joe Kunk