Archive for September, 2009

If you’re into Silverlight/WPF development you’ll surely appreciate Karl Shifflett’s XAML Power Toys: http://tinyurl.com/karlpowertoys  Awesome (free) tool for WPF/Silverlight.

XAML Power Toys Full Feature Set  includes

  • Create ViewModel Class – from a VB.NET or C# code window, easily create a ViewModel stub that includes commands and exposed data class.  Optionally you can elect to re-implement all Model properties on the ViewModel.
  • Create Silverlight DataForm For Selected Class – quickly create a DataForm complete with bindings that are easily associated with properties on the source class
  • Create WPF or Silverlight DataGrid For Selected Class – quickly create a DataGrid complete with bindings that are easily associated with properties on the source class
  • Create WPF ListView For Selected Class – quickly create a ListView complete with bindings that are easily associated with properties on the source class
  • Create Business Form For Selected Class – quickly create a form complete with bindings that are easily associated with properties on the source class
  • Create Business Form – quickly create a form without selecting an entity class.  Great for creating unbound forms or just laying out a Grid.
  • Show Fields List For Selected Class – display a list of class fields similar to Microsoft Access.  Allows dragging of fields to create controls
  • Extract Properties To Style – allows selecting a control, choosing desired properties and have those selected properties extracted to a newly created style
  • Group Into – allows selecting one or more controls and group them inside a new parent control.  Many options provided
  • Change Grid To Flow Layout – allows selecting of one of more control and will remove all MinWidth, MinHeight, Margin properties and will set all row and column definitions to Auto.
  • Chainsaw Minimize Cider Designer XAML – allows selecting of one or more controls and will remove all MinWidth, MinHeight, x:Name, Name, Margin properties and will set all row and column definitions to Auto.
  • Remove Margins – allows selecting one or more controls and removes the Margin property from them
  • Edit Grid Column and Rows – allows selecting a grid and then add or remove rows and columns
  • Set Defaults For Created Controls – allows customizing the initial properties that are set when the software creates a new control
  • About Box – see the version of XAML Power Toys installed on your system.  The current version of XAML Power Toys is always displayed at the top of this page below the title

I’m starting a new line of blog posts in which I’m going to give out a few tips and tricks I’ve picked up during the past years. I’m going to start with one of the most common mistakes I often face when reading code.

CaptureTo demonstrate that, I’m going to use LinqToSQL as my data access method (although the problem can be found in any kind of Data access technology) and use the the same model I used in my earlier Caching series.

As you can see the model is very simple and contains just three entities, a Peson, its Phones and its Email addresses.

Next I’m going to create a web form in which I want to display a list of Persons along with their email address. To do that I’m going to add a Repeater control which I’m going to Bind to the Person retrieved from the store.

<asp:Repeater ID="personGrid" runat="server" OnItemDataBound="personGrid_ItemDataBound">   <ItemTemplate>     <asp:Label ID="lblName" runat="server" Text='<%# string.Format("{0} {1}", Eval("FirstName"), Eval("LastName")) %>'></asp:Label>  |      <asp:Label ID="lblEmail" runat="server" Text='Email' Font-Bold="True" Font-Italic="True"></asp:Label>   </ItemTemplate>   <SeparatorTemplate><br /></SeparatorTemplate>
</asp:Repeater>

Since the Email Address is not in the same entity as the Person I’m going to take advantage of the repeater’s OnItemDataBound event to fetch the Email address of these persons. So typically would find something like that in the code behind.

protected void Page_Load(object sender, EventArgs e)
{
	if (!Page.IsPostBack)
	{
		DataBind();
	}
}

public override void DataBind()
{
	base.DataBind();

	using (AdventureworksDataContext context = new AdventureworksDataContext())
	{
		var query = from p in context.Persons
		select p;

		personGrid.DataSource = query.Take(10);
		personGrid.DataBind();
	}
}

protected void personGrid_ItemDataBound(object sender, RepeaterItemEventArgs e)
{
	Label lblEmail = e.Item.FindControl("lblEmail") as Label;
	Person currentPerson = e.Item.DataItem as Person;

	if (lblEmail != null && currentPerson != null)
	{
		using (AdventureworksDataContext context = new AdventureworksDataContext())
		{
			var personEmail = context.EmailAddresses.Where(ea => ea.BusinessEntityID == currentPerson.BusinessEntityID).FirstOrDefault();
			lblEmail.Text = (personEmail != null) ? personEmail.EmailAddress1 : "";
		}
	}
}

This will work just fine but what some developers don’t realize is that that this will make as many queries to the database as the records on the Persons Table, cause it’s time a Person record is Data bound I’m querying for its Email Address on the ItemDataBound Event Handler.

What we could do instead, is take advantage of all the cool new features in .Net 3.5 and Linq, like data projection, data shaping and anonymous types to prepare a read-only view to bind to the repeater. So a more optimized and much cleaner version of this code would look something like that:

protected void Page_Load(object sender, EventArgs e)
{
	if (!Page.IsPostBack)
	{
		DataBind();
	}
}

public override void DataBind()
{
	base.DataBind();

	using (AdventureworksDataContext context = new AdventureworksDataContext())
	{
		var query = from p in context.Persons
			    select new { 
				FirstName = p.FirstName, 
				LastName = p.LastName, 
				Email = p.EmailAddresses.Select(e => e.EmailAddress1).FirstOrDefault() };

		personGrid.DataSource = query.Take(10);
		personGrid.DataBind();
	}
}

//protected void personGrid_ItemDataBound(object sender, RepeaterItemEventArgs e)
//{
//  Label lblEmail = e.Item.FindControl("lblEmail") as Label;
//  Person currentPerson = e.Item.DataItem as Person;

//  if (lblEmail != null && currentPerson != null)
//  {
//    using (AdventureworksDataContext context = new AdventureworksDataContext())
//    {
//      var personEmail = context.EmailAddresses.Where(ea => ea.BusinessEntityID == currentPerson.BusinessEntityID).FirstOrDefault();
//      lblEmail.Text = (personEmail != null) ? personEmail.EmailAddress1 : "";
//    }
//  }
//}

This way I can also remove the OnItemDataBound event handler completely (or use it just for visual stuff that’s my preference) from the repeater and bind the Email Label to the new Field of the anonymous type that I’ve just created. So the page code is going to look something like that:

<asp:Repeater ID="personGrid" runat="server" OnItemDataBound="personGrid_ItemDataBound">   <ItemTemplate>     <asp:Label ID="lblName" runat="server" Text='<%# string.Format("{0} {1}", Eval("FirstName"), Eval("LastName")) %>'></asp:Label>  |      <asp:Label ID="lblEmail" runat="server" Text='<%# Eval("Email") %>' Font-Bold="True" Font-Italic="True"></asp:Label>   </ItemTemplate>   <SeparatorTemplate><br /></SeparatorTemplate>
</asp:Repeater>

Although this kind of behavior is pretty obvious and I’m pretty sure that most of you are already aware of it, there are times when this behavior is disguised and rather difficult to spot. For example the following code has exactly the same problem since the GetEmailAddress method is going to be called as many times as the person records (although there is no ItemDataBOund handler).

<asp:Repeater ID="personGrid" runat="server" OnItemDataBound="personGrid_ItemDataBound">   <ItemTemplate>     <asp:Label ID="lblName" runat="server" Text='<%# string.Format("{0} {1}", Eval("FirstName"), Eval("LastName")) %>'></asp:Label>  |      <asp:Label ID="lblEmail" runat="server" Text='<%# GetEmailAddress((int)Eval("ID")) %>' Font-Bold="True" Font-Italic="True"></asp:Label>   </ItemTemplate>   <SeparatorTemplate><br /></SeparatorTemplate>
</asp:Repeater>

And of course this behavior doesn’t only apply to repeaters but in every list control (drop down list, GridView, Checkbox list etc) that is going to be bound to fields belonging to more than a single entity.


GPS Tools

Categories: Tools
Comments: 1

If you’re into geo-app development the following tools should come handy.

GeoFrameworks and Jon Person released their GPS Framework for .NET on CodePlex!  The  really nice thing about this framework is it works with both the full blown .NET Framework and the Compact Framework.  If you’re developing for a Windows Mobile, embedded, laptop or desktop computer, it should handle all your needs.

Just a few of the features are:

  • Automatic detection of serial GPS devices (or devices found via a virtual serial port). Capture
  • Automatic detection of Bluetooth devices (when using the Microsoft Bluetooth stack.)
  • Automatic baud rate detection.
  • Automatic recovery of lost connections.
  • Advanced GPS precision via Kalman filtering.
  • Support for desktops and mobile devices.
  • Support for real-time GPS data without relying on Microsoft’s GPS API.
  • Support for control and monitoring of precision.
  • A set of animated gauge controls for desktops and mobile devices (Altimeter, Compass, Speedometer, SatelliteViewer, SatelliteSignalBar).

  • What a great honor this is, I’m featured at the home page of the Greek MSDN site ;-).

    Capture


    AzureMigWiz Ok you’ve developed your cloud application and you’re ready to publish it to Azure, wait a minute how are you going to upload your database to SQL Azure? Of course you just can’t do a backup – restore to the cloud so what are your options?

    Well actually there are two ways you could deploy a database to SQL Azure. With the first one you manually create your db schema installation script and run it using SQL Manager hopping that everything will work well (a lot of SQL features are just not there yet e.g. Multi Part Column names).

    But you can also use SQL Azure migration wizard to do that for you. George Huey’s SQL Azure Migration Wizard (MigWiz) offers a streamlined alternative to the SQL Server Management Studio (SSMS) Script Wizard for generating SQL Azure schemas that conform to the service’s current Data Definition Language (DDL) limitations. You can download the binaries, source code, or both from CodePlex. You can learn more about MigWiz and watch a screencast from Wade Wegner’s SQL Azure Migration Wizard post.


    A few weeks back Microsoft announced the first CTP of SQL Azure, its new cloud relational persistence offering. In this post I will show you how you can connect and query your SQL Azure DBs from your PC. There are two ways you can interact with SQL Azure, one is through the SQLCMD command line program and the other is through SQL Server Management studio. To connect to SQL Azure using the later, you need to follow these simple steps:

    1. Procure an activation code for SQL Azure (Since it is in CTP now). http://connect.microsoft.com
    2. Go to http://lx.azure.microsoft.com/ and activate you key. Then you can access SQL Azure from the portal (https://sql.azure.com) and create or drop databases, find out the connection strings and so on.
    3. To access the SQL Azure from SQL management studio, open the SQL Management studio 2008
    4. Cancel the login dialog box that will pop up
      Capture1
    5. Click on ‘New Query’
      Capture2
    6. Choose Database engine as the server type
    7. Enter the server name of the SQL Azure server. This information can be obtained from the SQL Azure portal. (no prefix, no suffix, no protocol should be appended).
    8. Enter the login name (<login_name>@servername)
    9. Enter password
    10. Click on Options
      Capture3
    11. Enter the database name (master or another other database if you have created one). If you don’t you won’t be able to switch later (“USE” statement is not supported).
    12. Specify the TCP protocol
    13. Click on connect to connect
      Capture4

     

     

     

     

    After that you’ll be able to run your SQL commands against you cloud SQL server.


    Before picking up where I left off yesterday, I have to first make a small disclaimer. I received a couple of comments stating that the caching solution I presented wasn’t that sophisticated or complete or that the cache keys that I’ve used were not the right and the answer is of course “yes”. The solution I presented was far from perfect but it served well as a simplified demonstration of the problems one might face when caching Linq to SQL entities.

    Ok now that we’re done with the typicalities let’s see what was wrong with the solution shown earlier. To demonstrate the problem I’m going to add a single button to my web form and attach an event handler to it. In the event handler I’m going to ask the Person from the Cache and Lazy load his phones.

    public partial class _Default : System.Web.UI.Page
    {
    	protected void Page_Load(object sender, EventArgs e)
    	{
    		if (!IsPostBack)
    		{
    			DataBind();
    		}
    	}
    
    	public override void DataBind()
    	{
    		base.DataBind();
    
    		employeesRepeater.DataSource = new List<Person>() { GetPerson("M") };
    		employeesRepeater.DataBind();
    
    	}
    
    	protected void btnGetPhone_Click(object sender, EventArgs e)
    	{
    		Person person = GetPerson("M"); //This person comes from the cache
    		var phones = person.PersonPhones.ToList(); //Attempt to lazy load throws exception as Context has been disposed
    	}
    
    	protected Person GetPerson(string lastNameStartsWith)
    	{
    		using (AdventureworksDataContext context = new AdventureworksDataContext())
    		{
    			return CacheGet<Person>(
    					() => context.Persons.Where(p => p.LastName.StartsWith(lastNameStartsWith)).FirstOrDefault(),
    					string.Format("Person_LastName_StartsWith:{0}", lastNameStartsWith));
    		}
    	}
    
    	protected T CacheGet<&lt;T>(Func<T> loader, string cacheKey) where T : class
    	{
    		var cachedObject = this.Cache[cacheKey] as T;
    		if (cachedObject == null)
    		{
    			cachedObject = loader.Invoke();
    		}
    		return cachedObject;
    	}
    }

    CaptureIf you ran this code and click the button you’ll end up with the exception
    Cannot access a disposed object.
    Object name: ‘DataContext accessed after Dispose.’.

    That’s the real problem when caching Linq to SQL entities, the fact that they have a dependency (reference) on the data context that created them and as such they can not be cached. There are ways you can work-around this problem by detaching and re-attaching the entities to the currently loaded data context. But since there is no way you can actually detach an entity from the context you can only rely on hacks to do that. A way to detach an entity is serialize it (of course entities must be declared serializable) cache it and then before retrieving it from the cache deserialize it and attach it to the context. Another is to manually detach an entity (set certain properties to null) and another to manually clone it. Each of these methods has advantages and disadvantages, like the fact that you’ll loose all object’s graph if you serialize it or the complexity if you follow the manual detaching path. One thing is for sure though, there is no easy way around it.

    What’s my preffered way you ask? I guess the most painful way (;-)) but at the same time the most scalable and robust. Build my own custom POCO entities model, populate it using whatever data access technology and cache those entities.


    A caching solution is always necessary if you want to build scalable applications which will handle lots of users and heavy data access without requiring more hardware resources. I’ve already talked about caching when using Entity Framework, but is there a way to use caching in Linq to SQL and what are the problems one might face.

    Capture To explore caching in Linq to SQL I’ve built a small web site that had a single page. This page used Linq To SQL to access the AdventureWorks sample Database which you can download from Codeplex.

    The very simple Linq To SQL model that I’ve created had three entities, a Peson and its Phones and Email addresses.

     

    The site default page contained a repeater that was databound to the first Person that his last name begun with the letter M, fetched by a call to the GetPerson method.

    public partial class _Default : System.Web.UI.Page
    {
      protected void Page_Load(object sender, EventArgs e)
      {
        if (!IsPostBack)
        {
          DataBind();
        }
      }
    
      public override void DataBind()
      {
        base.DataBind();
    
        employeesRepeater.DataSource = new List<Person>() { GetPerson("M") };
        employeesRepeater.DataBind();
    
      }
    
      protected Person GetPerson(string lastNameStartsWith)
      {
        using (AdventureworksDataContext context = new AdventureworksDataContext())
        {
          return context.Persons.Where(p => p.LastName.StartsWith(lastNameStartsWith)).FirstOrDefault();
        }
      }
    }

    This works, but imagine tens of thousand of users accessing this page at the same time asking for the same thing. This would result in thousand of SQL commands reaching SQL server. And in this case the SQL command is trivial but what would happen if the query was more complex? The answer is that this page would become significantly slower as more users would try to access it.

    To overcome this, I’ve introduced a simple Caching mechanism to the page utilizing the System.Web.Caching.Cache Object of the Page and changed the GetPerson method to use this caching mechanism.

    protected Person GetPerson(string lastNameStartsWith)
    {
      using (AdventureworksDataContext context = new AdventureworksDataContext())
      {
        return CacheGet<Person>(
          () => context.Persons.Where(p => p.LastName.StartsWith(lastNameStartsWith)).FirstOrDefault(),
          string.Format("Person_LastName_StartsWith:{0}", lastNameStartsWith));
      }
    }
    
    protected T CacheGet<T>(Func<T> loader, string cacheKey) where T : class
    {
      var cachedObject = this.Cache[cacheKey] as T;
      if (cachedObject == null)
      {
        cachedObject = loader.Invoke();
      }
      return cachedObject;
    }
    
    	

     

    Now only the first call to the GetPerson method results into a query being made to the SQL Server, all resulting calls will be served by the Cache (that is of course if the cache key aka last name character matches) and its going to be equally fast for either one user or million of users. So what’s the problem? Anyone care to guess?

    To be continued…