Localizing multiple sites

This has been a really fun week. After spendin a week at Microsoft (MVP Summit), I came back to attempt to get my sites together. I found that there were some changes that forced me to do a bit of a big bang integration and my QA guy was busy on other work. Oh, the joys of working at a startup.

Localization 101

I have done little localization exercises in the past, but nothing of this sort. I have two live sites that essentially share the same code base. In an ideal world, they should share 100% of the business logic (via business libraries) and the pages should be fairly similar with CSS and master pages being the primary changes. This is not an ideal world.

First, let me explain the setup, and then much of this will make sense.

We control two production sites right now, called My Jaguar Watch and My Land Rover Watch. These sites are used as vehicle recovery and tracking sites.

Step 1: Move config elements to the config file

This should have been done when we were building the site, but it was a bit impossible moving all of the dev team onto a unified model when we were making up much of the model as we were coding. As it stood, a lot of the elements that change between the two sites were hard coded. This meant you could not copy and paste code from one to another and simply send out a new build. You had to make sure the page did not have key words in it that should not appear on the other site. In addition, the hard coded bits meant we could not share the code files in our source repository (SourceSafe at this time), which meant code could get out of sync. Ouch!

I quickly found that I could move the site name and the car make to the config files and use the ConfigurationManager object to pull these values. I did the same for customer and dealer support numbers, which made the code fairly neutral. I could now share the code behind files without issue.

Step 2: Figure a method to use tokens in the page

While I could have done this a variety of ways, I wanted a control that would automatically check to see if it needed to be "de tokenized" and automagically do the work. To handle this, I made a control that inherited from label I called a DeTokenizedLabel. Yeah, the name was stupid, but it did the work necessary. The method necessary to get this working properly was RenderControl:

        public override void RenderControl(HtmlTextWriter writer)
            //Check if there are tokens to replace
            if (base.Text.IndexOf('{') > -1)
                HttpContext context = HttpContext.Current;
                Dictionary dict = (Dictionary)context.Application["dictionary"];

                base.Text = TokenController.DeTokenizeControl(base.Text, dict);


Fairly simple. I will likely refine it before it goes out to production. When I created the second class, I pushed the actual "de tokenizing" method into its own class.

    public static class TokenController
        public static string DeTokenizeControl(string text, Dictionary dict)
            if (dict == null)
                //Get dictionary
                HttpContext context = HttpContext.Current;
                dict = (Dictionary)context.Application["dictionary"];

            foreach (KeyValuePair pair in dict)
                if (text.Contains(pair.Key))
                    text = text.Replace(pair.Key, pair.Value);

            return text;

The check for the dictionary may seem like a bit of overkill (perhaps it should be removed from the classes?), but it served another purpose later. Now, there is one flaw in the program thus far. Did you spot it? If not, the control is tightly coupled to a dictionary object that contains replacement text. If you do not load the application dictionary, the page using these controls will blow up. It’s okay, in this instance, as I am not making this available for human consumption, so I will fix it in a later refactor, along with the duped retrieval of the dictionary.

Now, I can set a controls text to something like:

Welcome {Make} owner to {SiteName}

and have it render

Welcome Jaguar owner to My Jaguar Watch.
Welcome Land Rover owner to My Land Rover Watch.

Page Title Tokens

This one stumped me a bit, until I had a bit of time away from the application. Since all pages needed site name, I had to have a solution that worked on all pages. Options:

  1. Load into the master page. This is doable, but a bit too messy for my liking.
  2. Add code to every Page_Load – with 55 pages times two sites, this was cumbersome
  3. Make a WatchSitePage that inherits from System.Web.UI.Page

I opted for number three. Since I already had my detokenize method in the TokenController class (yeah, the names still suck), I had the info I needed. I will have to experiment on which method to use, but here is the basic code for the new base class for my site pages.

public class WatchSitePage : Page
    protected override void OnLoadComplete(System.EventArgs e)
        this.Title = TokenController.DeTokenizeControl(this.Title, null);

The DeTokenizeControl method already handles the creation of the dictionary. New idea, make dictionary static once loaded. Okay, back to our regularly scheduled program.

Localizing the site

I started trying to localize with the standard ASP.NET 2.0 method. If you have not done this, it is fairly simple. First, you have to make sure all of your text is in controls. If you do not want to use a bunch of labels (render to <SPAN> in text, so labels are okay for all but the purist ), you can use the localize control. I have not seen what this renders to, so don’t ask … yet.

To localize with local page resources, you open a  page and then choose Generate Local Resources from the Tools menu.  This will create a set of tags for your page. You can then use meta:ResourceName to use that particular resource tag. For example, suppose you have this particular setup:

     <asp:Label runat="server">Label1</asp:Label>
     <asp:Label runat="server">Label2</asp:Label>

You now create resources and a resource is created called Label1.Text for which you set the text to "Foo" and one for Label2.Text which you set to "bar". The XML in the resource file looks like this:

<data name="Label1.Text" xml:space="preserve">
 <data name="Label2.Text" xml:space="preserve">

You then can set the text to the localized resource (named pageName.aspx.resx for the default langauge) in the labels.

     <asp:Label runat="server" meta:resourceKey="Label1">Label1</asp:Label>
     <asp:Label runat="server" meta:resourceKey="Label2">Label2</asp:Label>

So far, this is the basic stuff you can get off of the QuickStarts for ASP.NET. Notice that the meta:Resource key does not require that you include the .Text, as you can set multiple values for a single resource using the same resource file. There is an equivalent with the explicit syntax, which is <%$ Resources:Label1.Text %>. Note that you will need the .Text for explicit naming.

External Resource Files

External resource files are a bit more difficult, but fortunately Michèle Leroux Bustamante has already solved the problem. I used her CustomResourceProviders project, in particular the ExternalResourceProvider classes. From here, I was able to use the following syntax to set up the title:

<%@ Page Language="C#" AutoEventWireup="true" 
    CodeFile="Default.aspx.cs" Inherits="_Default" 
    Title="<%$ Resources:WatchSiteResources|PageTitles, _default %>" %>

I am awaiting the need to fully internationalize the site name from My Jaguar Watch to the equivalent in all languages. This will make simple token replacements a bit more complex, but I can do it by having a nested localized hierarchy instead of a dictionary.

Until next time!

MVP Summit, Day 3

Day 3 (Wednesday, March 14)

Welcome to deep dive day, as they have dubbed it. The concept here is we are taking a “deep dive” into the underlying technologies. As this is a year after all of the major releases, this is more of a “get caught up on what we have completed” year than a “look at what we are going to do” year. If you have played with the May CTP (2006) of Orcas, you already know what we are learning; there is just more done now.

Session one was an overview session. As is typical in the MVP events, it is the first chance many MVPs have to bitch about things they think the product teams are doing incorrectly. To facilitate, the C# team, wisely, had an intro and then let the MVPs loose.

They then separated us into three groups so we could have smaller sessions, some surrounding white boards.

Session two dealt with the new features of C# 3.0. There are 12 new key words in C# 3.0 and most of these keywords deal with LINQ. As a piece of trivia, the longest new keyword is “descending”. Part of this session was for feedback on feature set in “Orcas + 1”, the yet unnamed version of Visual Studio after Orcas ships. Most of what was asked was far beyond what the majority of C# programmers will ever touch. It will be interesting to see what comes out of these talks, although you will not see the fruition until after the Orcas timeframe.

We were also versed on pieces that are going to be dropped from LINQ, which I will not blog about until I see it out on an official Microsoft blog, as I do not want to cross my NDA and lose my MVP status. There aren’t many, at least not in the talk we had, so do not panic. They have also, as far as I know, never been in any CTP, so you are not missing things you used to have, like you did with Whidbey.

Session three was a deeper dive into LINQ for SQL and was the most informative session I attended. At one time, the session degraded to the point it got Anders involved and it was proven that LINQ truly does lazy load.

I then played the role of bad MVP and jumped out of my track. Fortunately, all of the SQL and developer tracks are hosted at the MSCC, so I did not have to jump buildings. I am considering doing it today (jumping buildings) for the Sandcastle session, but ADO.NET Entities sounds appealing, so I probably will stay here at the MSCC.

The session I jumped to was a session on ASP.NET vNext. I cannot blog about any of the features, but feel free to go to MIX this year and you will have the opportunity to see quite a bit of it. I would love to share the pictures, but I cannot do that either. Just trust me that a lot of this stuff is cool.

At night I went to a developer dinner and spent much of the time talking to Kathleen Dollard about everything other than programming. As two long term MVPs, we also reminisced about the “early days”. I bailed out of the party early to attempt to get some sleep as I am still on Central time and waking up at 5 AM every morning. It was a perfect plan, but I did not anticipate the loud drunk guys at 2:30 AM, but that is a story for another blog entry.

MVP Summit, days 1 and 2

Hello again from Redmond. As usual, I incur the penalty of a four and half hour flight, lack of sleep (still on Central time) and overindulgence in networking (okay, number 3 is not that bad) to get some time with the insiders at Microsoft. Of course, this year, the sessions are not as startling, as there are not a huge amount of alpha products being shown (Orcas has been out in CTP since last May – 2006 for archaelogists reading this site from the far future).

Once again, I am playing the role of bad MVP and I am flipping from track to track to see things that better fit my professional role (I am also too rebelious to be pigeon holed). NOTE: If Sean O is reading this, Rafael and Ben have both strongly encouraged me to stick to the sessions in my session planner. I am just hard of hearing.

Fortunately, this year, I can share a lot of the information I am learning here without breaking my NDA. I will google to make sure it is public prior to saying anything, just to be sure.

Day 1 (Monday):
Could not find a direct flight that fit my schedule on Cattle Car Airlines (Southwest), so I had to fly through Chicago Midway. To get in at a decent hour, I was up at 5:30 AM Central time, with about 3 and a half hours under my belt. Six hours and forty five minutes of flying sucks.

My first duty was a focus session on Newsgroups and Consumer front ends to community. Very interesting focus group session and I discovered it had a $50 Amazon gift cert as a perk. I probably need a new book about now.

The next duty was the Americas dinner with perhaps the most horrid Master of Ceremonies in the world. So bad, in fact, that nobody was paying attention, even when giving out schwag. After meeting up with fellow MVPs (Billy Hollis, Kathleen Dollard and my old FrontPage buddies – Patrick Altman might like to rib me on that one). After dinner, we had a SupportSpace party where they handed out heavily loaded Gameworks cards (good for 90 days, too bad there is not one in Nashville). For those not familiar, SupportSpace is aiming towards being a truly expert exchange site, where experts can get online and help people, instantly (even on their machine) for real cash (unlike the outsource overrun sites where someone will build a casino site for five dollars).

Day 2 (Tuesday):
Normally, day 2 (day one of actual sessions) is a marketing "rah, rah" show. Fortunately, there was not a lot to "rah rah" this year, as there are not a plethora of alpha products coming out. Oh, sure, there are some additional extensions, like PLINQ (LINQ for Parallel processing – cool stuff), but the vNext products, overall, are known (Orcas first CTPed in May of 2006).

The Bill Gates session was good, but really the least sexy session, IMO. For the "Bill Gates is God" people, I am sure this session was far more exciting than it was for me. I sat, online, working on how to fix a problem at work. 🙂 At least this year, there was no major sucking up during Q&A, like there was in 2001. No, I am not going to name names, but any old MVPs who were there know the guilty individual.

Next, there was a developer futures session with Somasager. I sat through about half of this, but found the session focused a bit too much on VSTO in the middle, so I took a temporary sabbatical from the session. I have nothing against VSTO, personally, but the deployment aspect sucks right now. I missed the parts about teh Expression bits being placed into Orcas, but I already was familiar. It is blogged here, for those who might be lost.

My third session was on LINQ, primarily LINQ to XML and LINQ to SQL (and, of course, PLINQ). I have used OR Mappers for quite some time, and while LINQ is quite a bit more than a typical OR Mapper, I think it can do great things. Microsoft will, of course, have to provide good examples of how to use LINQ properly in your layers (or libraries) to avoid having it become another cluster. Anders Heljsberg used most of our hour and a half to demo … demo … demo. I found myself using my digital camera to take pix of code as I could not type into the Orcas CTP fast enough. As I am working with the CTP and Anders was using what I assume is a dog food build (internal build monickered from the statement "eat your own dog food"), some of the code broke in my CTP build. I guess I will have to wait.

The final session was Don Box and Chris Anderson. If you have never seen these two interact, it is an interesting experience. The talk is formulated largely from audience questions, although certain bits are planned. This particular session was on MDA or Model Driven Architecture and much of it showed modeling (using DSLs that have a GUI to model, ala Visual Studio Team Architect) is just one way of modeling. Great session.

That night, we had a party at the Museum of Flight. While there were many great things I could have done (flight simulator rides, caricature artists, jam sessions (Rob Foster would have loved this) and karaoke), I spent most of my time networking with the SQL Server MVPs.

I am now in day three, in a session (how rude of me), so I will sign off and write more later. And, no, I am still not going to break my NDA (as a bad MVP, I will definitely hit some of the futures sessions in areas other than C# – sorry Rafael, maybe next April I will be good).

Perfmon in Vista

A friend of mine, Patrick Altman, shared this one with me and I wanted to pass it along.

In Vista, there is a new feature of the performance monitor that is really neat. It is the report view. To get to it, you have to type in perfmon /report. It will then run a performance report for 60 seconds. You will then get a report of your system’s health.

The first section of the report is a set of warnings on potential problems with your system, like services or drivers that failed or lack of disk space. Very useful.

One that he pointed out was DISK >> Hot Files, which shows the files with the greatest amount of activity.