The TDD Session at DevLink – Anatomy of a Train Wreck


If you paid attention to #DevLink on Twitter, you already know about the lunch session on TDD. If not, here is a brief synopsis of the events, taken from the eyes of one of the participants.

It is not my goal to point fingers here or to try to get all of the events documented. I, instead, would like to show how a train wreck happens with the hope you can notice the signs and avoid one.

How This All Began

I was not involved in the beginning of the TDD session. It was incubated in an Open Spaces session I was not participating in. I was in a Open Spaces session on SOA, run by James Bender,  and only heard of the TDD lunch session when Alan Stevens joined in at the end of our session.

It is my understanding that Michael Woods was talking to a fellow DevLinker (is it okay to use DevLink in this manner?) about TDD. The idea, at that time, was to have five or six guys around a laptop talking about TDD to show the value. I am not sure how the idea expanded, but I am imagining that Alan Stevens and a few others seeing the value of expanding the idea into a demo. Pinging Alan today, I go the idea that the “incubation” was extremely fast.

By the time Alan came down to photograph and participate in our session, the idea was already a demo in one of the session rooms, with a projected, etc.

The Players

Here is the list of players, in no particular order:

Gregory A. Beamer (me) – I am a proponent of TDD (Test Driven Development) and am actively looking at how BDD (Business Driven Development) fits into testing. I am not a purist on TDD, although I am passionate about tests.

Alan Stevens – Alan is a proponent of TDD, but even a more active proponent of community. Alan was responsible for the Open Spaces sessions at DevLink.

Corey Haines – Corey is a strong proponent of both TDD and BDD. From conversations, it is evident Corey is more of a purist than I am.

Steve Harman – Steve is a guy who, according to others, “goes from 0 to 60 in 2 seconds”. Steve is passionate about TDD and BDD and speaks on a very deep level about things he is passionate about. I would love to attend one of his 300 level sessions some day.

Jim Holmes – This was my first opportunity to meet Jim Holmes. After the “fiasco”, Jim was facilitating the post mortem discussion between Michael, Steve, myself and a few others. Jim also helped referee the session.

Michael Wood – Michael was involved in the original discussion (five or six people and a laptop) and was also involved in some of the facilitation of the post mortem.

Mike Eaton – Mike acted as the primary referee in the session and attempted to keep it under control.

The Session

I walked into the session late, so I did not see the first few minutes. The session used the example of a DevLink type of lunch where X number of people had to be fed. This example was chosen at the session rather than set up before hand.

From the post mortem, it was my understanding that there was no pre-session meeting (in the hall?) so the session just flowed.

There was also a question as to what the session’s goal was. Here are different ideas I heard about the session:

Demo to teach beginners something about TDD

Open session demoing TDD and showing the variety of thoughts people have about TDD

Session illustrating the right way to do TDD for those who do not know TDD

While these goals might seem complementary, the overarching goals for a “true” beginner’s session and a session showing the right way are different, as is the goal for an open session with a demo.

The Train Wreck

The first clue of a train wreck came when the test class was renamed for the second or third time. The first confirmation of a train wreck came when the participants were informed that “#DevLink”Twitter group was abuzz on the session. Some of the nicer comments:

Good idea, poor execution on the #devlink TDD demo. If this is aimed at "new to TDD," there’s WAY too much religion, not enough fundamentals.

Macs don’t support TDD – so I’ve learned at #devlink open spaces – also you flip the bird a lot when doing TDD

If one went down mechanics and mindsets, there was the Mac, which was really less a problem once you got on it than previously thought. There was also the religion: how pure should you worship your *DD. And, there was a lot of minutia, especially surrounding the naming of classes and methods.

The biggest “sin” of the session was we lost the core audience. Although not everyone was aware, the session was touted as a “beginner’s TDD” session. Talking to Sarah Dutkiewicz after the session, I found out she walked completely confused prior to my walking in.

Lessons Learned

The first lesson we can learn his is you have to have a goal in mind. The goal has to be communicated amongst the participants. If we would have had a “quick two minute meeting in the hall”, prior to the session, the wreck could have been avoided.

Second, you should keep the number of people in list of participants down to a reasonable number. And, while you might want various points of view in the mix, make sure that all are willing to focus on the same goal. I am not sure this would have been a problem with anyone in this group. While we do have different views on TDD, everyone in the group is a consummate professional and willing to tone down a bit for a common goal.

Third, while it is nice to make up scenarios on the fly, the tried and true examples are often the best place to start. As Steve suggested after the session, the typical “canonical” bank transaction scenario would have been a better place to start, as every person in the room would have recognized that scenario and the participants would have been familiar with it.

Fourth. Even if it is an open session, there has to be a “boss” or a “referee”. This is the person who owns the show and is responsible for keeping control of the session and getting the players back on track. The players have to walk in agreeing that the referee has the power to pull things back into place.

Summary

It would be easy, as some tweets have suggested, defining this train wreck by the personalities involved. But this would not do justice to the reality that even the most consummate professional can get off track. It is my hope that

Advertisements

Meshing Out – What I took home from DevLink


Before getting into this entry too deep, I wanted to congratulate John Keller and crew for another successful event. I also wanted to congratulate Alan Stevens and crew for the wonderful Open Spaces sessions. I did not attend but one regular session, but the Open Spaces more than made up for missing the regular sessions.
 
I ended up trying to sign on a bit too late, so I had to hork a pass from someone who was not attending on day 2. I will remember to get my pass earlier next year and avoid the long lines. 🙂 Thanks to Rebel Bailey for lending me his pass on Saturday.
 

Mesh is Cool

I attended Jeff Blankenburg’s impromptu session on new stuff from Microsoft. Although PhotSynth, Sea Dragan and Popfly are kewl, I have seen them enough times to get very little out of that part of the session. Then Jeff showed Live Mesh. Here is somethign that immediately fits my needs.

For my work, I am often having to sync up more than one device. Now I have 5 GB on the web to sync up with. I already have my current project shared as a folder and I am currently downloading it to my laptop. In addition, I love the fact I can control my home machine from my laptop without having to load a lot of extraneous software. And only people in my mesh can do this, and only those with a proper logon. Very cool.

Where I see this going … anyone at Microsoft listening … is merging in Live Meeting so I can use the Mesh to immediately go to a sharing session or share, from a meeting, to my mesh. Company meshes. Group meshes. People will pay for this stuff. Trust me.

Microsoft is already hinting at an SDK for Mesh so I can create apps for it, so the future is nearly here.

Impromptu can mean a train wreck

The TDD session crashed. It was not a total disaster, but it needs to be done better next time. I have an entry I have completed on this, but I am awaiting a review of the article from Alan Stevens before putting it onto this blog. My goal on this one is dissecting the train wreck and providing a constructive post mortem. The last thing I want is to inadvertently tick someone off.

Open Spaces are fun and productive

I did not think an open sesssion would be as useful as it was. I spent most of my time in these sessions, firing thoughts off some of the best in the industry … and loved it. I will gladly do this again.

Final Thoughts

DevLink is an inexpensive alterative to the big name conferences. It is shaping up to be a premiere conference, despite the low price. Hats off to the guys who put this together and make it purr. You guys really deserve a big hand. Next time can we holder it closer to downtown again? 😉

Peace and Grace,
Greg

Stupid SQL Server 2008 Tricks


THe title is just a nod to David Letterman with his Stupid Pet Tricks or Stupid Human Tricks segments. It is probably not the best title, as will be revelead. A better title:
 
SQL Server 2008 Management Studio renders SQL Server 2005 Management Studio useless when working with projects
 
But "Stupid SQL Server 2008 Tricks" sure fits on the title line much better.
 
Since I have already released the gist of the "problem", I will explain the symptoms.
 

Reproing the Problem

Yesterday, I created a new SQL Server project in SQL Server 2005 manager. After I was finished, the solution explorer showed nothing. Zip, zilch, nada. I then recreated the project, with the same name and got an error.
 
Test-Creation
 
If you want to get the same error, the repro steps are very easy:
 
1. Install SQL Server 2008 as a stand alone (not an upgrade)
2. Open SQL Server 2005 Management Studio
3. Create a project called Test and allow it to create the solution (default)
4. When it appears to do nothing, create a project called Test and allow it to create a solution (default)
 
You can alter steps 3 and 4 and not create the solution folder. As long as the setting is teh same the second time, you will get the error.
 
I then looked in Windows Explorer and yes, there was a project there (Test.ssmssqlproj), but there was no solution file. Strange. So, I tried to open the project file. Still nothing in SQL Management studio.
 
But, then I had an epiphany. Try opening the project in SQL Server 2008 Management Studio. Yes, it works.
 

Incompatibility of Tools

What this boils down to is there are some incompatibility with the underlying assemblies/DLLs that SQL Management studio uses to work its magic. In addition to this incompatibility, I have found a few more.

When working with a SQL Server 2005 instance, in SQL 2008 Management Studio, you cannot alter the definition of a table. Attempting to alter a SQL 2005 table in SQL 2008 Management Studio, even one you created with SQL 2008 Management Studio, fails with this largely useless error:

Useless-Error

Now, looking at this, you might think. Yeah, the error box seems useless, but it allows you to save it as a text file. That sounds like it might be a cool feature, if you are thinking, as I did, that the text file is a drop and recreate script for the table. If you were thinking that, you and I were both wrong. Here is what the text file contains:

/*
   Friday, August 22, 20089:17:05 AM
   User:
   Server: BNA-GBEAMER2
   Database: UnitWarehouse
   Application:
*/

SimHistory

Now if you first thought is "that is a completely useless text file", you are I are tracking. If I were doing a lot of work and saved this file off and then came back a few weeks later to look at the file (unlikely, but let’s play along), would it mean ANYTHING to me? I can picture this "Greg, what the *#&% were you doing on August 22 at 9:17 AM with the SimHistory table?" (I only know this because I know my schema). Worse, imagine someone else looking at this text file. It might make sense if the file contained something like this:

/*
   Friday, August 22, 20089:17:05 AM
   User:
   Server: BNA-GBEAMER2
   Database: UnitWarehouse
   Application: 
   Object: SimHistory

*/

Attempt to alter object in Management Studio failed as saving changes to SQL Server 2005 objects is not permitted.

While saving off this file is still useless to me, it at least logs something.

What This Means

To me, the interesting point of this finding is it means I have three choices with SQL Server 2008.

1. Switch all of my databases to 2008 and only use SQL Server 2008 tools
2. Get rid of SQL 2008 completely until I am ready for it
3. Use SQL 2008 MS for my projects and SQL 2005 MS for my visual schema work

Number 1 is not an option until I know SQL 2008 databases can be attached on SQL 2005, as I have SQL 2005 database servers. I would suspect there is some incompatibility here due to the error altering schema. I could be wrong, as this just could be a tool/database impedence match. I do not have the time to investigate, so I will be pessimistic and safe.

Number 2 is a risky option now that I have a SQL 2008 instance installed, as I may end up with a hefty reinstall for 2005 to fix the tools. I cannot afford this time either, although not installing 2008 at all might have been a good option at one time. Unfortunately, I have to investigate 2008 for my job.

Number 3 is the sanest option right now, although it does require keeping both open. As long as I have 2008 open with a solution, it is easy to tell them apart. Since I am only working with it for projects, this is not a horrible ‘trade off’.

Peace and Grace,
Greg

Microsoft Heartland Influencer’s Summit


I ended up going to the Heartland Influencer’s Summit today. My main reason for going was to talk to Brian Prince, the Heartland Architect Evangelist, which I accomplished later. It was a productive talk. I left with a bit more of an open mind about a few things (nothing related directly to the Summit) and a thought about User Interface.
 
Much of the Summit was under a "gentleman’s NDA", so I can’t go into details about all of hte specifics, but that is not he purpose of this entry.

Paradigms 

The first thing of note is how extensive paradigms are. I have, more than once, caught myself in paradigms in thinking. In general, I find I am more susceptible to getting caught this way in things I find appealing. If I have an affinity towards something, I am less likely to challenge my thought process.
 
Let me get a bit more specific, with no names, as they are not important. In the Summit, they gave away desk light sabers as gifts. Here is a picture of mine sitting on my desk (yes, that is a Michelob Porter in the background – best option in the "cheaper" beers).
 
Light-Saber
 
Alan Stevens and I both assembled ours (typical geeks) and a conversation about Star Wars ensued. It was generally agreed that episodes 1, 2 and 3 sucked and that midichlorians were stupid. Then the comment burst forth that there were "certain universal myths that were not exclusive to Christianity" (paraphrased, not a true quote). While I understand the nature in which the comment was made, the bulk of the evidence of the virgin birth/savior being a Universal myth comes from a single source about the Mithras cult which is largely conjecture on the part of the author. It is very possible, if not likely, that these stories do have a Universal nature, but the statement was stated with such force it revealed the speakers own paradigm.
 
There is nothing wrong with being in a paradigm. The comment just struck me as ironic after our previous conversations about how, using higher level constructs, one did not get caught on syntax. I am not mentioning names as I might have been stuck in my own paradigm at the moment and there is no reason to start a peeing contest over a statement.
 
What is important is many developers are stuck in paradigms without realizing it, as they start to see syntax as a programming construct and not an artifact of language. I believe both the author of the statement and myself agree that there are paradigms in programming, but it is interesting how easy it is to fail to realize we are in one even after we have advanced. It is another thing I have to keep myself in check over as we move forward.

Bad UI

Now this second one may come across as strange as it is in no way related to programming, except in abstract (abstractions are good). Look at this picture:
Faucet
 
And here is an "enhanced" shot to illustrate the "bad UI".
Faucet-2
First, notice three things. In red, there is a motion sensor. And, in green, there is a short distance between the pipes in the curve of the faucet (green). I should have taken a picture from the side, but the water pours about 2 inches away from the back of the sink. In blue, note the water.
 
There are a couple of things wrong. First, the motion sensor points straight out. This causes you to have to hold your hands up above the bowl to use it. Added to this is the fact that the water pours so close to the back of the sink. Combined, this causes a lot of water to pour out the back of the sink and eventually end up at the front, where it can deposit on the pants of anyone who stands close to the sink.
 
In the process of discussing UI (we are talking about a bunch of geeks talking about things "in nature"), we did determine a possible advantage to this design. Anyone who pees horizontally on his pants can blame it on the sink.
 

Fish bowl

One positive take away I got from this Summit was the fish bowl. This, to me, is almost as neat as sliced bread. Okay, old and tried analogy. As neat as scrum meetings done properly. Another geek analogy. ARRRRGGGHHHH!!!

The fishbowl is designed to facilitate a bunch of people on a topic without becomming a free for all. You set up six chairs in the middle of the room. One is for the facilitator, who makes sure the conversation remains on topic and the other five are for participants. One chair must always remain empty. This is important, as it always leaves space for others to enter the conversation. When a person enters, one of the current people must leave. The last rule is nobody outside the inner circle can talk (like chickens at a scrum meeting?). I was suprised at how well this flowed. We got through some topics that might otherwise degrade without any conflict or too many tangents. This will become another repertoire item for me. Very cool.

Well, that is it for now.

Peace and Grace,
Greg

VB to C# Conversion programs (altered review)


On August 8, I compared a demo of Instant C# (all I have) versus VB Conversions. It was a review prompted by the needs of a friend. Once I found some conversion issues with Instant C#, I posted a review despite it being a demo. One of the employees of the company added a comment and I have since added to the review. The review has been updated at the original URL.
 
Dave, the employee mentioned, has indicated there are some updates for Instant C# forthcoming, so I am willing to review the changed procuct, along with any VB Conversions changes, once they are ready.
 
This is actually a bit of a waste of my time, as I have no need of either product. I am willing to spend the time, however, in the interest of both fairness and providing you, the reader, with the most current information. I am opinionated, but I try to be as fair as I possibly can and I truly wish both parties the best of luck should the opportunity arise to review both again.
 
Peace and Grace,
Greg

The Politics of Poverty


I saw a commentary by Glenn Beck on the CNN website today about poverty. Unfortunately his commentary rails on Democrats in city hall, where I find that the bigger story is the idea that hand-outs do not help most people get out of poverty. While this might be considered a hallmark of Democratic policy, I feel the fact that these mayors are pushing hand out type programs is more important than their party. I imagine this was Beck’s intent, but it gets lost in the rail.

Poverty and Change

In life, the biggest changes are most often precipitated by some form of challenge. I am sure there are exceptions to this rule, but looking over my life, as well as lives of others I know, it is the moments of extreme challenge that have brought about the most extreme change. If there were programs to focus people on positive outcomes through their hardships rather than programs to relieve their pain by paying them, I believe we would see a much more radical change in the poverty level.

I have recently spent some time with the Total Transformation Program. If you are not familiar, this program is heavily advertised on the radio here in Nashville. The program, created by Child Behavioral Specialist, James Lehman, shows parents how to alter their child’s behavior in positive ways by changing the way we respond to acting out. One of the primary realizations I have through the program is my children’s acting out is due to their inability to cope because they do not have proper problem solving skills. One of the things I do that exacerbates the situation is picking up after them.

Now, these sound like very different beasts, but there are some very astounding similarities, when you look underneath the hood. When I pick up my daughters’ things, I am perpetuating the myth they are incapable of solving the problem and making it a reality. I relieve some of their pain (by removing their need to clean up) and I also remove the challenge of having to solve the problem. In like manner, when our primary focus is on paying away the pain of the poor, we remove the challenge of having to solve the problem.

Teaching Men to Fish

The following has become a mantra in Conservative circles:

Give a man a fish, feed him for a day
Teach a man a fish, feed him for a lifetime

On the surface, this is the key to poverty and change. We have to get people learning to fish if we want them to rise from the ashes of their former life. But, this is an oversimplification, as starving men do not learn. We have to both give men fish and teach them to fish to solve the problem. This means there are certain types of “handouts” that need to exist.

The end run here is both Democrats and Republicans, or if you like, Liberals and Conservatives are right. One might have a slightly higher rate of success with their program alone, but I am sure we will find that neither type of program by itself is as effective as both together.

Peace and Grace,
Greg

Refactoring the Disconnected LINQ to SQL Repository


I am bit blog happy tonight, but I am on a roll. In this post, I want to cover refactoring. To do this, I will be refactoring the solution I created in the last few posts (Repository Pattern in LINQ to SQL).  In our last post, I had the following routine to get the Primary Key for a LINQ to SQL object.

        private string GetPrimaryKeyName(T entity)

        {

            Type type = entity.GetType();

            foreach (PropertyInfo prop in type.GetProperties())

            {

                object[] attributes = prop.GetCustomAttributes(true);

 

                foreach (object o in attributes)

                {

                    ColumnAttribute attribute = (ColumnAttribute)o;

                    string name = o.ToString();

 

                    if (attribute.IsPrimaryKey)

                        return prop.Name;

                }

            }

 

            return null;

        }

To me, this is a lot of work for finding a primary key column. There has to be an easier way. But, before I can refactor, I need a test. Actually a suite of tests, as I want nothing to change, but here are a couple of tests that call the save method, which requires the name of a primary key.

[TestMethod()]

public void SaveNewSimStatusType()

{

    IDataContextFactory dataContextFactory = new SimStatusTypeDataContextFactory();

    IRepository<SimStatusType> target = new SqlRepository<SimStatusType>(dataContextFactory);

    string expectedName = "This is a test";

    int expectedValue = _simStatusTypeId + 1;

 

    SimStatusType entity = new SimStatusType();

    entity.SimStatusTypeId = 0;

    entity.SimStatusTypeName = expectedName;

 

    SimStatusType actual = target.Save(entity);

    isDirty = true;

 

    Assert.AreEqual(expectedName, actual.SimStatusTypeName, "name is different");

}

And

/// <summary>

///A test for Insert

///</summary>

[TestMethod()]

public void UpdateASimStatusType()

{

    IDataContextFactory dataContextFactory = new SimStatusTypeDataContextFactory();

    IRepository<SimStatusType> target = new SqlRepository<SimStatusType>(dataContextFactory);

    string expectedName = "This is a test";

    string changedName = "This is a second test";

    int simStatusTypeId;

 

    //Create entity

    SimStatusType entity = new SimStatusType();

    entity.SimStatusTypeId = 0;

    entity.SimStatusTypeName = expectedName;

    SimStatusType actual = target.Insert(entity);

    simStatusTypeId = actual.SimStatusTypeId;

 

    isDirty = true;

 

    Assert.AreEqual(expectedName, actual.SimStatusTypeName, "name is different");

 

    actual.SimStatusTypeName = changedName;

    target.Save(actual);

 

    Assert.AreEqual(changedName, actual.SimStatusTypeName, "name is different");

}

Now, I really need to refactor these tests a bit, as the warning messages tell me little and the names of the routines are not quite explicit enough for me, but that is another blog entry. Feel free to critique the tests, if you must, as I am more than happy to learn things.

When I run these routines, they come up green, which means the test is passing. I then look into how one might pull a primary key from a LINQ to SQL object. For my first refactor, I find that you can get the primary key off of the DataContext mapping. So, I change my GetPrimaryKeyName() routine to this:

private string GetPrimaryKeyName(T entity)

{

    using (System.Data.Linq.DataContext context = _dataContextFactory.Context)

    {

        MetaTable table = context.Mapping.GetTable(typeof(T));

        return table.RowType.IdentityMembers[0].Name;

    }

}

That is huge reduction in the number of lines necessary. Run tests. Still green. Okay, so I lucked out this time. But, I am now wrapping a context into a context, as the call has the context on it. Here is the save method, as it stands now.

public T Save(T entity)
{
   
using (System.Data.Linq.DataContext context = _dataContextFactory.Context)
    {
        T databaseEntity;
       
string idName = GetPrimaryKeyName(entity);
       
int id = (int)GetIdValue(entity, idName);

        //Then
        if(0 == id)
        {
           
//This is a save
            return Insert(entity);
        }
       
else
        {
            databaseEntity = context.GetTable<T>().First(s => s == entity);
            LoadDatabaseEntity(entity,
ref databaseEntity, idName);
        }

        context.SubmitChanges();
        return databaseEntity;
    }
}

This one just screams refactor, as I only need the context for the extent of the routine. The first is to call the GetPrimaryKey from outside of the context call. The other is to move the metadata pull into the save routine. Of the two, I am opting for the later until I see need for this elsewhere. While I do see a possibility, I am going to follow the rules of refactoring and only refactor as needed. Thus, a separate routine is overkill. My save routine gets change to this:

public T Save(T entity)
{
   
using (System.Data.Linq.DataContext context = _dataContextFactory.Context)
    {
        T databaseEntity;
       
string idName = _dataContextFactory.Context.Mapping.GetTable(typeof(T))
                        .RowType.IdentityMembers[0].Name;
       
//string idName = GetPrimaryKeyName(entity);
        int id = (int)GetIdValue(entity, idName);

        //Then
        if(0 == id)
        {
           
//This is a save
            context.GetTable<T>().InsertOnSubmit(entity);
            context.SubmitChanges();
           
return entity;
        }
       
else
        {
            databaseEntity = context.GetTable<T>().First(s => s == entity);
            LoadDatabaseEntity(entity,
ref databaseEntity, idName);
        }

        context.SubmitChanges();
       
return databaseEntity;
    }
}

I have now deleted the GetPrimaryKeyName() routine. Now, you might say, "but you said you would likely resurrect this code. Isn’t it better to just comment it out?" I am a bit torn on this one, but I am inclined to say no, as I can use the refactoring tools in either Visual Studio or Resharper (great product if you do not have it) to extract this method later. While this is not necessary in this routine (1 line), if I had multiple lines it would be prudent to delete now and extract later, as I would likely find some improvements in the code. I then end up copying and pasting changes, which is not a good idea when you have refactoring tools.

Just remember, the basic rule here is red … green … refactor. You then aim for green again. In my case, the change was so simple it did not break everything, but I have had cases where a refactor broke code. This is why tests are mandatory. If you are not using unit tests you should ask yourself "why am I not using them?"

Peace and Grace,
Greg