Monday, December 10, 2007

The Proactive Pair Programmer

It has been a little over three weeks since the MediaLogic software developers have decided to adopt pair programming. Team work has always been a very important part of the software developer culture at MediaLogic. I like to think that we have created a culture where no one person is the super star. We all have our particular strengths and weaknesses. When faced with a difficult or seemingly impossible problem it is comforting to know that you can simply ask for help. This open environment coupled with the right people can help you produce amazing code.

While pairing with various developers I have realized that one must be focused and proactive. Pair programming can be very difficult at first. I found that the most difficult part of pair programming was paying attention when I did not have the keyboard in my hands. Secondly, I learned that the inactive pair must be proactive and think ahead. Being proactive and forward thinking will greatly improve your focus and concentration. When we pair at MediaLogic we do a little thing called ping-ponging. The steps are as follows:

  1. Adam writes a test.
  2. Mo makes the test pass.
  3. Refactor if needed.
  4. Run tests and check for green bar.
  5. Mo writes a test.
  6. Adam makes the test pass.
  7. Repeat steps 1 through 6.

During the first step my pair must be thinking about how to satisfy the test and what the next test is. Furthermore, the inactive pair must be constantly challenging and questioning the design. You and your pair must keep in mind that it is the code being attacked and not the developer.

In my experience pair programming forces you to write better code. When pairing it is more difficult to suggest taking a shortcut. However, I must admit that this mentality to produce quality is solely based on the team. For example, if two weak programmers who take shortcuts pair together the quality of the code will be poor. On the other hand if at least one of the developers is strong short cuts will be less likely and quality will be up. I consider myself lucky as I work with a small team of very exceptional software developers. Over the last few weeks I have cut very few corners as a result of pair programming. As a result we are producing higher quality code that will more maintainable down the road.

Another benefit of pair programming is cross training and exposure to new ideas and ways of thinking. After one week of pair programming I feel like I have learned more than I normally do in one month of working on my own.   

Lastly, I have noticed that I am no longer distracted by email, instant messenger and my urge to read blogs. When pairing you do not have the luxury of checking your email or favourite blog. Let's face it, it would be poor etiquette to read your email with your pair sitting beside you. Pair programming is a very exhausting and rewarding experience. Paring requires that you try and stay one step ahead of your partner. At times this will require an immense amount of brain power to figure our your partners next move. Hopefully this will help you develop a passion to yell out "I know what your doing" or "I know where you are going with this". This passion or skill of active thinking can spill over into other parts of life too. Another benefit of pair programming is that it allows you to share success and failure. Failure or frustrating problems are never fun to deal with. Pairing allows you to attack problems together as a team. No two people think alike. This diversity allows each person to contribute different ideas and perspectives as to how a problem should be solved. A pair also allows you talk your problem out. Some times its just good to have someone to bounce your ideas off of. The process of explaining your problem out loud can be a very helpful tool. Other than over coming problems pair programming allows you to share success. I find that shared success is much more rewarding. This may be a paradigm shift but pair programming allows for human interaction to be a more pervasive element in the software development industry.

Cryptic NAnt Scripts

I've only had the privilege of working with a few NAnt scripts but most of them are very cryptic. They use XML attribute names delimited by periods. The following property name is distinguishable but not very clear.

<property name="base.dir" value="${project::get-base-directory()}" />

With the advent of fluent interfaces and striving to make our code more readable lets make our NAnt build scripts readable. I'm sure that NAnt has a naming convention but who cares lets be pragmatic about this and make our build files easier to read. Let us think of those who are looking at NAnt for the first time. On a side note Resharpers NAnt support kicks ass. My only complaint is that I can not rename a file set ID.

Here is an example of the type of build script that I would like to introduce.

<?xml version="1.0"?>

<project name="ProtocolBuilder" default="test">

  <property name="debug" value="true" />

 

  <property name="baseDirectory" value="${project::get-base-directory()}" />

  <property name="buildDirectory" value="${baseDirectory}\build" />

  <property name="toolsDirectory" value="${baseDirectory}\tools" />

  <property name="libraryDirectory" value="${baseDirectory}\lib" />

  <property name="sourceDirectory" value="${baseDirectory}\src" />

  <property name="applicationSourceDirectory" value="${sourceDirectory}\app" />

  <property name="testSourceDirectory" value="${sourceDirectory}\test" />

 

  <property name="applicationLibraryName" value="${project::get-name()}.dll" />

  <property name="testLibraryName" value="${project::get-name()}.test.dll" />

  <property name="xunitConsoleArguments" value="${testLibraryName} /sr /report-type:Text /rf:${buildDirectory} /rnf:Report" />

 

  <fileset id="libraryFilesSet">

    <include name="${libraryDirectory}\**\*.dll" />

  </fileset

 

  <fileset id="toolsForTestingFileSet">

    <include name="${toolsDirectory}\mbunit\bin\**.dll" />

    <include name="${toolsDirectory}\rhino.mocks\Rhino.Mocks.dll" />

  </fileset>

 

  <fileset id="testReferencesFileSet">

    <include name="${toolsDirectory}\mbunit\bin\MbUnit.Framework.dll" />

    <include name="${toolsDirectory}\rhino.mocks\Rhino.Mocks.dll" />   

    <include name="${buildDirectory}\${applicationLibraryName}" />

    <include name="${libraryDirectory}\**\*.dll" />

  </fileset>

 

  <fileset id="applicationSourceFileSet">

    <exclude name="${applicationSourceDirectory}\**\AssemblyInfo.cs" />

    <include name="${applicationSourceDirectory}\**\*.cs" />

  </fileset>

 

  <target name="killNotepad">

    <exec

          program="${toolsDirectory}\pskill.exe"

          commandline="notepad.exe"

          failonerror="false"/>

  </target>

 

  <target name="initialize" depends="killNotepad">

    <delete dir="${buildDirectory}" />

    <mkdir dir="${buildDirectory}" />

  </target>

 

  <target name="compileSourceCode" depends="initialize">

    <csc output="${buildDirectory}\${applicationLibraryName}" target="library" debug="${debug}">

      <sources refid="applicationSourceFileSet" />

      <references refid="libraryFilesSet" />

    </csc>

  </target>

 

  <target name="compileSourceCodeAndTests" depends="compileSourceCode">   

    <csc output="${buildDirectory}\${testLibraryName}" target="library" debug="${debug}">

      <sources>

        <include name="${testSourceDirectory}\**\*.cs" />

        <exclude name="${testSourceDirectory}\**\AssemblyInfo.cs" />

      </sources>

      <references refid="testReferencesFileSet" />

    </csc>

  </target>

 

  <target name="copyTestDependencies"

    <copy todir="${buildDirectory}" flatten="true">

      <fileset refid="toolsForTestingFileSet" />

  </copy>

    <copy todir="${buildDirectory}" flatten="true">

      <fileset refid="libraryFilesSet" />

    </copy>   

  </target>

 

  <target name="test" depends="compileSourceCodeAndTests, copyTestDependencies"

    <exec basedir="${toolsDirectory}\mbunit\bin"

      useruntimeengine="true"

      workingdir="${buildDirectory}"

      failonerror="false"

      program="mbunit.Cons.exe"

      commandline="${xunitConsoleArguments}" />           

  </target>

 

</project>

Sunday, December 9, 2007

NAnt Build Script Improvement - Being Pragmatic

For the last six months our team has dealt with the annoyance of forgetting to close our Report.txt file before running the test target of our NAnt Script. If you have no clue what I am talking about I will give you a little bit of background on our build process. Our team does not use Visual Studio to build our projects. Rather we use a NAnt script. Our NAnt Script has a target called "test" that runs all tests in the solution using an xunit framework. The results of the test are written to a notepad file in the same directory where the code is built. If the Report.txt file is not closed before running the test target you receive an error that the directory is in use. This error occurs because the build script is trying to delete a directory where a file is in use. Even though we only waste several seconds a day or minutes in a week we decided to get pragmatic about the problem. One of our developers created a target called KillNotePad that uses a SysInternals tool called PsKill. PsKill is a program that allows you to specify the process name or id that you would like to terminate(kill). Please be warned that this will kill all instances of Notepad. Fortunately, I do not use notepad but NotePad++. Therefore when I kill notepad notepad++ stays open. The build script I am using is very similar to the one described on Jp's blog you can also find it in the Nothing But .Net source code from November 2007 http://jpboodhoo.googlecode.com/svn/trunk/ 

The original clean target simply tries to delete the build and deploy directory and fails if the Report.txt file is open.

<target name="clean" >

  <delete dir="${build.dir}" />

</target>

The new clean target is now dependant on the killNotePad target. This means that the KillNotepad target will be run before the clean target.


<target name="clean" depends="KillNotepad">

  <delete dir="${build.dir}" />

</target>

 

<target name="KillNotepad">

  <exec

        program="${tools.dir}\pskill.exe"

        commandline="notepad.exe"

        failonerror="false"/>

</target>

The following is just one example of how you can solve this problem. The wonderful thing is that we do not all think alike. I'm sure that you can come up with a different solution a better solution.

 

Saturday, December 8, 2007

Quick Tip: How To Read and Write a Test

This is just a quick post. Hopefully in the next little while I can post a more in depth description of how a test should read and written. If you questions leave a comment and I'll make sure to follow up in my next post if I am able to.

If you are using JP's Resharper templates may I suggest changing your "record" live template to place the cursor in the playback. The reason I suggest placing the cursor in the playback is based on how a test should be read. A test should be read and written bottom up. By working bottom up you are explicitly setting a goal. The goal is to write code above the last line that will satisfy your assertion.  Lets work through some sample code to get a better understanding. I am assuming that you are familiar with Rhino Mocks and the system under test concept(Its a factory method).

[Test]

      public void Should_Leverage_the_task_to_retrieve_a_list_of_interviewee_Roles_Builders()

      {

          using (mockery.Record()) {

              Expect.Call(mockTask.RetrieveRoleBuilders()).Return(new List<IBuilder>());

          }

 

          using (mockery.Playback()) {

              CreateSUT().Build( mockXmlTextWriter);  // start reading here

          }

      }

 The following test simply states the following:
"When the build method is called we expect the task to call its RetrieveRoleBuilder method and return a  list of IBuilders"

This can be refined even further to:
"When a build occurs we expect that the task will retrieve a list of role builders."

Change your record live template so that you are writing tests bottom up

The original record live template:

using (mockery.Record())
{
$END$
}
using (mockery.Playback())
{  

}

The modified live template:

using (mockery.Record())
{

}
using (mockery.Playback())
{
$END$
}