Thursday, December 30, 2010

Red-Green-Refactor

Quite possibly the last blog of the year, and want to wish everyone a Happy and Productive 2011.  Well, what a year it has been.  I should probably do a 2010-Retrospective at a later post, but for now, I want to stress out the importance of unit testing.

Currently with my team, we are working on a partner integration project where we call a few web services to initiate a process, and in return expose a few services (URI’s) to receive process results.  We are using WCF, C#, .NET 3.5, Linq-to-SQL, and SQL Server as the backend.

Although a few months into design, specification and development of the project, we have made decent progress.  There are two separate projects used for testing the code base.  The code has been factored out into three separate projects, Services – Business and Data Access, which is quite typical for .NET enterprise type projects. 

Not sure if you belong to the camp of testing your code, or you rather use the FDD method (Faith-Driven-Development : “I believe my code will work”).  This is plucked from a relatively recent tweet by @lazycoder, but I think he also retweeted.  You have to look into it to see who originally coined the term.  But FDD is a practice which one can easily follow especially when you are time constrained and just want to call “I am done”.  I hope you belong to the TDD camp (Test Driven Development).

So, why unit test?  I have seen and read many blogs about this, but rather share with you my opinion why and how it works for me.  First, and foremost, it makes me to “think” about my code and what I want to deliver.  You have one piece of feature you want to develop and develop well.  Once you test and worked out the feature-set, you move to the next. 

It makes you think writing code that is decoupled from each other.  Ask yourself. How many times do you “new up” an object in your method?  Can you do dependency injection?  By the way, there is a great book (currently in MEAP) by Mark Seeman which I have reading along, called Dependency Injection .NET.  I suggest you check it out. Anytime you create an instance of an object in your method, you are creating coupling.  This could cause some pain when those instances are not available and you decide not to unit test.  But you already know that right?

Majority of the time, you get to see and test for the edge cases even though they could be overlooked. So you can easily fake/stub/mock an object and create the test case for those edge cases.  You can plan and execute exceptions in isolation.  Don’t let the user/interface find out what you are about to do. 

Increased Decoupling

Can’t say more.  Do you want decoupled code?  Better design?  You should unit test. Refactoring, what an easy thing to do!  Because unit testing allows you to focus on one thing at a time.  Not the three-four levels up or down.

In short (and I want to keep it short), if you are writing enterprise applications, I see no way around doing it without unit tests (plus integration tests).  Oh, before I leave, please check out Resharper which is a great plugin for Visual Studio which especially helps with the process plus many other enhancements.

Thanks for reading and Happy New Year!

Baskin

Thursday, September 2, 2010

When everyone agrees

Don’t you find it odd when everyone agrees, or it feels like at times?  Then there is an eerie silence… Even odder in remote WebEx/Skype type meetings.  I have been working with off-shore development over a year now and have mixed feelings during this interaction.  Most of the time, it feels like we all agree but tangible outcomes are not happening. 

First of all, I am not sure if the rules of the game are set the same both for on-shore (US-based) or off-shore (outside of US).  Definitely there are differences in our training background and our craftsmanship and values. As a software sr. developer, jr. architect who is trained in the US and actively practicing over 13 years, I value good software craftsmanship, applications which are tested, designed and developed according to some best practices that maintainable and scalable.  This is really what I strive for and encourage & mentor others do so, including soon to be my past my off-shore team as well.  However the road traveled was rocky, and ran into a few obstacles. 

So what happened?  Good question. My off-shore team has an engraved way of doing things and it was hard for them to learn test-driven development, understanding tier-based architecture, not creating large “GOD” classes, writing explicit, modular functions rather than monolithic top-to-bottom “Know-it-all, Do-it-all” methods, avoiding code duplication  Quite a few are documented by Jeff Atwood back in 2006 (Coding Horror) and many others in the past.  Unfortunately my team was mostly VB6 developers with a few exceptions.  How do I know?  Because I was one.  I am not saying that being a VB6 developer is bad, but times have changed and maybe you should too.  And please stop using Hungarian notation for crying out loud… #pet_peeve

Writing Legacy Code and continuing to do so

Yet with the releases of .NET 2.0, 3.5, and now 4.0, and influences of Open Source (especially for the web), and changing face of software development (Agile development, Scrum, TDD, BDD, etc), it was time for me to change then back in 2005 change others.  Can’t write code the way it was written 10 years ago (Waterfall, monolithic, client/server etc.)  Somehow my off-shore team haven’t received this message because no one probably delivered.  They were content delivering applications with looks and feels of the late 90’s and early 2000’s.  Eyebrows raised and points made over WebEx, but not much happened.  I bet they did not see my eyebrows though :)

According to Michael Feather’s “Working Effectively with Legacy Code”, legacy code is code that has no tests.  I am not going to discuss the details of his book and why writing "zero” tests is not good, but the point is, if you want to write good, scalable, readable, and usable applications, you better have tests around your code and features.  This is one of the first things I tried to set guidelines around with the off-shore team with partial mixed results.  Yes, they are writing tests, but most of them are fulfilling a requirement rather than testing their code or logic, or help refactor their code. Ugh.

Usage and throwing exceptions created another interesting exchange with the off-shore.  In .NET, exceptions should be used judiciously, and as the name implies, they should be an “exception”, like an “accident”, you didn’t see it coming – not to be used in regular workflow, such as enters a wrong password(?) In once incident, I seemed to have convinced the developer this being the case, but come next day, you find the same type of code with a different signature.  A totally a “Gaaa!” moment! Yikes.. Again there was really not much re-enforcement,  application works, users are happy*. Next feature please..

N-Tiered Architecture and Separation of Concerns

Even though you may have created a multi project solution, it does not guarantee that your project is tiered and respects the notion of separation of concerns.  When I joined, I introduced the Model-View-Controller architecture for the ASP.NET development.  At first I had a different off-shore team and we also had struggles especially with less than stellar user experience requirements.  You have to understand how difficult it could be if you try to write a windows application on a web browser.  And my team is set to achieve just that with the support product owners. 

Another sticking point with my off-shore was overuse of stored procedures.  Okay there are two camps out there on this, but come on, it is 2010, and we have OR/M tools including Linq and Entity Framework, data transfer objects, business classes, etc.  And how do you test it, especially edge conditions? Below is a part of one of those sprocs.

-- ...
My_DateTime <=DATEADD(minute, 59, @MyDatetime))
-- ...

Above, what happens the programmer uses “<” instead.  How do you catch that? What does “59” mean? Hint: minutes.  Let me tell you long stored procedures with convoluted business logic is certainly not my cup of tea.  With the existing legacy database we had a few thousand stored procedures written over the lifetime of applications.  God knows which ones are in use and which are deprecated, but can you imagine the amount of effort to review those and clean them up? I digress.


Anyway, it has been a quite an interesting journey and experience.  I have some good takeaways, and developed very good friendships over WebEx and Skype.  But I do miss the opportunity of having face time, have a chat with the developer/peer, preferably over a beer on some design pattern or an intense code review.  Thus turned in my resignation, and now off to new ventures.  Let us not forget to have discussions and does not mean it is all good when everyone agrees.  Because, really, they don’t!


Happy Coding! (just be sure to leave “code smells” in rear-view mirror)

Monday, May 10, 2010

Handling duplicate Area/Controller names in ASP.NET MVC 2

When dealing with duplicate names for an "Area" and "Controller" in ASP.NET MVC, we could run into issues. The ASP.NET routing mechanism needs to be understood to produce the proper links. In this post, I will walk through the problem, and the fix for ASP.NET MVC 2 on Visual Studio 2010.

First create an ASP.NET MVC 2 application from File -> New -> Project, and choose the project type shown below.


Next create an "AdministrationController" and an associated folder, called "Administration" inside the "Views" folder. Also create an empty view page (Index.aspx) inside this folder.

Place the below text inside the Content2 section of this view page.

   1:   <h2>Administration in Root</h2>

Then create a link to this controller/view from the "Home/Index.aspx" file as shown below.

   1:   <%=Html.ActionLink("Administration in Root", "Index", "Administration") %>

Make sure the link works as supposed to.

Stop debugger, and add a new area by right clicking on the solution, selecting "Area". Give the Area as "Administration".
Create an "Administration" controller as in the application's root, but this time inside the "Administration" area. Create the associated view folder/index.aspx page.Place the following text in the content area.


   1:  <h2>Administration in Area</h2>


Run the application as before, and verify that "Administration in Root" this time fails with a message similar to below:


Add another link to the "Administration" area from the "Home/Index.aspx".


   1:  <%= Html.ActionLink("Adminstration in Area ", "Index", "Administration", new { area = "Administration" }, null)%>


This link now will work from the Home/Index.aspx page hitting the "Administration in Areas" as shown below:

Now we need to fix the "Administration" in the application's root path.
To do that, we add the "namespaces" optional parameter in the routes.MapRoute function as shown below.


   1:     routes.MapRoute(

   2:                  "Default", // Route name

   3:                  "{controller}/{action}/{id}", // URL with parameters

   4:                  new { controller = "Home", action = "Index", id = UrlParameter.Optional },  // Parameter defaults

   5:                  new string[]{"DuplicateAreaControllerName.Controllers"}

   6:              );


By adding namespace to the routes.MapRoute(...), the previous Html.ActionLink(..) to the application's root "Administration" screen will work as before. Ideally, you will not run into this, but my team had this problem while trying to develop a similar feature set inside the areas with the same name. Looks like ASP.NET MVC 1.0 does not have a direct solution, but ASP.NET MVC 2.0 has better built-in support, although not very obvious at first. Hopefully you will be careful with your naming and avoid such situations in the first place.

Monday, April 5, 2010

A tour around Open Data Protocol (OData)

It should come as no surprise that the amount of data collected in many places are hard to get to. Given the ubiquity of web technologies, the need for information from the silo’ed databases, file systems and static web sites are increasing more than ever. Google has already established its own GData and offering solutions. Representational State Transfer (REST) and the simplicity of web protocols are also driving these initiatives.

OpenDataProtocol

Microsoft recently rebranded its ADO.NET Data Services (formerly known as “Astoria”) to OData. I have been dabbling with the OData recently and it is quite impressive what is possible just from out-of-the-box. Then again I haven’t built a full-fledged application yet, but first looks are pretty decent.

OData comprises a bunch of well-established protocols helping the creation of HTTP-based data services. It enables access and exposure of variety of sources (database, file systems and web sites). For in depth look, please check out Odata.org

As of today, using Visual Studio 2010 RC, we can build applications and there are few blog posts out there. On VS 2010, start by creating a ASP.NET Web Application from File –> New –> Project, and choose ASP.NET Web Application (see below).

CreateNewSampleWebOData

Next add a new ADO.NET Entity Model for the project. Right-click project and say “Add –> New” and choose ADO.NET Entity Model.

ADONETEntityModel

Give it a proper name. In this example, I am going to use the Northwind database since it is pretty common for SQL Server and at one point, shipped with SQL Server. Now you can download it from here. Once you run the installer you need to attach the database files to your pertinent SQL Server instance.

The wizard will walk you through the Generation of EDMX file. First "Generate from database” option, then point to your SQL Server instance and pick Northwind database.

As shown below, I used “Categories” and “Products” tables for this.

GenerateDatabaseObjects

Now pay attention to your Model namespace, because this will impact your Web.config file. Subsequent attempts of doing the same steps will clutter your file, and you may end up with **Model1, **Model2 names in your configuration file. Click “Finish”. Now you have created a simple ADO.NET Entity model. It is time to expose it via WCF Data service.

In the same web project, add new WCF Web Service. Right-click on the project, and “Add –> New Item…” and choose “WCF Data Service” under “Web” tab. I am starting to like the VS2010’s organization of items, but sometimes things still don’t make sense, or “haven’t adjusted to it yet”.

CreateNewWCFDataService

Once you have created the service, go ahead and open it up source editor. Notice the first line. Change it to say the name of your entities.

public class WcfDataService1 : DataService<NorthwindEntities>





Next, add the below two lines to indicate you are setting these access rules on these entities.



config.SetEntitySetAccessRule("Categories", EntitySetRights.All);
config.SetEntitySetAccessRule("Products", EntitySetRights.All);


Those two lines in place, you can check your work up-to now by simply setting the service as “Start page” and view the results in browser. (shown below for IE-8).



baseEntity



Finally to consume and query this service, we can create a simple console application and add it to this project. I wrote this simple function to display products on the query window (borrowed a few ideas from Scott Hanselman’s a much better and elegant “Creating an OData API for StackOverflow including XML and JSON in 30 minutes”. For in depth coverage, please check out his blog.



First we need to add a service reference, right click on the Console project and choose “Add Service Reference”. You have to copy and paste the web service URL you created in the previous step. Paste the address and hit “Discover”. Again pay attention to your name spaces.



TechMasterService



Again if you think this is too simplistic, you are not alone. At some point we have to worry about security/credentials and maybe performance, but not right now. But impressive what you can accomplish with a few mouse-clicks out of the box.



For VS-2010 RC, you can also download and install “Open Data Protocol Visualizer” from “Tools –> Extension Manager” menu item. Once that is place, it is pretty slick to see it in action.



Right click on the service name from the Solution Explorer and choose “View in Diagram” option. This will open up a nice canvas for object visualization. Notice the new tab called “Open Data Model Browser” . Very nice I thought. You can drop your entities and look at your columns, etc.



OpenDataModelBrowser



Finally to query referenced entity model, I added below to the start-up of this console application.




   1:  class Program

   2:     {

   3:         static void Main(string[] args)

   4:         {

   5:             CallingODataService();

   6:         }

   7:   

   8:         public static void CallingODataService()

   9:         {

  10:             var tm = new TechMasterEntities(

  11:                 new Uri(http://localhost:xxxx/TechMasterService.svc));

  12:   

  13:             var products = from p in tm.Products

  14:                            select p;

  15:   

  16:             foreach (Product p in products)

  17:             {

  18:                 Console.WriteLine(p.ProductName);

  19:             }

  20:   

  21:             Console.ReadLine();

  22:         }

  23:     }



Don’t forget to change your port number to your own as well as your service name. The partial results from the console are shown below.

partialResults
That is pretty much it for now. Looking forward to finding more about oData protocols and exposing silo’ed data using HTTP protocol in secure and simple ways. Happy Coding!