I have the opportunity to review and analyze a lot of different application code bases, across a number of difference technology stacks. Some of these are custom software applications that Alliance is building or maintaining for our clients. Some of these are open source packages we are using in our work. Others are analyzed for our clients as part of our Application Assessment and software testing outsourcing solution.
One of the things that continues to surprise me is the wide variance in code quality. After seeing so many different applications, created by so many different development teams of different skill levels Iknow I shouldn't be surprised at some of the things that I see. Every now and then a particular issue jumps out as so obvious, why didn't the original developers write the code better?
One recent example involves a Application Architecture Analysiswe did for a government client using the CAST Application Intelligence tool. The application was a medium sized .NET web application connecting to an Oracle database. The application was a port of an existing Powerbuilder desktop app, and exhibited a lot of classic problems with simplistic porting. Each screen in the desktop application was directly mapped to a web screen, without regard for whether the type of navigation and state management - not to mention browser round trips - made sense in a web application.
At the start of the engagement, the client identified that they had concerns around the correct handling of database connections. CAST is a great tool for finding problems like this in custom .NET applications, as the .NET analyzer is able to identify specific methods in which a database connection is opened but not closed. It's certainly much faster and more accurate to browse an easy-to-use Dashboard pointing to the exact 18 locations in the 400,000 Lines of Code (400 kLOC) that should be checked rather than trying to manually search ASP.NET pages and code behind files, dozens of VS.NET projects, and hundreds of C# classes, to find the needle in this haystack.
After identifying and fixing the specific problems, the database connection leaks were fixed, and the application was able to proceed through user testing. A perfect quick win for Application Risk Assessment & Code Quality analysis.
One of the surprising (yeah, I know, I shouldn't be surprised!) things in this case was that in many parts of the code base the developers had taken advantage of the "using" keyword in C# to automatically perform resource management of the database connection. This is a much simpler approach than trying to enforce correct usage of finally{} blocks for resource clean up and has been available for several versions and many years. Yet many developers are not aware of it, and do not use it regularly! This is a simple best practice that would enable higher quality code with less effort and fewer defects.
And that means higher productivity. Sounds like an obvious win to me!
These are the types of best practices we teach to our teams as part of our RightWare Software Development Process.
No comments:
Post a Comment