Paying Too Much for Custom Application Implementation

Face it. Even if you have a team of entry-level coders implementing custom application software, you’re probably still paying too much.

Here’s what I mean:

You already pay upfront for fool proof design and detailed requirements.  If you leverage more technology to implement your application, rather than spending more on coders, your ROI can go up significantly.

In order for entry-level coders to implement software, they need extra detailed designs. Such designs typically must be detailed enough that a coder can simply repeat patterns and fill in blanks from reasonably structured requirements. Coders make mistakes, and have misunderstandings and other costly failures and take months to complete (if nothing changes in requirements during that time).

But, again…   if you have requirements and designs that are already sufficiently structured and detailed… how much more effort is it to get a computer to repeat the patterns and fill in the blanks instead?   Leveraging technology through code generation can help a lot.

Code generation becomes a much less expensive option in cases like that because:

  • There’s dramatically less human error and misunderstanding.
  • Generators can do the work of a team of offshored implementers in moments… and repeat the performance over and over again at the whim of business analysts.
  • Quality Assurance gets much easier…  it’s just a matter of testing each pattern, rather than each detail.  (and while you’re at it, you can generate unit tests as well.)

Code generation is not perfect: it requires very experienced developers to architect and implement an intelligent code generation solution. Naturally, such solutions tend to require experienced people to maintain (because in sufficiently dynamic systems, there will always be implementation pattern changes)  There’s also the one-off stuff that just doesn’t make sense to generate…  (but that all has to be done anyway.)

Actual savings will vary, (and in some cases may not be realized until a later iteration of the application)but typically depend on how large and well your meta data (data dictionary) is structured, and how well your designs lend themselves to code generation.  If you plan for code generation early on, you’ll probably get more out of the experience.  Trying to retro-fit generation can definitely be done (been there, done that, too), but it can be painful.

Projects I’ve worked on that used code generation happened to focus generation techniques mostly on database and data access layer components and/or UI.  Within those components, we were able to achieve 75-80% generated code in the target assemblies.  This meant that from a data dictionary, we were able to generate, for example, all of our database schema and most of our stored procedures, in one case.  In that case, for every item in our data dictionary, we estimated that we were generating about 250 lines of compilable, tested code.  In our data dictionary of about 170 items, that translated into over 400,000 lines of  code.

By contrast, projects where code generation was not used generally took longer to build, especially in cases where the data dictionaries changed during the development process.  There’s no solid apples to apples comparison, but consider hand-writing about 300,000 lines of UI code while the requirements are changing.  Trying to nail down every detail (and change) by hand was a painstaking process, and the changes forced us to adjust the QA cycle accordingly, as well.

Code generation is not a new concept.  There are TONs of tools out there, as demonstrated by this comparison of a number of them on Wikipedia.  Interestingly, some of the best tools for code generation can be as simple as XSL transforms (which opens the tool set up even more).  Code generation may also already be built into your favorite dev tools.  For example, Microsoft’s Visual Studio has had a code generation utility known as T4 built into it for the past few versions, now.   That’s just scratching the surface.

So it’s true…  Code generation is not for every project, but any project that has a large data dictionary (that might need to be changed mid stream) is an immediate candidate in my mind.  It’s especially great for User Interfaces, Database schemas and access layers, and even a lot of transform code, among others.

It’s definitely a thought worth considering.

IT Cost Cutting and Revenue Enhancing Projects

scissorsIn the current economic climate the CIOs and IT managers are constantly pushed to “do more with less”. However, blindly following this mantra can be a recipe for disaster. These days IT budgets are getting squeezed and there are fewer resources to go around however, literally trying to “do more with less” is the wrong approach. The “do more” approach implies that IT operations were not running efficiently and there was a lot of fat that could be trimmed — quite often that is simply not the case. It is not always possible to find a person or a piece of hardware that is sitting idle which can be cut from the budget without impacting something. However, in most IT departments there are still a lot of opportunities to save cost. But the “do more with less” mantra’s approach of actually trying to do more with less maybe flawed! Instead the right slogan should be something along the lines of “work smarter” or “smart utilization of shrinking resources”; not exactly catchy but conveys what is really needed.

polar bearWhen the times are tough IT departments tend to hunker down and act like hibernating bears – they reduce all activity (especially new projects) to a minimum and try to ride out the winter, not recognizing the opportunity that a recession brings. A more productive approach is to rethink your IT strategy, initiate new projects that enhance your competitive advantage, cut those that don’t, and reinvigorate the IT department in better alignment with the business needs and a more efficient cost structure. The economic climate and the renewed focus on cost reduction provides the much needed impetus to push new initiatives through that couldn’t be done before. Corporate strategy guru Richard Rumelt says,

“There are only two paths to substantially higher performance, one is through continued new inventions and the other requires exploiting changes in your environment.”

Inventing something substantial and new is not always easy or even possible but as the luck would have it the winds of change is blowing pretty hard these days both in technology and in the business environment. Cloud computing has emerged as a disruptive technology and is changing the way applications are built and deployed. Virtualization is changing the way IT departments buy hardware and build data centers. There is a renewed focus on enterprise wide information systems and emergence of new software and techniques have made business intelligence affordable and easy to deploy. These are all signs of major changes afoot in the IT industry. On the business side of the equation the current economic climate is reshaping the landscape and a new breed of winners and losers is sure to emerge. What is needed is a vision, strategy, and will to capitalize on these opportunities and turn them into competitive advantage. Recently a health care client of ours spent roughly $1 million on a BI and data strategy initiative and realized $5 million in savings in the first year due to increased operational efficiency.
 
Broadly speaking IT initiatives can be evaluated along two dimensions cost efficiency and competitive advantage. Cost efficiency defines a project’s ability to lower the cost structure and help you run operations more efficiently. Projects along the competitive advantage dimension provide greater insight into your business and/or market trends and help you gain an edge on the competition. Quite often projects along this dimension rely on an early mover’s advantage which overtime may turn into a “me too” as the competitors jump aboard the same bandwagon. The life of such a competitive advantage can be extended by superior execution but overtime it will fade – think supply-chain automation that gave Dell its competitive advantage in early years. Therefore such projects should be approached with a sense of urgency as each passing day erodes the potential for higher profits. In this framework each project can be considered to have a component of each dimension and can be plotted along these dimensions to help you prioritize projects that can turn recession into an opportunity for gaining competitive edge. Here are six initiatives that can help you break the IT hibernation, help you lower your cost structure, and gain an edge on the competition:

Figure-1-Categorization-of-

Figure 1: Categorization of IT Projects 

Figure-2-Key-Benefits

In the current economic climate no project can go too far without an ROI justification and calculating ROI for an IT project especially something that does not directly produce revenue can be notoriously hard. While calculating ROI for these projects is beyond the scope of this article I hope to return to this issue soon with templates to help you get through the scrutiny of the CFO’s office. For now I will leave you with the thought that ROI can be thought of in terms three components:

  • A value statement
  • Hard ROI (direct ROI)
  • Soft ROI (indirect ROI)

Each one is progressively harder to calculate and requires additional level of rigor and detail but improves the accuracy of calculation. I hope to discuss this subject in more detail in future blog entries.