Neuma Technology: CM+ Enterprise Software Configuration Management for Application Lifecycle Management

Neuma Technology Inc. provides the world's most advanced solution to manage the automation of the software development lifecycle.
Neuma Technology Inc.

More white papers:

Find out more ...


Neuma White Paper:

CM: THE NEXT GENERATION - Evaluating and Selecting a CM/ALM Tool

Selecting a CM/ALM tool is no small task these days. The CM/ALM tool is no longer just another developer tool, such as a compiler or IDE environment. Instead, it forms the backbone of a development organization. It will strongly influence an organization's ability to measure and control quality, meet schedules, comply to requirements and assess development costs. It will also be the central focus for all development processes. It will support a key knowledge base, and will be viewed as the vault for both software and product assets.


A CM/ALM tool is also a key communication tool. Wrongly implemented, it will impede both process and communication. Successfully implemented, it will add significant value to all users, reducing workload and providing effective decision-making capability. Looking at the evolution of CM tools, the backbone technology capabilities increase dramatically from generation to generation.

We'll look at a few key areas of evaluating and selecting a CM/ALM tool in this article, with a clear focus on the different generations of CM technology:

  • I. Surveying the Landscape

  • II. Let the Vendors Help You Create Your Requirements

  • III. Your CM/ALM Requirements

  • IV. Vendor RFIs and Responses

  • V. In-house Evaluation

  • VI. Payment Considerations

I. Surveying the Landscape
Evaluation can be a long process or a not so long process. It depends on how you proceed. It may be possible to use the input from other evaluators, but if you do, make sure you're working to the same set of requirements and make sure that the information is accurate and up to date. Some CM vendors move their technology ahead much more rapidly than others. Often some of the best solutions are overlooked or there is a bias in the evaluation due to experience with past systems.

Don't start by restricting yourself to Open Source solutions. Why? There are two big reasons: the cost of the software will not not come close to being your biggest cost, and, you'll find it much more difficult to discover advanced requirements as Open Source CM often caters to a smaller common denominator of functionality. At the same time, don't exclude Open Source solutions. It's important to measure them against your requirements and to get a feel for where they sit with respect to commercial offerings.

Don't restrict yourself to the most-used solutions. There are a few old and clunky systems out there that have been around forever, it seems, and which may have been advanced at one point in time, but perhaps are not so much so now.

Here's a set of questions to ask all prospective vendors:

1. How long does it take to install?

2. What hardware do I need to do a full evaluation?

3. Do I need additional 3rd party software?

4. Can I perform the installation myself?

5. How quickly can I load in my existing data (so that I may do a more accurate evaluation)?

6. How much training will I need to evaluate?

7. How much customization will I need before evaluation?

These questions should be a starting point. Assuming you have a fixed set of time and resources for evaluation, you may want to get a good feel for how many tools you can evaluate. Be careful not to make assumptions here. The simplest tools may be very easy to install and evaluation - but so may the most comprehensive tools. And a number of tools can eat through your evaluation budget before you have a chance to look elsewhere. I strongly recommend looking at the easiest to install/evaluate tools first. If you look at a difficult one first, you may assume that all tools will require that amount of effort and you may be much less likely to cast your net wide enough.

II. Let the Vendors Help You Create Your Requirements
Do you know what your requirements are? Have you been through this process many times before or is this the first time. Regardless, your requirements are, at least in part, formed your perceptions of what technology is available. And in fact, many get hooked on one solution just because they see some technology innovation that they haven't seen before. I know the CM industry as a whole is advancing slowly, but there are a few vendors that have quite a number of innovations - not just nice gimmicks or fancy reports, but real time savers and process support.

Even if you don't choose to evaluate some tools, it's important to look at what they have to offer. Give yourself at least a couple of hours on each of 10 tools. I'm certain you'll find some new requirements that are important to your environment. But how can you optimize those couple of hours to get the most out of them. Here's a secret: vendors like to be involved in the requirements phase so that they can get their key features in as requirements - it increases their chances for success. So take advantage, but make sure that you cover a large set of vendors. A vendor will come in to do a 1 or 2 hour presentation or perhaps they'll do a Web presentation for you. And they'll be sure to let you know what you really need!

Beyond that, you might even want to ask the vendor for CM/ALM requirements. Vendors deal with a lot of companies looking for solutions. Of course, there's always a lot of valuable input from cmcrossroads.com.

III. Your CM/ALM Requirements
Just because the vendors have some new requirements for you, doesn't mean you don't need a head start. Still it's hard to put together a CM/ALM requirements checklist. Why? Because there are a lot of requirements, many of which it's easy to take for granted. So it might be a good idea to specify requirements with a broad title and then a detailed paragraph which zooms in on specific capabilities that are important from your perspective.

At the end of this article, I have prepared a fairly lengthy, yet certainly incomplete, checklist of things I would have on my list. Some are basic, some are not. Some are musts, some are wants. I've organized them by CM/ALM generations, from 1st to 4th generation.

It's important to understand each of these requirements, at least to the extent that you can verify or otherwise ascertain that tools you are considering support them.

Generically, you should be looking for process based tools. Prior to establishing your requirements, put together an overview of your process and identify the type of support you're looking for from your tools. Process-centric tools should help you match your process better. And you should carefully explore the differences between their process capabilities and the out-of-the-box configuration. If you need a lot of customization to take the out-of-the-box to a point that you realize the promised capabilities you require, look at the cost of that customization clearly.

Often requirements will look at functionality but fail to do a significant total cost of ownership (TCO) assessment. These are your resource requirements for implementing a solution. TCO includes items such as:

  • Cost of evaluation

  • Cost of supporting platform (hardware and software), and the effort to set them in place

  • Cost of tool licenses (all tools for the ALM solution, not just version and change control), including all options

  • Cost of annual maintenance and both minor and major upgrades

  • Cost of training, administrators, CM Managers and end users, including lost salary for the time trainees are on course

  • Cost of customization, including both ease of customization (how long) and rates (how much)

  • Cost of administration, including applying upgrades, backups, performing multiple site synchronization tasks, and platform maintenance

  • Cost of down time: does it affect 1 user or all, and what is the track record for the tool

  • Cost of data loss (what is the track record for the tools), and cost of recovery

  • Savings due to productivity increases, for all roles

  • Cost of moving from your existing platform to the new one

  • Cost of tool glue if you have to integrate multiple tools, including ongoing salaries to support the glue

  • Costs incurred due to extra platforms/licenses/administration to support poor scalability, and cost of tuning.

This list helps you to understand why the tool licenses themselves are only a portion of the acquisition cost. You may want your vendors to give you a cost percentage breakdown for a 50 person project, using all of the above factors over a 3 or 5 year period.

IV. Vendor RFIs and Responses
Once you have your requirements clear, put together an RFI, asking vendors how well they meet each requirement. Now the problem you're going to face is that each vendor will have different criteria for responding. Your requirement is clear, but "does our tool meet the requirement" may not be. And some vendors will answer "yes" if a small bit of the requirement can be met by spending a few days configuring their tool, while others will answer "no" because they only meet it 75%. Most will add comments if they fall short and answer "no", but will avoid comments if they fall short but answer "yes". As long as they can justify their response at some point in the future - if it becomes an issue - a vendor is generally OK with the response.

What I would recommend is simply rating how well they meet the requirement on a scale of 0 to 5. This doesn't really give them a chance to avoid the heart of the question. At least now, if they say "5" and you purchase the tool and it's only a "2", you have more of a legal leg to stand on. As well, the 0 to 5 scale gives you a chance to zero in on the weaker (and stronger) features of their tool more easily. You'll still get a lot of subjectivity (a 4 for one vendor is a 2 for another), but your overall impression of the tool will help you to adjust the ratings from your perspective. And as most requirements from a vendor will be a 5, if they're in the business with a good solution, you've reduced the set of issues you have to look at more closely.

Don't be unreasonable with your RFI. For example, if a vendor is a day late submitting the response, take that as an indication of their support team - don't disqualify the entire response. You're only hurting your own cause in that case. If you really need to be accountably fair (e.g. government procurement), give demerit points for late submissions rather than having a hard deadline. On one hand, the vendor has seen most of these requirements before. On the other hand, they may have a new release that required more work for the response, or perhaps they're overloaded with sales propects.

Do not eliminate any vendor based on their response. Instead, use the rankings to order how you'll do your evaluation. You may find that a couple of vendors were too lenient in their responses. Or maybe, in fact likely, you'll find a few key questions you forgot to include, like cost of customization, data porting time, and the ability to export data from their tool. These may be significant requirements that you just missed the first time through. Don't be afraid to issue a secondary RFI.

V. In-house Evaluation
The rubber meets the road when it's time to evaluate. You will find a very wide range of evaluation requirements - perhaps wider than you were led to believe by your interpretation of the response results. Maybe it can be installed in minutes, ...after a two-week course. Perhaps the response time is sub-second, ...on the top of the line quad-core system with 16GB of memory. Did you realize that you needed such a machine to do a "proper" evaluation once all of your data was loaded?

Most likely, you'll find vendors have worked hard at making evaluation easy. It's you're first real view of the tool and they want a successful outcome. Some vendors will offer to come in and help you to evaluate. My recommendation: take them up on the offer, BUT only after you have had a chance to do the evaluation yourself. If you have a difficult time without them there, perhaps they can show you why when they arrive. But if they need to hand-hold all the way through the evaluation, you have a clear indication of how much consulting support and training you'll require.

Some tools will install in minutes and be fully up and running. Other tools will install in minutes and be partially up and running. You need to ensure that you can evaluate all of the major components: all of the ALM functions, their global development solution, upgrade procedures, basic customization, etc. You want a feel for how much training and how much consulting you're going to need. You also want to be able to measure the level of administration required. And, of course, you'll want to see how easy each user role is to perform with the tool.

Make sure that your evaluation is taking place with your data. It's fine to go through the checklist once with their "demo" repository. But you need to know how easy it is to load in your data, or at least a significant portion of it. And you need to look at performance with a larger, not perfectly tuned, set of data. The history graph was fine, and very pretty, with only 2 branches and 4 revisions in each. How usable is it when you scale to 10 branches with dozens of revisions in each? As well, evaluating with your own data will help others on the team to more easily participate with real scenarios.

So which tools do you evaluate first? The highest ranked tools, right? Not necessarily. You do not want to get bogged down evaluating a tool. Pick your top 3 or 5 and then address them based on how easy they are to evaluate. Some tools take weeks to evaluate, and others require extensive preparation, such as special repository platforms or special hardware. Move those to the end of your list. There is little correlation between how much prepartion is required and how functional is the tool. Some good tools require significant setup. Some virtually none. Some bad tools require a lot of setup. Some virtually none. Some tools require significant training prior to evaluation. Some require a day or two. Some vendors will charge for this training, others might not. Some will charge you only if you don't purchase their tool.

As you're working through your evaluation, give your vendors a chance... let them know what you like and don't like about their solution. Maybe they'll clarify some features for you: like maybe that "Yes" for "MultiSite" really just meant that you can have a VNC connection to your server network. Or maybe a customization that you've been trying to do for 3 days really could have been done in 3 minutes if you were a bit more familiar with the tool. You'll be working with your vendor somewhat to do your initial customization - perhaps they'll throw in some free customization during your evaluation period. This is more likely to be true if customization is easy to do.

Likely you're going to have different preferences from different evaluators. A CM Manager may be very familiar with one tool having had several weeks of training on it and having used it for several years. But perhaps another tool only requires a few days of training in total because of the more intuitive design. Perhaps a Project Manager likes one tool, while developers like another. This can be a real difficult situation. So find out why and then bring the vendors in to talk to the situation. Maybe the developers don't appreciate that the "process-oriented" tool is actually going to save them time and effort. Maybe the project manager can be persuaded to live with the other tool if certain features are pre-configured or otherwise delivered with the solution. So turn it around and make your vendors show you how agile they are.

The biggest caveat with respect to evaluation is not to expend all (or a significant portion) of your evaluation resources with a single vendor. This can happen easily. You may have picked the wrong vendor to evaluate first or second and they may require that you do training and spend a lot of time customizing and loading your data. Having gone down that path and having spent your resources, you'll be much more reluctant to pursue other evaluations. Set aside a few days for each tool. After that time, move on to the next couple. Then make a decision which one or two you're going to dive into in more detail.

For large corporate purchases, you really should be willing to run pilot projects with a couple of vendors, possibly in sequence, unless you see a clear winner. Your key issues will likely not surface until you have run a pilot for a few months. Your evaluation results should go out to all projects for feedback. Some will agree, some won't. Get the ones that don't on side by having them point out why they don't agree and then present a demo to them to refute the claim, or let them run a small evaluation too. If it's a close decision between two tools, you're likely not going to have a win-win situation - you'll likely have two camps, with the loser subsequently pointing fingers as the winner. But that's life. The best road forward might be to spend a bit more time focused on the "process" evolution and get the two camps to demonstrate how easily the process is going to evolve with each tool.

VI. Payment Considerations

Economy. uP. Down. It's definitely down now. It would be great to get a new tool in to make us more effective, but the dollars just aren't there. Some vendors will have a solution for you. Some won't. Push the vendors. "If your solution is so good, let us use it for a year for 10% down and if we like it we'll buy it." They may say "no". They may say "25%" down. They may say "yes".

Even if you're not economically constrained, this is a good idea. It will let you know how much faith the vendors have in their own tools. Be careful though. Some vendors may say yes, fully well knowing that most of their revenue will come from training and consulting. So you need to examine these as well.

I think it's quite reasonable for a vendor to say: 33% up front, 33% after 6 months and the balance in a year - and you can opt out at any time. For training, consulting and license costs. Perhaps the maintenance costs can be pro-rated to 6 months for the first bit - or perhaps they're already included (i.e. buried) in the initial license cost. Maybe you'll need them unbundled so that your capital costs are lower. Beware of vendors that will require that you pay the full amount up front. Maybe the tool will do everything they say, but maybe it will cost you an arm and a leg in consulting services. But if you've sunk your budget up front, and now have to pay significant consulting fees, or risk looking like you made a bad choice, you're not going to be happy, and neither is your boss.

Vendors know that CM/ALM is a significant backbone application. They should be more than willing to help you through a pilot project with partial payment up front. If you're acquiring for a large project or for a corporate standard, let the vendor know and negotiate a "pilot" project price and payment schedule. And let them know that you're not willing to pay more than a certain amount for customization and consulting. If they can't meet the terms, you really have to be somewhat suspect about the technology or at least the ultimate cost of the solution.

What if they guarantee that their solution is best? Well be careful about the terms of the guarantee. Some vendors are willing to give away the tool (i.e. give you your money back) while they collect training and consulting revenue. Now they may get a bad name if you really don't like their solution, but there may be a legal clause covering publication of your negative impressions. Still some guarantees are a bonified claim that a vendor believes their tool is the best. If so, you'll also notice a willingness from them to admit that their tool is perhaps not the right choice for your specific requirements.

THE BOTTOM LINE
The bottom line is: don't use up your budget and resources until you're sure you have a good solution. It may not be perfect and maybe you end up with the second best instead of the best fit, but make sure you have a good fit before all of the money is placed on the table. If a CM/ALM vendor thinks that's unreasonable, tell them you're not interested. They want the sale and will either admit that their tool has some issues with respect to your requirements, or will propose reasonable payment terms to you.

Go into an evaluation prepared. Pay close attention to the big items. Make sure the tool is flexible enough so that you don't have to pay too close attention to the little items. Make sure you can get into the new tool easily, but also make sure it's easy enough to get out if things don't work out.

Look at your vendor as a partner. You'll be dealing with a small number of vendor personnel, whether a large company or a small company. Make sure they understand you want to minimize risk. They likely do too because every successful implementation of their tool is free marketing and a set of potential reference customers.

CM REQUIREMENTS
Apart from the article proper, I've tried to put together a number of CM/ALM tool functional requirements that may be critical to successful acquisition. This list is not complete. But it will likely contain numerous items you have not yet considered.

First Generation CM Requirements
First generation (1G) CM tools were available through the 1970s and 1980s, but their use continued well into the new millenium (e.g. Visual Source Safe. Such tools were file-based and focused on basic checkout/in and build capabilities. Some of the requirements included:

1 Item and Revision Identification: Uniquely identify all repository items and consistently identify item revisions.

2 Check-out and Check-in with comments: Check files in and out of the repository and add "reason" comments to the checked out/in files.

3 Exclusive Check-out: Checkout of a file by one person can preclude concurrent checkout of the same file by another person.

4 File Retrieval (aka. R/O checkout): Retrieve either latest or earlier revision of a file from repository to workspace directory.

5 Revision Comparison (aka. Diff / Delta): Comparison of line differences between workspace file and the latest repository file, or between any two revisions of the file.

6 Baseline Definition Capability and Reproducibility: Define (and name) a consistent set of file revisions as a baseline such that the same set of file revisions can be retrieved repeatedly.

7 Basic Build/Make Tool Support: Ability to transform the files in a workspace into a working build, using third party compilers/linkers/etc.

8 Basic File Merge Capability: Ability to merge differences between two files into a third file.

9 Basic Scripting (usually OS Scripting Language): Scripting capability which allows automation of build, retrieval of a set of files, basic information reporting.

10 Basic Branching Capability: Revision identification allows definition of branches within an otherwise sequential set of revisions of files. This allows parallel development/tracking of changes for a file.

11 Basic Branch/Revision Reports: Ability to report on existing branches and revisions of a file (file history), as well as the contents (i.e. file revisions) of a baseline.

12 Consistent Backup: Support for a consistent backup capability of the entire CM repository is supported.

Second Generation CM Requirements
Second Generation (2G) CM had peak usage in the 1990s-2000s, and moved CM solutions ahead significantly. As vendors struggle to reach 3G capabilities, 2G tools will continue to be used well into the 2010s. 2G CM tools begin to integrate more of the ALM suite into the solution. This includes Problem Tracking, Requirements Management and Change Management. In particular, change packaging is a significant move forward in the 2nd Generation, allowing changes to be viewed logically, rather than file by file. Today, the vast majority of CM solutions are based on 2G tools (though some still cling to 1G tools).

Key 2G CM Tool requiremetns include:

1 Unix and Windows Concurrent Platform Support: Support for clients must include, concurrently, both Unix/Linux and Windows platforms. Ideally, the same holds for servers, but this is not a hard requirement of 2G tools.

2 Scalability of Solution to hundreds of users: Hundreds of users are able to share a CM repository with reasonable performance characteristics, and ideally without having to partition the repository to have acceptable performance.

3 Change-packaging capabilities (aka. Updates, Changesets): Collecting files which implement a logical change into a permanently defined change package; the ability to check-in and do delta reports on a change package basis; the ability to promote changes, rather than files; the ability to define baselines based on change status rather than individual file status. Ideally, this capability is central to the CM tool and does not require a separate database/user interface to maintain change definitions.

4 Revisioning of Directory Structure: As product development evolves, the CM tool is able to track which directories are used to hold files in both newer and older baselines, and ideally within and view of the repository.

5 Automated Baseline Definition Support: The CM tool must support automation of baseline definitions using change package definitions/promotion levels, and other related change tracking data. Baseline definition should not be a tedious, manual opeartion.

6 Sharing of revisions across development Releases/Streams: The CM tool should not force copying of all files in order to support a new release or development stream. Instead, fixes made to older streams should apply to newer streams if the file has not yet been modified in the newer stream.

7 Software Bulk-Loading Capability: The CM tool provides a means to load in entire product file sets, or at least to load in larger portions of a product in a single operation, rather than the file-by-file method of 1G systems.

8 Parallel Checkout: The CM tool supports the option of using parallel checkouts within a release or stream. In this scenario, even within a given development stream, the option can exist to permit more than one person to make changes to the file concurrently. Ideally, the CM tool detects when this happens and notifies the user checking in code whenever a parallel checkin conflict arises (i.e. checking in files that have been modified by others since being checked out).

9 Branch Label Management or Equivalent: The ability must exist to clearly label branches to support the CM branching strategy of the project. Whether arbitrary branching is used or stream-based branching, labelling must support all of the branching functions (e.g. parallel checkouts, parallel development streams, promotion levels, build/baseline definitions, etc.). Ideally, first order objects other than branches exist to support most of these functions.

10 Support for Makefiles and Build Scripts: The CM tool not only stores Makefiles and Build Scripts, but also can support creation of these based on the set of files in a product. Often this capability is inherent in the related IDE (Interactive Development Environment) tools.

11 Distributed Build Capability (possibly thru tool integration): The CM tool allows very large products to be built across multiple machines to reduce build times. Ideally, the distribution of compile and link operations is performed automatically by the CM solution.

12 Integration with MS SCC compliant IDEs: Microsoft has defined a "de facto" Source Code Control Interface supported by many IDEs and CM tools. Although this interface has changed significantly over time, CM tools should be able to support the majority of SCC compliant IDEs on the most common versions of the interface. This allows the IDEs to plug in to the CM tools which support the MS SCCI. Note that Microsoft has not declared this a standard.

13 Basic Workspace Support: A 2G tool allows you to populate a workspace easily, to compare the contents of the workspace to a specific view (e.g. baseline) within your repository, and lends support to automating the synchronization of differences found in such comparisons. Ideally the CM tool allows for easy synchronization on a daily basis, but also allows for isolation from changes being deposited to the repository.

14 Integrated Problem/Issue/Task Tracking: Problem tracking must be integrated to the extent that problem reports can be traced against change packages, and the states of these problems can be automatically promoted as change status is promoted. Similarly, other tasks/feature tracking capabilities should be integrated so that full traceability from change packages to "reasons" for the change is possible. It must also be possible to generate reports and queries which can translate from problems/features/tasks to change packages, and vice versa.

15 Multiple File Retrieval: The CM tool supports retrieval of multiple files at a time. This includes the ability to retrieve either an entire product source tree, or at least entire source directories, in a single operation. Such retrieval should be possible for any baseline or for any "view" of the CM repository. Retrieval of files of a change package is also a frequently required operation. Restriction based on file type, or based on wild cards, is another important aspect of this requirement.

16 Scripting and Basic Configuration Capability: A 2G CM tool has significantly more functionality than a 1G tool. Because of this, more customization is required by each project. So 2G CM tools need some scripting and other customization capabilities to enable a project to support specific usage, data and process requierments, specific to each project. As well, scripting is often required to support implementation of triggers and integration of tools.

17 Basic Rules/Triggers Capability, including email: A 2G CM tool allows extension of process definition through rules and triggers. These are often used to integrate process across tools or to signal information to other users via email.

18 State-based Promotion Model: A 2G CM tool defines states or promotion levels for change packages (possibly as file states, but ideally not), for problem reports, and for features/tasks. These can be customized to the requirements of the project and the model is generally expressed through a state/transition diagram or equivalent. This allows state-based tracking of product/project status.

19 Basic Repository Data Security: Access to the CM repository must be controlled. Both access to data and to the ability to change the data needs to be controlled.

20 Graphical User Interface: The command line interface (CLI), while still important, gives way, in normal usage, to a GUI-based interface in 2G tools. The GUI needs to be sufficiently useful and intuitive that there is a tendency to migrate to it from the CLI interface. Ideally, a substantial amount of GUI customization can be done with a reasonable amount of effort.

21 Context View Capability: The 2G CM tool allows the user to specify a "context" through which to view files (and perhaps other information). This eliminates the need to specify specific revisions of files within the tool. Instead, the CM tool uses the context to determine which revision is implied. Context can be static (e.g. a baseline) or dynamic (e.g. the latest checked-in version for release 2).

22 Basic Reporting Capabilities: Reporting capabilities extend beyond basic file history and baseline definitions, to cover change summaries, problem and feature reports and requirements informations. Better tools will permit interactive queries with drill down capabilities. Ideally, most reports do not require an "expert" to specify or run the report.

23 Graphical Differencing/Merge Tools: Difference and merge tools supported with the 2G CM tool are more intuitive to read and are interactive, in the case of merge tools. Typically, color is used to highlight differences, and menus/mouse clicks are used to navigate differences and to support merge operations.

24 Remote Access Capabilities: Support for accessing and using the CM repository from outside its resident internet domain is provided for by 2G tools. This requirement can usually be fulfilled outside of the CM tool proper (e.g. VPN).

25 Comprehensive On-line Help: On-line help is required for all CM tools, and with the additional capabilities, even more so. Ideally, on-line help can be easily customized to reflect tool customizations made by the project.

26 Graphical Navigation of Source Tree and History: The GUI-based interface must support navigation of the source code tree (in a directory/file manner) as well as navigation of source history for individual files.

27 Distributed Developoment Support: Basic means of distributing development geographically. Changes between sites need to be synchronized or at least sent to a central site with reasonable frequency (e.g. daily or weekly).

Third Generation CM/ALM Requirements
With third generation (3G) CM tools, we move more dramatically beyond functional improvements to operational cost reduction and ease of use issues, while expanding into a full ALM coverage. Although peak usage of 3G CM solutions are not expected prior to the 2010s, there are mature solutions available today. There is also strong impetus to move to a 3G solution because of the reduced cost of operation and the better acceptance by users. 3G tools expand the CM user base both horizontally, as additional applications are added in, and vertically, as ease-of-use supports use by executive management and administrative staff, as well as by the traditional technical staff.

3G CM tools are a significant advance in technology, even moreso than the jump from 2G to 3G. However, because of the advances in technology, support requirements are reduced, allowing improved pricing over 2G tools, and allowing lower budgets for internal CM operations. This may be somewhat compromised by the extended breadth of coverage which requires additional process specification and training, covering a new class of users.

A brief description of some key requirements follow:

1 Low Administration: Internal support teams for CM administration should be small for 3G tools, ranging from a part-time position for smaller projects, to a 2-person team for very large projects.

2 Fully Interoperable Windows/Unix, Big/Little Endian: Interoperability between Windows and Unix/Linux extends to both servers and clients, with equivalent functionality available across platforms. Support for easy migration between big and little endian (typically Intel and non-Intel) platforms must be supported.

3 Platform-independent scripting integrated with repository data: Scripting must not be dependent on the platform used. It is expected that the underlying platforms will change at least once during the lifetime of the CM tool for your project.

4 Fast Roll-out and Upgrade Capabilities: 3G systems are easy to evaluate and roll-out. Similarly, upgrades are relatively painless and typically do not involve any significant down time.

5 Seamless Integration of CM/ALM Applications: In a 3G CM tool, applications are seamlessly integrated so that a single user interface, a single set of base training, a single repository and a common process engine are used. Forms, reports, customizations and administration are no longer managed in an application-specific manner, but in a common manner across the life cycle. Administration and upgrade processes are dramatically simplified.

6 Extensive High-Level Configurability: Process, GUI, Schema: 3G CM/ALM tools allow relatively simple customization capabilities allowing the user interface, the CM data and the CM/ALM processes to track precisely the corporate models.

7 Easy Bulk-loading Capability for End Users: The user should be able to load in data easily. This supports evaluation in the context of the customer's data as well as rapid capture of existing data.

8 Stream-based automated branching: Sufficient mechanisms are provided by the CM tool so that branching is not overloaded and can be used to support stream-based development, where predominantly, there is a single branch (or at least a small fixed number of branches) per development stream.

9 Change Package-based CM Model and Processes: The CM tool not only supports change packages, but the entire process embedded in the tool supports the change model as well.

10 Change-based Promotion Model: There is a change-based promotion model which is used to drive promotion of changes through the system. This can be used to automate the build process by harvesting changes at a given promotion level for inclusing into a nightly build.

11 Automatic Build/Make/ANT File Generation: Ideally, a 3G CM tool, supports the automatic generation of build and Make files.

12 Workspace synchronization/rebasing automation: Workspace synchronization is automated from a given user initiation. This may involve some user interaction to resolve merge conflicts.

13 Interactive Build/Release Comparisons: The user interface allows the comparison of builds and/or releases from the perspective of change packages, source code differences, problem/feature differences, and requirements addressed. It should be possible to summarize and zoom in to such content/differences.

14 Support for Multiple Baseline Bulk-loading: Loading of multiple baselines of data from the pre-existing CM tool is supported. As such, the tool does not have to be used only going forward, but can replace virtually all usage of pre-existing tools.

15 Data Filtering/Find on Data Browsers: Data browsers within the tool support some level of filtering or "find" operation

16 Queued Exclusive Checkouts: Exclusive checkouts can be queued in the case that a file is already checked out.

17 Eclipse Integration: A 3G tool should support integration with Eclipse development environments.

18 Rapid Performance: A 3G CM tool must be responsive. It must be possible to use it during meetings without delaying proceedings. A poor performing tool dramatically reduces the user-friendliness of the tool, and this is unacceptable for 3G solutions.

19 File System Browser Integration: At some level, it should be possible to view file revisions in the repository directly from the the operating system platform. Typically, 3G capability might be restricted to a file explorer view, or might automatically mirror various CM configurations in real file system directories.

20 Formal Support of Stream Based Development: A 3G tool must permit a natural development cycle which meshes with the marketing releases of the products under development. As most products follow a repetitive release cycle, the CM tool must support an organization which clearly map development efforts, planning data, and file branches onto the streams of release development.

21 Minimizing Branch/Merge - Elimination of Labeling: Branching and merging is rationalized in a 3G tool by the introduction of first order objects and mechanisms which support various CM operations without the need to use branching. Branching should be primarily reserved for parallel development (ideally, parallel development streams only), providing an intuitive branching strategy. Labelling on branches, changes, file revisions, etc. are now inferred from user context, user operations and traceability data rather than through a manual labelling technique. All "labels" are now automatically supplied by the tool.

22 Integrated Process Workflow Capability: The statue-based process model of a 2G tool is expanded to full state-based object flow, with roles, permissions, rules, triggers and tracking, and to work flow capabilities across objects.

23 Access Control Beyond File System: Fine grained access control is provided based on roles defined in the CM tool. This goes beyond the traditional file system (owner, group, world) access control, to very specific role-based, and even user-based, permissions.

24 Multiple-site Distributed Development Capability: Support for distributed development is supported, while maintaining the ability to create consistent backups. Distribution of data applies to all elements of the ALM, not just source code.

25 Scalability to hundreds per server/platform: The CM tool should not require more than one server at a single site unless several hundred users are using the CM repository at that site.

26 End-to-end Traceability (Requirements to Builds/Test Cases): Traceability navigation is supported through various browsers such that it is always easy to map between one set of artifacts and other related artifacts. This includes requirements traceability to test cases and actual test results, from build defintions to feature and defect content, from source code to change packages and back to requirements, etc.

27 Advanced Data Import Capabilities: It must be easy to import data from existing solution components. This includes source code, documents and data (e.g. problem reports, tasks, etc.)

28 Project Management with Gantt Charts and WBS Support: A basic project management capability must support Agile development (priority-based feature/task driven development), preferably with charting to show plans and progress. As well, the tool supports a work breakdown structure (WBS) so that a project may be easily decomposed into workable units.

29 Full ALM Suite, from Requirements Tracking through to Test Suite Management: The CM/ALM suite must cover version control, change control, document management, requirement tracking, test case management, build and release management, problem tracking, feature/task management and ideally can be extended to cover other development areas (e.g. lab time assignment).

30 Real-time Metrics to Support Decision Making: A 3G tool must allow easy havesting of up-to-the-minute metrics that can be used for risk assessment, quality assurance and decision making.

31 High Reliability and Availability: The CM/ALM repository content must be available at all times (less than 24 hours outage/yr). A larger outage is permissible for repository content change capability, (e.g. for consistent backups it may be disabled) providing it does not impede the project significantly.

32 Data Transaction Journaling and Data Recovery Capabilities: All changes to the repository, source code or otherwise, must be clearly tracked so that it is always possible to know who did what, and when. There must also be support for recovery in the case of a disk crash or other repository corruption. If transaction journals remain intact through such problems, it should be possible to recover without any loss of data.

33 Advanced Backup and Redundancy Capabilities: As data volumes can grow significantly, advanced backup techniques must be used so that full backups can be performed in reasonable time frames. This is especially true if changes to the repository are disabled during backups. There should be sufficient redundancy such that operation of the CM/ALM environment can survive disk crashes/errors, network problems, etc.

34 Web Access Interfaces: At a minimum, a web interface must allow access to information within the repository to support field personnel, or specific contractor or customer access. Controls must be sufficient to adequately ensure that access is restricted according to the user. Ideally, updates to the repository (e.g. feature requests, problem reports or replies) can be performed through the web interface.

35 Flexible Reporting and Interactive Query: A 3G CM/ALM tool must support flexible reporting, within a specific application and across applications. It must be possible to produce both summary and key detailed reports directly from the tool (although in some cases some scripting may have to be put in place ahead of time to define the reports). As well, the tool must provide various interactive query capabilities so that high level data displays may be drilled down to reveal details, and so that traceability information can be traversed directly through mouse clicks.

36 Selectable Differencing/Merge Tools: As various developers and technologies have preferences for how differences are treated, it should be possible to select the difference/merge tools that a user will use with the overall CM/ALM tool. However, the CM/ALM tool must provide generally adequate tools out-of-the-box.

37 Basic Management Dashboards with Drill Down: A 3G CM/ALM tool must provide dashboards to present overall status for various roles. There should be, at a minimum, product, project and configuration management dashboards which support the roles of the product manager, project managers, and configuration manager. Dashboards should present appropriate summary information into which the user can drill down to get additional information.

Fourth Generation CM/ALM Requirements
The future of CM/ALM tools is in the fourth generation (4G). Although peak usage for these tools may not be attained until well into the 2020s, there will be at least one 4G tool available before the end of the decade. If you are looking at tools, an understanding of 4G requirements will help you to measure existing tools and their potential to evolve. As well, many tools already meet some 4G requirements.

4G Fourth Generation CM Tools (peak usage forecast 2020s-2030s)

1 Small Footprint: Despite growing hardware capabilities, a small footprint for a CM/ALM tool is a strong indicator of vendor longevity. It is also a strong indicator of tool simplicity, which is ever more important and functionality continues to increase through the generations.

2 Zero Administration Operation: For the most part, an administrator is not required for a 4G tool. A part-time administrator may be required to deal with specific issues (e.g. applying an upgrade), but these should be intuitive and require minimal manual intervention other than high level directives (e.g. verify synchronization of all sites).

3 Scalability to thousands of users per server/platform: A 4G system must support thousands of users across an enterprise, as organizations begin to consolidate multiple projects/products into a single respository to leverage re-use, process engineering and data mining.

4 Fully Synchronous Multi-site: Multiple sites must be able to synchronize themselves without manual intervention. Near-zero maintenance should apply even to distributed development support. Automatic recovery from network outages must be supported. Data segregation (e.g. ITAR) must be fully supported within the framework of having multiple sites.

5 Full Interoperability between 32- and 64-bit platforms. As a mixture of platforms will persist for quite a while, especially across geographic sites, the 4G tool must support full interoperability between 32 and 64 bit platforms. There must not be any significant effort required to switch a server or a client between platforms, and each must be able to concurrently interoperate with the other.

6 Unified Configuration of all Native and Web interfaces: In cases where the web interface technology is different from the native platform interface, it must be possible to use a single configuration specification for those parts of user interface common to both web and native platforms.

7 Server-only Installation (only trivial Client Upgrades Required): Visible client installation (other than establishing a short cut or other reference to the 4G tool) must be absent. Client side upgrades must follow directly from the central server(s) upgrade, without any client-side intervention.

8 Trivial Bulkloading and Multiple Revision/Baseline Bulk-loading: It must be possible to load in software trees by a simple drag-and-drop, copy/paste or other such trivial mechanism, although additional supporting details may be queried as part of the operation. It should be possible to bulk load multiple baselines of a source tree such that the history between the baselines, and between the individual items within the baseline, are maintained.

9 Fully Automated Configuration Management: Configuration management tasks should be reduced to high level requests. Tedious tasks such as labelling, creating baseline definitions, establishing context views, creating promotion views, etc. must be fully automated, subject only to the parameters required by the high level requests. Users, including the CM managers, should focus only on Change management.

10 Bulk Build/ANT/Make File Generation: Automatic generation of Makefiles, Build scripts, ANT files should replace the need to have to create and maintain such files. Generation of the files should proceed based on having sufficient information within the repository, and especially in the data structure and relationships.

11 Advanced Workspace Management: Workspaces are clearly linked to change packages. Multiple changes are supported in a workspace. The 4G tool reports through visual or other effects on the attributes of the workspace so that, as well as checkin/out status, other attributes, such as ro/rw, missing, differences in the files and other such information is actively portrayed in the source tree view of the CM tool.

12 Source Code and Source Revision Searching: Source code search capabilities are extended in the 4G tools so that operations common in IDEs, such as source code searching, can be extended to the CM repository. This allows, for example, searching across revisions of a source file, or searching through a specific baseline or other context view.

13 File Revisioning augmented with Full Data Revisioning: Along with the normal source code revisions, the 4G CM tool allows revision of finer levels of data including descriptions, data elements and even process workflow.

14 Context-based dependency analysis and layering support: The 4G CM tool allows interactive query of relationships between groups of files to support impact analysis and layering support. For example, it should be possible to identify, using a simple query, whether or not layering is violated because of out-of-layer "include" operations.

15 Promotable Directory Structure Changes: In the same way that file changes may be promoted or rolled back through the status associated with the change, it must be possible to promote or roll back changes to the directory structure. By adjusting a context view, any changes placed into or pulled out of the view must have such structural differences immediately reflected in the view.

16 Dynamic Variant Capabilities: Variant changes are supported such that the change can be dynamically applied to any context view. Hence if a product variant is formed by changing a set of files, that variant may be applied automatically, and dynamically, to any build definition, and user view or any baseline to produce the variant effect for that view. This helps to reduce multiple variant baselines down to a single baseline with variant options which may be specified at build time.

17 Product/Sub-product Management: The 4G CM/ALM tool will manage multiple products, including products which are contained within other products. Such a product hierachy helps to automate the order of building products (i.e. product dependencies), but also helps in navigating the set of product assets of a company so that a specific product release context may be easily selected.

18 Rename Operation Preserving History: Renaming of files which are part of a configuration must preserve the history of existing configurations, while at the same time allowing the history of the file evolution to be preserved.

19 Update Based on Workspace Changes: A 4G CM/ALM tool can look at a workspace and automatically generate an update package, including checkouts and structural (i.e. directory) changes, that is ready for peer review and subsequent checkin. As such directory-centric development (in addition to repository-centric development) is fully supported.

20 End-to-end impact analysis: From a line of code back to the potentially affected requirements, from a failed test case to the potential set of requirements, from a requirement change to the potentially affected set of documents, source code and test cases, the CM/ALM tool provides impact analysis capabilities to pare down the analysis activity.

21 Configurable Unified Process Support: The 4G CM/ALM tool includes unified process support which is easily customized for each organization or project. The support is apparent both in work flow and in on-line process documentation that is easy to navigate.

22 Integrated RAD Capability to Extend Integration Set: It is possible to extend the ALM suite to cover other aspects of the product or project. A rapid application development capability within the engine of the 4G tool allows addition applications to be created and seamlessly integrated into the suite. Example applications might include: Time sheet management, sales prospect management, lab resource usage, meeting management.

23 Change and Revision Control of Requirement Items: Requirements tracking includes full revision control of requirements as well as change packaging for requirements. The change package becomes the central point of traceability for the collection of changed requirements.

24 Project and Quality Metrics and Forecasting Capabilities: The CM/ALM tool supports the production of a wide variety of project and quality metrics, as well as customization capabilities for new metrics, and allows some level of project forecasting based on these measures.

25 Customer Request Tracking: The 4G CM/ALM tool supports customer request tracking with the ability to report to a customer the status of all completed and outstanding (i.e. not delivered) requests.

26 Peer Review Tracking Support: The 4G CM/ALM supports on-line peer reviews through interactive change-centric review panels which allow navigation through difference/delta displays and identification of actions and issues resulting from the review.

27 Test Run Management and Metrics: The 4G CM/ALM tool tracks the results of test runs against various builds so that it is possible to identify which builds tests have last failed/passed against, and so that sufficient metrics can be provided to indicate both the suitability and expected time frame for release of a build.

28 Data Management: The 4G CM/ALM provides full data management for development, including traceability and the ability to easily adapt the schema to track any and all significant project and product data.

29 Dynamic Management Dashboard Capability, Customizable: 4G CM/ALM dashboards are dynamic so that they can traverse products, builds, releases, sterams or any other potentially dynamic selection criteria (i.e. without having to re-launch the dashboard). In this sense, it is possible to navigate through project or product status across releases, or to navigate through a list of build comparisons. Dashboards are easily customized to have exactly the information desired by the role or by the activity being performed.

30 Electronic Authorizations: Electronic authorizations, including signing of reviews or documents, is supported across the entire suite of ALM applications.

31 Warm-stand-by Disaster Recovery: The 4G CM/ALM tool supports warm standby disaster recovery so that a disk crash, an explosion or a natural disaster does not impede the work of clients - a manual or automated redirection of the client/server connection leaves the client with the ability to continue his/her work without any loss of progress or context.

32 Checkpoint/Recovery Capability: The 4G CM/ALM tool allows relatively small checkpoints to be created and used as consistent recovery points. Ideally, transactions created after the checkpoint can be applied automaticaly to the checkpoint after recovery if desired.

33 Ultra High Reliability and Availability: Data availability is available 99.95% of the time, with a down time of less than 4 hours per year.

34 ITAR Data Segregation: Both Physical and Logical data segregation is supported so that a common set of data can be shared with proper regard to which data is visible to which users.

35 Recovery from Malicious/Subtle Data Corruption: The 4G CM/ALM tool supports identification of and full recovery from data sabatoge or subtle data corruption, with minimal loss of data or effort.

36 Proven Longevity of the Tool on projects (12+ years): The 4G CM/ALM tools has proven its logevity with at least a dozen years operation on projects of a significant size and constant activity.

37 Extensive Report Formats (XML, Spreadsheet, HTML, Text, etc.): Reporting capabilities are extensive to support intranets, data export and various other reporting needs.

38 Interactive Browsers (Hyperdata, Tree-browse, form browse, etc): The 4G CM/ALM tool includes a variety of history, tree, hyperdata and form-based browsers to interactively navigate the wealth of data.

39 Executive Summary and Interactive Drill-down Capabilities: Interactive abilities include drill-down capabilities on charts, graphs, summary displays down to the finest details.

40 Security – File Access Logging: The CM/ALM tool supports the option of full file access logging so that it is possible to identify both who has modified and who has accessed a file managed within the tool

41 Pre-populated Role-based Information Tabs and/or Dashboards: The 4G CM/ALM tools starts up with a configurable set of pre-populated role-based information sets, each a single click away. This allows instant access to the most frequently used summaries and information displays without having to request the information.

42 Configurable, Organized Role-based Information: The 4G tool has a role-based, intuitive interface showing a user, based on his/her role(s), the set of tasks, assignements, plans, notices or other information particular to that user. The user is able to navigate his/her various to-do lists and review them in a prioritized fashion.

The End... or Not The End
Does this leave anything for the 5th generation? Believe me, there's still a long ways to go after attaining even a 4G CM/ALM solution. However, beyond the 4th generation, CM/ALM reaches out far beyond the world of development and technology.

I've laid out a rough guide for selecting tools. I've categorized requirements by the CM/ALM Generation in which I think they belong. I'm sure others have additional requirements, or perhaps finer details for these. I'd love to, in fact I'm sure everybody would like to, see a matrix of existing tools against these requirements. I'd love even more to see a matrix of vendor plans for 2010 against this set of requirements.

If you are doing a review of tools and would like to measure tools against these requirements, we'd love to collect the results by Tool.YearOfRelease. If I collect enough, I'm willing to find someone to support an on-line spreadsheet of the results.


Joe Farah is the President and CEO of Neuma Technology . Prior to co-founding Neuma in 1990 and directing the development of CM+, Joe was Director of Software Architecture and Technology at Mitel, and in the 1970s a Development Manager at Nortel (Bell-Northern Research) where he developed the Program Library System (PLS) still heavily in use by Nortel's largest projects. A software developer since the late 1960s, Joe holds a B.A.Sc. degree in Engineering Science from the University of Toronto. You can contact Joe by email at farah@neuma.com