More white papers:
Find out more ...
Neuma White Paper:
CM: THE NEXT GENERATION of Seamlessly Integrated CM/ALM Suites
There are a lot of CM/ALM solutions out there - so where are we headed. In my opinion, the future of CM/ALM will be defined by the level of tool suite integration, more than by any other factor in the 3rd and 4th Generations of tools. And basic "integration" will not cut it. Putting tools together into a single package with some glue and triggers to help the tools interact is helpful, but will fall short of market demand. "Seamless Integration" will be a requirement. No advance is more important to the next genreation of CM tools. What about cost? What about ease-of-use? How about traceability? The answer is simple: First Seamless Integration, and the rest will follow. If you haven't seen a Seamlessly Integrated tool suite, you may not fully grasp this statement. But Seamlessly Integrated tools are the ones that will be dominant in the future. Let's look at why.
What is Seamless Integration? Let's start out by considering solutions which integrate tools from various vendors, or even various tools from the same vendor. Some do a nice job. But the honeymoon ends after the initial shipment. These tools are glued together and it's not easy to change the glue. If you want to expose a new feature from one tool to another, it's a lot of hard work. If you want to upgrade one of the tools, you're probably better off waiting until someone else has done it first, or until a new integrated suite version is released. It's not only risky, but difficult. Not only are the tools dependent on one another, but to a large extent, they're dependent on the specific release features that have been integrated. Change a feature and you can break the integration.
So then, what is Seamless Integration? With seamless integration, the user sees one tool. The database is a single database shared by all applications. There's one, consistent user interface across all applications in the suite. Changes to one application are visible to all the other applications. Releases are done by tool suite, not by tool. And the glue that otherwise is used to hold the pieces together, is replaced by a process and data engine, with rich query, change and data navigation.
Typically in a seamlessly integrated suite of tools, both the set of applications and the set of functions available to each application are customized to the user's roles. Data navigation is not restricted to a single application, but flows as necessary between applications. On the infrastructure side, there's one consistent set of capabilities, whether it be backups, mulitple site operation, workflow capabilites or reporting.
I've seen dozens of CM requirements documentation requiring integration of the CM tool with a particular Problem Tracking tool or Requirements Management tool. And as a vendor it's important that we can put a checkmark beside that requirement. However, almost continuously over the past 25 years, I've had the privilege of using CM/ALM tools with seamlessly integrated applications - even going back to the days way before the GUI came into being. I would not consider using or recommending tools with traditional loose integration - it's like stepping backwards in time. With traditional integration, the focus stays on supporting processes (rather than advancing them) - getting the tools to talk sufficiently to each other to support the process, and then to maintain that level of interaction over time.
One of the most fundamental capabilities of a seamlessly integrated CM tool is an unprecedented level of traceability. This assumes, of course, that the process and data schema support such requirements. But when they do, the expressiveness of the query capabilities are generally sufficient to allow, not only reporting of traceability information, but point-and-click navigation of the same. Some tools do this better than others. I find it is essential to be able to take a build, and in a single click generate a prioritized set of testing activities which address each problem report or feature addressed by that build. Equally important are the following types of tasks:
- Generating a list of problems fixed, features addressed by a build
- Identifying which test cases have been run against a build, and which have succeeded
- Identify which requirements are missing test cases
- Easily translating from a line of code to a change, and the associated change request or requirement which produced it
- Rolling up gantt charts and producing risk reports based on the actual time sheets against each WBS activity
- Generating a quarterly report for each customer on the state of that customer's change requests, both problems and features.
I can't wait for someone to put this info together. If it takes weeks, days, hours or even minutes to produce this information, something is missing. First it means that not everyone will have access to the information - it's too complex or it takes too long. Also, your CM tool will not provide up to the minute data for use in meetings, CCB or otherwise. You'll spend valuable resources everytime you need this type of information and you'll probably spend more time trying to reduce the effort to get it.
Backups and Restore Capabilities
One good indication of seamless integration is how well backup and restore operations can be carried out. If you have separate tools being managed separately, you may find it difficult to maintain consistency between them when you restore from backup. If the architectures were not designed with consistency across applications in mind, it's not likely that this will be the result. This in turn means that when you are most pressed for time, on a restore operation, you'll have to spend additional time getting a consistent view together. On the other hand, most single-database solutions can easily deal with this issue. Or at least they can until it comes to managing multiple sites.
When several tools are integrated things only get more complex. And upgrades pose significant risks to the consistency strategies. However, when the tools are sharing the same repository, consistency is much easier to manage.
Related to this is the ability to checkpoint the solution at a particular point in time. If this cannot be done rapidly, either significant down time results or consistency is at risk. When the architecture is built to support the application suite, it may also be designed to support rapid, consistent checkpointing.
Another related topic is how tool recovery works after a system outage. In a seamlessly integrated solution, there is a single recovery procedure, typically automated. In loosely integrated systems, each tool has a recovery procedure and then there is a recovery and consistency strategy that has to be addressed between each pair of tools.
Multiple Site Operation
In this world of international companies, multi-site operation is increasingly a key component of ALM solutions. Look at your existing solution. Does it provide you with a real multi-site capability or is it restricted to source management? Perhaps there's a separate mechanism to handle problem reporting. What about the other applications? In a seamlessly integrated solution, there is a single, consistent mechanism. Whether it's a divide and conquer strategy, where data is split off to the appropriate site, or a full replication solution, where all data is available at all sites, the solution must work across all applications.
Total Cost of Operation
As you can see from the above, the administration and total cost of operation of a seamlessly integrated solution is dramatically lower. Additional cost reduction comes from the improved traceability. And a seamlessly integrated tool suite is typically less costly to deploy and upgrade.
In loosely integrated tools, each tool has to be learned. Customization of each tool may be different requiring different training for each. Several help desks, perhaps both internal and external, are drawn upon for supporting the solution. Costs and complexity escalate. And finger-pointing on the cause of problems, between vendors or between teams within a vendor, becomes inevitable.
If workflow is different between applications, additional complexities and cost arise. And advances are more difficult to come by because they have to be achieved in each tool rather than in a single underlying architecture.
Where does integration end?
It's nice to have your Configuration Management integrated with your Problem Reporting and your Change Management. Is more required? If Requirements Management and Test Suite Management are tied in as well, the solution is even more attractive. So what about Build and Release Management, Project Management, Customer Requests and CRM? What about Time Sheets? Peer Reviews? Pre-sales Support? Documentation? Org Charts?
When do we have too much integration? In my experience, I'd gladly welcome all of these in a single integrated suite. In fact, for the most part, that's how our own environment (at Neuma) works. A single integrated suite. It's true that few on the product team need that wide a view of the world. But when done properly, it sure makes life easier. These pieces are all tightly woven. Or more properly put, the processes are.
And that's the key - to look at the processes. It's important that management data is integrated from end to end. The output of one phase is the input to another and eventually closure is required by verifying outputs from later phases against requirements of earlier phases.
Horizontal vs. Vertical Integration
When we discuss tool integration, we need to talk about both vertical and horizontal integration. Horizontal integration has the same general types of users: Management or various classes of end users. An ALM tool integrates management tools. An IDE integrates implementation tools. An Operating System integrates (or supports integration of) run-time programs.
There is a very high level of data interchange within horizontal integrations. In an IDE, source code and resource editors work with compilers which work with linkers and debuggers. It allows a developer to rapidly move from one "application", such as editing, to another, such as debugging. Ideally, the user sees a single IDE tool. In an ALM tool, users move from requirements to test cases, from change packages to problem reports and features, from baselines to workspaces. A successful ALM tool integration will make the end-to-end integration behave like a single tool.
Vertical integration is much more difficult. There are not a few phases or applications to move between. There are a myriad of them. From countless CM/ALM tools to numerous IDEs. From source tree browsers to various editiors, difference tools and merge tools. It is complex. Whereas an IDE integration will concern itself with a specific tool suite (e.g. the Visual Studio family, or the JSEE environment), a vertical implementation must span all sorts of variations.
Vertical integration is best addressed through standard APIs that must be adhered to by all tools. An Editor should be able to take a filename and a line number or object identifier and go to it. A Merge Tool should take two or three filenames and a few standard options. The higher level tools can then invoke them appropriately. There are hundreds, if not thousands, of common editors. The API must provide the standardization so that these can plug into any CM environment to provide a simple "view" operation, for example. Such a collection of vertical integration APIs establish a Framework for vertical tool integration.
Standard APIs can, by their nature, be restrictive or complex. Take for example the "APIs" for integrating CM tools with Microsoft's Visual Studio or with Eclipse. The former is restrictive - for example it does not support the concept of Change Package (aka Update) based CM very well. The latter is complex - it's no simple matter to integrate a CM tool with Eclipse. Overall, I would expect a fairly basic API to interact with a CM tool, but one that at least caters to the concept of a Change Package/Update. I think Microsoft is actually fairly close with it's unofficial SCC API. However, they complicate things by continuously changing the semantics of the API from their IDE perspective, although officially they have not done so as the SCC API was not officially supported.
For more basic tools, such as editors, difference/merge engines, compilers, etc. standard APIs can be much simpler, and even data-driven (rather than procedural), as is the case with "File Type" behaviour in the Windows registry.
From an ALM tool perspective, Horizontal integration can work very well. The key here is that the horizontal management tools share the same database and user interface, the same process structure and query architecture, the same transaction-based system and multiple site management strategy. In other words, the applications need to be built on a common platform. If not, complexities will arise and integration will be restrictive or complex, just as with vertical integration.
Seamless integration needs to be applied to horizontal integration. End-to-end. You can decide how far end-to-end is, but in my experience, the more I have, the more I want. CM/ALM tools are central to a product development and its support. As the ends expand, they become a critical component of the business management tool suite. Hopefully, you have some idea from this article how Seamless Integration leads to lower costs, ease of use, productivity, etc. As the ends are expanded, the savings can be realized across a wider scope, and ultimately across the entire company. You training costs go down, your staff is more focused on its core product requirements, your quality and customer feedback improve.
So if there's one requirement that I have to put at the top of my list for a CM/ALM tool suite it's Seamless Integration. What's yours?
Joe Farah is the President and CEO of Neuma Technology . Prior to co-founding Neuma in 1990 and directing the development of CM+, Joe was Director of Software Architecture and Technology at Mitel, and in the 1970s a Development Manager at Nortel (Bell-Northern Research) where he developed the Program Library System (PLS) still heavily in use by Nortel's largest projects. A software developer since the late 1960s, Joe holds a B.A.Sc. degree in Engineering Science from the University of Toronto. You can contact Joe by email at firstname.lastname@example.org