In many ways, the software development process resembles manufacturing. Regardless of whether the product is widget or code, typically, one part of the organization identifies the need, then different departments get involved in scoping functions or features, identifying end markets, design, manufacture, delivery, and post-sales support. The problem is that the process tends to be disconnected, with each group throwing the fruits of its labors over the wall to the next.
In manufacturing and software development alike, there have been numerous efforts to integrate the life cycle. The reality is that turf battles and the fact that each group requires different kinds of tools to do their job has hindered the goal of a seamless life cycle.
For instance, in manufacturing, design engineers produce bills of materials focused around technical specifications. Yet, in the back office, manufacturing bills of material contend with parameters such as grades and quantities. Although there is some common data, you wouldn’t use the same tool to design engineering and inventory-focused bills.
The same goes with software development. A developer who writes code may perform a few perfunctory tests before throwing it over the wall to QA, which tests the app more thoroughly. Consequently, both groups require tools carrying different levels of detail or functionality for testing.
In a nutshell, that’s been a large part of the problem deterring vendors from building tools that integrate the software development life cycle, from requirements gathering to use case analysis, modeling, version control, construction, testing and debugging, final deployment, and operation. Back in the 1980s, IBM’s AD Cycle failed because it tried enforcing a data architect’s view on software development. Although Rational was a bit more successful when it acquired the products that became Rational Suite, the product suffered from neglect as the company failed to update the suite with common technology that would have provided a better alternative to file transfer-based integration.
Now Microsoft has upped the ante with Visual Studio Team System (VSTS), a framework that taps web services and a meta data store (they insist it’s not a repository!). It will provide a degree of integration on managing development akin to what it accomplished with the common front end integration of Visual Studio. On the bright side, Team Studio relieves providers of life cycle tools, such as Compuware, Serena, Borland, and dozens of niche providers from having to write their own interfaces, repositories, or frameworks.
The challenge, of course, is that Microsoft is best known for developer tools, and has little outside experience with tools serving other constituencies, such as testers. Rational had that problem as many veteran UML modelers were put off by the developer-oriented XDE successor to Rational ROSE. Will Microsoft fare any better this time?
Our answer is a qualified yes. OK, the product won’t be out until next year, and it’s not a solution if you develop Java or on other platforms (and that’s a major limitation). But the combination of Microsoft’s critical mass presence in the developer community, the hunger of niche tool vendors to get their wares accepted by wider audiences, and the emergence of standards such as web services which will be used for communications between the functional parts dictate that Microsoft’s approach to unify the life cycle will be treated more seriously than most of its predecessors.
B2B trading hype notwithstanding, the premise of linking trading partners electronically is as old as enterprise computing itself. In the late 1960s, the trucking industry’s experiments with exchanging routine transactions such as purchase orders and shipping notices eventually led to Electronic Data Interchange (EDI). And although EDI standards eventually emerged, securing electronic “handshakes” between trading partner remained elusive because different companies applied the standards differently. Consequently, only top tier companies could afford the time and expense of making EDI work.
The crux of the problem is that, while it’s relatively simple to standardize lower- level protocols, such as opening or closing a message, going higher up the value chain to specify, for instance, what fields to include in a forecast cuts to the heart of how companies differentiate themselves. So, while my company is proud of its lean forecasts, your company includes customer data because that provides more insight on demand patterns for your products. Not surprisingly, ambitious standards efforts, such as ebXML, which attempted to address all facets of establishing electronic handshakes, failed to gain traction.
With web services promising to democratize B2B commerce for businesses of all shapes and sizes, the challenge of creating electronic handshakes grows even steeper. Not surprisingly, web services bodies have effectively addressed specifying how a service is described, listed, and requested, and now they are branching to higher level functions such as asserting what kind of security is enforced, how identity is declared, how business processes are choreographed, and how quality of service requirements are described. Another body, WS-I, is providing standard cases for testing all the handshaking. And, echoing EDI history, where various vertical sectors such as automotives defined extensions to standards, the same thing is occurring with specification of XML business vocabularies for areas like law, insurance, and financial services.
But what’s missing is a standard framework for describing or assembling service contracts. Admittedly, some pieces of the puzzle are falling into place, such as WSDL, which describes the service; UDDI, which provides a registry of services; LegalXML, which provides contract language; WS-Reliable Messaging, which addresses how to specify the way service messages are to be delivered; and BPEL, which “choreographs” business processes. But there is no standard framework for organizing all the service and identity descriptors that in aggregate define the relationship between customer and provider, and the services to which customers are entitled. For now, that’s the domain of individual products, such as Infravio’s new X-Registry, which provides a meta data repository for such descriptors.
Admittedly, standards are no panacea, as they won’t ensure handshakes will succeed. They simply harmonize formats and vocabularies for building and testing messages and content. Still, we’d like to see the standards community attempt a WS-Contract framework that describes service provider relationships. History tells us that would be an uphill battle, but we think it would be one worth the fighting.
Given the reality of software industry consolidation, it’s open season for analysts anytime somebody’s quarterlies don’t quite add up. With BEA’s disappointing software license numbers, it’s probably a case of making a mountain out of a molehill. But that doesn’t obscure some more important issues, like what the future software industry is going to look like.
The immediate question is whether BEA’s numbers make it more likely that Oracle will buy them. However, the more important question is whether “independent” software vendors (the “I” in ISV) can survive in a maturing industry. Arguably, in today’s market, are contracts more likely to be won by what we’ll call “Enterprise” Software Vendors (ESVs) with critical mass in core platforms, databases, enterprise applications, or services operations. Nobody’s getting fired for buying IBM or Microsoft.
The ISV vs. ESV issue is in large part driving Oracle’s contested bid for PeopleSoft. Sure, the idea ain’t pretty, and we don’t like it any more than most of you do. PeopleSoft’s product has larger share and is functionally superior. But as we’ve maintained previously, anti-trust law isn’t about protecting customer investments, it’s about protecting competition. And the ERP space remains too darn crowded.
The poster child for ISVs is Mercury Interactive, which has managed to fend off Compuware and Rational for leadership in the software testing space. But even here, Mercury realizes it can’t stay inside its niche, so it’s expanding to IT governance, a new market where competition is still marginal.
Back to BEA, can ISVs still survive in middleware? Today, with the appserver having become de facto extension of the database or OS, BEA has wisely redirected its emphasis towards portal and enterprise application integration, backing it with a simplified developer environment. Problem is, IBM and Oracle are going there too.
But it can be argued that integration remains a viable ISV market because, outside IBM, the main players are independents. And here, BEA has a potential advantage as later entrant because it is not burdened with costly, pre-J2EE legacy.
But BEA blinked on an opportunity that could have provided an even faster jumpstart in enterprise integration. A few years back, it passed on developing an Enterprise Services Bus (ESB), an approach that some say could be even cheaper than funneling integration through an appserver (BEA’s ESB brain trust then left for Sonic). In so doing, BEA failed to capitalize on the aggressiveness that originally propelled WebLogic to a 2-year head start in the J2EE server space.
Nope, BEA’s fate is hardly cast in stone. And yup, in some cases, ISVs will survive. But now that everybody (including BEA) is building ESBs, wouldn’t it have been great if BEA had been out ahead on this one as well?
Please excuse us if we’re a bit jaded to the latest hype over non-Microsoft clients. The emergence of Linux and the fact that Microsoft Office has not yet secured a stranglehold in fast growing regions like China and India suggests that some form of Linux desktop may actually prove a viable alternative for what Apple used to call “the rest of us.”
But displacing the existing base of Microsoft Office in developed economies? With Office the de facto standard of editable business content, moving to non-Microsoft versions hasn’t been worth the inconvenience of format translation.
So our attention was piqued when IBM threw its hat into the ring, because their focus was not exactly about replacing Microsoft Office. IBM’s new Lotus Workplace products do offer some modest productivity tools, useful largely for read-only work. And, according to Michael Rhodin, who helped spearhead much of the new offering, Sarbanes Oxley might be prodding companies to take another look at the traditional Microsoft Office model of scattered documents on every desktop. Well maybe.
But more importantly, IBM Lotus Workplace is about managing client side costs. Not that this is anything new, we’ve seen lots of asset management, automated software delivery, locked-down desktop, and anorexic client approaches over the years. In the end, people don’t like giving up the perception of freedom that the fat desktop has given them. IBM’s initiative differs in that it takes an idea that is popular — the enterprise portal, which personalizes information delivery — to software distribution and life cycle management, which personalizes functionality.
It does so by combining Lotus collaboration, WebSphere Portal, and Tivoli software distribution and identity management into a server-based provisioning engine with industrial strength technology. The engine is designed to provision all the goodies dynamically, which is a technical way of saying that IBM has finally come up with The On Demand client. Intriguing, yes, but dependence on the network to make this happen in real time tells us that IBM has lots of tweaking ahead of it.