08.23.06

Mincing Words

Posted in Application Development, Application Lifecycle Management (ALM), SOA & Web Services at 5:45 pm by Tony Baer

One of the neat things about SOAs (services-oriented architectures) is that they’re supposed to promote reuse. Long been one of the holy grails of software engineering, if you’re going to have reuse, you need a place where you can store the assets that you’d like to repurpose.

BEA’s announcement that it’s buying repository provider Flashline as part of its SOA governance strategy reminded us that there’s still confusion over what registries and repositories do.

Dating back to the days of CASE, repositories were supposed to be the place where all the goodies about software development were to be stored. The problem was, no consensus ever emerged as to what actually went into repositories. Do you deposit code itself, or metadata about attributes like platforms, dependencies, other artifacts like test cases, or do you simply keep a card catalog of pointers?

With all the fuzziness about what repositories were supposed to do, no wonder that the market stalled on takeoff. Even Microsoft got scared off after investing millions in a joint effort with the old TI (Texas Instruments) Software. Today they’ve resurrected the repository idea – somewhat – in Visual Studio Team System. But Microsoft labels it a “data warehouse.”

So, when UDDI registry standards emerged as one of the first building blocks of web services, you’d be excused for thinking of it as a new form of repository. As first conceived, UDDI was supposed to be the catalog or yellow pages that service requests would search at run time to discover the right service.

With UDDI version 3, most players in the SOA space began offering their own registries, or partnered with companies providing them. At that point Flashline’s CEO Charles Stack grew quite vocal in claim that UDDI registries were overblown (since the BEA acquisition, he’s ratcheted down the tone). For a brief time, Flashline offered $50,000 to any customer upgrading from UDDI registry to its repository.

Some players like Infravio are trying to bridge both functions with a modular single product approach. They claim that the back and forth communication between registries and repositories exacts a hit on performance at run time.

Others like Flashline, of course, say both are and should remain separate: repositories are where you put service artifacts and metadata at design time, while registries are where you list service descriptions and policies that are accessed at run time.

At the risk of splitting hairs, we’ll side with Flashline on this one. At run time, you’re more interested in checking whether a service is the right one or whether policies will allow it to be exposed, and under what conditions. You don’t need information about all the back-end dependencies that are of more concern to developers. Anyway, while your system is parsing XML, the last thing you need is searching through a complex database.

But as to what role repositories play in governance remains a gray area. Maybe it makes sense to use them for pushing approved services out to the registry. Then again, that overlaps what source code and version control tools from IBM/Rational, Borland, Microsoft, Serena, and others already do. Should we treat services any different?

What about when improper service requests are made at run time? Should the registry check back with a repository or with run time management engine from providers like AmberPoint or SOA Software? Our sense is that repositories are overkill for run time, and that you may as well utilize tools that are designed for fast response. We believe that repositories are where you store and version policy, rather than enforce it.

Significantly, BEA is positioning the Flashline acquisition as part of an SOA governance strategy. We agree that the important word is “part,” because BEA still relies on third partiers for registry and run time services management. And in the future, we expect that IBM will ramp up its alliance (and probably acquire) WebLayers as its SOA governance response.

Services have finally given repositories a purpose for being, but we’re still a ways from figuring out where they actually fit in.

08.11.06

Content is King – Act 2

Posted in IT Governance, Technology Market Trends at 9:46 am by Tony Baer

A glance at the ongoing double-digit growth of the storage industry should convince you of one fact: the world’s appetite for data is fast outpacing whatever the technology price/performance curves are.

So if you subscribe to the notion that at least 80% of corporate data is unstructured – that is, it sits outside databases – the content management folks should be having a field day.

Content management was rooted in the classic client/server document management or imaging tools that specific regulated sectors, like aerospace, healthcare, or pharmaceutical firms, relied on. But those markets peaked early because of their narrow appeal and high cost barriers.

With the emergence of the web, and the need to keep websites current to draw traffic, content became king. That sparked a wave of web content management tools that grabbed attention and revenues from their more staid document counterparts.

Ironically, the unrelated events of 9/11 followed several months later by the demise of Enron reversed fortunes for the content folks. Content was still king, but for a different reason. Sure, web sites needed to stay current, but lots of cheap commodity tools kept vendor margins down. But in the wake of Enron, new corporate governance mandates, combined with a wave of customer privacy laws, placed content management under a new light. And, restyled as “enterprise” content management, the document folks regained their groove.

Today, four enterprise content players own roughly half the market. After yesterday’s IBM announcement of its intention to buy FileNet, that number will dwindle to three.

Compliance drove IBM to fork out $1.6 billion for a company that complemented and, in some areas, competed with it. If you have any doubt about IBM’s investment, consider the fact that after the Enron verdicts, the backlash for softening SOX pretty much evaporated. And following recent disclosures of privacy breaches at AOL and the VA, if anything, privacy laws are likely to be tightened.

For much of the corporate world, documenting who has access to what version of the truth and when is becoming a matter of survival.

Although there was some competition in the business process management portions of their businesses, for the most part, FileNet’s content workflows fill some key gaps for IBM. Both companies are already pretty familiar with each other, having a track record of joint engagements or presence at many of the same Global 2000 accounts. In many ways, the deal is redolent of asset management player MRO Software, which IBM announced its intentions to buy last week.

Admittedly, FileNet, from its image management heritage, has had its share of challenges prodding customers to migrate to P8, which provides more robust, dynamic workflows. Sounds like a rich invitation for IBM Global Services.

When you think about it, the deal for FileNet looks a bit anticlimactic. Heck, it’s been nearly three years since EMC bought Documentum, leaving barely two major independents standing. In the meantime, IBM bought a company linking content stores, like FileNet, Documentum or whatever, to its mainstream data integration platform. So when it came to FileNet, we’re wondering what took IBM so long?