10.30.02

Pac Man

Posted in Application Development, Application Lifecycle Management (ALM), Middleware at 10:27 pm by Tony Baer

We’ve said it before, when buying software, corporate customers prefer umbrella packages to best of breed. While not everything lends itself to big packages, the list keeps getting smaller each day as apps and tools are increasingly subsumed into (choose one) the operating system, application server, enterprise application, or database.

And so it goes with portals, which are essentially a function of whatever your company’s dominant data source or ISV is. Theoretically one of the easiest software investments to cost justify, who can argue against a portal that personalizes the information that account reps get if it helps them shorten the sales cycle?

But unlike enterprise applications like ERP or CRM, portals are tactical, not destination buys. Why buy a specialized portal if your dominant source of information comes from a particular application or set of databases? Or, in content management provider Vignette’s view, a way to provide access to content? That was their rationale for buying portal vendor Epicentric. But the PacMan process won’t end there, because content is no longer king (was it ever?), business processes are. That makes content/portal management logically an extension of OSs, ERP/CRM systems, databases, or in environments requiring multiple sources to be integrated, appservers or EAI brokers. We wouldn’t be surprised if somebody else subsequently snaps up Vignette.

What about application development tools? Long ago, Microsoft popularized the integrated development environment (IDE) that made standalone code editors, debuggers and other point tools obsolete. And with its latest version, VS.NET, Microsoft adds APIs that further bind third party niche tools to its common UI shell, not to mention the underlying .NET middleware framework. Exactly the thing that rival IBM is also doing with the open source Eclipse framework on the Java side.

Borland, a niche tools survivor if there ever was one, is now trodding the same path, thanks to recent acquisitions of Starbase, and just now, TogetherSoft. Until now, Borland has survived on its wits, making the best IDEs or niche languages such as Delphi and others. With the new strategy, Borland taking a page out of Rational’s book, something that rivals CA and Compuware have been less successful at to date.

On paper, the Borland/TogetherSoft/Starbase synergy is compelling, because the tools boast superior ease of use, and in the case of TogetherSoft Control Center, ease of integration between UML modeling and the Java IDE. (The product forced Rational’s hand in their release of XDE.)

Maybe Borland might have an easier time integrating its tools, something that Rational has accomplished only in a loose sense since it began its acquisition spree five years ago. But the real challenge will be market share, not technology. Its biggest hurdle? Overcoming Rational’s lucrative relationship with IBM’s ubiquitous Global Services. Spurned by Oracle, which is bulking up its own JDeveloper tools, we expect Borland to align closer to BEA, which lacks tools and is vying with Oracle to become the non-Microsoft/IBM “third way.”

10.16.02

The Customer Is Always Wrong

Posted in Technology Market Trends at 10:26 pm by Tony Baer

Going by the laws of supply and demand, today should be a corporate technology buyer’s market. According to a survey just published by Walker Information, a firm that conducts customer satisfaction studies across all industries, that might not be the case.

Surveying over 2000 corporate IT professionals, Walker found roughly 75% of respondents indicating that they would retain their current IT vendors. The rub, however, is that a third of that group felt trapped in their buying decisions because the costs of switching to something better proved too exorbitant. Significantly, the survey found that this applied more to software than hardware.

Years ago, vendors called this phenomenon “account control.”

Obviously, there’s nothing new or startling here. The same phenomenon occurred during the heyday of legacy systems, most of which were highly customized. While client/server provided the technology disruption, the actual impetus for the software migrations of the 1990s included changes to the business environment that dictated integrated back-end systems, not to mention the necessity of Y2K compliance.

Ironically, the client/server solutions that replaced legacy systems weren’t as open as advertised. Relying on industry standard SQL databases didn’t make the data any more open or accessible, since each ERP package structured information differently. Furthermore, the ascendance of Microsoft Office as the de facto standard corporate desktop suite further constrained IT buying decisions.

Even if the stars align for migration today, it will be a far costlier affair than 5, 10, or 15 years ago because of the simple fact that more people in the organization have computers and are connected to enterprise applications.

Consequently, the traumatic event that makes software migrations feasible won’t come from technology, but business practices. Yes, maybe Linux or web services might become disruptive technologies, but their impact will likely be restricted to software pricing or licensing practices. Instead, the best hope for making software clean sweeps thinkable would be if existing enterprise applications prove too rigid (or enterprise-centric) to support emerging trends such as collaborative business.

10.11.02

Speed Kills

Posted in Technology Market Trends at 10:25 pm by Tony Baer

Is it our imagination, or is history really going faster these days? Last century — barely a couple years ago — the business world dragged itself to the rhythm of “Internet Time.” The immediacy of e-business prodded enterprises to assume risk — almost any risk — in the hope of becoming first to market with new products, services, or (excuse the cliche) some radical transformational paradigm.

Media, technology vendors, and market analysts all played parts in blessing the doctrine of business velocity. For instance, Gartner Group pushed the “Zero Latency Enterprise” (ZLE), a concept for bulldozing bottlenecks to internal processes and information flows. Hindsight being 20/20, it’s clear that ZLE was purely reactive strategy. While speed might have been great for some processes, such as order confirmation or credit scoring, it proved wholly inappropriate for others, such as any form of master planning or strategic analysis.

Not that we hadn’t witnessed the limitations of speed previously. When UNIX technical workstations emerged in the 1980s, doubling and quadrupling the amount of computing power to engineering desktops, the result wasn’t faster design analyses, but more thorough runs that produced better quality products.

So when we saw Gartner headline its 2002 Symposium with the “Real-Time Enterprise” theme, we initially thought we were caught in a flashback from Startup.com. Hearing them out, we learned that unlike ZLE, the Real-Time Enterprise didn’t necessarily call for the entire organization to go real time. Just pick five or ten of the most critical business processes.

Yes, there are precedents for real-time processes in various verticals, such as the banking industry’s “Straight-Through Processing” (STP) initiatives, the healthcare industry’s HIPAA mandates, and the utility industry’s task of managing power grids. But must these exceptions form the rules for the rest of us?

In this economy, approaching any CxO with high concept is not exactly a career- enhancing move. Our conversations with users at Gartner Symposium confirmed this perception, and the fact that the burden of proof is on Gartner to demonstrate that this catch phrase is not just another ploy for selling consulting services.