Category Archives: SOA & Web Services

Stack envy: Impressions for Oracle OpenWorld 2010

Last year, the anticipation of the unveiling of Fusion apps was palpable. Although we’re not finished with Oracle OpenWorld 2010 yet – we still have the Fusion middleware analyst summit tomorrow and still have loose ends regarding Oracle’s Java strategy – by now our overall impressions are fairly code complete.

In his second conference keynote – which unfortunately turned out to be almost a carbon copy of his first – Larry Ellison boasted that they “announced more new technology this week than anytime in Oracle’s history.” Of course, that shouldn’t be a heavy lift given that Oracle is a much bigger company with many more products across the portfolio, and with Sun, has a much broader hardware/software footprint at that.

On the software end – and post-Sun acquisition, we have to make that distinction – it’s hard to follow up last year’s unveiling of Fusion apps. The Fusion apps are certainly a monster in size with over 5000 tables, 10,000 task flows, representing five years of development. Among other things, the embedded analytics provide the context long missing from enterprise apps like ERP and CRM, which previously required you to slip into another module as a separate task. There is also good integration of process modeling, although for now BPM models developed using either of Oracle’s modeling tools won’t be executable. For now, Fusion apps will not change the paradigm of model, then develop.

A good sampling of coverage and comment can be found from Ray Wang, Dennis Howlett, Therese Poletti, Stefan Ried, and for the Java side, Lucas Jellema.

The real news is that Fusion apps, excluding manufacturing, will be in limited release by year end and general release in Q1. That’s pretty big news.

But at the conference, Fusion apps took a back seat to Exadata, the SPARC HP (and soon to be SPARC)-based database appliance unveiled last year, and the Exalogic cloud-in-a-box unwrapped this year. It’s no mystery that growth in the enterprise apps market has been flat for quite some time, with the main Greenfield opportunities going forward being midsized businesses or the BRIC world region. Yet Fusion apps will be overkill for small-midsized enterprises that won’t need such a rich palette of functionality (NetSuite is more likely their speed), which leaves the emerging economies as the prime growth target. The reality is most enterprises are not about to replace the very ERP systems that they implemented as part of modernization or Y2K remediation efforts a decade ago. At best, Fusion will be a gap filler, picking up where current enterprise applications leave off, which provides potential growth opportunity for Oracle, but not exactly a blockbuster one.

Nonetheless, as Oracle was historically a software company, the bulk of attendees along with the press and analyst community (including us) pretty much tuned out all the hardware talk. That likely explains why, if you subscribed to the #oow10 Twitter hashtag, that you heard nothing but frustration from software bigots like ourselves and others who got sick of the all-Exadata/ Exalogic-all-the-time treatment during the keynotes.

In a memorable metaphor, Ellison stated that one Exalogic device can schedule the entire Chinese rail system, and that two of them could run Facebook – to which a Twitter user retorted, how many enterprises have the computing load of a Facebook?

Frankly, Larry Ellison has long been at the point in his life where he can afford to disregard popular opinion. Give a pure hardware talk Sunday night, then do it almost exactly again on Wednesday (although on the second go round we were also treated to a borscht belt routine taking Salesforce’s Mark Benioff down more than a peg on who has the real cloud). Who is going to say no to the guy who sponsored and crewed on the team that won the America’s cup?

But if you look at the dollars and sense opportunity for Oracle, it’s all about owning the full stack that crunches and manages the data. Even in a recession, if there’s anything that’s growing, it’s the amount of data that’s floating around. Combine the impacts of broadband, sensory data, and lifestyles that are becoming more digital, and you have the makings for the data counterpart to Metcalfe’s Law. Owning the hardware closes the circle. Last year, Ellison spoke of his vision to recreate the unity of the IBM System 360 era, because at the end of the day, there’s nothing that works better than software and hardware that are tuned for each other.

So if you want to know why Ellison is talking about almost nothing else except hardware, it’s not only because it’s his latest toy (OK, maybe it’s partly that). It’s because if you run the numbers, there’s far more growth potential to the Exadata/Exalogic side of the business than there is for Fusion applications and middleware.

And if you look at the positioning, owning the entire stack means deeper account control. It’s the same strategy behind the entire Fusion software stack, which uses SOA to integrate internally and with the rest of the world. But Fusion apps and middleware remain optimized for an all-Oracle Fusion environment,underpinned by a declarative Application Development Framework (ADF) and tooling that is designed specifically for that stack.

So on one hand, Oracle’s pitch that big database processing works best on optimized hardware can sound attractive to CIOs that are seeking to optimize one of their nagging points of pain. But the flipside is that, given Oracle’s reputation for aggressive sales and pricing, will the market be receptive to giving Oracle even more control? To some extent the question is moot; with Oracle having made so many acquisitions, enterprises that followed a best of breed strategy can easily find themselves unwittingly becoming all-Oracle shops by default.

Admittedly, the entire IT industry is consolidating, but each player is vying for different combinations of the hardware, software, networking, and storage stack. Arguably, applications are the most sensitive layer of the IT technology stack because that is where the business logic lies. As Oracle asserts greater claim to that part of the IT stack and everything around it, it requires a strategy for addressing potential backlash from enterprises seeking second sources when it comes to managing their family jewels.

First Takes from Rational Innovate 2010

To paraphrase Firesign Theatre, how can you in two places at once when you’re not anywhere at all? We would have preferred being in at least two places if not more today, what with Microsoft TechEd in New Orleans, IBM Rational’s Innovate conference in Orlando, and for spare change, PTC’s media and analyst day just a cab ride away.

Rational’s message, which is that software is the invisible glue of smarter products, was much more business grounded than its call a year ago for collaborative software development, which we criticized back then as more like a call for repairing your software process as opposed to improving your core business.

The ongoing name changes in the conference reflect Rational’s repositioning, which the Telelogic acquisition closes the circled. Two years ago, the event was called the Rational Software Development Conference; last year they eliminated the word “Development,” and this year replaced “Software” with “Innovate.” Vanishing of Software from the conference title is consistent with the invisible glue motif.

Software is the means, not the end. Your business needs to automate its processes or make better products in a better way. Software gets you there. As IBM’s message is Smarter Planet, Rational has emphasized Smarter Products rather than “Smarter Business Processes. It’s not just a matter of force fitting to a corporate slogan; Rational estimates compound annual growth of its systems of systems (ergo, mostly the Telelogic side of the business) to be well into the double digits over the next 4 – 5 years, compared to a fraction of that for its more traditional enterprise software modernization and IT optimization businesses.

Telelogic played the starring role for Smart Products. The core of the strategy is a newly announced Integrated Product Management umbrella of products and services for helping companies gain better control over their product lifecycles. Great lead, but for now scant detail. IBM’s strategy leverages Telelogic’s stakehold with companies making complex engineered products with other assets such as IBM’s vertical industry frameworks. We also see strong potential synergy with Maximo, which completes the product lifecycle with the piece that follows the product’s service life.

IBM’s product management strategy places it on a collision course with the PLM community. IBM lays claim to managing the logical constraints of product development – coming from its base in requirements and portfolio management. By contrast, IBM claims that PLM vendors know only the physical constraints. The PLM folks – especially PTC – are developing their own counter strategies. For starters, there is PTC’s plan to offer Eclipse tooling that will start with its own branded support of CollabNet’s open source Subversion source code repository. Folks, this is the beginning of a grudge match that for now is only reinforcing the culture/turf divides demarcating software from the more traditional physical engineering disciplines.

Rational also introduced a workbench idea which is a promising way to use SOA to mix and mash capabilities from across its wide portfolio to address specific vertical industry problems or horizontal software development requirements. The idea is promising, but for now mostly vision. These workbenches take products – mostly Jazz offerings – and mash them up using the SOA architecture of Jazz framework to create configurable integrations for addressing specific business and software delivery problems. We saw a demo of an early example that provided purpose built integrations that provided role-based views for correlating requirements to functional and performance tests, through to specific builds that would be accessed through tools that BAs, testers, and developers use. On the horizon, IBM Rational is planning vertical workbenches that apply Rational tools with some of its vertical industry frameworks addressing segments such as Android mobile device development, cybersecurity, and smarter cities.

The idea for the workbenches is that they would not be rigid products but instead configurable mixes and matches of Rational and partner content, through interfaces developed through OSLC, IBM’s not-really-open-source community for building Jazz interfaces. A good use for IBM’s varied software and industry framework portfolio, it will be challenging to sell as these are not standard catalog products, and ideally, not customized systems integrations. Sales needs to think out of the box to sell these workbenches while customers need assurance that they are not paying for one-off systems integration engagements.

The good news is that with IBM’s expanding cloud offerings for Rational, that these workbenches could be readily composed and delivered through the cloud on much shorter lead time compared to delivering conventional packaged software. Aiding the workbenches is a new flexible token licensing system that expands on a model originated by Telelogic. Tokens are generic licenses that give you access to any piece of software within a designated group, allowing the customer to mix and match software, or for IBM to do so through its Rational Workbenches. IBM is combining it with the idea of term licensing to make this suited for cloud customers who are allergic to perpetual licensing. For now, tokens are available only for Telelogic and Jazz offerings, but IBM Software Group is investigating applying it to the other brands.

So how do you cost justify these investments for the software side of smarter products? Rational GM Danny Sabbah’s keynote on software econometrics addressed the costing issue based on Rational’s invisible glue, means not end premise. We agree with Sabbah that traditional metrics for software development, such as defect rates, are simply internal metrics that fail to express business value. Instead, Sabbah urges measuring business outcomes of software development.

Sabbah’s arguments are hardly new. They rehash old debates over the merits of “hard” vs. “soft,” or tangible vs intangible costing. Traditionally, new capital investments, such as buying new software (or paying to develop it) were driven by ROI calculations that computed how much money you’d save; in mist cases, those savings came form direct labor. Those were considered “hard” numbers because it was fairly straightforward to calculate how much labor some piece of automation would save. Savings are OK for the bottom line but do nothing for the top line. However, if you automated a process that would allow you to shorten product lead time by 3 weeks, how much money would you make with the extra selling time, and by getting to market earlier, how much benefit would accrue by becoming first to market? Common sense is that all other factors being equal, getting a jump in sales should translate to more revenue, and in turn, bolster your competitive position. But such numbers were considered “soft” because there were few ways to scientifically quantify the benefits.

Sabbah’s plea for software econometrics simply revives this debate.

IBM offers to buy Sterling Commerce

We should have seen this one coming. IBM’s offer to buy Sterling Commerce for $1.4 billion from AT&T closes a major gap in the WebSphere portfolio, extending IBM’s array of internal integrations externally to B2B. It’s a logical extension, and IBM is hardly the first to travel this path: Software AG’s webMethods began life as a B2B integration firm before it morphed into EAI, later SOA and BPM middleware, before getting acquired by Software AG. In turn, Tibco recently added Foresight Software as an opportunistic extension for taking advantage of a booming market in healthcare B2B transactions.

But neither Software AG’s or Tibco’s moves approach the scope of Sterling Commerce’s footprint in B2B trading partner management, a business that grew out of its heritage as one of the major EDI (electronic data interchange) hubs. The good news is the degree of penetration that Sterling has; the other (we won’t call it “bad”) news is all the EDI legacy, which provides great fodder for IBM’s Global Business Services arm to address a broader application modernization opportunity.

Sterling’s base has been heavily in downstream EDI and related trading partner management support for retailers, manufacturers, and transportation/freight carriers. Its software products cover B2B/EDI integration, partner onboarding into partner communities (an outgrowth of the old hub and spoke patterns between EDI trading partners), invoicing, payments, order fulfillment, and multi-channel sales. In effect, this gets IBM deeper into the supply chain management applications market as it already has Dynamic Inventory Optimization (DIOS) from the Maximo suite (which falls under the Tivoli umbrella), not to mention the supply chain optimization algorithms that it inherited as part of the Ilog acquisition which are OEM’ed to partners (rivals?) like SAP and JDA.

Asked if acquisition of Sterling would place IBM in competition with its erstwhile ERP partners, IBM reiterated its official line that it picks up where ERP leaves off – but that line is getting blurrier.

But IBM’s challenge is prioritizing the synergies and integrations. As there is still a while before this deal closes – approvals from AT&T shareholders are necessary first – IBM wasn’t about to give a roadmap. But they did point to one no-brainer: infusing IBM WebSphere vertical industry templates for retail with Sterling content. But there are many potential synergies looming.

At top of mind are BPM and business rules management that could make trading partner relationships more dynamic. There are obvious opportunities for WebSphere Business Modeler’s Dynamic Process Edition, WebSphere Lombardi Edition’s modeling, and/or Ilog’s business rules. For instance, a game changing event such as Apple’s iPad entering or creating a new market for tablet could provide the impetus for changes to products catalogs, pricing, promotions, and so on; a BPM or business rules model could facilitate such changes as an orchestration layer that acts in conjunction with some of the Sterling multi-channel and order fulfillment suites. Other examples include master data management, which can be critical when managing sale of families of like products through the channel; and of course Cognos/BI, which can be used for evaluating the profitability or growth potential of B2B relationships.

Altimeter Group’s Ray Wang voiced a question that was on many of our minds: why AT&T would give up Sterling. IBM responded about the potential partnership opportunities but to our mind, AT&T has its hands full attaining network parity with Verizon Wireless and is just not a business solutions company.

Tibco’s Hits and Misses

Messaging is Tibco’s business, but it has had a mixed track record when it comes to making the messaging around its message-oriented view of the world. It starts off on the right foot. Its perennial tagline, The Power of Now, has become timelier in a world where the ability to respond is reinforced by the headlines. Just take last couple weeks for example: last Thursday’s weird Wall Street meltdown, and before that, arrest of the foiled bomber of Times Square.

Tibco is hardly alone in voicing such messaging. IBM’s Smarter Planet and Progress Software’s Operational Responsiveness are also about the need for systems that think on their feet. Yet Tibco’s DNA gives it a unique claim to this space as the company was born around fast reliable, messaging. Two years ago, Tibco CEO Vivek Ranadive made the case for event-driven predictive intelligence. Now Tibco is talking about the need, in global marketing head Ram Menon’s words, “to humanize the story better.” That’s always been a stretch for this technology-driven company whose vision has long been driven by a shy, technology centric CEO. Towards that goal, Tibco is taking a step or two forward, but unfortunately also a step back.

The good news is that Tibco is fleshing out it’s “Power of Now” tagline. We saw the first of a new series of simple, straightforward visual ads with short statements of business outcomes, like how Tibco’s event processing helps defense agencies clear the fog of war, underscored by the tagline.

Then Tibco unveiled a new tag line, the Two-Second Advantage, which makes the case if that you have just enough information quickly enough, you don’t need the complete picture to make the right decision. Tibco’s on a roll there, a message backed up by the surprisingly irreverent Ronald K. Noble, the brash New Yorker who heads Interpol, who made the case that such an advantage can have life or death implications in crime fighting, especially when it comes to border control.

The problem is that just when you’ve thought that Tibco finally has gotten on message, it reverts back to its geeky self and steps on top of it. The latest case is its CEO’s Enterprise 3.0 concept that, when debuted in front of a room of analysts, floated like a lead balloon.

His numbering is over simplistic and cuts against popular perception: Ranadive terms Enterprise 2.0 as client/server, rather than the social computing weave that is now seeping into enterprise systems – including Tibco’s. But bloggers of record Sandy Kemsley and Brenda Michelson summed it up best. Kemsley: “Enterprise 3.0 is becoming a joke amongst the analysts attending here today (we’re tweeting about staging an intervention)…” Michelson:” We like the 2 second advantage message, but “Enterprise 3.0” doesn’t resonate, it won’t be meaningful to Business Execs and CIOs.”

Tibco doesn’t need new messaging, it just has to bring out the best of what it already has. It can humanize “The Power of Now” by appending the question, “What does it really mean?” And from that, “The two-second advantage,” and all the business cases that manifest it, become the logical response.

‘Nuff said.

With acquisitions and organic product development, Tibco’s portfolio is broader than ever, and not surprisingly, this year’s event carried announcements of a large number of product upgrades and introductions. For us the highlight was ActiveMatrix BPM, which finally puts Tibco’s business process management engine on the same Eclipse development platform and runtime as the rest of Tibco’s service orchestration products. As a completely new product (this is not iProcess, which becomes Tibco’s legacy BPM offering rooted from the original Staffware acquisition). This is a major development for a vendor that has accumulated a large portfolio of individual products over the years, with the harshest critique being the need for multiple runtime engines: ActiveMatrix, iProcess, BusinessWorks, Rendezvous, EMS, etc. The new BPM offering fills a critical gap in the SOA-oriented ActiveMatrix product family.

Our critique here – as with IBM – is that the use of Eclipse as the design time platform appeals more to developers than business stakeholders. But the fact that ActiveMatrix BPM is intended to be an execution platform means that so-called nirvana of having business people design their own business processes is the type of stuff that you do when in a room with a whiteboard. Fortuitously, Tibco does have something in the works, as it previewed Design Collaborator, a new process definition tool that suspiciously resembled IBM BPM Blueprint; we hope that Tibco designs it so that it could feed BPMN models into ActiveMatrix BPM so it doesn’t become a dead-end product.

There were other introductions, such as the none-too original, retro-named PeopleForms (which sounds like the name for one of Oracle’s legacy PeopleSoft offerings) that for now only churns out SharePoint-like forms-driven apps as a beta. PeopleForms addresses a low end of the market not served by Tibco, developed by what’s left of the old General Interface team; eventually this will be beefed up into something more useful with workflow. We also hope that there might be some rationalization with Design Collaborator, so that this product doesn’t wind up becoming a standalone curiosity.

But the most profound impression came from an acquisition that Tibco completed only in March. Our award for best-of-the-day award from the analyst sessions was demonstration of Netrics, a tiny 15-person outfit out of Princeton, NJ that has developed a patented, algorithmic pattern matching program that really fleshes out the “two-second advantage message’ in providing proximate matches that should be “good enough: to make decisions. Netrics’ technology assigns algorithms as metadata that scores the identity of names or people or things; using that metadata, it quickly reduces large data sets to find probable matches. Those probable matches can be filtered to include or exclude misspellings and typos. Netrics’ technology has ready applicability to identifying event patterns and golden copies of data – and as such, Tibco’s initial plans are to incorporate the technology into Tibco Business Events (their CEP offering) and Master Data Management. On the horizon, it provides a pattern matching approach that complements text mining that is often used in national security applications.

Netrics is not a replacement for data quality – that remains a major gap ion Tibco’s product suite. While the two-second advantage implies having data that is “good enough,” when you perform event processing and must make snap decisions. But over the long haul, you’ll need the kind of feedback loop and reality check on those decisions that business intelligence provides – and for that, you’ll need data that is better scrubbed.

SOA what

There is a core disconnect between what gets analysts and journalists excited, and what gains traction with the customers who consume the technologies that keep our whole ecosystem in business. OK, guilty as charged, we analysts get off on hearing about what’s new and what’s breaking the envelope, but that’s the last thing that enterprise customers want to hear. Excluding reference customers (who have a separate set of motivations that often revolve around a vendor productizing something that would otherwise be custom developed), most want the tried and true, or at least innovative technology that at least has matured the rough spots and is no longer version 1.

It’s a thought that crystallized as we bounced impressions of this year’s IBM SOA Impact event with colleagues like Dorothy Alexander and Marcia Kaufman, who shared perceptions that, while this year’s headlines or trends seemed a bit anticlimactic, that there was real evidence that customers were actually “doing” whatever it is that we associate with SOA.

Forget about the architectural journeys that you’ve heard about SOA; SOA is an enterprise architectural pattern that is a means to an end. It’s not a new argument; it was central to the SOA is dead debate that flared up with Anne Thomas Manes’ famous or infamous post of almost a year and a half ago, and of the subsequent debates and hand wringing that ensued.

IBM’s so-called SOA conference, Impact, doesn’t include SOA in its name, but until now SOA was the implicit rationale for this WebSphere middleware stack conference to exist. But more and more it’s about the stack that SOA enables, and more and more, about the composite business applications that IBM’s SOA stack enables. IBM won’t call it the applications business. But when you put vertical industry frameworks, business rules, business process management, and analytics together, it’s not simply a plumbing stack. It becomes a collection of software tools and vertical industry templates that become the new de facto applications that bolt atop and aside the core application portfolio that enterprises already have and are not likely to replace. In past years, this conference was used to introduce game changers, such as the acquisition of Webify that placed IBM Software firmly on the road to verticalizing its middleware.

This year the buzz was about something old becoming something new again. IBM’s acquisition of Cast Iron, as dissected well by colleagues Dana Gardner and James Governor, reflects the fact that after all these years of talking flattened architectures, especially using the ESB style, that enterprise integration (or application-to-application, or A2A) hubs never went out of style. There are still plenty of instances of packaged apps out there that need to be interfaced. The problem is no different from a decade ago when the first wave of EAI hubs emerged to productize systems integration of enterprise packages.

While the EAI business model never scaled well in its time because of the need for too much customization, experience, commoditization of templates, and emergence of cheap appliances provided economic solutions to this model. More importantly, the emergence of multi-tenanted SaaS applications, like Salesforce.com, Workday and many others, have imposed a relatively stable target data schema plus a need of integration of cloud and on-premises applications. Informatica has made a strong run with its partnership with Salesforce, but Informatica is part of a broader data integration platform that for some customers is overkill. By contrast, niche players like Cast Iron which only do data translation have begun to thrive with a Blue Chip customer list.

Of course Cast Iron is not IBM’s first appliance play. That distinction goes to DataPower, which originally made its name with specialized appliances that accelerated compute-intensive XML processing and SSL encryption. While we were thinking about potential synergy, such as applying some of DataPower’s XML acceleration technology to A2A workloads, IBM’s middleware head Craig Hayman responded to us that IBM saw Cast Iron’s technology as a separate use case. But they did demonstrated that the software of Cast Iron could, and would, literally run on DataPower’s own iron.

Of course, you could say that Cast Iron overlaps the application connectors from IBM’s Crossworlds acquisition, but those connectors, which were really overlay applications (Crossworlds used to call them “collaborations”), have been repurposed by IBM as BPM technology for WebSphere Process Server. Arguably, there is much technology from IBM’s Ascential acquisition focused purely on data transformation that also overlaps here. But Cast Iron’s value add to IBM is the way those integrations are packaged, and the fact that they have been developed especially for integrations to and from SaaS applications – no more and no less. IBM has gained the right sized tool for the job. IBM has decided to walk a safe tightrope here; it doesn’t want to weigh Cast Iron’s simplicity (a key strength down) with added bells and whistles from the rest of its integration stack. But the integration doesn’t have to go in one direction –weighing down Cast Iron with richer but more complex functionality. IBM could go the opposite direction and infuse some of this A2A transformation as services that could be transformed and accelerated by the traditional DataPower line.

This is a similar issue that IBM has faced with Lombardi, a deal that it closed back in January. They’ve taken the obvious first step in “blue washing” the flagship Lombardi Teamworks BPM product, which is now rebranded IBM WebSphere Lombardi Edition and bundled with WebSphere Application Serve 7 and DB2 Express under the covers. The more pressing question is what to do with Lombardi’s elegantly straightforward Blueprint process definition tool and IBM WebSphere BlueWorks BPM, which is more of a collaboration and best practices definition rather than modeling tool (and still in beta). The good news is that IBM is trying the right thing in not cluttering Blueprint (now rebranded IBM BPM Blueprint), but the bad news is that there is still confusion with IBM’s mixed messages of a consistent branding umbrella but uncertainty regarding product synergy or convergence.

Back to the main point however: while SOA was the original impetus for the Impact event, it is now receding to a more appropriate supporting role.

AmberPoint finally gets acquired

Thanks go out to Oracle this morning for finally putting us out of our suspense. AmberPoint was one of a dwindling group of still-standing independents delivering run time governance of the for SOA environments.

It’s a smart move for Oracle as it patches some gaps in its Enterprise Manager offering, not only in SOA runtime governance, but also with business transaction management – and potentially – better visibility to non-Oracle systems. Of course, that visibility will in part depend on the kindness of strangers as AmberPoint partners like Microsoft and Software AG might not be feeling the same degree of love going forward.

We’re surprised that AmberPoint was able to stay independent for as long as it had, because the task that it performs is simply one piece of managing the run time. When you manage whether services are connecting, delivering the right service levels to the right consumers, ultimately you are looking at a larger problem because services do not exist on their own desert island. Neither should runtime SOA governance. As we’ve stated again and again, it makes little sense to isolate runtime governance from IT Service Management. The good news is that with the Oracle acquisition, there are potential opportunities, not only for converging runtime SOA governance with application management, but as Oracle digests the Sun acquisition, providing full visibility down to infrastructure level.

But let’s not get ahead of ourselves here as the emergence of a unified, Oracle on Sun turnkey stack won’t happen overnight. And the challenge of delivering an integrated solution will be as much cultural as technical, as the jurisdictional boundary between software development and IT operations blurs. But we digress.

Nonetheless, over the past couple years, AmberPoint itself has begun reaching out from its island of SOA runtime, as it extended its visibility to business transaction management. AmberPoint is hardly alone here as we’ve seen a number of upstarts like AppDynamics or Bluestripe (typically formed by veterans of Wiley and HP/Mercury), burrowing down into the space of instrumenting transactions from hop to hop. Transaction monitoring and optimization will become the next battleground of application performance management, and it is one that IBM, BMC, CA, HP, and Compuware are hardly likely to passively watch from the sidelines.

As for whether runtime SOA governance demands a Switzerland-style independent vendor approach, that leaves it up to the last one standing, SOA Software, to fight the good fight. Until now, AmberPoint and SOA Software have competed for the affections of Microsoft; AmberPoint has offered an Express web services monitoring product that is a free plug-in for Visual Studio (a version is also available for Java); SOA Software offers extensive .NET versions of its service policy, portfolio, repository, and service manager offerings.

Nonetheless, although AmberPoint isn’t saying anything outright about the WebLogic share of its 300-customer installed base, that platform was first among equals when it came to R&D investment and presence. BEA previously OEM’ed the AmberPoint management platform, an arrangement that Oracle ironically discontinued; well in this case, the story ends happily ever after. As for SOA Software, we would be surprised if this deal didn’t push it into closer embrace with Microsoft.

Postscript: Thanks to Ann Thomas Manes for updating me on AmberPoint’s alliances. They are/were with SAP, Tibco, and HP, in addition to Microsoft. Their Software AG relationship has faded in recent years.

Of course all this M&A rearranges the dance floor in interesting ways. Oracle currently OEMs HP’s Systinet as its SOA registry, an arrangement that might get awkward now that Oracle’s getting into the hardware business. That will place into question virtually all of AmberPoint’s relationships.

BI and CEP: Sybase and Aleri postscript

OK, we’ve gotten off the horn with Sybase. It’s obvious that for now, the deal simply formalizes the partnership that both companies (actually all three if you also count Coral8) already had in place around Sybase’s RAP platform. On one hand, Sybase has little work to do because the products in question already integrate with its offerings, but still has lots of work cut out in rationalizing the Aleri stack. At this point, Sybase has not yet thought about the broader synergies with its BI offerings, but as we state below, it is not simply about integrating products. It is about positioning BI as a broader portfolio that ranges from static, historical trend analysis to operational intelligence and feed-forward predictive analytics. You may not use the same tools or techniques, but in each case you are looking to synthesize insight based on transforming data (as opposed to querying databases).

But it also brings back the debate over whether CEP and BI are converging or different creatures. A few months ago, Streambase CTO Richard Tibbets responded to a Twitter discussion that involved us, stating that CEP and BI are different paradigms: BI infers the need for a human to make decisions, whereas CEP is about real-time data transformation. Our contention is that it’s about ends, not the means. So, true, CEP is not simply fast BI. It’s a different form, which we’d characterize as adaptive closed-loop intelligence that is event-driven. Your systems analyze torrents of data to make discoveries, and based on those discoveries, identify a set of events and apply policies to act on them. In turn you can apply good old-fashioned BI to determine if your system is looking for the right events or applying the right policies. Or you can use parsing of those streams to predict what will happen. That’s still BI because the action comes on analyzing and transforming data, but it’s not your father’s BI.

Whatever Event Processing

Consolidation in the software business is like the force of gravity. Although there will always be best of breed solutions, ultimately as a particular solution space matures, it doesn’t do so in isolation. No technology is an island.

But of course, there’s always been the question, what to do about Complex Event processing (CEP)? Obviously, only a dropout from marketing could have ever dreamt up a product name whose emphatic message is, “this product is too difficult for you.” (Actually the name just evolved out of academic research.)

Research from our day job at Ovum revealed that there is no single set of use cases, but instead that there are uses for event processing that differ based on complexity of the event or the latency at which events must be processed.

The technology itself is not new – financial services firms have used their own routines to drive algorithmic trading or couch risk for years. What’s new – as in new during the past 10 years – is that a commercial software market has evolved out of it. But the market has struggled because of a number of factors that start with the question, “What is CEP or whatever you call it?

For now we won’t get hung up on names. Let’s stick with Whatever Event Processing, and invent a new TLA: WEP II, so as to distinguish it from Wireless Encryption Protocol. OK, we’re just joking. Suffice it to say that we are talking about a technology that parses out events that would not otherwise be human perceptible that can be translate into actionable intelligence.

Nonetheless, whatever you call it – complex event processing, business event processing, or just plain event processing – is a technology that was never meant to stand alone. There is logical synergy with business process management, as processes can trigger a chain of events or vice versa. There is a similar symbiosis with rules processing – you can use rules to parse and identify unique chains of events, or unique chains of events that are identified can trigger response through rules or policy management. And of course there is a synergy with SOA, as event processing can be exposed as a service that may be consumed by other applications.

But what became clear to us was that the very act of parsing and analyzing event streams, whether through time-based SQL approaches or through rules processing for identifying specific occurrences that must be dealt with, is a prime example of business intelligence. This form of BI does not replace other established uses, ranging from look-back historical analysis to quasi-real time BI where data warehouses or data marts are trickle fed data to keep them almost current. Instead, parsing events as they occur can provide a snapshot of what is occurring now, and form the basis for feed-forward predictive analytics.

Streambase will vehemently disagree with us, but the days of the standalone CEP vendor are over. More specifically, there is room for maybe one or two strong independents – just as players like Informatica and Teradata have survived as independent players. (But as Seth Grimes has pointed out, that also leaves Teradata currently standing by the wall at a party without a CEP date.)

So our first take on Sybase’s acquisition of Aleri is just that – an inevitable act of industry consolidation. Just as Informatica recently swept up Agent Logic, initially to boost its U.S. federal business. Both Sybase and Aleri have already been on first dates, with a 2-year old where Sybase integrated Aleri’s event processing engine as an input to its RAP trading platform. But secondly, the potential BI connection hasn’t escaped us. Sybase could add the Aleri technology – actually there are three of them, which translates to a lot of migration and consolidation. More to the point, when “Sybaleri” grows up, it could offer event processing to its BI portfolio as a piece of look-forward technology.

As (almost) the last man standing, Streambase is having its fun, offering an “amnesty” to Sybaleri customers. When we inquired whether Sybaleri customers were criminals, Streambase CEO Mark Palmer jokingly reassured us that they were just “misguided souls.”

We’ll be speaking with Sybase and Streambase later this afternoon and may update this post later.

Oracle’s Sun Java Strategy: Business as Usual

In an otherwise pretty packed news day, we’d like to echo @mdl4’s sentiments about the respective importance of Apple’s and Oracle’s announcements: “Oracle finalized its purchase of Sun. Best thing to happen to Sun since Java. Also: I don’t give a sh#t about the iPad. I said it.”

There’s little new in observing that on the platform side, that Oracle’s acquisition of Sun is a means for turning the clock back to the days of turnkey systems in a post-appliance era. History truly has come full circle as Oracle in its original database incarnation was one of the prime forces that helped decouple software from hardware. Fast forward to the present, and customers are tired of complexity and just want things that work. Actually, that idea was responsible for the emergence of specialized appliances over the past decade for performing tasks ranging from SSL encryption/decryption to XML processing, firewalls, email, or specialized web databases.

The implication here is that the concept is elevated to enterprise level; instead of a specialized appliance, it’s your core instance of Oracle databases, middleware, or applications. And even there, it’s but a logical step forward from Oracle’s past practice of certifying specific configurations of its database on Sun (Sun was, and now has become again, Oracle’s reference development platform). That’s in essence the argument for Oracle to latch onto a processor architecture that is overmatched in investment by Intel for the x86 line. The argument could be raised than in an era of growing interest in cloud, as to whether Oracle is fighting the last war. That would be the case – except for the certainty that your data center has just as much chance of dying as your mainframe.

At the end of the day, it’s inevitably a question of second source. Dana Gardner opines that Oracle will replace Microsoft as the hedge to IBM. Gordon Haff contends that alternate platform sources are balkanizing as Cisco/EMC/VMware butts their virtualized x86 head into the picture and customers look to private clouds the way they once idealized grids.

The highlight for us was what happens to Sun’s Java portfolio, and as it turns out, the results are not far from what we anticipated last spring: Oracle’s products remain the flagship offerings. From looking at respective market shares, it would be pretty crazy for Oracle to have done otherwise

The general theme was that – yes – Sun’s portfolio will remain the “reference” technologies for the JCP standards, but that these are really only toys that developers should play with. When they get serious, they’re going to keep using WebLogic, not Glassfish. Ditto for:
• Java software development. You can play around with NetBeans, which Oracle’s middleware chief Thomas Kurian characterized as a “lightweight development environment,” but again, if you really want to develop enterprise-ready apps for the Oracle platform, you will still use JDeveloper, which of course is written for Oracle’s umbrella ADF framework that underlies its database, middleware, and applications offerings. That’s identical to Oracle’s existing posture with the old (mostly) BEA portfolio of Eclipse developer tools. Actually, the only thing that surprised us was that Oracle didn’t simply take NetBeans and set it free – as in donating it to Apache or some more obscure open source body.
• SOA, where Oracle’s SOA Suite remains front and center while Sun’s offerings go on maintenance.

We’re also not surprised as to the prominent role of JavaFX in Oracle’s RIA plans; it fills a vacuum created when Oracle terminated BEA’s former arrangement to bundle Adobe Flash/Flex development tooling. In actuality, Oracle has become RIA agnostic, as ADF could support any of the frameworks for client display, but JavaFX provides a technology that Oracle can call its own.

There were some interesting distinctions with identity management and access, where Sun inherited some formidable technologies that, believe it or not, originated with Netscape. Oracle Identity management will grab some provisioning technology from the Sun stack, but otherwise Oracle’s suite will remain the core attraction. But Sun’s identity and access management won’t be put out to pasture, as it will be promoted for midsized web installations.

There are much bigger pieces to Oracle’s announcements, but we’ll finish with what becomes of MySQL. In short there’s nothing surprising to the announcement that MySQL will be maintained in a separate open source business unit – the EU would not have allowed otherwise. But we’ve never bought into the story that Oracle would kill MySQL. Both databases aim at different markets. Just about the only difference that Oracle’s ownership of MySQL makes – besides reuniting it under the same corporate umbrella as the InnoDB data store – is that, well, like yeah, MySQL won’t morph into an enterprise database. Then again, even if MySQL had remained independent, that arguably it was never going to evolve to the same class of Oracle as the product would lose its beloved simplicity.

The more relevant question for MySQL is whether Oracle will fork development to favor Solaris on SPARC. This being open source, there would be nothing stopping the community from taking the law into its own hands.

BPM Pure Play days numbered with Progress acquisition of Savvion

Is it more than coincidence that acquisitions tend to come in waves? Just weeks after IBM’s announcement to snap up Lombardi just before Christmas, Progress responds with agreement to put Savvion out of its misery? In such a small space that is undergoing active consolidation, it is hard not to know who’s in play.

Nonetheless, Progress’s acquisition confirms that BPM’s pure play days are numbered, if you expect executable BPM.

The traditional appeal of BPM was that it was a business stakeholder-friendly approach to developing solutions that didn’t rely on IT programmatic logic. The mythology around BPM pure-plays was that these were business user, not IT-driven software buys. In actuality, they simply used a different language or notation: process models with organizational and workflow-oriented semantics as opposed to programmatic execution language. That stood up only as long as you used BPM to model your processes, not automate them.

Consequently, it is not simply the usual issues of vendor size and viability that are driving IT stack vendors to buy up BPM pure plays. It is that, but more importantly, if you want your BPM tool to become more than documentware or shelfware, you need a solution with a real runtime. And that means you need IT front and center, and the stack people right behind it. Even with emergence of BPMN 2.0, which adds support for executables, the cold hard facts are that anytime, anything executes in software, IT must be front and center. So much for bypassing IT.

Progress’s $49 million offer for is a great exit strategy for Savvion. The company, although profitable, has grown very slowly over its 15 years. Even assuming the offer was at a 1.5x multiple, Savvion’s extremely low 7-figure business is not exactly something that a large global enterprise could gain confidence in. Savvion was in a challenging segment: a tiny player contending for enterprise, not departmental BPM engagements. If you are a large enterprise, would you stake your enterprise BPM strategy on a slow-growing players whose revenues are barely north of $10 million? It wasn’t a question of whether, but when Savvion would be acquired.

Of course that leads us to the question as to why Progress couldn’t get its hands on Savvion in time to profit from Savvion’s year-end deals. It certainly would have been more accretive to Progress’ bottom line had they completed this deal three months ago (long enough not to disrupt the end of year sales pipeline).

Nonetheless, Savvion adds a key missing piece for Progress’s Apama events processing strategy (you can read Progress/ApamaCTO John Bates’ rationale here). There is a symbiotic relationship between event processing and business process execution; you can have events trigger business processes or vice versa. There is some alignment with the vertical industry templates that both have been developing, especially for financial services and telcos, which are the core bastions (along with logistics) for EP. And with the Sonic ESB, Progress has a pipeline for ferrying events.

In the long run, there could also be a legacy renewal play by using the Savvion technology to expose functionality for Progress OpenEdge or DataDirect customers, but wisely, that is now a back burner item for Progress which is not the size of IBM or Oracle, and therefore needs to focus its resources.

Although Progress does not call itself a stack player, it is evolving de facto stacks in capital markets, telcos, and logistics.

Event processing, a.k.a., Complex Events Processing (CEP, a forbidding label) or Business Events Processing (a friendlier label that actually doesn’t mean much) is still an early adopter market. In essence, this market fulfills a niche were events are not human detectable and require some form of logic to identify and then act upon. The market itself is not new; capital markets have developed homegrown event processing algorithms for years. What’s new (as in, what’s new in the last decade) is that this market has started to become productized. More recently, SQL-based approaches have emerged to spread high-end event processing to a larger audience.

Acquiring Savvion ups the stakes with Tibco, which also has a similar pairing of technologies in its portfolio. Given ongoing consolidation, that leaves Pegasystems, Appian, Active Endpoints, plus several open source niche pure plays still standing. Like Savvion, Pega is also an enterprise company, but it is a public company with roughly 10x revenues which as still managed to grow in the 25% range in spite of the recession. While in one way, it might make a good fit with SAP (both have their own, entrenched, proprietary languages), Pega is stubbornly independent and SAP acquisition-averse. Pega might be a fit with one of the emerging platform stack players like EMC or Cisco. On second thought, the latter would be a much more logical target for web-based Appian or even Active Endpoints, both still venture-funded, but also promising growth players that at some point will get swept up.