10.02.11

Tibco: The Best Surprise is No Surprise

Posted in BPM, Business Intelligence, Data Management, Fast Data, Middleware at 8:20 pm by Tony Baer

Tibco has been running on all cylinders of late. In earnings and revenues, it has kept up with the Joneses in the enterprise software neighborhood, running respectable 25% revenue and 30+% software license growth numbers in its most recent quarterly year over year results as we’ve noted in several of our recent Ovum research notes.

It is beginning to make the turn from its geeky roots towards more solution selling to the business side in tone and deed. Ever since the 2007 Spotfire acquisition – which brought real-time analytic visualizations – it has made several buys that are more targeted to the business rather than strictly the IT or CIO side. They include Netrics, for fuzzy logic technologies for pattern matching; Loyalty Lab, for managing customer affinity programs; and Nimbus, a recent addition, which adds process discovery and management of manual activity that comprise the other 80% of what happens inside an enterprise.

Of course it’s not as if Tibco were trying to pull an HP in doing a 180 on its business strategy (heaven forbid, we don’t need any more senseless Silicon Valley soap operas!). Core infrastructure plays, such as FTL ultra low latency messaging or the DataSynapse data grid, remain core to Tibco’s 2-second advantage mission. It’s just that, in modest but growing cases, the raw technology is being packaged as a black box underneath more business-focused solutions. For instance, Tibco is packaging solutions for retail such as Active Catalog and Active Fulfillment that underneath the hood bundle Tibco Business Events (CEP), Active Matrix BPM, and other pieces.

Of course, such transformations don’t come overnight, as there is the need to get field sales up to speed and accustomed to calling on new entry points at target prospect. Not surprisingly, Tibco is also ramping up vertical solutions, but on an opportunistic basis. An example: we met with a European telco customer that is using Business Events for monitoring devices (in this case, water meters) which may present an opportunity for Tibco to develop an M2M (machine-to-machine) event-driven integration solution that could be more widely applied to segments such as utilities or logistics/transportation.

Several of its recent acquisitions, such as Foresight, a healthcare payer EDI gateway; Open Sprit, for data integration for upstream oil and gas processing, are strictly vertical plays. Loyalty Labs, which provides analytics for customer affinity programs, has helped make retail one of its fastest growing verticals coming from a near-zero client base a few years back.

Tibco is traveling a similar road as IBM, but is starting from much earlier point in developing vertical solutions. As Tibco lacks the professional services presence of IBM, it has to cherry pick its vertical opportunities.

At this point, the major disrupters for Tibco are big data and mobility.

For mobile the challenge is integrating alerts from Tibco’s Business Events and Spotfire engines to clients; tibbr, its internal collaboration messaging platform, provides the logical environment for bringing its events feed out to mobile devices. This could be bolstered with its recent Nimbus acquisition, both for input (process discovery, using mobile devices to snap a picture, for instance) and output (for communicating how to perform manual processes out to the field).

Big data positioning and productization for Tibco is also a work in progress. Its message busses can in some cases handle enormous amounts of data; its business event engine could also provide feeds if Tibco can make the sensing agent more lightweight; its BPM offering could be configured to get triggered based on specific event patterns that may involve crunching of enormous volumes of event feeds.

But there is a brave new world of variably structured data that is becoming fair game for enterprises to sense and respond. We don’t expect Tibco to buy its own Advanced SDQL platform or create its own Hadoop distribution, as Tibco is not about data at rest, nor is it a database player (OK, its MDM offering does have to store master and reference data). Nonetheless, delivering the 2-second advantage in a big world where the data is getting bigger, bigger, and more heterogeneous raises the urgency for Tibco to distinguish itself in extending its visibility.

When we were asked by the executive marketing team of our impressions this year, our thoughts were, well, there was hardly anything newsworthy. That’s not necessarily a bad thing, as during a strategy roadmap presentation at this year’s Tibco TUCON conference, a timeline of Tibco acquisitions showed roughly a half dozen entries for 2010 and just one for this year. Over the past year Tibco has been preoccupied with absorbing the new acquisitions and so – Nimbus excluded – has not been active on this front lately. For instance, Tibco has integrated the Netrics fuzzy pattern matching engine into Business Events, where it belongs.. It has similarly blended the recently acquired data grid technology with Business Events. Check out Sandy Kemsley’s post for a more detailed blow-by-blow on how Tibco has rounded out its product portfolio over the past year.

With the swoon on Wall Street, Tibco has left its $250 cash stash alone, in spite of the fact that there are plenty of acquisition targets available at reasonable prices right now as a lot of venture funds are looking for exits. By its CFO’s words, the company is not as enormous as IBM or Oracle, where acquisitions don’t disrupt the entire company. Nonetheless, we expect that 2012 will grow more active in acquisitions – we hope that acquisition of a data quality provider makes the top of the shopping list.

04.15.11

A Week of BPM

Posted in BPM, Cloud, Enterprise Applications, Enterprise Integration, Middleware at 2:54 am by Tony Baer

We’re closing out a week that’s been bookended by a couple of BPM-related announcements from IBM and upstart Active Endpoints. The common thread is the quest for greater simplicity, the difference is that one de-emphasizes the B word itself, with a focus of being a simple productivity tool that makes complex enterprise apps (in this case Salesforce.com) easier, while the other amplifies a message loud and clear that this is the BPM that you always wanted BPM to be.

The backstory is that BPM has traditionally appealed to a narrow audience of specially-skilled process analysts and modelers, and has yet to achieve the mass market status. Exhibit One? One of the larger independent BPM players (Pegasystems) still standing is a $330 million company. In the interim, there has been significant consolidation in this community as BPM has become one of the components that are expected in a middleware stack. A milestone has been Oracle’s latest Fusion BPM 11g R1, which unifies the engines and makes them first class citizens of the Fusion middleware stack. While such developments have solidified BPM’s integration story, on its own that has not made BPM the enterprise’s next ERP.

The irony is that BPM has long been positioned as the business stakeholder’s path to application development, with the implication that a modeling/development environment that uses the terms of business process rather than programmatic commands should appeal to a higher audience. The drawback is that to get there, most BPM tools relied on proprietary languages that limited use to… you guessed it… a narrow cadre of business process architects.

Just over a year ago, IBM acquired Lombardi, one of the more innovative independents that has always stressed simplicity. Not that IBM was lacking in BPM capability, but they were based on aging engines centered around integration and document management/workflow use cases, respectively. As IBM software has not been known by its simplicity (many of its offerings still consist of multiple products requiring multiple installs, or a potpourri of offerings targeted for separate verticals or use cases), the fear was that Lombardi would get swallowed alive and emerge unrecognizable.

The good news is that Lombardi in technology and product has remained alive and well. We’ll characterize IBM Business Process Manager 7.5 as Lombardi giving IBM’s BPM suite a heart transplant; it dropped in a new engine to reinvigorate the old. As for the peanut gallery, Janelle Hill characterized it as IBM getting a Lombardotomy; Neil Ward-Dutton and Sandy Kemsley described it as a reverse takeover; Ward-Dutton debated Clay Richardson that the new release was more than just a new paint job, while Bruce Silver assessed it as IBM’s BPM endgame.

So what did IBM do in 7.5? The modeling design environment and repository are Lombardi’s, with the former IBM WebSphere Process Server (that’s the integration BPM) now being ported over to the Lombardi engine. It’s IBM’s initial answer to Oracle’s unification of process design and runtimes with the latest Fusion BPM.

IBM is not all the way there yet. Process Server models, which were previously contained in a flat file, are now stored in the repurposed Lombardi repository, but that does not yet make the models fully interoperable. You can just design them in the same environment and store them in the same repository. That’s a good start, and at least it shows that the Lombardi approach will not get buried. Beyond the challenge of integrating the model artifacts, the bigger question to us is whether the upscaling and extending of Lombardi to cover more use cases might be too much of a good thing. Can it avoid the clutter of Microsoft Office that resulted from functional scope creep?

As for FileNet, IBM’s document-centric BPM, that’s going to wait. It’s a very different use case – and note that even as Oracle unified two of its BPM engines, it has markedly omitted the document management piece. Document centric workflow is a well-ingrained use case and has its own unique process patterns, so the open question is whether it is realistic to expect that such a style of process management can fit in the same engine, or simply exist as external callable workflows.

At the other end of the week, we were treated to the unveiling of Active Endpoints new Cloud Extend release. As we tweeted, this is a personality transplant for Active Endpoints, as the company’s heritage has been with more geeky BPEL, and even geekier branded tool, ActiveVOS.

OK, the Cloud branding is a break from geekdom, but it ain’t exactly original – there’s too much Cloud this and Cloud that going around the ISV community these days.

More to the point, Cloud Extend does not really describe what Active Endpoints is doing with this release. Cloud Extend is not cloud-enabling your applications, it is just enabling your application that in this case happens to run in the cloud (the company also has an on premises version of this tool with the equally unintuitive brand Socrates).

In essence, Cloud Extend adds a workflow shell to Salesforce.com so that you can design workflows in a manner that appears as simple as creating a Visio diagram, while providing the ability to save and reuse them. There’s BPEL and BPMN underneath the hood, but in normal views you won’t see them. It also has neat capabilities that help you filter out extraneous workflow activities when working on a particular process. The result is that you have screens that drive users to interact with Salesforce in a consistent manner, replacing the custom coding of systems integrators with a tool that should help systems integrators perform the same Salesforce customization job quicker. Clearly, Salesforce should be just the first stop for this technology; we’d expect that Active Endpoints will subsequently target other enterprise apps with its engine in due course.

We hate to spill the beans, but under the covers, this is BPM. But that’s not the point – and in fact it opens an interesting argument as to whether we should simply take advantage of the technology without having to make a federal case about it. It’s yet another approach to make the benefits of BPM more accessible to people who are not modeling notation experts, which is a good thing.

12.01.10

IBM’s Software Complex

Posted in Application Development, BPM, Business Intelligence, Database, Enterprise Applications, Enterprise Integration, IT Services & Systems Integration, Middleware, Technology Market Trends at 1:28 am by Tony Baer

Sometimes the news is that there is no news. Well, Steve Mills did tell us that IBM is investing the bulk of its money in software and that between now and 2015, it would continue to make an average of $4 – 5 billion worth of strategic acquisitions per year. In other words, it would continue its current path, and it will continue buying making acquisitions for the strategic value of technology with the guideline of having them become revenue accretive within 2 – 4 years. Again, nothing new, as if there were anything wrong with that.

The blend of acquisition and organic development is obviously bulking up the Software Group’s product portfolio, which in itself is hardly a bad thing; there is more depth and breadth. But the issue that IBM has always faced is that of complexity. The traditional formula has always been, we have the pieces and we have services to put it together for you. Players like Oracle compete with a packaged apps strategy; in more specialized areas such as project portfolio management, rivals like HP and CA Technologies say that we have one product where IBM splits it in two.

IBM continues to deny that it is in the apps business, but as it shows architectural slides of its stack that is based on middleware along with horizontal “solutions” such as a SPSS Decision Manager offering (more about that shortly); vertical industry frameworks which specify processes, best practices, and other assets that can be used to compose industry solutions; and then at the top of the stack, solutions that IBM and/or its services group develops. It’s at the peak of the stack that the difference between “solutions” and “applications” becomes academic. Reviewing Oracle’s yet to be released Fusion applications, there is a similar architecture that composes solutions based on modular building blocks.

So maybe IBM feels self-conscious about the term application as it doesn’t want to be classed with Oracle or SAP, or maybe it’s the growing level of competition with Oracle that made Mills rather defensive in responding to an analyst’s question about the difference between IBM’s and Oracle’s strategy. His response was that IBM’s is more of a toolkit approach that layers atop the legacy that will always be there, which is reasonable, although the tone was more redolent of “you [Oracle] can’t handle the truth.”

Either way, where you sell a solution or a packaged application for enterprise level, assembly will still be required. Services will be needed to integrate and/or train your people. Let’s be adults and get that debate behind us. For IBM, time to get back to Issue One: Defusing Complexity. When you’re dealing with enterprise software, there will always be complexity. But when it comes to richness or simplicity, IBM tends to aim for the former. The dense slides with small print made the largely middle aged analyst audience more self conscious than normal of the inadequacies of their graduated eyeglasses or contacts.

OK, if you’re IBM facing an analyst crowd, you don’t want to oversimplify the presentation into the metaphorical equivalent of the large print weekly for the visually impaired. You must prove that you have depth. You need to show a memorable, coherent message (Smarter Planet was great when it débuted two years ago). But most importantly, you need to have coherent packaging and delivery to market.

IBM Software Group has done a good job of repurposing technologies across brands to fill defined product needs; it still has its work cut out for its goal of making the software group sales force brand agnostic. That is going to take time.

As a result, good deeds don’t go unpunished, with IBM’s challenges with SPSS Decision Manager a great case in point. The product, an attempt to craft a wider market for SPSS capabilities, blends BI analytics from Cognos, rules management from Ilog, and event processing from WebSphere Business Events to develop a predictive analytics solution for fashioning business strategy aimed at line of business users.

For instance, if you are in the claims processing group of an auto insurance company, you can use form-based interfaces to vary decision rules and simulate the results to ensure that accident calls from 19 year old drivers or those who have not yet contacted the police are not fast tracked for settlement.

The problem with Decision Manager is that it is not a single SKU or install; IBM has simply pre-integrated components that you still must buy a la carte. IBM Software is already integrating product technologies; it now needs to attend to integrating delivery.

09.23.10

Stack envy: Impressions for Oracle OpenWorld 2010

Posted in Application Development, BPM, Business Intelligence, Data Management, Database, Enterprise Applications, Enterprise Integration, Java, Middleware, SaaS (Software as a Service), SOA & Web Services at 2:19 am by Tony Baer

Last year, the anticipation of the unveiling of Fusion apps was palpable. Although we’re not finished with Oracle OpenWorld 2010 yet – we still have the Fusion middleware analyst summit tomorrow and still have loose ends regarding Oracle’s Java strategy – by now our overall impressions are fairly code complete.

In his second conference keynote – which unfortunately turned out to be almost a carbon copy of his first – Larry Ellison boasted that they “announced more new technology this week than anytime in Oracle’s history.” Of course, that shouldn’t be a heavy lift given that Oracle is a much bigger company with many more products across the portfolio, and with Sun, has a much broader hardware/software footprint at that.

On the software end – and post-Sun acquisition, we have to make that distinction – it’s hard to follow up last year’s unveiling of Fusion apps. The Fusion apps are certainly a monster in size with over 5000 tables, 10,000 task flows, representing five years of development. Among other things, the embedded analytics provide the context long missing from enterprise apps like ERP and CRM, which previously required you to slip into another module as a separate task. There is also good integration of process modeling, although for now BPM models developed using either of Oracle’s modeling tools won’t be executable. For now, Fusion apps will not change the paradigm of model, then develop.

A good sampling of coverage and comment can be found from Ray Wang, Dennis Howlett, Therese Poletti, Stefan Ried, and for the Java side, Lucas Jellema.

The real news is that Fusion apps, excluding manufacturing, will be in limited release by year end and general release in Q1. That’s pretty big news.

But at the conference, Fusion apps took a back seat to Exadata, the SPARC HP (and soon to be SPARC)-based database appliance unveiled last year, and the Exalogic cloud-in-a-box unwrapped this year. It’s no mystery that growth in the enterprise apps market has been flat for quite some time, with the main Greenfield opportunities going forward being midsized businesses or the BRIC world region. Yet Fusion apps will be overkill for small-midsized enterprises that won’t need such a rich palette of functionality (NetSuite is more likely their speed), which leaves the emerging economies as the prime growth target. The reality is most enterprises are not about to replace the very ERP systems that they implemented as part of modernization or Y2K remediation efforts a decade ago. At best, Fusion will be a gap filler, picking up where current enterprise applications leave off, which provides potential growth opportunity for Oracle, but not exactly a blockbuster one.

Nonetheless, as Oracle was historically a software company, the bulk of attendees along with the press and analyst community (including us) pretty much tuned out all the hardware talk. That likely explains why, if you subscribed to the #oow10 Twitter hashtag, that you heard nothing but frustration from software bigots like ourselves and others who got sick of the all-Exadata/ Exalogic-all-the-time treatment during the keynotes.

In a memorable metaphor, Ellison stated that one Exalogic device can schedule the entire Chinese rail system, and that two of them could run Facebook – to which a Twitter user retorted, how many enterprises have the computing load of a Facebook?

Frankly, Larry Ellison has long been at the point in his life where he can afford to disregard popular opinion. Give a pure hardware talk Sunday night, then do it almost exactly again on Wednesday (although on the second go round we were also treated to a borscht belt routine taking Salesforce’s Mark Benioff down more than a peg on who has the real cloud). Who is going to say no to the guy who sponsored and crewed on the team that won the America’s cup?

But if you look at the dollars and sense opportunity for Oracle, it’s all about owning the full stack that crunches and manages the data. Even in a recession, if there’s anything that’s growing, it’s the amount of data that’s floating around. Combine the impacts of broadband, sensory data, and lifestyles that are becoming more digital, and you have the makings for the data counterpart to Metcalfe’s Law. Owning the hardware closes the circle. Last year, Ellison spoke of his vision to recreate the unity of the IBM System 360 era, because at the end of the day, there’s nothing that works better than software and hardware that are tuned for each other.

So if you want to know why Ellison is talking about almost nothing else except hardware, it’s not only because it’s his latest toy (OK, maybe it’s partly that). It’s because if you run the numbers, there’s far more growth potential to the Exadata/Exalogic side of the business than there is for Fusion applications and middleware.

And if you look at the positioning, owning the entire stack means deeper account control. It’s the same strategy behind the entire Fusion software stack, which uses SOA to integrate internally and with the rest of the world. But Fusion apps and middleware remain optimized for an all-Oracle Fusion environment,underpinned by a declarative Application Development Framework (ADF) and tooling that is designed specifically for that stack.

So on one hand, Oracle’s pitch that big database processing works best on optimized hardware can sound attractive to CIOs that are seeking to optimize one of their nagging points of pain. But the flipside is that, given Oracle’s reputation for aggressive sales and pricing, will the market be receptive to giving Oracle even more control? To some extent the question is moot; with Oracle having made so many acquisitions, enterprises that followed a best of breed strategy can easily find themselves unwittingly becoming all-Oracle shops by default.

Admittedly, the entire IT industry is consolidating, but each player is vying for different combinations of the hardware, software, networking, and storage stack. Arguably, applications are the most sensitive layer of the IT technology stack because that is where the business logic lies. As Oracle asserts greater claim to that part of the IT stack and everything around it, it requires a strategy for addressing potential backlash from enterprises seeking second sources when it comes to managing their family jewels.

05.24.10

IBM offers to buy Sterling Commerce

Posted in BPM, Enterprise Applications, Enterprise Integration, IT Services & Systems Integration, Middleware, SOA & Web Services, Supply Chain Management at 4:19 pm by Tony Baer

We should have seen this one coming. IBM’s offer to buy Sterling Commerce for $1.4 billion from AT&T closes a major gap in the WebSphere portfolio, extending IBM’s array of internal integrations externally to B2B. It’s a logical extension, and IBM is hardly the first to travel this path: Software AG’s webMethods began life as a B2B integration firm before it morphed into EAI, later SOA and BPM middleware, before getting acquired by Software AG. In turn, Tibco recently added Foresight Software as an opportunistic extension for taking advantage of a booming market in healthcare B2B transactions.

But neither Software AG’s or Tibco’s moves approach the scope of Sterling Commerce’s footprint in B2B trading partner management, a business that grew out of its heritage as one of the major EDI (electronic data interchange) hubs. The good news is the degree of penetration that Sterling has; the other (we won’t call it “bad”) news is all the EDI legacy, which provides great fodder for IBM’s Global Business Services arm to address a broader application modernization opportunity.

Sterling’s base has been heavily in downstream EDI and related trading partner management support for retailers, manufacturers, and transportation/freight carriers. Its software products cover B2B/EDI integration, partner onboarding into partner communities (an outgrowth of the old hub and spoke patterns between EDI trading partners), invoicing, payments, order fulfillment, and multi-channel sales. In effect, this gets IBM deeper into the supply chain management applications market as it already has Dynamic Inventory Optimization (DIOS) from the Maximo suite (which falls under the Tivoli umbrella), not to mention the supply chain optimization algorithms that it inherited as part of the Ilog acquisition which are OEM’ed to partners (rivals?) like SAP and JDA.

Asked if acquisition of Sterling would place IBM in competition with its erstwhile ERP partners, IBM reiterated its official line that it picks up where ERP leaves off – but that line is getting blurrier.

But IBM’s challenge is prioritizing the synergies and integrations. As there is still a while before this deal closes – approvals from AT&T shareholders are necessary first – IBM wasn’t about to give a roadmap. But they did point to one no-brainer: infusing IBM WebSphere vertical industry templates for retail with Sterling content. But there are many potential synergies looming.

At top of mind are BPM and business rules management that could make trading partner relationships more dynamic. There are obvious opportunities for WebSphere Business Modeler’s Dynamic Process Edition, WebSphere Lombardi Edition’s modeling, and/or Ilog’s business rules. For instance, a game changing event such as Apple’s iPad entering or creating a new market for tablet could provide the impetus for changes to products catalogs, pricing, promotions, and so on; a BPM or business rules model could facilitate such changes as an orchestration layer that acts in conjunction with some of the Sterling multi-channel and order fulfillment suites. Other examples include master data management, which can be critical when managing sale of families of like products through the channel; and of course Cognos/BI, which can be used for evaluating the profitability or growth potential of B2B relationships.

Altimeter Group’s Ray Wang voiced a question that was on many of our minds: why AT&T would give up Sterling. IBM responded about the potential partnership opportunities but to our mind, AT&T has its hands full attaining network parity with Verizon Wireless and is just not a business solutions company.

05.12.10

Tibco’s Hits and Misses

Posted in Application Development, BPM, Business Intelligence, Data Management, Enterprise Integration, Middleware, SOA & Web Services at 2:35 am by Tony Baer

Messaging is Tibco’s business, but it has had a mixed track record when it comes to making the messaging around its message-oriented view of the world. It starts off on the right foot. Its perennial tagline, The Power of Now, has become timelier in a world where the ability to respond is reinforced by the headlines. Just take last couple weeks for example: last Thursday’s weird Wall Street meltdown, and before that, arrest of the foiled bomber of Times Square.

Tibco is hardly alone in voicing such messaging. IBM’s Smarter Planet and Progress Software’s Operational Responsiveness are also about the need for systems that think on their feet. Yet Tibco’s DNA gives it a unique claim to this space as the company was born around fast reliable, messaging. Two years ago, Tibco CEO Vivek Ranadive made the case for event-driven predictive intelligence. Now Tibco is talking about the need, in global marketing head Ram Menon’s words, “to humanize the story better.” That’s always been a stretch for this technology-driven company whose vision has long been driven by a shy, technology centric CEO. Towards that goal, Tibco is taking a step or two forward, but unfortunately also a step back.

The good news is that Tibco is fleshing out it’s “Power of Now” tagline. We saw the first of a new series of simple, straightforward visual ads with short statements of business outcomes, like how Tibco’s event processing helps defense agencies clear the fog of war, underscored by the tagline.

Then Tibco unveiled a new tag line, the Two-Second Advantage, which makes the case if that you have just enough information quickly enough, you don’t need the complete picture to make the right decision. Tibco’s on a roll there, a message backed up by the surprisingly irreverent Ronald K. Noble, the brash New Yorker who heads Interpol, who made the case that such an advantage can have life or death implications in crime fighting, especially when it comes to border control.

The problem is that just when you’ve thought that Tibco finally has gotten on message, it reverts back to its geeky self and steps on top of it. The latest case is its CEO’s Enterprise 3.0 concept that, when debuted in front of a room of analysts, floated like a lead balloon.

His numbering is over simplistic and cuts against popular perception: Ranadive terms Enterprise 2.0 as client/server, rather than the social computing weave that is now seeping into enterprise systems – including Tibco’s. But bloggers of record Sandy Kemsley and Brenda Michelson summed it up best. Kemsley: “Enterprise 3.0 is becoming a joke amongst the analysts attending here today (we’re tweeting about staging an intervention)…” Michelson:” We like the 2 second advantage message, but “Enterprise 3.0” doesn’t resonate, it won’t be meaningful to Business Execs and CIOs.”

Tibco doesn’t need new messaging, it just has to bring out the best of what it already has. It can humanize “The Power of Now” by appending the question, “What does it really mean?” And from that, “The two-second advantage,” and all the business cases that manifest it, become the logical response.

‘Nuff said.

With acquisitions and organic product development, Tibco’s portfolio is broader than ever, and not surprisingly, this year’s event carried announcements of a large number of product upgrades and introductions. For us the highlight was ActiveMatrix BPM, which finally puts Tibco’s business process management engine on the same Eclipse development platform and runtime as the rest of Tibco’s service orchestration products. As a completely new product (this is not iProcess, which becomes Tibco’s legacy BPM offering rooted from the original Staffware acquisition). This is a major development for a vendor that has accumulated a large portfolio of individual products over the years, with the harshest critique being the need for multiple runtime engines: ActiveMatrix, iProcess, BusinessWorks, Rendezvous, EMS, etc. The new BPM offering fills a critical gap in the SOA-oriented ActiveMatrix product family.

Our critique here – as with IBM – is that the use of Eclipse as the design time platform appeals more to developers than business stakeholders. But the fact that ActiveMatrix BPM is intended to be an execution platform means that so-called nirvana of having business people design their own business processes is the type of stuff that you do when in a room with a whiteboard. Fortuitously, Tibco does have something in the works, as it previewed Design Collaborator, a new process definition tool that suspiciously resembled IBM BPM Blueprint; we hope that Tibco designs it so that it could feed BPMN models into ActiveMatrix BPM so it doesn’t become a dead-end product.

There were other introductions, such as the none-too original, retro-named PeopleForms (which sounds like the name for one of Oracle’s legacy PeopleSoft offerings) that for now only churns out SharePoint-like forms-driven apps as a beta. PeopleForms addresses a low end of the market not served by Tibco, developed by what’s left of the old General Interface team; eventually this will be beefed up into something more useful with workflow. We also hope that there might be some rationalization with Design Collaborator, so that this product doesn’t wind up becoming a standalone curiosity.

But the most profound impression came from an acquisition that Tibco completed only in March. Our award for best-of-the-day award from the analyst sessions was demonstration of Netrics, a tiny 15-person outfit out of Princeton, NJ that has developed a patented, algorithmic pattern matching program that really fleshes out the “two-second advantage message’ in providing proximate matches that should be “good enough: to make decisions. Netrics’ technology assigns algorithms as metadata that scores the identity of names or people or things; using that metadata, it quickly reduces large data sets to find probable matches. Those probable matches can be filtered to include or exclude misspellings and typos. Netrics’ technology has ready applicability to identifying event patterns and golden copies of data – and as such, Tibco’s initial plans are to incorporate the technology into Tibco Business Events (their CEP offering) and Master Data Management. On the horizon, it provides a pattern matching approach that complements text mining that is often used in national security applications.

Netrics is not a replacement for data quality – that remains a major gap ion Tibco’s product suite. While the two-second advantage implies having data that is “good enough,” when you perform event processing and must make snap decisions. But over the long haul, you’ll need the kind of feedback loop and reality check on those decisions that business intelligence provides – and for that, you’ll need data that is better scrubbed.

05.04.10

SOA what

Posted in BPM, Cloud, Data Management, Enterprise Applications, Enterprise Integration, Middleware, SOA & Web Services at 2:29 am by Tony Baer

There is a core disconnect between what gets analysts and journalists excited, and what gains traction with the customers who consume the technologies that keep our whole ecosystem in business. OK, guilty as charged, we analysts get off on hearing about what’s new and what’s breaking the envelope, but that’s the last thing that enterprise customers want to hear. Excluding reference customers (who have a separate set of motivations that often revolve around a vendor productizing something that would otherwise be custom developed), most want the tried and true, or at least innovative technology that at least has matured the rough spots and is no longer version 1.

It’s a thought that crystallized as we bounced impressions of this year’s IBM SOA Impact event with colleagues like Dorothy Alexander and Marcia Kaufman, who shared perceptions that, while this year’s headlines or trends seemed a bit anticlimactic, that there was real evidence that customers were actually “doing” whatever it is that we associate with SOA.

Forget about the architectural journeys that you’ve heard about SOA; SOA is an enterprise architectural pattern that is a means to an end. It’s not a new argument; it was central to the SOA is dead debate that flared up with Anne Thomas Manes’ famous or infamous post of almost a year and a half ago, and of the subsequent debates and hand wringing that ensued.

IBM’s so-called SOA conference, Impact, doesn’t include SOA in its name, but until now SOA was the implicit rationale for this WebSphere middleware stack conference to exist. But more and more it’s about the stack that SOA enables, and more and more, about the composite business applications that IBM’s SOA stack enables. IBM won’t call it the applications business. But when you put vertical industry frameworks, business rules, business process management, and analytics together, it’s not simply a plumbing stack. It becomes a collection of software tools and vertical industry templates that become the new de facto applications that bolt atop and aside the core application portfolio that enterprises already have and are not likely to replace. In past years, this conference was used to introduce game changers, such as the acquisition of Webify that placed IBM Software firmly on the road to verticalizing its middleware.

This year the buzz was about something old becoming something new again. IBM’s acquisition of Cast Iron, as dissected well by colleagues Dana Gardner and James Governor, reflects the fact that after all these years of talking flattened architectures, especially using the ESB style, that enterprise integration (or application-to-application, or A2A) hubs never went out of style. There are still plenty of instances of packaged apps out there that need to be interfaced. The problem is no different from a decade ago when the first wave of EAI hubs emerged to productize systems integration of enterprise packages.

While the EAI business model never scaled well in its time because of the need for too much customization, experience, commoditization of templates, and emergence of cheap appliances provided economic solutions to this model. More importantly, the emergence of multi-tenanted SaaS applications, like Salesforce.com, Workday and many others, have imposed a relatively stable target data schema plus a need of integration of cloud and on-premises applications. Informatica has made a strong run with its partnership with Salesforce, but Informatica is part of a broader data integration platform that for some customers is overkill. By contrast, niche players like Cast Iron which only do data translation have begun to thrive with a Blue Chip customer list.

Of course Cast Iron is not IBM’s first appliance play. That distinction goes to DataPower, which originally made its name with specialized appliances that accelerated compute-intensive XML processing and SSL encryption. While we were thinking about potential synergy, such as applying some of DataPower’s XML acceleration technology to A2A workloads, IBM’s middleware head Craig Hayman responded to us that IBM saw Cast Iron’s technology as a separate use case. But they did demonstrated that the software of Cast Iron could, and would, literally run on DataPower’s own iron.

Of course, you could say that Cast Iron overlaps the application connectors from IBM’s Crossworlds acquisition, but those connectors, which were really overlay applications (Crossworlds used to call them “collaborations”), have been repurposed by IBM as BPM technology for WebSphere Process Server. Arguably, there is much technology from IBM’s Ascential acquisition focused purely on data transformation that also overlaps here. But Cast Iron’s value add to IBM is the way those integrations are packaged, and the fact that they have been developed especially for integrations to and from SaaS applications – no more and no less. IBM has gained the right sized tool for the job. IBM has decided to walk a safe tightrope here; it doesn’t want to weigh Cast Iron’s simplicity (a key strength down) with added bells and whistles from the rest of its integration stack. But the integration doesn’t have to go in one direction –weighing down Cast Iron with richer but more complex functionality. IBM could go the opposite direction and infuse some of this A2A transformation as services that could be transformed and accelerated by the traditional DataPower line.

This is a similar issue that IBM has faced with Lombardi, a deal that it closed back in January. They’ve taken the obvious first step in “blue washing” the flagship Lombardi Teamworks BPM product, which is now rebranded IBM WebSphere Lombardi Edition and bundled with WebSphere Application Serve 7 and DB2 Express under the covers. The more pressing question is what to do with Lombardi’s elegantly straightforward Blueprint process definition tool and IBM WebSphere BlueWorks BPM, which is more of a collaboration and best practices definition rather than modeling tool (and still in beta). The good news is that IBM is trying the right thing in not cluttering Blueprint (now rebranded IBM BPM Blueprint), but the bad news is that there is still confusion with IBM’s mixed messages of a consistent branding umbrella but uncertainty regarding product synergy or convergence.

Back to the main point however: while SOA was the original impetus for the Impact event, it is now receding to a more appropriate supporting role.

03.16.10

Pegasystems Starts Growing Up

Posted in BPM, Enterprise Applications, Java, Middleware at 12:45 am by Tony Baer

We’d be the first to admit our surprise that Pegasystems has thrived as well as it has. Our initial impression of the company about 4 – 5 years ago was of an interesting, rather eccentric bunch whose absent minded professors had great ideas but little business savvy. At the time, roughly four or five years ago, the company was marginally profitable.

Maybe their professors weren’t that absent minded and their approach not so pedantic after all, as the company has been on a winning streak for the past ten quarters, scoring 25% growth last year as the rest of the economy (and software industry) tanked. Tilting against windmills, the company scored big gains among established clients across financial services industries, who used Pega’s process “solution frameworks” covering areas such as loan origination and underwriting; wholesale banking, and retail bank account opening.

Pegasystems is on the right side of history, having embraced vertical frameworks. That’s an approach that you also find IBM taking. In business for roughly 25 years, Pega’s sales didn’t take off until it began rolling out a series of templates or frameworks that provided a 60% solution, eliminating the need to model commodity processes from scratch.

Either way, Pega’s success belies our observation that vertical templates are the future of enterprise applications — using the framework as a raw template, they will be composed from existing applicaitons and data sources rather than written or implemented as a packaged applicaiton from scratch.

Growth last year added $35 million to the company’s cash cushion, leaving it with a nice healthy $200 million in the bank. But cash in a consolidating industry is trash when your rivals are either acquiring or getting acquired left and right. As so the question was what would Pega do with its cash.

We now have the answer: Pega announced yesterday its intent to acquire Chordiant, whose specialty is dissecting, analyzing, and optimizing a company’s experiences with its customers. The deal, at $167 million in cash, actually nets out to about $116 million when you factor Cordiant’s $51 million cash position. Pega’s solicited offer trumped an abortive unsolicited $105 million offer back in January from CDC, an aspiring Hong Kong-based enterprise applications provider. Chordiant has come down a few notches over time, with business flattening to $75 million last year, down from $115 million a couple years ago. Pega’s $5/share bid about 10% of the company’s 2000 dot com peak, but a 30% premium over its current valuation.

Pega got a good deal, and Wall St. agreed, as shares of both companies rose on the heels of the announcement. It reflects the fact that Chordiant provides Pega two opportunities: (1) Deepen its presence in financial services accounts by going into the front office, and (2) gain a new beachhead in telecom where it currently has bit a single critical mass client. Although telco could broaden Pega’s addressable market the deal wouldn’t work if the solutions weren’t complementary.

Pegasystems offers a highly sophisticated, rules-driven approach to defining, modeling, and executing business processes. It offers roughly 30 industry specific templates, and well over a dozen cross-industry frameworks such as customer process management, control and compliance, procurement and so on.

By contrast Chordiant covers what it calls “customer experience management,” which tracks customer interactions and offers predictive analytics for optimizing cross-selling, upselling, or customer retention strategies, or for predicting risk or churn. It also offers vertical templates for financial services, healthcare, and telecom. Chordiant’s predictive analytics have adaptive capabilities where the rules can change based on trends in customer response; if a promotion offer proves not as attractive as initially forecast, the rules can adjust the algorithm to reflect reality.

The potential synergy is where Chordiant optimizes customer-facing front office processes while Pega’s BPM frameworks optimize the corresponding back office processes such as loan origination. On paper, it looks like yin and yan. But there are basic architectural differences between the products, as decision management consultant and author James Taylor has pointed out. Keep in mind that Taylor has traditionally been skeptical of Pega’s approach to embedding rules inside its process engine, rather than loosely coupling the two. But he makes valid points that Chordiant handles rules differently from Pega, that the potential synergy between the two is great, but that the company need to take care that technical differences do not “derail the technical integration or cause the merged company to merge its operations without merging its products.”

So on paper, Pega has made a sound deal. As the company is not yet experienced in digesting acquisitions of this size, its success in consummating the Chordiant acquisition will become a predictive indicator of the company’s ability to survive and grow in a consolidating market where it will be expected to make more such deals.

02.04.10

Whatever Event Processing

Posted in BPM, Business Intelligence, Data Management, Database, Enterprise Applications, Middleware, SOA & Web Services at 5:21 pm by Tony Baer

Consolidation in the software business is like the force of gravity. Although there will always be best of breed solutions, ultimately as a particular solution space matures, it doesn’t do so in isolation. No technology is an island.

But of course, there’s always been the question, what to do about Complex Event processing (CEP)? Obviously, only a dropout from marketing could have ever dreamt up a product name whose emphatic message is, “this product is too difficult for you.” (Actually the name just evolved out of academic research.)

Research from our day job at Ovum revealed that there is no single set of use cases, but instead that there are uses for event processing that differ based on complexity of the event or the latency at which events must be processed.

The technology itself is not new – financial services firms have used their own routines to drive algorithmic trading or couch risk for years. What’s new – as in new during the past 10 years – is that a commercial software market has evolved out of it. But the market has struggled because of a number of factors that start with the question, “What is CEP or whatever you call it?

For now we won’t get hung up on names. Let’s stick with Whatever Event Processing, and invent a new TLA: WEP II, so as to distinguish it from Wireless Encryption Protocol. OK, we’re just joking. Suffice it to say that we are talking about a technology that parses out events that would not otherwise be human perceptible that can be translate into actionable intelligence.

Nonetheless, whatever you call it – complex event processing, business event processing, or just plain event processing – is a technology that was never meant to stand alone. There is logical synergy with business process management, as processes can trigger a chain of events or vice versa. There is a similar symbiosis with rules processing – you can use rules to parse and identify unique chains of events, or unique chains of events that are identified can trigger response through rules or policy management. And of course there is a synergy with SOA, as event processing can be exposed as a service that may be consumed by other applications.

But what became clear to us was that the very act of parsing and analyzing event streams, whether through time-based SQL approaches or through rules processing for identifying specific occurrences that must be dealt with, is a prime example of business intelligence. This form of BI does not replace other established uses, ranging from look-back historical analysis to quasi-real time BI where data warehouses or data marts are trickle fed data to keep them almost current. Instead, parsing events as they occur can provide a snapshot of what is occurring now, and form the basis for feed-forward predictive analytics.

Streambase will vehemently disagree with us, but the days of the standalone CEP vendor are over. More specifically, there is room for maybe one or two strong independents – just as players like Informatica and Teradata have survived as independent players. (But as Seth Grimes has pointed out, that also leaves Teradata currently standing by the wall at a party without a CEP date.)

So our first take on Sybase’s acquisition of Aleri is just that – an inevitable act of industry consolidation. Just as Informatica recently swept up Agent Logic, initially to boost its U.S. federal business. Both Sybase and Aleri have already been on first dates, with a 2-year old where Sybase integrated Aleri’s event processing engine as an input to its RAP trading platform. But secondly, the potential BI connection hasn’t escaped us. Sybase could add the Aleri technology – actually there are three of them, which translates to a lot of migration and consolidation. More to the point, when “Sybaleri” grows up, it could offer event processing to its BI portfolio as a piece of look-forward technology.

As (almost) the last man standing, Streambase is having its fun, offering an “amnesty” to Sybaleri customers. When we inquired whether Sybaleri customers were criminals, Streambase CEO Mark Palmer jokingly reassured us that they were just “misguided souls.”

We’ll be speaking with Sybase and Streambase later this afternoon and may update this post later.

01.11.10

BPM Pure Play days numbered with Progress acquisition of Savvion

Posted in BPM, Enterprise Integration, Java, Middleware, SOA & Web Services at 12:36 pm by Tony Baer

Is it more than coincidence that acquisitions tend to come in waves? Just weeks after IBM’s announcement to snap up Lombardi just before Christmas, Progress responds with agreement to put Savvion out of its misery? In such a small space that is undergoing active consolidation, it is hard not to know who’s in play.

Nonetheless, Progress’s acquisition confirms that BPM’s pure play days are numbered, if you expect executable BPM.

The traditional appeal of BPM was that it was a business stakeholder-friendly approach to developing solutions that didn’t rely on IT programmatic logic. The mythology around BPM pure-plays was that these were business user, not IT-driven software buys. In actuality, they simply used a different language or notation: process models with organizational and workflow-oriented semantics as opposed to programmatic execution language. That stood up only as long as you used BPM to model your processes, not automate them.

Consequently, it is not simply the usual issues of vendor size and viability that are driving IT stack vendors to buy up BPM pure plays. It is that, but more importantly, if you want your BPM tool to become more than documentware or shelfware, you need a solution with a real runtime. And that means you need IT front and center, and the stack people right behind it. Even with emergence of BPMN 2.0, which adds support for executables, the cold hard facts are that anytime, anything executes in software, IT must be front and center. So much for bypassing IT.

Progress’s $49 million offer for is a great exit strategy for Savvion. The company, although profitable, has grown very slowly over its 15 years. Even assuming the offer was at a 1.5x multiple, Savvion’s extremely low 7-figure business is not exactly something that a large global enterprise could gain confidence in. Savvion was in a challenging segment: a tiny player contending for enterprise, not departmental BPM engagements. If you are a large enterprise, would you stake your enterprise BPM strategy on a slow-growing players whose revenues are barely north of $10 million? It wasn’t a question of whether, but when Savvion would be acquired.

Of course that leads us to the question as to why Progress couldn’t get its hands on Savvion in time to profit from Savvion’s year-end deals. It certainly would have been more accretive to Progress’ bottom line had they completed this deal three months ago (long enough not to disrupt the end of year sales pipeline).

Nonetheless, Savvion adds a key missing piece for Progress’s Apama events processing strategy (you can read Progress/ApamaCTO John Bates’ rationale here). There is a symbiotic relationship between event processing and business process execution; you can have events trigger business processes or vice versa. There is some alignment with the vertical industry templates that both have been developing, especially for financial services and telcos, which are the core bastions (along with logistics) for EP. And with the Sonic ESB, Progress has a pipeline for ferrying events.

In the long run, there could also be a legacy renewal play by using the Savvion technology to expose functionality for Progress OpenEdge or DataDirect customers, but wisely, that is now a back burner item for Progress which is not the size of IBM or Oracle, and therefore needs to focus its resources.

Although Progress does not call itself a stack player, it is evolving de facto stacks in capital markets, telcos, and logistics.

Event processing, a.k.a., Complex Events Processing (CEP, a forbidding label) or Business Events Processing (a friendlier label that actually doesn’t mean much) is still an early adopter market. In essence, this market fulfills a niche were events are not human detectable and require some form of logic to identify and then act upon. The market itself is not new; capital markets have developed homegrown event processing algorithms for years. What’s new (as in, what’s new in the last decade) is that this market has started to become productized. More recently, SQL-based approaches have emerged to spread high-end event processing to a larger audience.

Acquiring Savvion ups the stakes with Tibco, which also has a similar pairing of technologies in its portfolio. Given ongoing consolidation, that leaves Pegasystems, Appian, Active Endpoints, plus several open source niche pure plays still standing. Like Savvion, Pega is also an enterprise company, but it is a public company with roughly 10x revenues which as still managed to grow in the 25% range in spite of the recession. While in one way, it might make a good fit with SAP (both have their own, entrenched, proprietary languages), Pega is stubbornly independent and SAP acquisition-averse. Pega might be a fit with one of the emerging platform stack players like EMC or Cisco. On second thought, the latter would be a much more logical target for web-based Appian or even Active Endpoints, both still venture-funded, but also promising growth players that at some point will get swept up.

« Previous entries Next Page » Next Page »