Ever since IBM exited the applications business, it has been steadily inching its way back up the value chain from pure infrastructure software. IBM has over the past few years unleashed a string of initiatives seeking to deliver, not only infrastructure software and the integration services to accompany them, but gradually more bits of software that deliver content aimed for the needs of specific scenarios in specific verticals. Naturally, with a highly diversified organization like IBM, there have been multiple initiatives with, of course, varying levels of success.
It started with the usual scenario among IT service providers seeking to derive reusable content from client engagements. Then followed a series of acquisitions for capabilities targeted at vertical industries such as fixed asset management for capital-intensive sectors such as manufacturing or utilities; product information management for consumer product companies; commerce for B2B transactions; online marketing analytic capabilities, and so on. Then came the acquisition of Webify in 2007, where we thought this would lead to a new generation of SOA-based, composite vertical applications (disclosure: we were still drinking the SOA Kool-Aid at the time). At the time, IBM announced there would be Business Fabric SOA frameworks for telco, banking, and insurance, which left us waiting for the shoe to drop for more sectors. Well, that’s all they wrote.
Last year, IBM Software Group (SWG) reorganized into two uber organizations: Middleware under the lead of Robert Leblanc, and Solutions under Mike Rhodin. Both presented at SWG’s 2011 analyst forum as to what the reorg meant. What was interesting was that for organizational purposes, this was a very ecumenical definition of Middleware: it included much of the familiar products from the Information Management, Tivoli, and Rational brand portfolios, and as such, was far more encompassing (e.g., it also included the data layer).
More to the point, once you get past middleware infrastructure, what’s left? At his presentation last year, Rhodin outlined five core areas: Business Analytics and Optimization; Smarter Commerce; Social Business; Smarter Cities; and Watson Solutions. And he outlined IBM’s staged process for developing new markets, expressed as incubation, where the missionary work is done; “make a market” where the product and market is formally defined and materialized; and scale a market, which is self-explanatory. Beyond, we still wondered what makes an IBM solution.
This year, Rhodin fleshed out the answer. To paraphrase, Rhodin said that “it’s not about creating 5000 new products, but creating new market segments.” Rhodin defined segments as markets that are large enough to have visible impact on a $100 billion corporation’s top line. Not $100 million markets, but instead, add a zero or two to it.
An example is Smarter Cities, which began with the customary reference customer engagements to define a solution space. IBM had some marquee urban infrastructure engagements with Washington DC, Singapore, Stockholm, and other cites, out of which came its Intelligent Operations Center. IBM is at an earlier stage with Watson Solutions with engagements at WellPoint (for approving procedures) and Memorial Sloan-Kettering Cancer Center (healthcare delivery) in fleshing out a Smart Healthcare solution.
Of these, Smarter Analytics (not to be confused with Smart Analytics System – even big companies sometimes run out of original brand names) is the most mature.
The good news is that we have a better idea of what IBM means when it says solutions – it’s not individual packaged products per se, but groups of related software products, services, and systems. And we know at very high level where IBM is going to focus its solutions efforts.
Plus ca change… IBM has always been about software, services, and systems – although in recent years the first two have taken front stage. The flip side is that some of these solutions areas are overly broad. Smarter Analytics is a catch-all covering the familiar areas of business intelligence and performance management (much of the Cognos portfolio), predictive analytics and analytical decision management (much of the SPSS portfolio), and analytic applications (Cognos products tailored to specific line organizations like sales, finance, and operations).
It hasn’t been in doubt that for IBM, solutions meant addressing the line of business rather than just IT. That’s certainly a logical strategy for IBM to spread its footprint within the Global 2000. The takeaway of getting a better definition of what IBM’s Solutions business is that it gives us the idea of the scale and acquisitions opportunities that they’re after.
Informatica is within a year or two of becoming a $1 billion company, and the CEO’s stretch goal is to get to $3b.
Informatica has been on a decent tear. It’s had a string of roughly 30 consecutive growth quarters, growth over the last 6 years averaging 20%, and 2011 revenues nearing $800 million. Abbasi took charge back in 2004, lifting Informatica out of its midlife crisis by ditching an abortive foray into analytic applications, instead expanding from the company’s data transformation roots to data integration. Getting the company to its current level came largely through a series of acquisitions that then expanded the category of data integration itself. While master data management (MDM) has been the headliner, other recent acquisitions have targeted information lifecycle management (ILM), complex event processing (CEP), low latency messaging (ultra messaging), along with filling gaps in its B2B and data quality offerings. While some of those pieces were obvious additions, others such as ultra messaging or event processing were not.
CEO Sohaib Abbassi is talking about a stretch goal of $3 billion revenue. The obvious chunk is to deepen the company’s share of existing customer wallets. We’re not at liberty to say how much, but Informatica had a significant number of 6-figure deals. Getting more $1m+ deals will help, but on their own won’t triple revenue.
So how to get to $3 billion?
Obviously, two strategies: deepen the existing business while taking the original formula to expand the footprint of what’s data integration.
First, the existing business. Of the current portfolio, MDM is likely best primed to allow Informatica to more deeply penetrate the installed base. Most of its data integration clients haven’t yet done MDM, and it is not a trivial investment. And for MDM clients who may have started with a customer or product domain, there are always more domains to tackle. During Q&A, Abbasi listed MDM has having as much potential addressable market as the traditional ETL and data quality segments.
The addition of SAP and Oracle veteran Dennis Moore to the Informatica MDM team points to the classic tightrope for any middleware vendor that claims it’s not in the applications game – build more “solutions” or jumpstart templates to confront the same generic barrier that packaged applications software was designed to surmount: provide customers an alternative to raw toolsets or custom programming. For MDM, think industry-specific “solutions” like counter-party risk, or horizontal patterns like social media profiles. If you’re Informatica, don’t think analytic applications.
That’s part of a perennial debate (or rant) on whether middleware is the new enterprise application: you implement for a specific business purpose as opposed to technology project, such as application or data integration, and you implement with a product that offers patterns of varying granularity as a starting point. Informatica MDM product marketing director Ravi Shankar argues it’s not an application because applications have specific data models and logic that become their own de factor silos, whereas MDM solutions reuse the same core metadata engine for different domains (e.g., customer, product, operational process). Our contention? If it solves a business problem and it’s more than a raw programming toolkit, it’s a de facto application. If anybody else cares about this debate, raise your hand.
MDM is typically a very dry subject but demo’ing a social MDM straw man showing a commerce application integrated into Facebook perked Twitter debate among analysts in the room. The operable notion is that such a use of MDM could update the customer’s (some might say, victim’s) profile by the associations that they make in social networks. An existing Informatica higher educational client that shall remain anonymous actually used MDM to mine LinkedIn to prove that its grads got jobs.
This prompts the question, just because you can do it, should you. When a merchant knows just a bit too much about you – and your friends (who may not have necessarily opted in) – that more than borders on creepy. Informatica’s Facebook MDM integration was quite effective; as a pattern for social business, well, we’ll see.
So what about staking new ground? When questioned, Abbasi stated that Informatica had barely scratched the surface with productizing around several megatrend areas that it sees impacting its market: cloud, social media, mobile, and Big Data. More specifically:
• Cloud continues to be a growing chunk of the business. Informatica doesn’t have all of its tooling up in the cloud, but it’s getting there. Consumption of services from the Informatica Cloud continues to grow at a 100 – 150% annual run rate. Most of the 1500 cloud customers are new to Informatica. Among recent introductions are a wizard-driven Contact Validation service that verifies and corrects postal addresses from over 240 countries and territories. A new rapid connectivity framework further eases the ability of third parties to OEM Informatica Cloud services.
• Social media – there were no individual product announcements her per se, just that Informatica’s tools must increasingly parse data coming from social feeds. That covers MDM, data profiling and data quality. Much of it leverages HParser, the new Hadoop data parsing tool released late last year.
• Mobile – for now this is mostly a matter of making Informatica tools and apps (we’ll use the term) consumable on small devices. On the back end, there are opportunities for optimizing virtualizing and replicating data on demand to the edges of highly distributed networks. Aside from newly-announced features such as iPhone and Android support of monitoring the Informatica cloud, for now Informatica is making a statement of product direction.
• Big Data – Informatica, like other major BI and database vendors, have discovered big Data with a vengeance over the past year. The ability to extract from Hadoop is nothing special – other vendors have that – but Informatica took a step ahead with release of HParser last fall. In general there’s growing opportunity for tooling in a variety of areas touching Hadoop, with Informatica’s data integration focus being one of them. We expect to see extension of Informatica’s core tools to not only parse or extract from Hadoop, but increasingly, work natively inside HDFS on the assumption that customers are not simply using it as a staging platform anymore. We also see opportunities in refinements to HParser providing templates or other shortcuts for deciphering sensory data. ILM, for instance, is another obvious one. While Facebook et al might not archive or deprecate their Hadoop data, mere mortal enterprises will have to bite the bullet. Data quality in Hadoop in many cases may not demand the same degree of vigilance as SQL data warehouses, creating demand for lighter weight data profiling and cleansing tooling And for other real-time web centric use case, alternatives stores like MongoDB, Couchbase, and Cassandra may become new Informatica data platform targets.
What, no exit talk?
Abbasi commented at the end of the company’s annual IT analyst meeting that this was the first time in recent memory that none of the analysts asked who would buy Informatica when. Buttonholing him after the session, we got his take which, very loosely translated to Survivor terms, Informatica has avoided getting voted off the island.
At this point, Informatica’s main rivals – Oracle and IBM – have bulked up their data integration offerings to the point where an Informatica acquisition would no longer be gap filling; it would simply be a strategy of taking out a competitor – and with Informatica’s growth, an expensive one at that. One could then point to dark horses like EMC, Tibco, Teradata, or SAP (for obvious reasons we’ve omitted HP). A case might be made for EMC, or SAP if it remains serious in raising its profile as database player– but we believe both have bigger fish to fry. Never say never. But otherwise, the common thread is that data integration will not differentiate these players and therefore it is not strategic to their growth plans.
Tibco has been running on all cylinders of late. In earnings and revenues, it has kept up with the Joneses in the enterprise software neighborhood, running respectable 25% revenue and 30+% software license growth numbers in its most recent quarterly year over year results as we’ve noted in several of our recent Ovum research notes.
It is beginning to make the turn from its geeky roots towards more solution selling to the business side in tone and deed. Ever since the 2007 Spotfire acquisition – which brought real-time analytic visualizations – it has made several buys that are more targeted to the business rather than strictly the IT or CIO side. They include Netrics, for fuzzy logic technologies for pattern matching; Loyalty Lab, for managing customer affinity programs; and Nimbus, a recent addition, which adds process discovery and management of manual activity that comprise the other 80% of what happens inside an enterprise.
Of course it’s not as if Tibco were trying to pull an HP in doing a 180 on its business strategy (heaven forbid, we don’t need any more senseless Silicon Valley soap operas!). Core infrastructure plays, such as FTL ultra low latency messaging or the DataSynapse data grid, remain core to Tibco’s 2-second advantage mission. It’s just that, in modest but growing cases, the raw technology is being packaged as a black box underneath more business-focused solutions. For instance, Tibco is packaging solutions for retail such as Active Catalog and Active Fulfillment that underneath the hood bundle Tibco Business Events (CEP), Active Matrix BPM, and other pieces.
Of course, such transformations don’t come overnight, as there is the need to get field sales up to speed and accustomed to calling on new entry points at target prospect. Not surprisingly, Tibco is also ramping up vertical solutions, but on an opportunistic basis. An example: we met with a European telco customer that is using Business Events for monitoring devices (in this case, water meters) which may present an opportunity for Tibco to develop an M2M (machine-to-machine) event-driven integration solution that could be more widely applied to segments such as utilities or logistics/transportation.
Several of its recent acquisitions, such as Foresight, a healthcare payer EDI gateway; Open Sprit, for data integration for upstream oil and gas processing, are strictly vertical plays. Loyalty Labs, which provides analytics for customer affinity programs, has helped make retail one of its fastest growing verticals coming from a near-zero client base a few years back.
Tibco is traveling a similar road as IBM, but is starting from much earlier point in developing vertical solutions. As Tibco lacks the professional services presence of IBM, it has to cherry pick its vertical opportunities.
At this point, the major disrupters for Tibco are big data and mobility.
For mobile the challenge is integrating alerts from Tibco’s Business Events and Spotfire engines to clients; tibbr, its internal collaboration messaging platform, provides the logical environment for bringing its events feed out to mobile devices. This could be bolstered with its recent Nimbus acquisition, both for input (process discovery, using mobile devices to snap a picture, for instance) and output (for communicating how to perform manual processes out to the field).
Big data positioning and productization for Tibco is also a work in progress. Its message busses can in some cases handle enormous amounts of data; its business event engine could also provide feeds if Tibco can make the sensing agent more lightweight; its BPM offering could be configured to get triggered based on specific event patterns that may involve crunching of enormous volumes of event feeds.
But there is a brave new world of variably structured data that is becoming fair game for enterprises to sense and respond. We don’t expect Tibco to buy its own Advanced SDQL platform or create its own Hadoop distribution, as Tibco is not about data at rest, nor is it a database player (OK, its MDM offering does have to store master and reference data). Nonetheless, delivering the 2-second advantage in a big world where the data is getting bigger, bigger, and more heterogeneous raises the urgency for Tibco to distinguish itself in extending its visibility.
When we were asked by the executive marketing team of our impressions this year, our thoughts were, well, there was hardly anything newsworthy. That’s not necessarily a bad thing, as during a strategy roadmap presentation at this year’s Tibco TUCON conference, a timeline of Tibco acquisitions showed roughly a half dozen entries for 2010 and just one for this year. Over the past year Tibco has been preoccupied with absorbing the new acquisitions and so – Nimbus excluded – has not been active on this front lately. For instance, Tibco has integrated the Netrics fuzzy pattern matching engine into Business Events, where it belongs.. It has similarly blended the recently acquired data grid technology with Business Events. Check out Sandy Kemsley’s post for a more detailed blow-by-blow on how Tibco has rounded out its product portfolio over the past year.
With the swoon on Wall Street, Tibco has left its $250 cash stash alone, in spite of the fact that there are plenty of acquisition targets available at reasonable prices right now as a lot of venture funds are looking for exits. By its CFO’s words, the company is not as enormous as IBM or Oracle, where acquisitions don’t disrupt the entire company. Nonetheless, we expect that 2012 will grow more active in acquisitions – we hope that acquisition of a data quality provider makes the top of the shopping list.
We’re closing out a week that’s been bookended by a couple of BPM-related announcements from IBM and upstart Active Endpoints. The common thread is the quest for greater simplicity, the difference is that one de-emphasizes the B word itself, with a focus of being a simple productivity tool that makes complex enterprise apps (in this case Salesforce.com) easier, while the other amplifies a message loud and clear that this is the BPM that you always wanted BPM to be.
The backstory is that BPM has traditionally appealed to a narrow audience of specially-skilled process analysts and modelers, and has yet to achieve the mass market status. Exhibit One? One of the larger independent BPM players (Pegasystems) still standing is a $330 million company. In the interim, there has been significant consolidation in this community as BPM has become one of the components that are expected in a middleware stack. A milestone has been Oracle’s latest Fusion BPM 11g R1, which unifies the engines and makes them first class citizens of the Fusion middleware stack. While such developments have solidified BPM’s integration story, on its own that has not made BPM the enterprise’s next ERP.
The irony is that BPM has long been positioned as the business stakeholder’s path to application development, with the implication that a modeling/development environment that uses the terms of business process rather than programmatic commands should appeal to a higher audience. The drawback is that to get there, most BPM tools relied on proprietary languages that limited use to… you guessed it… a narrow cadre of business process architects.
Just over a year ago, IBM acquired Lombardi, one of the more innovative independents that has always stressed simplicity. Not that IBM was lacking in BPM capability, but they were based on aging engines centered around integration and document management/workflow use cases, respectively. As IBM software has not been known by its simplicity (many of its offerings still consist of multiple products requiring multiple installs, or a potpourri of offerings targeted for separate verticals or use cases), the fear was that Lombardi would get swallowed alive and emerge unrecognizable.
The good news is that Lombardi in technology and product has remained alive and well. We’ll characterize IBM Business Process Manager 7.5 as Lombardi giving IBM’s BPM suite a heart transplant; it dropped in a new engine to reinvigorate the old. As for the peanut gallery, Janelle Hill characterized it as IBM getting a Lombardotomy; Neil Ward-Dutton and Sandy Kemsley described it as a reverse takeover; Ward-Dutton debated Clay Richardson that the new release was more than just a new paint job, while Bruce Silver assessed it as IBM’s BPM endgame.
So what did IBM do in 7.5? The modeling design environment and repository are Lombardi’s, with the former IBM WebSphere Process Server (that’s the integration BPM) now being ported over to the Lombardi engine. It’s IBM’s initial answer to Oracle’s unification of process design and runtimes with the latest Fusion BPM.
IBM is not all the way there yet. Process Server models, which were previously contained in a flat file, are now stored in the repurposed Lombardi repository, but that does not yet make the models fully interoperable. You can just design them in the same environment and store them in the same repository. That’s a good start, and at least it shows that the Lombardi approach will not get buried. Beyond the challenge of integrating the model artifacts, the bigger question to us is whether the upscaling and extending of Lombardi to cover more use cases might be too much of a good thing. Can it avoid the clutter of Microsoft Office that resulted from functional scope creep?
As for FileNet, IBM’s document-centric BPM, that’s going to wait. It’s a very different use case – and note that even as Oracle unified two of its BPM engines, it has markedly omitted the document management piece. Document centric workflow is a well-ingrained use case and has its own unique process patterns, so the open question is whether it is realistic to expect that such a style of process management can fit in the same engine, or simply exist as external callable workflows.
At the other end of the week, we were treated to the unveiling of Active Endpoints new Cloud Extend release. As we tweeted, this is a personality transplant for Active Endpoints, as the company’s heritage has been with more geeky BPEL, and even geekier branded tool, ActiveVOS.
OK, the Cloud branding is a break from geekdom, but it ain’t exactly original – there’s too much Cloud this and Cloud that going around the ISV community these days.
More to the point, Cloud Extend does not really describe what Active Endpoints is doing with this release. Cloud Extend is not cloud-enabling your applications, it is just enabling your application that in this case happens to run in the cloud (the company also has an on premises version of this tool with the equally unintuitive brand Socrates).
In essence, Cloud Extend adds a workflow shell to Salesforce.com so that you can design workflows in a manner that appears as simple as creating a Visio diagram, while providing the ability to save and reuse them. There’s BPEL and BPMN underneath the hood, but in normal views you won’t see them. It also has neat capabilities that help you filter out extraneous workflow activities when working on a particular process. The result is that you have screens that drive users to interact with Salesforce in a consistent manner, replacing the custom coding of systems integrators with a tool that should help systems integrators perform the same Salesforce customization job quicker. Clearly, Salesforce should be just the first stop for this technology; we’d expect that Active Endpoints will subsequently target other enterprise apps with its engine in due course.
We hate to spill the beans, but under the covers, this is BPM. But that’s not the point – and in fact it opens an interesting argument as to whether we should simply take advantage of the technology without having to make a federal case about it. It’s yet another approach to make the benefits of BPM more accessible to people who are not modeling notation experts, which is a good thing.
Sometimes the news is that there is no news. Well, Steve Mills did tell us that IBM is investing the bulk of its money in software and that between now and 2015, it would continue to make an average of $4 – 5 billion worth of strategic acquisitions per year. In other words, it would continue its current path, and it will continue buying making acquisitions for the strategic value of technology with the guideline of having them become revenue accretive within 2 – 4 years. Again, nothing new, as if there were anything wrong with that.
The blend of acquisition and organic development is obviously bulking up the Software Group’s product portfolio, which in itself is hardly a bad thing; there is more depth and breadth. But the issue that IBM has always faced is that of complexity. The traditional formula has always been, we have the pieces and we have services to put it together for you. Players like Oracle compete with a packaged apps strategy; in more specialized areas such as project portfolio management, rivals like HP and CA Technologies say that we have one product where IBM splits it in two.
IBM continues to deny that it is in the apps business, but as it shows architectural slides of its stack that is based on middleware along with horizontal “solutions” such as a SPSS Decision Manager offering (more about that shortly); vertical industry frameworks which specify processes, best practices, and other assets that can be used to compose industry solutions; and then at the top of the stack, solutions that IBM and/or its services group develops. It’s at the peak of the stack that the difference between “solutions” and “applications” becomes academic. Reviewing Oracle’s yet to be released Fusion applications, there is a similar architecture that composes solutions based on modular building blocks.
So maybe IBM feels self-conscious about the term application as it doesn’t want to be classed with Oracle or SAP, or maybe it’s the growing level of competition with Oracle that made Mills rather defensive in responding to an analyst’s question about the difference between IBM’s and Oracle’s strategy. His response was that IBM’s is more of a toolkit approach that layers atop the legacy that will always be there, which is reasonable, although the tone was more redolent of “you [Oracle] can’t handle the truth.”
Either way, where you sell a solution or a packaged application for enterprise level, assembly will still be required. Services will be needed to integrate and/or train your people. Let’s be adults and get that debate behind us. For IBM, time to get back to Issue One: Defusing Complexity. When you’re dealing with enterprise software, there will always be complexity. But when it comes to richness or simplicity, IBM tends to aim for the former. The dense slides with small print made the largely middle aged analyst audience more self conscious than normal of the inadequacies of their graduated eyeglasses or contacts.
OK, if you’re IBM facing an analyst crowd, you don’t want to oversimplify the presentation into the metaphorical equivalent of the large print weekly for the visually impaired. You must prove that you have depth. You need to show a memorable, coherent message (Smarter Planet was great when it débuted two years ago). But most importantly, you need to have coherent packaging and delivery to market.
IBM Software Group has done a good job of repurposing technologies across brands to fill defined product needs; it still has its work cut out for its goal of making the software group sales force brand agnostic. That is going to take time.
As a result, good deeds don’t go unpunished, with IBM’s challenges with SPSS Decision Manager a great case in point. The product, an attempt to craft a wider market for SPSS capabilities, blends BI analytics from Cognos, rules management from Ilog, and event processing from WebSphere Business Events to develop a predictive analytics solution for fashioning business strategy aimed at line of business users.
For instance, if you are in the claims processing group of an auto insurance company, you can use form-based interfaces to vary decision rules and simulate the results to ensure that accident calls from 19 year old drivers or those who have not yet contacted the police are not fast tracked for settlement.
The problem with Decision Manager is that it is not a single SKU or install; IBM has simply pre-integrated components that you still must buy a la carte. IBM Software is already integrating product technologies; it now needs to attend to integrating delivery.
Last year, the anticipation of the unveiling of Fusion apps was palpable. Although we’re not finished with Oracle OpenWorld 2010 yet – we still have the Fusion middleware analyst summit tomorrow and still have loose ends regarding Oracle’s Java strategy – by now our overall impressions are fairly code complete.
In his second conference keynote – which unfortunately turned out to be almost a carbon copy of his first – Larry Ellison boasted that they “announced more new technology this week than anytime in Oracle’s history.” Of course, that shouldn’t be a heavy lift given that Oracle is a much bigger company with many more products across the portfolio, and with Sun, has a much broader hardware/software footprint at that.
On the software end – and post-Sun acquisition, we have to make that distinction – it’s hard to follow up last year’s unveiling of Fusion apps. The Fusion apps are certainly a monster in size with over 5000 tables, 10,000 task flows, representing five years of development. Among other things, the embedded analytics provide the context long missing from enterprise apps like ERP and CRM, which previously required you to slip into another module as a separate task. There is also good integration of process modeling, although for now BPM models developed using either of Oracle’s modeling tools won’t be executable. For now, Fusion apps will not change the paradigm of model, then develop.
A good sampling of coverage and comment can be found from Ray Wang, Dennis Howlett, Therese Poletti, Stefan Ried, and for the Java side, Lucas Jellema.
The real news is that Fusion apps, excluding manufacturing, will be in limited release by year end and general release in Q1. That’s pretty big news.
But at the conference, Fusion apps took a back seat to Exadata, the
SPARC HP (and soon to be SPARC)-based database appliance unveiled last year, and the Exalogic cloud-in-a-box unwrapped this year. It’s no mystery that growth in the enterprise apps market has been flat for quite some time, with the main Greenfield opportunities going forward being midsized businesses or the BRIC world region. Yet Fusion apps will be overkill for small-midsized enterprises that won’t need such a rich palette of functionality (NetSuite is more likely their speed), which leaves the emerging economies as the prime growth target. The reality is most enterprises are not about to replace the very ERP systems that they implemented as part of modernization or Y2K remediation efforts a decade ago. At best, Fusion will be a gap filler, picking up where current enterprise applications leave off, which provides potential growth opportunity for Oracle, but not exactly a blockbuster one.
Nonetheless, as Oracle was historically a software company, the bulk of attendees along with the press and analyst community (including us) pretty much tuned out all the hardware talk. That likely explains why, if you subscribed to the #oow10 Twitter hashtag, that you heard nothing but frustration from software bigots like ourselves and others who got sick of the all-Exadata/ Exalogic-all-the-time treatment during the keynotes.
In a memorable metaphor, Ellison stated that one Exalogic device can schedule the entire Chinese rail system, and that two of them could run Facebook – to which a Twitter user retorted, how many enterprises have the computing load of a Facebook?
Frankly, Larry Ellison has long been at the point in his life where he can afford to disregard popular opinion. Give a pure hardware talk Sunday night, then do it almost exactly again on Wednesday (although on the second go round we were also treated to a borscht belt routine taking Salesforce’s Mark Benioff down more than a peg on who has the real cloud). Who is going to say no to the guy who sponsored and crewed on the team that won the America’s cup?
But if you look at the dollars and sense opportunity for Oracle, it’s all about owning the full stack that crunches and manages the data. Even in a recession, if there’s anything that’s growing, it’s the amount of data that’s floating around. Combine the impacts of broadband, sensory data, and lifestyles that are becoming more digital, and you have the makings for the data counterpart to Metcalfe’s Law. Owning the hardware closes the circle. Last year, Ellison spoke of his vision to recreate the unity of the IBM System 360 era, because at the end of the day, there’s nothing that works better than software and hardware that are tuned for each other.
So if you want to know why Ellison is talking about almost nothing else except hardware, it’s not only because it’s his latest toy (OK, maybe it’s partly that). It’s because if you run the numbers, there’s far more growth potential to the Exadata/Exalogic side of the business than there is for Fusion applications and middleware.
And if you look at the positioning, owning the entire stack means deeper account control. It’s the same strategy behind the entire Fusion software stack, which uses SOA to integrate internally and with the rest of the world. But Fusion apps and middleware remain optimized for an all-Oracle Fusion environment,underpinned by a declarative Application Development Framework (ADF) and tooling that is designed specifically for that stack.
So on one hand, Oracle’s pitch that big database processing works best on optimized hardware can sound attractive to CIOs that are seeking to optimize one of their nagging points of pain. But the flipside is that, given Oracle’s reputation for aggressive sales and pricing, will the market be receptive to giving Oracle even more control? To some extent the question is moot; with Oracle having made so many acquisitions, enterprises that followed a best of breed strategy can easily find themselves unwittingly becoming all-Oracle shops by default.
Admittedly, the entire IT industry is consolidating, but each player is vying for different combinations of the hardware, software, networking, and storage stack. Arguably, applications are the most sensitive layer of the IT technology stack because that is where the business logic lies. As Oracle asserts greater claim to that part of the IT stack and everything around it, it requires a strategy for addressing potential backlash from enterprises seeking second sources when it comes to managing their family jewels.
We should have seen this one coming. IBM’s offer to buy Sterling Commerce for $1.4 billion from AT&T closes a major gap in the WebSphere portfolio, extending IBM’s array of internal integrations externally to B2B. It’s a logical extension, and IBM is hardly the first to travel this path: Software AG’s webMethods began life as a B2B integration firm before it morphed into EAI, later SOA and BPM middleware, before getting acquired by Software AG. In turn, Tibco recently added Foresight Software as an opportunistic extension for taking advantage of a booming market in healthcare B2B transactions.
But neither Software AG’s or Tibco’s moves approach the scope of Sterling Commerce’s footprint in B2B trading partner management, a business that grew out of its heritage as one of the major EDI (electronic data interchange) hubs. The good news is the degree of penetration that Sterling has; the other (we won’t call it “bad”) news is all the EDI legacy, which provides great fodder for IBM’s Global Business Services arm to address a broader application modernization opportunity.
Sterling’s base has been heavily in downstream EDI and related trading partner management support for retailers, manufacturers, and transportation/freight carriers. Its software products cover B2B/EDI integration, partner onboarding into partner communities (an outgrowth of the old hub and spoke patterns between EDI trading partners), invoicing, payments, order fulfillment, and multi-channel sales. In effect, this gets IBM deeper into the supply chain management applications market as it already has Dynamic Inventory Optimization (DIOS) from the Maximo suite (which falls under the Tivoli umbrella), not to mention the supply chain optimization algorithms that it inherited as part of the Ilog acquisition which are OEM’ed to partners (rivals?) like SAP and JDA.
Asked if acquisition of Sterling would place IBM in competition with its erstwhile ERP partners, IBM reiterated its official line that it picks up where ERP leaves off – but that line is getting blurrier.
But IBM’s challenge is prioritizing the synergies and integrations. As there is still a while before this deal closes – approvals from AT&T shareholders are necessary first – IBM wasn’t about to give a roadmap. But they did point to one no-brainer: infusing IBM WebSphere vertical industry templates for retail with Sterling content. But there are many potential synergies looming.
At top of mind are BPM and business rules management that could make trading partner relationships more dynamic. There are obvious opportunities for WebSphere Business Modeler’s Dynamic Process Edition, WebSphere Lombardi Edition’s modeling, and/or Ilog’s business rules. For instance, a game changing event such as Apple’s iPad entering or creating a new market for tablet could provide the impetus for changes to products catalogs, pricing, promotions, and so on; a BPM or business rules model could facilitate such changes as an orchestration layer that acts in conjunction with some of the Sterling multi-channel and order fulfillment suites. Other examples include master data management, which can be critical when managing sale of families of like products through the channel; and of course Cognos/BI, which can be used for evaluating the profitability or growth potential of B2B relationships.
Altimeter Group’s Ray Wang voiced a question that was on many of our minds: why AT&T would give up Sterling. IBM responded about the potential partnership opportunities but to our mind, AT&T has its hands full attaining network parity with Verizon Wireless and is just not a business solutions company.
Messaging is Tibco’s business, but it has had a mixed track record when it comes to making the messaging around its message-oriented view of the world. It starts off on the right foot. Its perennial tagline, The Power of Now, has become timelier in a world where the ability to respond is reinforced by the headlines. Just take last couple weeks for example: last Thursday’s weird Wall Street meltdown, and before that, arrest of the foiled bomber of Times Square.
Tibco is hardly alone in voicing such messaging. IBM’s Smarter Planet and Progress Software’s Operational Responsiveness are also about the need for systems that think on their feet. Yet Tibco’s DNA gives it a unique claim to this space as the company was born around fast reliable, messaging. Two years ago, Tibco CEO Vivek Ranadive made the case for event-driven predictive intelligence. Now Tibco is talking about the need, in global marketing head Ram Menon’s words, “to humanize the story better.” That’s always been a stretch for this technology-driven company whose vision has long been driven by a shy, technology centric CEO. Towards that goal, Tibco is taking a step or two forward, but unfortunately also a step back.
The good news is that Tibco is fleshing out it’s “Power of Now” tagline. We saw the first of a new series of simple, straightforward visual ads with short statements of business outcomes, like how Tibco’s event processing helps defense agencies clear the fog of war, underscored by the tagline.
Then Tibco unveiled a new tag line, the Two-Second Advantage, which makes the case if that you have just enough information quickly enough, you don’t need the complete picture to make the right decision. Tibco’s on a roll there, a message backed up by the surprisingly irreverent Ronald K. Noble, the brash New Yorker who heads Interpol, who made the case that such an advantage can have life or death implications in crime fighting, especially when it comes to border control.
The problem is that just when you’ve thought that Tibco finally has gotten on message, it reverts back to its geeky self and steps on top of it. The latest case is its CEO’s Enterprise 3.0 concept that, when debuted in front of a room of analysts, floated like a lead balloon.
His numbering is over simplistic and cuts against popular perception: Ranadive terms Enterprise 2.0 as client/server, rather than the social computing weave that is now seeping into enterprise systems – including Tibco’s. But bloggers of record Sandy Kemsley and Brenda Michelson summed it up best. Kemsley: “Enterprise 3.0 is becoming a joke amongst the analysts attending here today (we’re tweeting about staging an intervention)…” Michelson:” We like the 2 second advantage message, but “Enterprise 3.0” doesn’t resonate, it won’t be meaningful to Business Execs and CIOs.”
Tibco doesn’t need new messaging, it just has to bring out the best of what it already has. It can humanize “The Power of Now” by appending the question, “What does it really mean?” And from that, “The two-second advantage,” and all the business cases that manifest it, become the logical response.
With acquisitions and organic product development, Tibco’s portfolio is broader than ever, and not surprisingly, this year’s event carried announcements of a large number of product upgrades and introductions. For us the highlight was ActiveMatrix BPM, which finally puts Tibco’s business process management engine on the same Eclipse development platform and runtime as the rest of Tibco’s service orchestration products. As a completely new product (this is not iProcess, which becomes Tibco’s legacy BPM offering rooted from the original Staffware acquisition). This is a major development for a vendor that has accumulated a large portfolio of individual products over the years, with the harshest critique being the need for multiple runtime engines: ActiveMatrix, iProcess, BusinessWorks, Rendezvous, EMS, etc. The new BPM offering fills a critical gap in the SOA-oriented ActiveMatrix product family.
Our critique here – as with IBM – is that the use of Eclipse as the design time platform appeals more to developers than business stakeholders. But the fact that ActiveMatrix BPM is intended to be an execution platform means that so-called nirvana of having business people design their own business processes is the type of stuff that you do when in a room with a whiteboard. Fortuitously, Tibco does have something in the works, as it previewed Design Collaborator, a new process definition tool that suspiciously resembled IBM BPM Blueprint; we hope that Tibco designs it so that it could feed BPMN models into ActiveMatrix BPM so it doesn’t become a dead-end product.
There were other introductions, such as the none-too original, retro-named PeopleForms (which sounds like the name for one of Oracle’s legacy PeopleSoft offerings) that for now only churns out SharePoint-like forms-driven apps as a beta. PeopleForms addresses a low end of the market not served by Tibco, developed by what’s left of the old General Interface team; eventually this will be beefed up into something more useful with workflow. We also hope that there might be some rationalization with Design Collaborator, so that this product doesn’t wind up becoming a standalone curiosity.
But the most profound impression came from an acquisition that Tibco completed only in March. Our award for best-of-the-day award from the analyst sessions was demonstration of Netrics, a tiny 15-person outfit out of Princeton, NJ that has developed a patented, algorithmic pattern matching program that really fleshes out the “two-second advantage message’ in providing proximate matches that should be “good enough: to make decisions. Netrics’ technology assigns algorithms as metadata that scores the identity of names or people or things; using that metadata, it quickly reduces large data sets to find probable matches. Those probable matches can be filtered to include or exclude misspellings and typos. Netrics’ technology has ready applicability to identifying event patterns and golden copies of data – and as such, Tibco’s initial plans are to incorporate the technology into Tibco Business Events (their CEP offering) and Master Data Management. On the horizon, it provides a pattern matching approach that complements text mining that is often used in national security applications.
Netrics is not a replacement for data quality – that remains a major gap ion Tibco’s product suite. While the two-second advantage implies having data that is “good enough,” when you perform event processing and must make snap decisions. But over the long haul, you’ll need the kind of feedback loop and reality check on those decisions that business intelligence provides – and for that, you’ll need data that is better scrubbed.
There they go again. Barely a month after announcing the acquisition of message broker Rabbit Technologies, SpringSource is adding yet one more piece to its middleware stack: it has announced the acquisition of Gemstone for its distributed data caching technology.
SpringSource’s Rod Johnson told us that he was planning to acquire such a technology even before VMware came into the picture, but make no mistake about it, VMware’s presence upped the ante.
SpringSource has been looking to fill out its stack vs. Oracle and IBM ever since its cornerstone acquisition of Covalent (which brought the expertise behind Apache Tomcat and bequeathed the world tc Server) two years ago. Adding Gemstone’s Gemfire becomes SpringSource’s response to Oracle Coherence and IBM WebSphere XD. The technologies in question allow you to replicate data from varied sources into a single logical cache, which is critical if those sources are highly dispersed.
So what about VMware? Wasn’t SpringSource planning to grow its stack anyway? There are deeper stakes at play: VMware’s aspiration to make cloud and virtualization virtually synonymous – or at least to make virtualization essential to the cloud – falls apart if you don’t have a scalable, high performance way to manage and access data. Enterprises using the cloud are not likely to move all their data there, and need a solution that allows hybrid strategies that will invariably involve a mix of cloud- and on premised-based data resources to be managed and accessed efficiently. Distributed data caching is essential.
So the next question is why SpringSource, as a historically open source company that has always made open source acquisitions, buy open source Terracotta instead? Chances are, were SpringSource still independent, it probably would have, but VMware brings deeper pockets and deeper aspirations. Gemstone is the company that sold object-oriented databases back in the 90s, and once it grew obvious that they (and other OODBMS rivals like Object Store) weren’t going to become the next Oracles, they adapted their expertise to caching. Gemfire emerged in 2002 and provided Wall Street and defense agencies an off the shelf alternative to homegrown development or a best of breed strategy. By comparison, although Terracotta boasts several Wall Street clients, its core base is in web caching for high traffic B2C oriented websites.
Bottom line: VMware needs the scale.
There are other interesting pieces that Gemstone brings to the party. It is currently developing SQLFabric, a project that embeds the Apache Derby open source relational database into Gemfire to make its distributed data grid fully SQL-compliant, which would be very strategic to VMware and SpringSource. It also has a shot-in-the-dark project, MagLev, which is more a curiosity for the mother ship. Conceivably it could provide the impetus for SpringSource to extend to the Ruby environment, but would require a lot more development work to productize.
Obviously as the deal won’t close immediately, both entities must be coy about their plans other than the obvious commitment to integrate products.
But there’s another angle that will be worth exploring once the ink dries: SpringSource has been known for simplicity. The Spring framework provided a way to abstract all the complexity out of Java EE, while tc Server, based on Tomcat, carries but a subset of the bells and whistles of full Java EE stacks. But Gemfire is hardly simple, and the market for distributed data grids has been limited to organizations with extreme processing needs who have extreme expertise and extreme budgets. Yet the move to cloud will mean, as noted above, that the need for logical data grids will trickle down to more of the enterprise mainstream, although the scope of the problem won’t be as extreme. It would make sense for the Spring framework to extend its dependency injection to a “lite” version of Gemfire (Gemcloud?) to simplify the hassle of managing data inside and outside of the cloud.
There is a core disconnect between what gets analysts and journalists excited, and what gains traction with the customers who consume the technologies that keep our whole ecosystem in business. OK, guilty as charged, we analysts get off on hearing about what’s new and what’s breaking the envelope, but that’s the last thing that enterprise customers want to hear. Excluding reference customers (who have a separate set of motivations that often revolve around a vendor productizing something that would otherwise be custom developed), most want the tried and true, or at least innovative technology that at least has matured the rough spots and is no longer version 1.
It’s a thought that crystallized as we bounced impressions of this year’s IBM SOA Impact event with colleagues like Dorothy Alexander and Marcia Kaufman, who shared perceptions that, while this year’s headlines or trends seemed a bit anticlimactic, that there was real evidence that customers were actually “doing” whatever it is that we associate with SOA.
Forget about the architectural journeys that you’ve heard about SOA; SOA is an enterprise architectural pattern that is a means to an end. It’s not a new argument; it was central to the SOA is dead debate that flared up with Anne Thomas Manes’ famous or infamous post of almost a year and a half ago, and of the subsequent debates and hand wringing that ensued.
IBM’s so-called SOA conference, Impact, doesn’t include SOA in its name, but until now SOA was the implicit rationale for this WebSphere middleware stack conference to exist. But more and more it’s about the stack that SOA enables, and more and more, about the composite business applications that IBM’s SOA stack enables. IBM won’t call it the applications business. But when you put vertical industry frameworks, business rules, business process management, and analytics together, it’s not simply a plumbing stack. It becomes a collection of software tools and vertical industry templates that become the new de facto applications that bolt atop and aside the core application portfolio that enterprises already have and are not likely to replace. In past years, this conference was used to introduce game changers, such as the acquisition of Webify that placed IBM Software firmly on the road to verticalizing its middleware.
This year the buzz was about something old becoming something new again. IBM’s acquisition of Cast Iron, as dissected well by colleagues Dana Gardner and James Governor, reflects the fact that after all these years of talking flattened architectures, especially using the ESB style, that enterprise integration (or application-to-application, or A2A) hubs never went out of style. There are still plenty of instances of packaged apps out there that need to be interfaced. The problem is no different from a decade ago when the first wave of EAI hubs emerged to productize systems integration of enterprise packages.
While the EAI business model never scaled well in its time because of the need for too much customization, experience, commoditization of templates, and emergence of cheap appliances provided economic solutions to this model. More importantly, the emergence of multi-tenanted SaaS applications, like Salesforce.com, Workday and many others, have imposed a relatively stable target data schema plus a need of integration of cloud and on-premises applications. Informatica has made a strong run with its partnership with Salesforce, but Informatica is part of a broader data integration platform that for some customers is overkill. By contrast, niche players like Cast Iron which only do data translation have begun to thrive with a Blue Chip customer list.
Of course Cast Iron is not IBM’s first appliance play. That distinction goes to DataPower, which originally made its name with specialized appliances that accelerated compute-intensive XML processing and SSL encryption. While we were thinking about potential synergy, such as applying some of DataPower’s XML acceleration technology to A2A workloads, IBM’s middleware head Craig Hayman responded to us that IBM saw Cast Iron’s technology as a separate use case. But they did demonstrated that the software of Cast Iron could, and would, literally run on DataPower’s own iron.
Of course, you could say that Cast Iron overlaps the application connectors from IBM’s Crossworlds acquisition, but those connectors, which were really overlay applications (Crossworlds used to call them “collaborations”), have been repurposed by IBM as BPM technology for WebSphere Process Server. Arguably, there is much technology from IBM’s Ascential acquisition focused purely on data transformation that also overlaps here. But Cast Iron’s value add to IBM is the way those integrations are packaged, and the fact that they have been developed especially for integrations to and from SaaS applications – no more and no less. IBM has gained the right sized tool for the job. IBM has decided to walk a safe tightrope here; it doesn’t want to weigh Cast Iron’s simplicity (a key strength down) with added bells and whistles from the rest of its integration stack. But the integration doesn’t have to go in one direction –weighing down Cast Iron with richer but more complex functionality. IBM could go the opposite direction and infuse some of this A2A transformation as services that could be transformed and accelerated by the traditional DataPower line.
This is a similar issue that IBM has faced with Lombardi, a deal that it closed back in January. They’ve taken the obvious first step in “blue washing” the flagship Lombardi Teamworks BPM product, which is now rebranded IBM WebSphere Lombardi Edition and bundled with WebSphere Application Serve 7 and DB2 Express under the covers. The more pressing question is what to do with Lombardi’s elegantly straightforward Blueprint process definition tool and IBM WebSphere BlueWorks BPM, which is more of a collaboration and best practices definition rather than modeling tool (and still in beta). The good news is that IBM is trying the right thing in not cluttering Blueprint (now rebranded IBM BPM Blueprint), but the bad news is that there is still confusion with IBM’s mixed messages of a consistent branding umbrella but uncertainty regarding product synergy or convergence.
Back to the main point however: while SOA was the original impetus for the Impact event, it is now receding to a more appropriate supporting role.
« Previous entries Next Page » Next Page »