Category Archives: Enterprise Applications

Is the sky the limit for Flash and In-Memory Databases?

Big Data is getting bigger, and Fast Data is getting faster because of the continuing declining cost of all things infrastructure. Ongoing commoditization of powerful, multi-core CPU, storage media, and connectivity made scale-out Internet data centers possible, and with them, scale-out data platforms such as Hadoop and the new generation of Advanced SQL/NewSQL analytic data stores. Bandwidth is similarly going crazy; while the lack of 4G may make bandwidth seem elusive to mobile users, growth of bandwidth for connecting devices and things has become another fact taken for granted.

Conventional wisdom is that similar trends are impacting storage, and until recently, that was the Kool-Aid that we swallowed. For sure, the macro picture is that declining price and ascending density curves are changing the conversation where it comes to deploying data. The type of media on which you store data is no longer just a price/performance tradeoff, but increasingly an architectural consideration on how data is processed and applications that run on data are engineered. Bigger, cheaper storage makes bigger analytics possible; faster, cheaper storage makes more complex and functional applications possible.

At 100,000 feet, such trends for storage are holding, but dig beneath the surface and the picture gets more nuanced. And those nuances are increasingly driving how we design our data-driven transaction applications and analytics.

Cut through the terminology
But before we dive into the trends, let’s get our terminology straight, because the term memory is used much too loosely (does it mean DRAM or Flash?). For this discussion, we’ll stick with the following conventions:
CPU cache is the memory on chip that is used for temporarily holding data being processed by the processor.
DRAM memory is the fastest storage layer that sits outside the chip, and is typically parceled out in GBytes per compute core.
Solid State Drive (SSD) based on Flash memory, is the silicon-based, faster substitute to traditional hard drives are typically sized at hundreds of GBytes (with some units just under a terabyte). But it is not as fast as DRAM.
• Hard disk, or “disk,” is the workhorse that now scales economically up to 1 – 3 TBytes per spindle.

So what’s best for which?
For hard drives, conventional wisdom has been that they keep getting faster and cheaper. Turns out, only the latter is true. The cheapness of 1- and 3-TByte drives has made scale-out Internet data centers possible, and with it, scale-out Big Data analytic platforms like Hadoop. Hard disk continues to be the medium of choice for large volumes of data because individual drives routinely scale to 1 – 3 TBytes. And momentary supply chain disruptions like the 2011 Thailand floods aside, the supply remains more than adequate. Flash drives simply don’t get as fat.

But if anything, hard drives are getting slower because it’s no longer worthwhile to try speeding them up. With Flash being at least 10 – 100x faster, there’s no way that disk will easily catch up even if the technology gets refreshed. Flash is actually pulling the rug out from under demand for 7200-RPM disks (currently the state of the art for disk). Not surprisingly, disk technology development has hit the wall.

Given current price trends, where Flash prices are expectedsome analysts expect Flash to reach parity with disk in the next 12 – 18 months (or maybe sooner), there will be less reason for your next transaction system to be disk-based. In fact there is good reason to be a bit skeptical on how soon supply of SSD Flash will ramp up adequately for the transaction system market; but SSD Flash will gradually make its way to prime time. Conversely, with disk likely to remain fatter in capacity than Flash, it will be best suited for active archiving that keeps older data otherwise bound for tape live; and for Big Data analytics, where the need is for volume. Nonetheless, the workhorse of large Hadoop, and similar disk-based Big Data analytic or active archive clusters will likely be the slower 5400 RPM models.

So what about even faster modes of storage? In the past couple years, DRAM memory prices crossed the threshold where it became feasible to deploy them for persistent storage rather than caching of currently used data. That cleared the way for the in-memory database (IMDB), which is often code word for all-DRAM memory storage.

In-memory databases are hardly new, but until the last 3 – 4 years they were highly specialized. Oracle TimesTen, one of the earliest commercial offerings, was designed for tightly-coupled, specialized transactional applications; other purpose-built in-memory data stores have existed for capital markets for at least a decade or more. But DRAM memory prices dropped to bring them into the enterprise mainstream. Kognitio opened the floodgates as it reincarnated its MOLAP cube and row store analytic platform to in-memory on industry-standard hardware just over 5 years ago; SAP put in-memory in the spotlight with HANA for analytics and transactional applications; followed by Oracle, which reincarnated TimesTen as Exalytics for running Oracle Business Intelligence Enterprise Edition (OBIEE) and Essbase.

Yet, an interesting blip happened on the way to the “inevitable” all in-memory database future: Last spring, DRAM memory prices stopped dropping. In part this was attributable to consolidation of the industry to fewer suppliers. But the larger driver was that the wisdom of crowds – e.g., that DRAM memory was now ready for prime time – got ahead of itself. Yes, the laws of supply and demand will eventually shift the trajectory of memory pricing. But nope, that won’t change the fact of life that, no matter how cheap, DRAM memory (and cache) will always be premium storage.

In-memory databases are dead, long live tiered databases
The sky is not the limit for DRAM in-memory databases. The rush to in-memory will morph into an expansion of data tiering. And actually that’s not such a bad thing: do you really need to put all of that data in memory? We think not.

IBM and Teradata have shunned all in-memory architectures; their contention is that the 80/20 rule should govern which data goes into memory. And under their breaths, the all in-memory database folks have fallbacks for paging data between disk and memory. If designed properly, this is not constant paging, but rather a process that only occurs for that rare out-of-range query. Kognitio has a clever pricing model where they don’t charge you for the disk, but just for the volume of memory. As for HANA, disk is designed into the system for permanent offline storage, but SAP quietly adds that it can also be utilized for paging data during routine operation. Maybe SAP shouldn’t be so quiet about that.

There’s one additional form of tiering to consider for highly complex analytics: it’s the boost that can come from pipelining computations inside chip cache. Oracle is looking to similar techniques for further optimizing upcoming generations of its Exadata database appliance platform. It’s a technique that’s part of IBM’s recent BLU architecture for DB2. High-performance analytic platforms such as SiSense also incorporate in-chip pipelining to actually reduce balance of system costs (e.g., require less DRAM).

It’s all about balance of system
Balance of system is hardly new, but until recently, it meant trading off CPU, bandwidth with tiers of disk. Application and database design in turn focused on distributing or sharding data to place the most frequently accessed data on the disk or portions of disk that could be accessed the fastest. New forms of storage, including Flash and DRAM memory, add a few new elements to the mix. You’ll still configure storage (along with processor and interconnects) for the application and vice versa, but you’ll have a couple new toys in your arsenal.

For Flash, it means fast OLTP applications that could add basic analytics, such as what Oracle’s recent wave of In-Memory Applications promise. For in-memory, that would dictate OLTP applications with even more complex analytics and/or what-if simulations embedded in line, such as what SAP is promising with its recently-introduced Business Suite and CRM applications on HANA.

For in-memory, we’d contend that for most cases, configurations for keeping 100% of data in DRAM will remain overkill. Unless you are running a Big Data analytic problem that is supposed to encompass all of the data, you will likely work with just a fraction of the data. Furthermore, IBM, Oracle, and Teradata are incorporating data skipping features into their analytic platforms that deliberately filter irrelevant data so it is not scanned. There a many ways to speed processing before using the fast storage option.

Storage will become an application design option
Although we’re leery about hopping the 100% DRAM in-memory bandwagon, smartly deployed, in-memory or DRAM could truly transform applications. When you eliminate the latency, you can embed complex analytics in-line with transactional applications, enable the running of more complex analytics, or make it feasible for users to run more what-if simulations to couch their decisions.

Examples include transaction applications that differentiate how to fulfill orders from gold, silver, or bronze-level customers based on levels of services and cost of fulfillment. It could help mitigate risk when making operational or fiduciary decisions by allowing the running of more permutations of scenarios. It could also enhance Big Data analytics by tiering the more frequently used data (and logic) in memory.

Whether to use DRAM or Flash will be a function of degree of data volume and problem complexity. No longer will inclusion of storage tiers be simply a hardware platform design decision; it will also become a configuration decision for application designers as well.

The Other Shoe Drops: SAP puts ERP on HANA

It was never a question of whether SAP would bring it flagship product, Business Suite to HANA, but when. And when I saw this while parking the car at my physical therapist over the holidays, I should’ve suspected that something was up: SAP at long last was about to announce … this.

From the start, SAP has made clear that its vision for HANA was not a technical curiosity, positioned as some high-end niche product or sideshow. In the long run, SAP was going to take HANA to Broadway.

SAP product rollouts on HANA have proceeded in logical, deliberate fashion. Start with the lowest hanging fruit, analytics, because that is the sweet spot of the embryonic market for in-memory data platforms. Then work up the food chain, with the CRM introduction in the middle of last year – there’s an implicit value proposition for having a customer database on a real-time system, especially while your call center reps are on the phone and would like to either soothe, cross-sell, or upsell the prospect. Get some initial customer references with a special purpose transactional product in preparation for taking it to the big time.

There’s no question that in-memory can have real impact, from simplifying deployment to speeding up processes and enabling more real-time agility. Your data integration architecture is much simpler and the amount of data you physically must store is smaller. SAP provides a cute video that shows how HANA cuts through the clutter.

For starters, when data is in memory, you don’t have to denormalize or resort to tricks like sharding or striping of data to enhance access to “hot” data. You also don’t have to create staging servers to perform ETL of you want to load transaction data into a data warehouse. Instead, you submit commands or routines that, thanks to processing speeds that are up to what SAP claims to be 1000x faster than disk, convert the data almost instantly to the form in which you need to consume it. And when you have data in memory, you can now perform more ad hoc analyses. In the case of production and inventory planning (a.k.a., the MRP portion of ERP), you could run simulations when weighing the impact of changing or submitting new customer orders, or judging the impact of changing sourcing strategies when commodity process fluctuate. For beta customer John Deere, they achieved positive ROI based solely on the benefits of implementing it for pricing optimization (SAP has roughly a dozen customers in ramp up for Business Suite on HANA).

It’s not a question of whether you can run ERP in real time. No matter how fast you construct or deconstruct your business planning, there is still a supply chain that introduces its own lag time. Instead, the focus is how to make enterprise planning more flexible, enhanced with built-in analytics.

But how hungry are enterprises for such improvements? To date, SAP has roughly 500 HANA installs, primarily for Business Warehouse (BW) where the in-memory data store was a logical upgrade for analytics, where demand for in-memory is more established. But on the transactional side, it’s a more uphill battle as enterprises are not clamoring to conduct forklift replacements of their ERP systems, not to mention their databases as well. Changing both is no trivial matter, and in fact, changing databases is even rarer because of the specialized knowledge that is required. Swap out your database, and you might as well swap out your DBAs.

The best precedent is Oracle, which introduced Fusion Applications two years ago. Oracle didn’t necessarily see Fusion as replacement for E-Business Suite, JD Edwards, or PeopleSoft. Instead it viewed Fusion Apps as a gap filler for new opportunities among its installed base or the rare case of greenfield enterprise install. We’d expect no less from SAP.

Yet in the exuberance of rollout day, SAP was speaking of the transformative nature of HANA, claiming it “Reinvents the Real-Time Enterprise.” It’s not the first time that SAP has positioned HANA in such terms.

Yes, HANA is transformative when it comes to how you manage data and run applications, but let’s not get caught down another path to enterprise transformation. We’ve seen that movie before, and few of us want to sit through it again.

So what does IBM mean when they say they’re in the solutions business?

Ever since IBM exited the applications business, it has been steadily inching its way back up the value chain from pure infrastructure software. IBM has over the past few years unleashed a string of initiatives seeking to deliver, not only infrastructure software and the integration services to accompany them, but gradually more bits of software that deliver content aimed for the needs of specific scenarios in specific verticals. Naturally, with a highly diversified organization like IBM, there have been multiple initiatives with, of course, varying levels of success.

It started with the usual scenario among IT service providers seeking to derive reusable content from client engagements.  Then followed a series of acquisitions for capabilities targeted at vertical industries such as fixed asset management for capital-intensive sectors such as manufacturing or utilities; product information management for consumer product companies; commerce for B2B transactions; online marketing analytic capabilities, and so on. Then came the acquisition of Webify in 2007, where we thought this would lead to a new generation of SOA-based, composite vertical applications (disclosure: we were still drinking the SOA Kool-Aid at the time). At the time, IBM announced there would be Business Fabric SOA frameworks for telco, banking, and insurance, which left us waiting for the shoe to drop for more sectors. Well, that’s all they wrote.

Last year, IBM Software Group (SWG) reorganized into two uber organizations: Middleware under the lead of Robert Leblanc,  and Solutions under Mike Rhodin. Both presented at SWG’s 2011 analyst forum as to what the reorg meant. What was interesting was that for organizational purposes, this was a very ecumenical definition of Middleware: it included much of the familiar products from the Information Management, Tivoli, and Rational brand portfolios, and as such, was far more encompassing (e.g., it also included the data layer).

More to the point, once you get past middleware infrastructure, what’s left? At his presentation last year, Rhodin outlined five core areas: Business Analytics and Optimization; Smarter Commerce; Social Business; Smarter Cities; and Watson Solutions. And he outlined IBM’s staged process for developing new markets, expressed as incubation, where the missionary work is done; “make a market” where the product and market is formally defined and materialized; and scale a market, which is self-explanatory. Beyond, we still wondered what makes an IBM solution.

This year, Rhodin fleshed out the answer. To paraphrase, Rhodin said that “it’s not about creating 5000 new products, but creating new market segments.” Rhodin defined segments as markets that are large enough to have visible impact on a $100 billion corporation’s top line. Not $100 million markets, but instead, add a zero or two to it.

An example is Smarter Cities, which began with the customary reference customer engagements to define a solution space. IBM had some marquee urban infrastructure engagements with Washington DC, Singapore, Stockholm, and other cites, out of which came its Intelligent Operations Center. IBM is at an earlier stage with Watson Solutions with engagements at WellPoint (for approving procedures) and Memorial Sloan-Kettering Cancer Center (healthcare delivery) in fleshing out a Smart Healthcare solution.

Of these, Smarter Analytics (not to be confused with Smart Analytics System – even big companies sometimes run out of original brand names) is the most mature.

The good news is that we have a better idea of what IBM means when it says solutions – it’s not individual packaged products per se, but groups of related software products, services, and systems. And we know at very high level where IBM is going to focus its solutions efforts.

Plus ca change… IBM has always been about software, services, and systems – although in recent years the first two have taken front stage. The flip side is that some of these solutions areas are overly broad. Smarter Analytics is a catch-all covering the familiar areas of business intelligence and performance management (much of the Cognos portfolio), predictive analytics and analytical decision management (much of the SPSS portfolio), and analytic applications (Cognos products tailored to specific line organizations like sales, finance, and operations).

It hasn’t been in doubt that for IBM, solutions meant addressing the line of business rather than just IT. That’s certainly a logical strategy for IBM to spread its footprint within the Global 2000. The takeaway of getting a better definition of what IBM’s Solutions business is that it gives us the idea of the scale and acquisitions opportunities that they’re after.

HP does a 180 – Now it’s Apotheker’s Company

HP chose the occasion of its Q3 earnings call to drop the bomb. The company that under Mark Hurd’s watch focused on Converged Infrastructure, spending almost $7 billion to buy Palm, 3COM, and 3PAR, is now pulling a 180 in ditching both the PC and Palm hardware business, and making an offer to buy Autonomy, one of the last major independent enterprise content management players, for roughly $11 billion.

At first glance, the deal makes perfect sense, given Leo Apotheker’s enterprise software orientation. From that standpoint, Apotheker has made some shrewd moves, putting aging enterprise data warehouse brand Neoview out of its misery, following up weeks later with the acquisition of Advanced SQL analytics platform provider Vertica. During the Q3 earnings call, Apotheker stated the obvious as to his comfort factor with Autonomy: “I have spent my entire professional life in software and it is a world that I know well. Autonomy is very complementary.”

There is potential synergy between Autonomy and Vertica, with Autonomy CEO Mike Lynch (who will stay on as head of the unit, reporting to Apotheker) that Autonomy’s user screens provide the long missing front end to Vertica, and that both would be bound by a common “information layer.” Of course, the acquisition not being final, he did not give details on what that layer is, but for now we’d assume that integration will be at presentation and reporting layer. There is clearly a lot more potential here — Vertica for now only holds structured data while Autonomy’s IDOL system holds everything else. In the long run we’d love to see federated metadata and also an extension of Vertica to handle unstructured data, just as Advanced SQL rivals like Teradata’s Aster Data already do.

Autonomy, according to my Ovum colleague Mike Davis who has tracked the company for years, is one of only three ECM providers that have mastered the universal document viewer – Oracle’s Stellent and an Australian open source player being the others. In contrast to HP (more about that in a moment), Autonomy is quite healthy with the latest quarterly revenues up 16% year over year, operating margins in the mid 40% range, and a run rate that will take the company to its first billion dollar year.

Autonomy is clearly a gem, but HP paid dearly for it. During Q&A on the earnings call, a Wall street analyst took matters back down to earth, asking whether HP got such a good deal, given that it was paying roughly 15% of its market cap for a company that will only add about 1% to its revenues.

Great, expensive acquisition aside, HP’s not doing so well these days. Excluding a few bright spots, such as its Fortify security software business, most of HP’s units are running behind last year. Q3 net revenue of $31.2 billion was up only 1% over last year, but down 2% when adjusted for constant currency. By contrast, IBM’s most recent results were up 12% and 5%, respectively, when currency adjusted. Dennis Howlett tweeted that it was now HP’s turn to undergo IBM’s near-death experience.

More specifically, HP Software was the bright spot with 20% growth year over year and 19.4% operating margin. By contrast, the printer and ink business – long HP’s cash cow – dropped 1% year over year with the economy dampening demand from the commercial side, not to mention supply chain disruptions from the Japanese tsunami.

By contrast, services grew only 4%, and is about to kick in yet another round of transformation. John Visenten, who ran HP’s Enterprise services in the Americas region, comes in to succeed Ann Livermore. The problem is, as Ovum colleague John Madden states it, HP’s services “has been in a constant state of transformation” that is making some customers’ patience wear thin. Ever since acquiring EDS, HP has been trying – and trying – to raise the legacy outsourcing business higher up the value chain, with its sights literally set in the cloud.

The trick is that as HP tries aiming higher up the software and services food chain, it deals with a market that has longer sales cycles and long-term customer relationships that prize stability. Admittedly, when Apotheker was named CEO last fall, along with enterprise software veteran ray Lane to the board, the conventional wisdom was that HP would train its focus on enterprise software. So to that extent, HP’s strategy over the past 9 months has been almost consistent – save for earlier pronouncements on the strategic role of the tablet and WebOS business inherited with Palm.

But HP has been around for much longer than 9 months, and its latest shifts in strategy must be viewed with a longer perspective. Traditionally an engineering company, HP grew into a motley assortment of businesses. Before spinning off its geeky Agilent unit in 1999, HP consisted of test instruments, midrange servers and PCs, a token software business, and lest we forget, that printer business. Since then:
• The 2001 acquisition of Compaq that cost a cool $25 billion, under Carly Fiorina’s watch. That pitted it against Dell and caused HP to assume an even more schizoid personality as consumer and enterprise brand.
• Under Mark Hurd’s reign, software might have grown a bit (they did purchase Mercury after unwittingly not killing off their OpenView business), but the focus was directed at infrastructure – storage, switches, and mobile devices as part of the Converged Infrastructure initiative.
• In the interim, HP swallowed EDS, succeeding at what it failed to do with its earlier ill-fated pitch for PwC.

Then (1) Hurd gets tossed out and (2) almost immediately lands at Oracle; (3) Oracle pulls support for HP Itanium servers, (4) HP sues Oracle, and (5) its Itanium business sinks through the floor.

That sets the scene for today’s announcements that HP is “evaluating a range of options” (code speak for likely divestment) for its PC and tablet business – although it will keep WebOS on life support as its last gasp in the mobile arena. A real long shot: HP’s only hope for WebOS might be Android OEMs not exactly tickled pink about Google’s going into the handset business by buying Motorola’s mobile unit.

There is logical rationale for dropping those businesses – PCs have always been a low margin business in both sales and service, in spite of what it claimed to be an extremely efficient supply chain. Although a third of its business, PCs were only 13% of HP’s profits, and have been declining in revenue for several years. PCs were big enough to provide a distraction and low enough margin to become a drain. And with Palm, HP gained an eloquent OS, but with a damaged brand that was too late to become the iOS alternative – Google had a 5-year headstart. Another one bites the dust.

Logical moves, but it’s fair to ask, what is an HP? Given HP’s twists, turns, and about-faces, a difficult one to answer. OK, HP is shedding its consumer businesses – except printers and ink because in normal times they are too lucrative – but HP still has all this infrastructure business. It hopes to rationalize all this in becoming a provider of cloud infrastructure and related services, with a focus on information management solutions.

As mentioned above, enterprises crave stability, yet HP’s track record over the past decade has been anything but. To be an enterprise provider, technology providers must demonstrate that they have a consistent strategy and staying power because enterprise clients don’t want to be left with orphaned technologies. To its credit, today’s announcements show the fruition of Apotheker’s enterprise software-focused strategy. But HP’s enterprise software customers and prospects need the assurance that HP won’t pull another about face when it comes time for Apotheker’s successor.

Postscript: Of course we all know how this one ended up. One good 180 deserved another. Exit Apotheker stage left. Enter Meg Whitman stage right. Reality has been reversed.

A Week of BPM

We’re closing out a week that’s been bookended by a couple of BPM-related announcements from IBM and upstart Active Endpoints. The common thread is the quest for greater simplicity, the difference is that one de-emphasizes the B word itself, with a focus of being a simple productivity tool that makes complex enterprise apps (in this case Salesforce.com) easier, while the other amplifies a message loud and clear that this is the BPM that you always wanted BPM to be.

The backstory is that BPM has traditionally appealed to a narrow audience of specially-skilled process analysts and modelers, and has yet to achieve the mass market status. Exhibit One? One of the larger independent BPM players (Pegasystems) still standing is a $330 million company. In the interim, there has been significant consolidation in this community as BPM has become one of the components that are expected in a middleware stack. A milestone has been Oracle’s latest Fusion BPM 11g R1, which unifies the engines and makes them first class citizens of the Fusion middleware stack. While such developments have solidified BPM’s integration story, on its own that has not made BPM the enterprise’s next ERP.

The irony is that BPM has long been positioned as the business stakeholder’s path to application development, with the implication that a modeling/development environment that uses the terms of business process rather than programmatic commands should appeal to a higher audience. The drawback is that to get there, most BPM tools relied on proprietary languages that limited use to… you guessed it… a narrow cadre of business process architects.

Just over a year ago, IBM acquired Lombardi, one of the more innovative independents that has always stressed simplicity. Not that IBM was lacking in BPM capability, but they were based on aging engines centered around integration and document management/workflow use cases, respectively. As IBM software has not been known by its simplicity (many of its offerings still consist of multiple products requiring multiple installs, or a potpourri of offerings targeted for separate verticals or use cases), the fear was that Lombardi would get swallowed alive and emerge unrecognizable.

The good news is that Lombardi in technology and product has remained alive and well. We’ll characterize IBM Business Process Manager 7.5 as Lombardi giving IBM’s BPM suite a heart transplant; it dropped in a new engine to reinvigorate the old. As for the peanut gallery, Janelle Hill characterized it as IBM getting a Lombardotomy; Neil Ward-Dutton and Sandy Kemsley described it as a reverse takeover; Ward-Dutton debated Clay Richardson that the new release was more than just a new paint job, while Bruce Silver assessed it as IBM’s BPM endgame.

So what did IBM do in 7.5? The modeling design environment and repository are Lombardi’s, with the former IBM WebSphere Process Server (that’s the integration BPM) now being ported over to the Lombardi engine. It’s IBM’s initial answer to Oracle’s unification of process design and runtimes with the latest Fusion BPM.

IBM is not all the way there yet. Process Server models, which were previously contained in a flat file, are now stored in the repurposed Lombardi repository, but that does not yet make the models fully interoperable. You can just design them in the same environment and store them in the same repository. That’s a good start, and at least it shows that the Lombardi approach will not get buried. Beyond the challenge of integrating the model artifacts, the bigger question to us is whether the upscaling and extending of Lombardi to cover more use cases might be too much of a good thing. Can it avoid the clutter of Microsoft Office that resulted from functional scope creep?

As for FileNet, IBM’s document-centric BPM, that’s going to wait. It’s a very different use case – and note that even as Oracle unified two of its BPM engines, it has markedly omitted the document management piece. Document centric workflow is a well-ingrained use case and has its own unique process patterns, so the open question is whether it is realistic to expect that such a style of process management can fit in the same engine, or simply exist as external callable workflows.

At the other end of the week, we were treated to the unveiling of Active Endpoints new Cloud Extend release. As we tweeted, this is a personality transplant for Active Endpoints, as the company’s heritage has been with more geeky BPEL, and even geekier branded tool, ActiveVOS.

OK, the Cloud branding is a break from geekdom, but it ain’t exactly original – there’s too much Cloud this and Cloud that going around the ISV community these days.

More to the point, Cloud Extend does not really describe what Active Endpoints is doing with this release. Cloud Extend is not cloud-enabling your applications, it is just enabling your application that in this case happens to run in the cloud (the company also has an on premises version of this tool with the equally unintuitive brand Socrates).

In essence, Cloud Extend adds a workflow shell to Salesforce.com so that you can design workflows in a manner that appears as simple as creating a Visio diagram, while providing the ability to save and reuse them. There’s BPEL and BPMN underneath the hood, but in normal views you won’t see them. It also has neat capabilities that help you filter out extraneous workflow activities when working on a particular process. The result is that you have screens that drive users to interact with Salesforce in a consistent manner, replacing the custom coding of systems integrators with a tool that should help systems integrators perform the same Salesforce customization job quicker. Clearly, Salesforce should be just the first stop for this technology; we’d expect that Active Endpoints will subsequently target other enterprise apps with its engine in due course.

We hate to spill the beans, but under the covers, this is BPM. But that’s not the point – and in fact it opens an interesting argument as to whether we should simply take advantage of the technology without having to make a federal case about it. It’s yet another approach to make the benefits of BPM more accessible to people who are not modeling notation experts, which is a good thing.

Big Data analytics in the cloud could be HP’s enterprise trump card

Unfortunately, scheduling conflicts have kept us from attending Leo Apotheker’s keynote today before the HP Analyst Summit in San Francisco. But yesterday, he tipped his cards for his new software vision for HP before a group of investment analysts. HP’s software focus is not to reinvent the wheel – at least where it comes to enterprise apps. Apotheker has to put to rest that he’s not about to do a grudge match and buy the company that dismissed him. There is already plenty of coverage here, interesting comment from Tom Foremski (we agree with him about SAP being a non-starter), and the Software Advice guys who are conducting a poll.

To some extent this has been little surprise with HP’s already stated plans for WebOS and its recently announced acquisition of Vertica. We do have one question though: what happened to Converged Infrastructure?

For now, we’re not revisiting the acquisitions stakes, although if you follow #HPSummit twitter tags today, you’ll probably see lots of ideas floating around today after 9am Pacific time. We’ll instead focus on the kind of company HP wants to be, based on its stated objectives.

1. Develop a portfolio of cloud services from infrastructure to platform services and run the industry’s first open cloud marketplace that will combine a secure, scalable and trusted consumer app store and an enterprise application and services catalog.

This hits two points on the checklist: provide a natural market for all those PCs that HP sells. The next part is stating that HP wants to venture higher up the food chain than just sell lots of iron. That certainly makes sense. The next part is where we have a question: offering cloud services to consumers, the enterprise, and developers sounds at first blush that HP wants its cloud to be all things to all people.

The good news is that HP has a start on the developer side where it has been offering performance testing services for years – but is now catching up to providers like CollabNet (with which it is aligned and would make a logical acquisition candidate) and Rally in offering higher value planning services for the app lifecycle.

In the other areas – consumer apps and enterprise apps – HP is starting from square one. It obviously must separate the two, as cloud is just about the only thing that the two have in common.

For the consumer side, HP (like Google Android and everyone else) is playing catchup to Apple. It is not simply a matter of building it and expecting they will come. Apple has built an entire ecosystem around its iOS platform that has penetrated content and retail – challenging Amazon, not just Salesforce or a would-be HP, using its user experience as the basis for building a market for an audience that is dying to be captive. For its part, HP hopes to build WebOS to have the same “Wow!” factor as the iPhone/iPad experience. It’s got a huge uphill battle on its hands.

For the enterprise, it’s a more wide open space where only Salesforce’s AppExchange has made any meaningful mark. Again, the key is a unifying ecosystem, with the most likely outlet being enterprise outsourcing customers for HP’s Enterprise Services (the former EDS operation). The key principle is that when you build a market place, you have to identity who your customers are and give them a reason to visit. A key challenge, as we’ve stated in our day job, is that enterprise customers are not the enterprise equivalent of those $2.99 apps that you’ll see in the AppStore. The experience at Salesforce – the classic inversion of the long tail – is that the market is primarily for add-ons to the Salesforce.com CRM application or use of the Force.com development platform, but that most entries simply get buried deep down the list.

Enterprise apps marketplaces are not simply going to provide a cheaper channel for solutions that still require consultative sells. We’ve suggested that they adhere more to the user group model, which also includes forums, chats, exchanges of ideas, and by the way, places to get utilities that can make enterprise software programs more useful. Enterprise app stores are not an end in themselves, but a means for reinforcing a community — whether it be for a core enterprise app – or for HP, more likely, for the community of outsourcing customers that it already has.

2. Build webOS into a leading connectivity platform.
HP clearly hopes to replicate Apple’s success with iOS here – the key being that it wants to extend the next-generation Palm platform to its base of PCs and other devices. This one’s truly a Hail Mary pass designed to rescue the Palm platform from irrelevance in a market where iOS, Android, Adobe Flash, Blackberry, and Microsoft Windows 7/Silverlight are battling it out. Admittedly, mobile developers have always tolerated fragmentation as a fact of life in this space – but of course that was when the stakes (with feature phones) were rather modest. With smart device – in all its varied form factors from phone to tablet – becoming the next major consumer (and to some extent, enterprise) frontier, there’s a new battle afresh for mindshare. That mindshare will be built on the size of the third party app ecosystem that these platforms attract.

As Palm was always more an enterprise rather consumer platform – before the Blackberry eclipsed it – HP’s likely WebOS venue will be the enterprise space. Another uphill battle with Microsoft (that has the office apps), Blackberry (with its substantial corporate email base), and yes, Apple, where enterprise users are increasingly sneaking iPhones in the back door, just like they did with PCs 25 years ago,

3. Build presence with Big Data
Like (1), this also hits a key checkbox for where to sell all those HP PCs. HP has had a half-hearted presence with the discontinued Neoview business. The Vertica acquisition was clearly the first one that had Apotheker’s stamp on it. Of HP’s announced strategies, this is the one that aligns closest with the enterprise software strategy that we’ve all expected Apotheker to champion. Obviously Vertica is the first step here – and there are many logical acquisitions that could fill this out, as we’ve noted previously, regarding Tibco, Informatica, and Teradata. The importance is that classic business intelligence never really suffered through the recession, and arguably, big data is becoming the next frontier for BI that is becoming, not just a nice to have, but increasingly an expected cost of competition.

What’s interesting so far is that in all the talk about big Data, there’s been relatively scant attention paid to utilizing the cloud to provide the scaling to conduct such analytics. We foresee a market where organizations that don’t necessarily want to buy all that and that use large advanced analytics on an event-driven basis, to consume the cloud for their Hadoop – or Vertica – runs. Big Data analytics in the cloud could be HP’s enterprise trump card.

HP’s acquisition of Vertica: Out with the old, in with the new

Ever since Leo Apotheker assumed the reins at HP – assisted by Ray Lane on the board – we have been awaiting the first hint that the new regime would take HP up the enterprise software food chain.

Apotheker has finally dropped the other shoe: HP has signed a definitive agreement to acquire Vertica, one of a growing family of columnar analytic databases. The announcement came barely two weeks after HP finally cut Neoview loose. It’s a pretty logical course, as Neoview technology was dated; inherited from HP’s acquisition of Compaq, which in turn swallowed Tandem; you get the picture – Neoview was the odd man out. Out with the old wave, in with the new, as Vertica is one of a group of emerging analytic database vendors. in this case, it is one of the first, but not the only, column-oriented, massively parallel (MPP) analytic database player out there.

Although it’s Valentine’s Day, we’ll avoid the temptation to call this a sweetheart deal. But it is a shrewd acquisition for HP because, as one of a group of emerging players, it is a good modest-sized first step for Apotheker’s software strategy. Vertica isn’t so big to that the acquisition cost would break the bank, nor a large enough player to add major organizational disruption. The move also has merit for other reasons: first, as a consumer of commodity hardware, Vertica provides a good fit for HP to add value with more appliances. And as implicated above, it provides HP a foothold into the direction that analytics are going: towards Big Data (the ability to process Internet-scale troves of data) and Fast Data (the ability to process huge chunks of data in real time). Vertica promotes its ability to scale and provide near real-time performance.

One of many companies founded by Ingres and Postgres pioneer Michael Stonebraker in 2005, Vertica has amassed roughly $30 million in venture backing and even in recession-wracked 2009, found its business tripling; the company now has over 150 customers. So the company is clearly a going proposition.

Of course, Vertica is hardly unique here. For instance, it has a handful of rivals that are also commercializing vertical database technologies, with one, ParAccel, having drawn almost as much venture backing. Column-oriented technology has proven useful for analytic applications because (1) the need is not for retrieving individual records (rows) and (2) scanning by columns rather than rows saves lots of process overhead. Admittedly, the technology is hardly new; Sybase acquired a startup in the mid 1990s, with Sybase IQ being an established product with a large installed base. But it is based on older SMP technology, as MPP was not economical back in the 90s. MPP-oriented columnar databases emerged over the past five years thanks to cheapening commodity hardware and increasing memory density — bringing massively parallel implementation within the budgets of mere mortals. Vertica also rides a wave to the handling of Big Data, as it (like others) provide support for Hadoop support for access to humongous Internet-scale data sourcing.

HP had to make a move like this, and not because they now have an enterprise software guy in charge. With IBM and Netezza, Oracle and Exadata, and EMC and Greenplum, HP’s rivals were rapidly buying up analytic database appliance providers (we’re waiting for Dell to make its move). They were not buying up (or developing) analytic database hardware systems just because they were there, but because there is an increasing feeding frenzy for the ability to process Big Data and Fast Data as a result of:
1. Commodity hardware and lower cost memory, as mentioned above
2. Actual use cases that bring tangible benefits, such as sentiment analysis of social media
3. Use of predictive analytics becoming, not a competitive differentiator, but a cost of doing business
4. Open source technologies and APIs from Internet giants like Google, Facebook, and yahoo bringing big data to the masses

This is not the first time that BI has entered a commoditization wave; it’s just that the bar keeps rising. Two examples: Microsoft’s packaging of OLAP as an option to SQL Server commoditized multi-dimensional databases; and enterprise applications began packaging their own analytics. In neither case did the market for advanced analytics disappear, just that the SAS’s, Cognos’s, and Business Objects of the world had to advance their capabilities. To this, add support for inline hardware platforms that perform the heavy lifting of processing huge amounts of data, or processing data in (or almost) real time. History will repeat itself again.

For HP, this is a logical first step towards raising its profile in the enterprise software space. It still has significant heavy lifting ahead of it, as it must balance the ongoing converged infrastructure initiatives that are melding 3COM, 3PAR, and Palm into the business. HP’s danger is getting distracted. With Vertica, it is a modest, digestible step and proof of concept as to whether HP can aim higher in its enterprise sales pitch.

HP puts Neoview out of its misery

Rumors of Neoview’s death are no longer exaggerated. It was revealed this week that HP has decided to pull the plug on support for its data warehousing system by 2014. Speculation got louder after last week’s announcement that HP was partnering with Microsoft to release a $2 million data warehouse appliance built around SQL Server.

The problem was that Neoview never really fit in at HP. It wasn’t the timing – when HP announced the product in 2007, it was during a period when the economy was still booming, and more importantly, technology advancement placed Big Data and in-memory Fast Data into affordability beyond the Global 2000. In the past decade, a number of startups have emerged, offering advanced analytic databases, giving Teradata and Sybase IQ new competition.

Until ascension of Leo Apotheker, there were few people that understood the software business, and even fewer that really understood enterprise software. HP’s existing software business, which surprisingly doubled under Mark Hurd’s watch, remains a modest 3% of revenues. The good news is in the last few years, HP has made software a business — software is no longer an oxymoron over there. But the kind of software that HP sells is primarily infrastructure and geek-oriented: IT infrastructure and service management, application lifecycle management, and telecom network node management. Try to figure the synergy with business analytics and analytics.

Neoview itself was software that HP ended up with and half-heartedly (if that) tried to market. Originating as the NonStop SQL database of the former Tandem Computers fault tolerant midrange platform, the technology came to HP through Compaq’s 1997 acquisition of Tandem, and the subsequent HP acquisition of Compaq in 2002.

Ironically, HP’s cutting the cord on Neoview actually gives Apotheker a fresh chance to re-energize this side of the HP software business. There are numerous rising analytic database players out there; beyond Netezza, which was acquired by IBM; Greenplum, which was swooped up by EMC; and Sybase, which is now part of SAP, you have many upstarts that could provide HP golden opportunities to do more SQL Server like deals on terms that would likely be far more favorable to HP, while providing dress rehearsals for eventual acquisition.

The trick of course is that HP has to decide what it wants to be under the new Apotheker regime. Under Hurd, the mantra was converged infrastructure, which drove the Palm, 3Com, and 3PAR deals. So at this point, HP has some major digestion on its plate before looking for more big deals. We’ve speculated previously about the types of deals that Apotheker could do to push HP up the enterprise software food chain. But we’d suggest that HP take a deep breath: partner, staff up, and get to know this business first rather than repeat the same mistake twice.

IBM’s Software Complex

Sometimes the news is that there is no news. Well, Steve Mills did tell us that IBM is investing the bulk of its money in software and that between now and 2015, it would continue to make an average of $4 – 5 billion worth of strategic acquisitions per year. In other words, it would continue its current path, and it will continue buying making acquisitions for the strategic value of technology with the guideline of having them become revenue accretive within 2 – 4 years. Again, nothing new, as if there were anything wrong with that.

The blend of acquisition and organic development is obviously bulking up the Software Group’s product portfolio, which in itself is hardly a bad thing; there is more depth and breadth. But the issue that IBM has always faced is that of complexity. The traditional formula has always been, we have the pieces and we have services to put it together for you. Players like Oracle compete with a packaged apps strategy; in more specialized areas such as project portfolio management, rivals like HP and CA Technologies say that we have one product where IBM splits it in two.

IBM continues to deny that it is in the apps business, but as it shows architectural slides of its stack that is based on middleware along with horizontal “solutions” such as a SPSS Decision Manager offering (more about that shortly); vertical industry frameworks which specify processes, best practices, and other assets that can be used to compose industry solutions; and then at the top of the stack, solutions that IBM and/or its services group develops. It’s at the peak of the stack that the difference between “solutions” and “applications” becomes academic. Reviewing Oracle’s yet to be released Fusion applications, there is a similar architecture that composes solutions based on modular building blocks.

So maybe IBM feels self-conscious about the term application as it doesn’t want to be classed with Oracle or SAP, or maybe it’s the growing level of competition with Oracle that made Mills rather defensive in responding to an analyst’s question about the difference between IBM’s and Oracle’s strategy. His response was that IBM’s is more of a toolkit approach that layers atop the legacy that will always be there, which is reasonable, although the tone was more redolent of “you [Oracle] can’t handle the truth.”

Either way, where you sell a solution or a packaged application for enterprise level, assembly will still be required. Services will be needed to integrate and/or train your people. Let’s be adults and get that debate behind us. For IBM, time to get back to Issue One: Defusing Complexity. When you’re dealing with enterprise software, there will always be complexity. But when it comes to richness or simplicity, IBM tends to aim for the former. The dense slides with small print made the largely middle aged analyst audience more self conscious than normal of the inadequacies of their graduated eyeglasses or contacts.

OK, if you’re IBM facing an analyst crowd, you don’t want to oversimplify the presentation into the metaphorical equivalent of the large print weekly for the visually impaired. You must prove that you have depth. You need to show a memorable, coherent message (Smarter Planet was great when it débuted two years ago). But most importantly, you need to have coherent packaging and delivery to market.

IBM Software Group has done a good job of repurposing technologies across brands to fill defined product needs; it still has its work cut out for its goal of making the software group sales force brand agnostic. That is going to take time.

As a result, good deeds don’t go unpunished, with IBM’s challenges with SPSS Decision Manager a great case in point. The product, an attempt to craft a wider market for SPSS capabilities, blends BI analytics from Cognos, rules management from Ilog, and event processing from WebSphere Business Events to develop a predictive analytics solution for fashioning business strategy aimed at line of business users.

For instance, if you are in the claims processing group of an auto insurance company, you can use form-based interfaces to vary decision rules and simulate the results to ensure that accident calls from 19 year old drivers or those who have not yet contacted the police are not fast tracked for settlement.

The problem with Decision Manager is that it is not a single SKU or install; IBM has simply pre-integrated components that you still must buy a la carte. IBM Software is already integrating product technologies; it now needs to attend to integrating delivery.

Leo Apotheker to target HP’s forgotten business

Ever since its humble beginnings in the Palo Alto garage, HP has always been kind of a geeky company – in spite of Carly Fiorina’s superficial attempts to prod HP towards a vision thing during her aborted tenure. Yet HP keeps talking about getting back to that spiritual garage.

Software has long been the forgotten business of HP. Although – surprisingly – the software business was resuscitated under Mark Hurd’s reign (revenues have more than doubled as of a few years ago), software remains almost a rounding error in HP’s overall revenue pie.

Yes, Hurd gave the software business modest support. Mercury Interactive was acquired under his watch, giving the business a degree of critical mass when combined with the legacy OpenView business. But during Hurd’s era, there were much bigger fish to fry beyond all the internal cost cutting for which Wall Street cheered, but insiders jeered. Converged Infrastructure has been the mantra, reminding us one and all that HP was still very much a hardware company. The message remains loud and clear with HP’s recent 3PAR acquisition at a heavily inflated $2.3 billion which was concluded in spite of the interim leadership vacuum.

The dilemma that HP faces is that, yes, it is the world’s largest hardware company (they call it technology), but the bulk of that is from personal systems. Ink, anybody?

The converged infrastructure strategy was a play at the CTO’s office. Yet HP is a large enough company that it needs to compete in the leagues of IBM and Oracle, and for that it needs to get meetings with the CEO. Ergo, the rumors of feelers made to IBM Software’s Steve Mills, and the successful offer to Leo Apotheker, and agreement for Ray Lane as non executive chairman.

Our initial reaction was one of disappointment; others have felt similarly. But Dennis Howlett feels that Apotheker is the right choice “to set a calm tone” that there won’t be a massive a debilitating reorg in the short term.

Under Apotheker’s watch, SAP stagnated, hit by the stillborn Business ByDesign and the hike in maintenance fees that, for the moment, made Oracle look warmer and fuzzier. Of course, you can’t blame all of SAP’s issues on Apotheker; the company was in a natural lull cycle as it was seeking a new direction in a mature ERP market. The problem with SAP is that, defensive acquisition of Business Objects notwithstanding, the company has always been limited by a “not invented here” syndrome that has tended to blind the company to obvious opportunities – such as inexplicably letting strategic partner IDS Scheer slip away to Software AG. Apotheker’s shortcoming was not providing the strong leadership to jolt SAP out of its inertia.

Instead, Apotheker’s – and Ray Lane’s for that matter – value proposition is that they know the side of the enterprise market that HP doesn’t. That’s the key to this transition.

The next question becomes acquisitions. HP has a lot on its plate already. It took at least 18 months for HP to digest the $14 billion acquisition of EDS, providing a critical mass IT services and data center outsourcing business. It is still digesting nearly $7 billion of subsequent acquisitions of 3Com, 3PAR, and Palm to make its converged infrastructure strategy real. HP might be able to get backing to make new acquisitions, but the dilemma is that Converged Infrastructure is a stretch in the opposite direction from enterprise software. So it’s not just a question of whether HP can digest another acquisition; it’s an issue of whether HP can strategically focus in two different directions that ultimately might come together, but not for a while.

So let’s speculate about software acquisitions.

SAP, the most logical candidate, is, in a narrow sense, relatively “affordable” given that its stock is roughly about 10 – 15% off its 2007 high. But SAP would be obviously the most challenging given the scale; it would be difficult enough for HP to digest SAP under normal circumstances, but with all the converged infrastructure stuff on its plate, it’s back to the question of how can you be in two places at once. Infor is a smaller company, but as it is also a polyglot of many smaller enterprise software firms, would present HP additional integration headaches that it doesn’t need.

HP may have little choice but to make a play for SAP if IBM or Microsoft were unexpectedly to actively bid. Otherwise, its best bet is to revive the relationship which would give both companies the time to acclimate. But in a rapidly consolidating technology market, who has the luxury of time these days?

Salesforce.com would make a logical stab as it would reinforce HP Enterprise Services’ (formerly EDS) outsourcing and BPO business. It would be far easier for HP to get its arms around this business. The drawback is that Salesforce.com would not be very extensible as an application as it uses a proprietary stored procedures database architecture. That would make it difficult to integrate with a prospective ERP SaaS acquisition, which would otherwise be the next logical step to growing the enterprise software footprint.

Informatica is often brought up – if HP is to salvage its Neoview BI business, it would need a data integration engine to help bolster it. Better yet, buy Teradata, which is one of the biggest resellers of Informatica PowerCenter – that would give HP far more credible presence in the analytics space. Then it will have to ward off Oracle – which has an even more pressing need for Informatica to fill out the data integration piece in its Fusion middleware stack – for Informatica. But with Teradata, there would at least be a real anchor for the Informatica business.

HP has to decide what kind of company it needs to be as Tom Kucharvy summarized well a few weeks back. Can HP afford to converge itself in another direction? Can it afford not to? Leo Apotheker has a heck of a listening tour ahead of him.