03.15.11

Big Data analytics in the cloud could be HP’s enterprise trump card

Posted in Application Development, Application Lifecycle Management (ALM), Big Data, Business Intelligence, Cloud, Data Management, Enterprise Applications, OS/Platforms, Outsourcing, Rich Internet Apps., SaaS (Software as a Service) at 10:47 am by Tony Baer

Unfortunately, scheduling conflicts have kept us from attending Leo Apotheker’s keynote today before the HP Analyst Summit in San Francisco. But yesterday, he tipped his cards for his new software vision for HP before a group of investment analysts. HP’s software focus is not to reinvent the wheel – at least where it comes to enterprise apps. Apotheker has to put to rest that he’s not about to do a grudge match and buy the company that dismissed him. There is already plenty of coverage here, interesting comment from Tom Foremski (we agree with him about SAP being a non-starter), and the Software Advice guys who are conducting a poll.

To some extent this has been little surprise with HP’s already stated plans for WebOS and its recently announced acquisition of Vertica. We do have one question though: what happened to Converged Infrastructure?

For now, we’re not revisiting the acquisitions stakes, although if you follow #HPSummit twitter tags today, you’ll probably see lots of ideas floating around today after 9am Pacific time. We’ll instead focus on the kind of company HP wants to be, based on its stated objectives.

1. Develop a portfolio of cloud services from infrastructure to platform services and run the industry’s first open cloud marketplace that will combine a secure, scalable and trusted consumer app store and an enterprise application and services catalog.

This hits two points on the checklist: provide a natural market for all those PCs that HP sells. The next part is stating that HP wants to venture higher up the food chain than just sell lots of iron. That certainly makes sense. The next part is where we have a question: offering cloud services to consumers, the enterprise, and developers sounds at first blush that HP wants its cloud to be all things to all people.

The good news is that HP has a start on the developer side where it has been offering performance testing services for years – but is now catching up to providers like CollabNet (with which it is aligned and would make a logical acquisition candidate) and Rally in offering higher value planning services for the app lifecycle.

In the other areas – consumer apps and enterprise apps – HP is starting from square one. It obviously must separate the two, as cloud is just about the only thing that the two have in common.

For the consumer side, HP (like Google Android and everyone else) is playing catchup to Apple. It is not simply a matter of building it and expecting they will come. Apple has built an entire ecosystem around its iOS platform that has penetrated content and retail – challenging Amazon, not just Salesforce or a would-be HP, using its user experience as the basis for building a market for an audience that is dying to be captive. For its part, HP hopes to build WebOS to have the same “Wow!” factor as the iPhone/iPad experience. It’s got a huge uphill battle on its hands.

For the enterprise, it’s a more wide open space where only Salesforce’s AppExchange has made any meaningful mark. Again, the key is a unifying ecosystem, with the most likely outlet being enterprise outsourcing customers for HP’s Enterprise Services (the former EDS operation). The key principle is that when you build a market place, you have to identity who your customers are and give them a reason to visit. A key challenge, as we’ve stated in our day job, is that enterprise customers are not the enterprise equivalent of those $2.99 apps that you’ll see in the AppStore. The experience at Salesforce – the classic inversion of the long tail – is that the market is primarily for add-ons to the Salesforce.com CRM application or use of the Force.com development platform, but that most entries simply get buried deep down the list.

Enterprise apps marketplaces are not simply going to provide a cheaper channel for solutions that still require consultative sells. We’ve suggested that they adhere more to the user group model, which also includes forums, chats, exchanges of ideas, and by the way, places to get utilities that can make enterprise software programs more useful. Enterprise app stores are not an end in themselves, but a means for reinforcing a community — whether it be for a core enterprise app – or for HP, more likely, for the community of outsourcing customers that it already has.

2. Build webOS into a leading connectivity platform.
HP clearly hopes to replicate Apple’s success with iOS here – the key being that it wants to extend the next-generation Palm platform to its base of PCs and other devices. This one’s truly a Hail Mary pass designed to rescue the Palm platform from irrelevance in a market where iOS, Android, Adobe Flash, Blackberry, and Microsoft Windows 7/Silverlight are battling it out. Admittedly, mobile developers have always tolerated fragmentation as a fact of life in this space – but of course that was when the stakes (with feature phones) were rather modest. With smart device – in all its varied form factors from phone to tablet – becoming the next major consumer (and to some extent, enterprise) frontier, there’s a new battle afresh for mindshare. That mindshare will be built on the size of the third party app ecosystem that these platforms attract.

As Palm was always more an enterprise rather consumer platform – before the Blackberry eclipsed it – HP’s likely WebOS venue will be the enterprise space. Another uphill battle with Microsoft (that has the office apps), Blackberry (with its substantial corporate email base), and yes, Apple, where enterprise users are increasingly sneaking iPhones in the back door, just like they did with PCs 25 years ago,

3. Build presence with Big Data
Like (1), this also hits a key checkbox for where to sell all those HP PCs. HP has had a half-hearted presence with the discontinued Neoview business. The Vertica acquisition was clearly the first one that had Apotheker’s stamp on it. Of HP’s announced strategies, this is the one that aligns closest with the enterprise software strategy that we’ve all expected Apotheker to champion. Obviously Vertica is the first step here – and there are many logical acquisitions that could fill this out, as we’ve noted previously, regarding Tibco, Informatica, and Teradata. The importance is that classic business intelligence never really suffered through the recession, and arguably, big data is becoming the next frontier for BI that is becoming, not just a nice to have, but increasingly an expected cost of competition.

What’s interesting so far is that in all the talk about big Data, there’s been relatively scant attention paid to utilizing the cloud to provide the scaling to conduct such analytics. We foresee a market where organizations that don’t necessarily want to buy all that and that use large advanced analytics on an event-driven basis, to consume the cloud for their Hadoop – or Vertica – runs. Big Data analytics in the cloud could be HP’s enterprise trump card.

12.01.10

IBM’s Software Complex

Posted in Application Development, BPM, Business Intelligence, Database, Enterprise Applications, Enterprise Integration, IT Services & Systems Integration, Middleware, Technology Market Trends at 1:28 am by Tony Baer

Sometimes the news is that there is no news. Well, Steve Mills did tell us that IBM is investing the bulk of its money in software and that between now and 2015, it would continue to make an average of $4 – 5 billion worth of strategic acquisitions per year. In other words, it would continue its current path, and it will continue buying making acquisitions for the strategic value of technology with the guideline of having them become revenue accretive within 2 – 4 years. Again, nothing new, as if there were anything wrong with that.

The blend of acquisition and organic development is obviously bulking up the Software Group’s product portfolio, which in itself is hardly a bad thing; there is more depth and breadth. But the issue that IBM has always faced is that of complexity. The traditional formula has always been, we have the pieces and we have services to put it together for you. Players like Oracle compete with a packaged apps strategy; in more specialized areas such as project portfolio management, rivals like HP and CA Technologies say that we have one product where IBM splits it in two.

IBM continues to deny that it is in the apps business, but as it shows architectural slides of its stack that is based on middleware along with horizontal “solutions” such as a SPSS Decision Manager offering (more about that shortly); vertical industry frameworks which specify processes, best practices, and other assets that can be used to compose industry solutions; and then at the top of the stack, solutions that IBM and/or its services group develops. It’s at the peak of the stack that the difference between “solutions” and “applications” becomes academic. Reviewing Oracle’s yet to be released Fusion applications, there is a similar architecture that composes solutions based on modular building blocks.

So maybe IBM feels self-conscious about the term application as it doesn’t want to be classed with Oracle or SAP, or maybe it’s the growing level of competition with Oracle that made Mills rather defensive in responding to an analyst’s question about the difference between IBM’s and Oracle’s strategy. His response was that IBM’s is more of a toolkit approach that layers atop the legacy that will always be there, which is reasonable, although the tone was more redolent of “you [Oracle] can’t handle the truth.”

Either way, where you sell a solution or a packaged application for enterprise level, assembly will still be required. Services will be needed to integrate and/or train your people. Let’s be adults and get that debate behind us. For IBM, time to get back to Issue One: Defusing Complexity. When you’re dealing with enterprise software, there will always be complexity. But when it comes to richness or simplicity, IBM tends to aim for the former. The dense slides with small print made the largely middle aged analyst audience more self conscious than normal of the inadequacies of their graduated eyeglasses or contacts.

OK, if you’re IBM facing an analyst crowd, you don’t want to oversimplify the presentation into the metaphorical equivalent of the large print weekly for the visually impaired. You must prove that you have depth. You need to show a memorable, coherent message (Smarter Planet was great when it débuted two years ago). But most importantly, you need to have coherent packaging and delivery to market.

IBM Software Group has done a good job of repurposing technologies across brands to fill defined product needs; it still has its work cut out for its goal of making the software group sales force brand agnostic. That is going to take time.

As a result, good deeds don’t go unpunished, with IBM’s challenges with SPSS Decision Manager a great case in point. The product, an attempt to craft a wider market for SPSS capabilities, blends BI analytics from Cognos, rules management from Ilog, and event processing from WebSphere Business Events to develop a predictive analytics solution for fashioning business strategy aimed at line of business users.

For instance, if you are in the claims processing group of an auto insurance company, you can use form-based interfaces to vary decision rules and simulate the results to ensure that accident calls from 19 year old drivers or those who have not yet contacted the police are not fast tracked for settlement.

The problem with Decision Manager is that it is not a single SKU or install; IBM has simply pre-integrated components that you still must buy a la carte. IBM Software is already integrating product technologies; it now needs to attend to integrating delivery.

10.01.10

Leo Apotheker to target HP’s forgotten business

Posted in Application Development, Business Intelligence, Data Management, Database, Enterprise Applications, IT Infrastructure, IT Services & Systems Integration, Networks, Outsourcing, SaaS (Software as a Service), Storage, Systems Management, Technology Market Trends at 1:35 pm by Tony Baer

Ever since its humble beginnings in the Palo Alto garage, HP has always been kind of a geeky company – in spite of Carly Fiorina’s superficial attempts to prod HP towards a vision thing during her aborted tenure. Yet HP keeps talking about getting back to that spiritual garage.

Software has long been the forgotten business of HP. Although – surprisingly – the software business was resuscitated under Mark Hurd’s reign (revenues have more than doubled as of a few years ago), software remains almost a rounding error in HP’s overall revenue pie.

Yes, Hurd gave the software business modest support. Mercury Interactive was acquired under his watch, giving the business a degree of critical mass when combined with the legacy OpenView business. But during Hurd’s era, there were much bigger fish to fry beyond all the internal cost cutting for which Wall Street cheered, but insiders jeered. Converged Infrastructure has been the mantra, reminding us one and all that HP was still very much a hardware company. The message remains loud and clear with HP’s recent 3PAR acquisition at a heavily inflated $2.3 billion which was concluded in spite of the interim leadership vacuum.

The dilemma that HP faces is that, yes, it is the world’s largest hardware company (they call it technology), but the bulk of that is from personal systems. Ink, anybody?

The converged infrastructure strategy was a play at the CTO’s office. Yet HP is a large enough company that it needs to compete in the leagues of IBM and Oracle, and for that it needs to get meetings with the CEO. Ergo, the rumors of feelers made to IBM Software’s Steve Mills, and the successful offer to Leo Apotheker, and agreement for Ray Lane as non executive chairman.

Our initial reaction was one of disappointment; others have felt similarly. But Dennis Howlett feels that Apotheker is the right choice “to set a calm tone” that there won’t be a massive a debilitating reorg in the short term.

Under Apotheker’s watch, SAP stagnated, hit by the stillborn Business ByDesign and the hike in maintenance fees that, for the moment, made Oracle look warmer and fuzzier. Of course, you can’t blame all of SAP’s issues on Apotheker; the company was in a natural lull cycle as it was seeking a new direction in a mature ERP market. The problem with SAP is that, defensive acquisition of Business Objects notwithstanding, the company has always been limited by a “not invented here” syndrome that has tended to blind the company to obvious opportunities – such as inexplicably letting strategic partner IDS Scheer slip away to Software AG. Apotheker’s shortcoming was not providing the strong leadership to jolt SAP out of its inertia.

Instead, Apotheker’s – and Ray Lane’s for that matter – value proposition is that they know the side of the enterprise market that HP doesn’t. That’s the key to this transition.

The next question becomes acquisitions. HP has a lot on its plate already. It took at least 18 months for HP to digest the $14 billion acquisition of EDS, providing a critical mass IT services and data center outsourcing business. It is still digesting nearly $7 billion of subsequent acquisitions of 3Com, 3PAR, and Palm to make its converged infrastructure strategy real. HP might be able to get backing to make new acquisitions, but the dilemma is that Converged Infrastructure is a stretch in the opposite direction from enterprise software. So it’s not just a question of whether HP can digest another acquisition; it’s an issue of whether HP can strategically focus in two different directions that ultimately might come together, but not for a while.

So let’s speculate about software acquisitions.

SAP, the most logical candidate, is, in a narrow sense, relatively “affordable” given that its stock is roughly about 10 – 15% off its 2007 high. But SAP would be obviously the most challenging given the scale; it would be difficult enough for HP to digest SAP under normal circumstances, but with all the converged infrastructure stuff on its plate, it’s back to the question of how can you be in two places at once. Infor is a smaller company, but as it is also a polyglot of many smaller enterprise software firms, would present HP additional integration headaches that it doesn’t need.

HP may have little choice but to make a play for SAP if IBM or Microsoft were unexpectedly to actively bid. Otherwise, its best bet is to revive the relationship which would give both companies the time to acclimate. But in a rapidly consolidating technology market, who has the luxury of time these days?

Salesforce.com would make a logical stab as it would reinforce HP Enterprise Services’ (formerly EDS) outsourcing and BPO business. It would be far easier for HP to get its arms around this business. The drawback is that Salesforce.com would not be very extensible as an application as it uses a proprietary stored procedures database architecture. That would make it difficult to integrate with a prospective ERP SaaS acquisition, which would otherwise be the next logical step to growing the enterprise software footprint.

Informatica is often brought up – if HP is to salvage its Neoview BI business, it would need a data integration engine to help bolster it. Better yet, buy Teradata, which is one of the biggest resellers of Informatica PowerCenter – that would give HP far more credible presence in the analytics space. Then it will have to ward off Oracle – which has an even more pressing need for Informatica to fill out the data integration piece in its Fusion middleware stack – for Informatica. But with Teradata, there would at least be a real anchor for the Informatica business.

HP has to decide what kind of company it needs to be as Tom Kucharvy summarized well a few weeks back. Can HP afford to converge itself in another direction? Can it afford not to? Leo Apotheker has a heck of a listening tour ahead of him.

05.24.10

IBM offers to buy Sterling Commerce

Posted in BPM, Enterprise Applications, Enterprise Integration, IT Services & Systems Integration, Middleware, SOA & Web Services, Supply Chain Management at 4:19 pm by Tony Baer

We should have seen this one coming. IBM’s offer to buy Sterling Commerce for $1.4 billion from AT&T closes a major gap in the WebSphere portfolio, extending IBM’s array of internal integrations externally to B2B. It’s a logical extension, and IBM is hardly the first to travel this path: Software AG’s webMethods began life as a B2B integration firm before it morphed into EAI, later SOA and BPM middleware, before getting acquired by Software AG. In turn, Tibco recently added Foresight Software as an opportunistic extension for taking advantage of a booming market in healthcare B2B transactions.

But neither Software AG’s or Tibco’s moves approach the scope of Sterling Commerce’s footprint in B2B trading partner management, a business that grew out of its heritage as one of the major EDI (electronic data interchange) hubs. The good news is the degree of penetration that Sterling has; the other (we won’t call it “bad”) news is all the EDI legacy, which provides great fodder for IBM’s Global Business Services arm to address a broader application modernization opportunity.

Sterling’s base has been heavily in downstream EDI and related trading partner management support for retailers, manufacturers, and transportation/freight carriers. Its software products cover B2B/EDI integration, partner onboarding into partner communities (an outgrowth of the old hub and spoke patterns between EDI trading partners), invoicing, payments, order fulfillment, and multi-channel sales. In effect, this gets IBM deeper into the supply chain management applications market as it already has Dynamic Inventory Optimization (DIOS) from the Maximo suite (which falls under the Tivoli umbrella), not to mention the supply chain optimization algorithms that it inherited as part of the Ilog acquisition which are OEM’ed to partners (rivals?) like SAP and JDA.

Asked if acquisition of Sterling would place IBM in competition with its erstwhile ERP partners, IBM reiterated its official line that it picks up where ERP leaves off – but that line is getting blurrier.

But IBM’s challenge is prioritizing the synergies and integrations. As there is still a while before this deal closes – approvals from AT&T shareholders are necessary first – IBM wasn’t about to give a roadmap. But they did point to one no-brainer: infusing IBM WebSphere vertical industry templates for retail with Sterling content. But there are many potential synergies looming.

At top of mind are BPM and business rules management that could make trading partner relationships more dynamic. There are obvious opportunities for WebSphere Business Modeler’s Dynamic Process Edition, WebSphere Lombardi Edition’s modeling, and/or Ilog’s business rules. For instance, a game changing event such as Apple’s iPad entering or creating a new market for tablet could provide the impetus for changes to products catalogs, pricing, promotions, and so on; a BPM or business rules model could facilitate such changes as an orchestration layer that acts in conjunction with some of the Sterling multi-channel and order fulfillment suites. Other examples include master data management, which can be critical when managing sale of families of like products through the channel; and of course Cognos/BI, which can be used for evaluating the profitability or growth potential of B2B relationships.

Altimeter Group’s Ray Wang voiced a question that was on many of our minds: why AT&T would give up Sterling. IBM responded about the potential partnership opportunities but to our mind, AT&T has its hands full attaining network parity with Verizon Wireless and is just not a business solutions company.

03.10.10

HP analyst meeting 2010: First Impressions

Posted in Application Development, Application Lifecycle Management (ALM), Business Intelligence, Cloud, Data Management, IT Infrastructure, IT Services & Systems Integration, Legacy Systems, Networks, Outsourcing, Systems Management at 12:34 am by Tony Baer

Over the past few years, HP under Mark Hurd has steadily gotten its act together in refocusing on the company’s core strengths with an unforgiving eye on the bottom line. Sitting at HP’s annual analyst meeting in Boston this week, we found ourselves comparing notes with our impressions from last year. Last year, our attention was focused on Cloud Assure; this year, it’s the integraiton of EDS into the core businesss.

HP now bills itself as the world’s largest purely IT company and ninth in the Fortune 500. Of course, there’s the consumer side of HP that the world knows. But with the addition of EDS, HP finally has a credible enterprise computing story (as opposed to an enterprise server company). Now we’ll get plenty of flack from our friends at HP for that one – as HP has historically had the largest market share for SAP servers. But let’s face it; prior to EDS, the enterprise side of HP was primarily a distributed (read: Windows or UNIX) server business. Professional services was pretty shallow, with scant knowledge of the mainframes that remain the mainstay of corporate computing. Aside from communications and media, HP’s vertical industry practices were sparse, few, and far between. HP still lacks the vertical breadth of IBM, but with EDS has gained critical mass in sectors ranging from federal to manufacturing, transport, financial services, and retail, among others.

Having EDS also makes credible initiatives such as Application Transformation, a practice that helps enterprises prune, modernize, and rationalize their legacy application portfolios. Clearly, Application transformation is not a purely EDS offering; it was originated by Ann Livermore’s Enterprise Business group, draws upon HP Software assets such as discovery and dependency mapping, Universal CMDB, PPM, and the recently introduced IT Financial Management (ITFM) service. But to deliver, you need bodies and people that know the mainframe – where most of the apps being harvested or thinned out are. And that’s where EDS helps HP flesh this out to a real service.

But EDS is so 2009; the big news on the horizon is 3Com, a company that Cisco left in the dust before it rethought its product line and eked out a highly noticeable 30% market share for network devices in China. Once the deal is closed, 3Com will be front and center in HP’s converged computing initiative which until now primarily consisted of blades and Procurve VoIP devices. It gains a much wider range of network devices to compete head-on as Cisco itself goes up the stack to a unified server business. Once the 3com deal is closed, HP will have to invest significant time, energy, and resources to deliver on the converged computing vision with an integrated product line, rather than a bunch of offerings that fill the squares of a PowerPoint matrix chart.

According to Livermore, the company’s portfolio is “well balanced.” We’d beg to differ where it comes to software, which accounts for a paltry 3% of revenues (a figure that our friends at HP reiterated underestimated the real contribution of software to the business).

It’s the side of the business that suffered from (choose one) benign or malign neglect prior to the Mark Hurd era. HP originated network node management software for distributed networks, an offering that eventually morphed into the former OpenView product line. Yet HP was so oblivious to its own software products that at one point its server folks promoted bundling of rival product from CA. Nonetheless, somehow the old HP managed not to kill off Openview or Opencall (the product now at the heart of HP’s communications and media solutions) – although we suspect that was probably more out of neglect than intent.

Under Hurd, software became strategic, a development that lead to the transformational acquisition of Mercury, followed by Opsware. HP had the foresight to place the Mercury, Opsware, and Openview products within the same business unit as – in our view – the application lifecycle should encompass managing the runtime (although to this day HP has not really integrated Openview with Mercury Business Availability Center; the products still appeal to different IT audiences). But there are still holes – modest ones on the ALM side, but major ones elsewhere, like in business intelligence where Neoview sits alone. Or in the converged computing stack and cloud in a box offerings, which could use strong identity management.

Yet if HP is to become a more well-rounded enterprise computing company, it needs more infrastructural software building blocks. To our mind, Informatica would make a great addition that would point more attention to Neoview as a credible BI business, not to mention that Informatica’s data transformation capabilities could play key roles with its Application Transformation service.

We’re concerned that, as integration of 3Com is going to consume considerable energy in the coming year, that the software group may not have the resources to conduct the transformational acquisitions that are needed to more firmly entrench HP as an enterprise computing player. We hope that we’re proven wrong.

03.17.09

The Network is the Computer

Posted in Cloud, IT Infrastructure, IT Services & Systems Integration, Linux, Networks, OS/Platforms, Storage, Systems Management, Technology Market Trends, Virtualization at 1:50 pm by Tony Baer

It’s sometimes funny that history takes some strange turns. Back in the 1980s, Sun began building its empire in the workgroup by combining two standards: UNIX boxes with TCP/IP networks built in. Sun’s The Network is the Computer message declared that computing was of little value without the network. Of course, Sun hardly had a lock on the idea, as Bob Metcalfe devised the law stating that the value of the network was exponentially related to the number of nodes connected, and that Digital (DEC) (remember them?) actually scaled out the idea at division level where Sun was elbowing its way into the workgroup.

Funny that DEC was there first but they only got the equation half right – bundling a proprietary OS to a standard networking protocol. Fast forward a decade and Digital was history, Sun was the dot in dot com. Then go a few more years later, as Linux made even a “standard” OS like UNIX look proprietary, Sun suffers DEC’s fate (OK they haven’t been acquired, yet and still have cash reserves, if they could only figure out what to do when they finally grow up), and bandwidth, blades get commodity enough that businesses start thinking that the cloud might be a cheaper, more flexible alternative to the data center. Throw in a very wicked recession and companies are starting to think that the numbers around the cloud – cheap bandwidth, commodity OS, commodity blades – might provide the avoided cost dollars they’ve all been looking for. That is, if they can be assured that lacing data out in the cloud won’t violate ay regulatory or privacy headaches.

So today it gets official. After dropping hints for months, Cisco has finally made it official: its Unified Computing System is to provide in essence a prepackaged data center:

Blades + Storage Networking + Enterprise Networking in a box.

By now you’ve probably read the headlines – that UCS is supposed to do, what observers like Dana Gardner term, bring an iPhone like unity to the piece parts that pass for data centers. It would combine blade, network device, storage management and VMware’s virtualization platform (as you might recall, Cisco owns a $150 million chunk of VMware) to provide, in essence, a data center appliance in the cloud.

In a way, UCS is a closing of the circle that began with mainframe host/terminal architectures of a half century ago: a single monolithic architecture with no external moving parts.

Of course, just as Sun wasn’t the first to exploit TCP/IP network, but got the lion’s share of credit from, similarly, Cisco is hardly the first to bridge the gap between compute and networking node. Sun already has a Virtual Network Machines Project for processing network traffic on general-purpose servers, while its Project Crossbow is supposed to make networks virtual as well as part of its OpenSolaris project. Sounds like a nice open source research project to us that’s limited to the context of the Solaris OS. Meanwhile HP has raped up its Procurve business, which aims at the heart of Cisco territory. Ironically, the dancer left on the sidelines is IBM, which sold off its global networking business to AT&T over a decade ago, and its ROLM network switches nearly a decade before that.

It’s also not Cisco’s first foray out of the base of the network OSI stack. Anybody remember Application-Oriented Networking? Cisco’s logic, to build a level of content-based routing into its devices was supposed to make the network “understand” application traffic. Yes, it secured SAP’s endorsement for the rollout, but who were you really going to sell this to in the enterprise? Application engineers didn’t care for the idea of ceding some of their domain to their network counterparts. On the other hand, Cisco’s successful foray into storage networking proves that the company is not a one-trick pony.

What makes UCS different on this go round are several factors. Commoditization of hardware and firmware, emergence of virtualization and the cloud, makes division of networking, storage, and datacenter OS artificial. Recession makes enterprises hungry for found money, maturation of the cloud incents cloud providers to buy pre-packaged modules to cut acquisition costs and improve operating margins. Cisco’s lineup of partners is also impressive – VMware, Microsoft, Red Hat, Accenture, BMC, etc. – but names and testimonials alone won’t make UCS fly. The fact is that IT has no more hunger for data center complexity, the divisions between OS, storage, and networking no longer adds value, and cloud providers need a rapid way of prefabricating their deliverables.

Nonetheless we’ve heard lots of promises of all-in-one before. The good news is this time around there’s lots of commodity technology and standards available. But if Cisco is to make a real alternative to IBM, HP, or Dell, it’s got to put make datacenter or cloud-in-the box reality.

10.02.08

Is Social Media vs. Knowledge Management a generational war?

Posted in Enterprise Applications, IT Services & Systems Integration, Web 2.0 Apps at 11:52 am by Tony Baer

The Y2K issue a decade ago brought to light a critical problem facing many organizations: what happens when your most experienced minds retire? That’s especially critical in the case of skillsets for technologies, architectures, or methodologies that are no longer in vogue. That brought forth the idea that, if you can’t prevent the passage of time, there might as well be some way to harvest the experience gained from it.

The question is whether you do so in a carefully organized top down fashion or instead encourage a culture of more informal or organic knowledge sharing. There’s no single silver bullet that works, but what’s always disturbed us have been those top-down enterprise knowledge management projects that to us appeared as little more than make work for highly paid enterprise consultants. The problem we had with classical Knowledge Management was that the whole idea seemed too difficult to put boundaries around: Just where exactly do you draw the lines on a knowledge management project? Even ERP projects, which were notable for their cost overruns, had more tangible targets: implement a new transaction system that, in many cases, would require reengineering of business processes. And we know how well contained those projects ended up.

Along came Web 2.0 and social media which provided new technologies for the grassroots to simply not wait for some project manager to start a harvesting session which is then converted into retrievable assets from some application requiring significant custom coding. Instead, the notion of Wikis, blogs, microblogs, chats, forums and so on is to use the right tool for the purpose as the purpose arises. Some call it fun. We’ve thought of the new social media as the next generation Knowledge Management.

A couple days ago Xerox researcher (and of course blogger) Venkatesh G. Rao wrote about Social Media and Knowledge management as a generational war. He spoke of several occasions where he was asked to give token advice to knowledge management project leads or consultants with the idea of finding a middle ground to update knowledge management practices and make them more webby and agile. We thought he got kind of carried away with the generational argument that was rife with stereotypes. Sure, in all likelihood, Twittering, Facebook et al tend to hit more of a younger demographic, but use of Web 2.0 tools is definitely not restricted to people under 30.

We’re more in agreement with another facet of his argument, that conventional knowledge management is more of a waterfall process, whereas social media tends to be more agile. It’s very much analogous to the different methodologies of software development, not to mention the idea of top-down vs. bottom-up.

Conventional knowledge management initiatives tend to be top-down affairs, driven by plenty of advanced planning for designating experts and thought leaders, then harvesting or codifying their insights, developing applications and databases, and inputting this all. By contrast, the social media are designed for use when the muse hits. It might not be comprehensive, but it provides a fast outlet that, through processes such as folksonomies applies a Wikipedia-like grassroots approach to classifying or giving meaning to knowledge, and easy-to-use technologies employing liberal use of tagging that help insight originators to just get their thoughts down in a manner that makes it retrievable, at the point when they have the time (or not) to contribute it.

The battle is more about whether learning is a more bottom-up rather than top-down exercise. There have been many of us around for years who have always contributed “folk” knowledge, but until recently lacked the tools to share it. The idea that the debate between knowledge management and social media is a generational divide is hogwash.

05.13.08

HP and EDS: Picking Up where Carly Left Off

Posted in IT Services & Systems Integration, Outsourcing, Technology Market Trends at 10:44 am by Tony Baer

Only leaked yesterday, this morning HP confirmed it would acquire EDS for north of $12 billion. The obvious driver, as colleague Dana Gardner noted, is that with the IBM and its global services colossus and the growth of outsourced, cloud, or on demand computing, that enterprise customers were going to demand a viable alternative – an Avis to IBM’s Hertz so to speak.

Of course the deal brings back memories of when Mark Hurd’s predecessor left off, which was the attempted purchase of PwC consulting for $18 billion back in 2000. Barely a couple years later, IBM swooped up PwC up for roughly a fifth of the cost. Besides the ridiculous price tag (these were pre-inflated dollars) was the question of culture clash. HP’s techie culture seemed a poor fit for PwC’s suit-and-tie atmosphere; as we maintained, IBM and PwC were a much better fit.

And while $12 billion still sounds like a lot of money, that’s probably about half of what HP would have paid back in 2000 with pre-deflated dollars.

But what a difference a few years and a more-focused senior management team make. Not only has Hurd rationalized the Compaq acquisition, but for the first time his team actually cultivated HP Software as more than an oxymoron, and has bulked it up with some shrewd acquisitions. Admittedly, success at HP Software doesn’t automatically portend similar results with EDS, which has been through the acquisition game before. More importantly, comparing Hurd to Fiorina, he exercises the far more hands-on management style that will be necessary to pulling such a transformative deal off.

Among the challenges are reorienting EDS away from IBM to promote HP infrastructure. Given that EDS is stronger in infrastructure outsourcing rather than mainstream systems integration, that might be a smoother shift, but it will require an internal migration of expertise. EDS is already part way down the transformation road, having been slimmed down by CEO Ronald Rittenmeyer (he stays on as business unit head), and before that, Michael Jordon (no, not the Air Jordans guy).

Significantly, HP will preserve EDS’s identity and autonomy, handing over some of Ann Livermore’s services operations. With more engaged management, HP stands a better chance this time of making such an acquisition work.

04.24.08

Still Room for Billion-Dollar Plays: A Conversation with M.R. Rangaswami

Posted in Cloud, Database, Enterprise Applications, Open Source, OS/Platforms, Outsourcing, SaaS (Software as a Service), Technology Market Trends at 10:02 pm by Tony Baer

On the eve of last year’s Software conference, Sand Hill Group principal M.R. Rangaswami spoke on the prospects for innovation in a consolidating software industry. Evidently there was some room left for innovation, witness Sun’s billion dollar acquisition of MySQL. According to Rangaswami, it points to the fact that there’s life – and value – in open source software companies beyond Red Hat.

In fact, 2007 was a year of second acts, with Salesforce joining the ranks of billion-dollar software companies. On the eve of Software 2008 next week, we just had a return engagement with MR to get his take on what’s gone down over the past year. The first point he dropped was breaking conventional wisdom that another software company could actually crack the established order, given ongoing consolidation. “People questioned whether there would ever be another billion dollars software company again, although of course Mark Benioff doesn’t call it that,” he noted.

But Rangaswami added that conventional wisdom wasn’t totally off, referring to the fact that a lot of promising break-outs are being gobbled up before they get the chance to go public – MySQL being Exhibit A. There’s plenty of cash around the old guard to snap up promising upstarts.

Nonetheless, there are invisible limits to the acquisition trend, most notably among SaaS (Software as a Service) providers. He ascribes the reticence to the fact that conventional software firms are scared of the disruptive effects that on demand providers could have in cannibalizing their existing businesses.

Going forward, Rangaswami expects some retrenchment. We’d put it another way – with last year’s 5 – 6% growth in IT spending, it was almost impossible for any viable ISV to not make money. Even if, perish the thought, we had been CFO for some poor ISV last year, it would have been in the black in spite of us.

But this year, with IT spending growth anticipated in the more modest 1 – 2% range if that, there’s going to be a lot of missed numbers. IBM cleaned up in Q1, but Oracle’s and Microsoft’s numbers failed to impress (Microsoft last year was coasting on Vista upgrades). Rangaswami advises ISVs to keep the lid of development costs (he expects to see more offshoring this year), but he also says that ISVs should be “smarter” with their marketing budgets. “Do a lot more with programs that are online and use Web 2.0 technologies as opposed to some of more traditional approaches,” he advised, pointing to channels like podcasts and YouTube. “Most people watch TV on YouTube these days,” he said, just slightly exaggerating.

Of course, Rangaswami says you can’t ignore emergence of social computing, for which Facebook for now has become the poster child. We admit to being a bit put off by the superficial atmosphere of the place, and of course not being under 35, why should we care what somebody did last night or who their favorite band is? But it’s become conventional wisdom that some form of social networking is bound to emerge for more professional purposes, like engaging your customers, that goes beyond the forums and chat rooms of user groups, the occasional regional or annual conferences, or the resume-oriented purpose of LinkedIn. In fact, one recent startup, Ringside Networks, is offering a “social appserver” where businesses can use Facebook apps to build their own communities on their own sites.

But Rangaswami says, why not use some of the less serious aspects of social computing to conduct real business. Like getting your New England customers together at the next Red Sox game (just make sure that one of your New York customers by mistake doesn’t slip onto the invite list).

The theme of this year’s Software 2008 conference is what Rangaswami terms “Platform Shift.” After the upheavals of the open systems and Internet eras, it appeared that the software industry was coalescing around Java and .NET platforms. But then on demand began making the Java vs. .NET differences irrelevant. For instance, if you want to write to Salesforce’s platform, it’s in a stored procedures languages that is like, but isn’t Java. On the horizon you have Amazon’s EC2 cloud, the Google Apps platform, and you could probably also consider Facebook to be another platform ecosystem (there are thousands of apps already written to it).

The good news is that tough times actually encourage customers to buy a couple of on demand seats for petty cash because it sharply limits risk.

The result is that barriers to entry for new software solution providers are lower than ever. You don’t have to ask customers to install software and you don’t have to build the on demand infrastructure to host it. Just build the software, then choose whose cloud you want to host it on, pay only by the drink, and start marketing. According to Rangaswami, the cloud might remove the hurdles to getting started, but getting your brand to emerge from the fog may prove the toughest challenge. “Sales and marketing in this new world will be totally different.”

04.15.08

Integration Competency Centers: Optimization vs. Investment

Posted in Business Intelligence, Data Management, Database, Enterprise Applications, Enterprise Integration, IT Services & Systems Integration at 12:04 pm by Tony Baer

Whenever we hang out at enterprise architect conferences, our first question to those folks is, how do you make yourselves relevant? All too often, the ideals of enterprise architecture get short-circuited by tactical needs (the point of pain must be resolved yesterday); budget constraints (we don’t want to pay more for up front architecting so somebody else gets a free ride from reuse); and of course politics (no explanation needed).

Evidently the same barb has been aimed at proponents of Integration Competency Centers (ICCs), ever since Gartner began writing about it circa 2002, and since Integration Consortium head John Schmidt and now-Informatica colleague David Lyle authored a book on how to create and run them back in 2005.

In response, Schmidt, who recently joined Informatica to head its newly formed ICC consulting practice, collaborated with colleagues on a white paper (registration is required) outlining some practical suggestions for making the economic case. For Informatica’s customer base, Schmidt’s preaching to the already – converted, with a survey of the base indicating that 43% either have an center already established or are in the midst of rolling one out.

The first half of the paper, offering a primer on ICCs, literally borrows a few pages from Schmidt’s book. There’s no single kind of integration center, with missions ranging pure advice to delivering actual service, with varying levels of control. For instance, some may simply provide a repository of best practices, act in a standards advisory or policing role, or deliver actual services.

Finally, midway through the paper, it cuts to the chase, starting with laying out the models for how ICCs are financed: a la carte by projects, centrally funded operations taxed as a part of IT overhead, or anything in between.

Our take is that the case for any kind of central funding is obviously tougher today, given the near-term likelihood of the economy and IT budgets heading south.

Furthermore, central funding often clouds the transparency that increasingly being demanded of IT, not to mention the fact that just as with taxes, it’s easy to see the costs and difficult to enumerate the value. Significantly, the paper says that while transparency is a goal, there is a clear limit. “Note that cost transparency does not mean detailed cost accounting; that is a mistake and invites your customer (internal or otherwise) to run your business,” providing as an example, the fact that you don’t expect a restaurant to disclose recipes for the dinners they are serving you. “Transparency does not mean you should allow the customer to wander into the kitchen to see how the food is prepared or to provide details about where the ingredients are purchased.”

If funding is more project-based, that dredges up the issue of who pays the freight for sound architecture, especially if the goal is some degree of reuse. Whether you are building a data integration platform or service-oriented architecture (or both), there’s going to be some extra effort expended up front to ensure that the architecture does not result in a one-off solution. And that brings the question of, why should my project pay for somebody else’s reuse? Short of central funding, budgetary or salary incentives are probably the best route for rewarding the finer aspects of human nature.

But that brings up a related dilemma facing ICCs: If shared infrastructure is involved, how do you make the cost case? As the paper notes, typical hurdles are pinpointing and communicating the right outcome metrics, mismatch of project cycles and budgeting/internal planning horizons, and securing stakeholder buy-in, especially given the typical pace of staff turnover.

The paper listed four possible strategies: the quick-win scenario that comes from small, incremental projects targeting immediate pain points; executive vision, that is great as long as you have the same c-level team in place; riding a “wave” of a large project with tangible ROI where you implement some of the future building blocks (it’s a sly way of slipping in upfront payment for future benefits); and probably the toughest one of the bunch, creating that wave (e.g., we’ve just consummated a dozen acquisitions, and we’re drowning in redundancy).

What was useful were the case study examples for backing these methods, which illustrated some of the real-world hurdles that you’d face. For instance, the typical hassles of getting the numbers for the business case, which often degenerates to political tug of wars: teams reluctant to generate time or cost numbers on the excuse that nobody has the time to collect them, masquerading lukewarm support for the idea of integration centers, or fear of losing control to some central authority. Or another case where an IT executive was leery about quoting the cost of a la carte development, so as not to publicize how costly his team of developers really were – with the resolution being splitting the difference (more modest costs that were still just high enough to support the business case).

For the example of a team that mounted a “create the wave” strategy, it took 6 months to assemble all the numbers showing 5 years of history and 3 years of projections, and monthly project costs for 2 years (the spreadsheet exceeded 13 MBytes) showing a $20 million investment would yield a $25 million annual net operational saving.

In many ways, the outlook for forming ICCs mirrors that for SOA, or any other EA initiative. If your organization is already invested in a strategy, it pays to have somebody tasked with optimizing it; but with the economy going south, for the rest of us, the “quick win” approach described by the authors, is going to be the most realistic. And to deal with inevitable resistance, maybe it doesn’t make sense to initially term it a competency center, because to cynics and budget hawks, that might sound too much akin to the 100-year troop commitment myth. Instead, the most expedient course would probably be to promote this as a way to optimize and stretch, rather than invest dollars.

« Previous entries Next Page » Next Page »