The old adage of the shoemaker’s son’s going barefoot has long been associated with IT organizations because they have often been the last to adopt the kind of automated solutions to run their shops that they have implemented for the rest of the enterprise.
Of course, packaging up what IT does is kind of difficult because, compared to line organizations, technology group processes and activities are so diverse. On one hand, they are regarded as a digital utility that is responsible for keeping digital lights on; not surprisingly, that is a dominant impression given that, according to a wide range of surveys, IT infrastructure and software maintenance easily consumes 70% or more (depending on whose survey you read). As to what’s left over, that’s where IT is supposed to play project delivery mode, where it delivers the technologies that are supposed to support business innovation. In that role, IT juggles several roles including systems integration, application development, and project management. And, if IT is to function properly, it is supposed to govern itself.
With ITIL emerging as a -– fill in the blank -– process enabler or yet another layer of red tape, the goal has been to make the utility side of IT a more consistent business with repeatable processes that should improve service levels and reduce cost. While adherence to the ITIL framework, or the broader discipline of IT Service Management, is supposed to make the business feel that it is getting more value for its IT dollars (better service can translate to competitive edge, especially if you heavily leverage the web as a channel for dealing with business partners or customers), the brass ring is supposed to come from the way that IT overtly supports business innovation in project delivery. But like any business, there is the need to manage your investments, a concern that has driven emergence of Project Portfolio Management as a means for IT organizations to evaluate how well projects are meeting their budgets, schedules, and defined goals; in some ways, it’s tempting to label PPM as ERP for IT, as it’s intended as an application for planning where IT should direct its resources.
Five years ago, Mercury’s (now part of HP) acquisition of Kintana, which was intended to grow the company out of its development/testing niche, began putting PPM on the map. That was followed in the next few years with each of the major development tools players acquiring their ways into this space.
Of course, the devil’s in the details. PPM is not a simple answer to a complex problem, as it requires decent data from IT projects that could encompass, not only accomplished milestones from a project management system, but feedback or reports from quality management or requirements analysis tools to ensure that the software – no matter how mission-critical – isn’t getting bogged down with insurmountable defects or veering away from intended scope. Significantly, at least one major player – IBM – is rethinking its whole PPM approach, with the result being that it will likely split its future solutions into separate project, portfolio, and program management streams.
Against this backdrop, Innotas has carved a different path. Founded by veterans of Kintana, the goal of the founders is to make the new generation a kinder, gentler PPM for the rest of us. Delivered as a SaaS offering, the company not surprisingly differentiates itself using the Siebel/Salesforce.com metaphor (and in fact is a marketing partner on Salesforce’s App Exchange). While we haven’t yet caught a demo to attest that Innotas PPM really is simpler, the company did grow 4x to a hundred customers last year, expects to double this year (recession notwithstanding), and just received a modest $6 million shot of C funding to do the usual late round expand sales & marketing.
The dilemma of governance is that lacking bounds, you can often wind up spinning wheels just to get the last detail, whether you actually need it or not. Not surprisingly for a top-down solution, and one that’s aimed at IT, the PPM market has been fairly limited. Significantly, in the press release, Innotas referred to the size of the SaaS rather than the PPM market to promote its upside potential. While that begs the question, there’s always the classic China market argument of a relatively empty market just waiting to be educated. In this case, Innotas points to the fact that barely 300 of the Fortune 1000 have bought PPM tools so far, and that doesn’t even count the 50,000 midmarket companies that have yet to be penetrated.
Our comeback is that like any midmarket solution, the burden is on the vendor to make the case that midmarket companies, with their more modest IT software portfolios, have resource management problems that are complex enough to warrant such a solution. Nonetheless, the PPM market is in need of solutions that can at least give you an 80% answer, because most organizations don’t have the time or resource to maintain fully staffed program management offices whose mission is to perform exactly that.
It shouldn’t be surprising as to why IT’s automation needs often fall to the bottom of the stack: Because most companies are not in the technology business, investments in people, processes, or technologies that are designed to improve IT only show up as expenses on the bottom line. And so while IT is the organization that is responsible for helping the rest of the business adopt various automated solutions, IT often goes begging when it comes to investing in its own operational improvements.
In large part that explains the 20-year “overnight success” of ITIL. Conceived as a framework for codifying what IT actually does for a living, the latest revisions of ITIL describe a service lifecycle that provides opportunity for IT to operate more like a services business that develops, markets, and delivers those services to the enterprise as a whole. In other words, it’s supposed to elevate areas that used to pass for “help desk” and “systems management” into conversations that could migrate from middle manager to C-level.
Or at least that’s what it’s supposed to be cracked up to being. If you codify what an IT Service is, then define the actions that are involved with every step in its lifecycle, you have the outlines for a business process that could be made repeatable. And as you codify processes, you gain opportunities to attach more consistent metrics that track performance.
We’ve studied ITIL and have spoken with organizations on their rationale for adoption. Although the hammer of regulatory compliance might look to be the obvious impetus (e.g., if you are concerned about regulations covering corporate governance or protecting the sanctity of customer identity, you want audit trails that include data center operation), we also found that scenarios such as corporate restructuring or merger and acquisition also played a hand. At an ITIL forum convened by IDC in New York last week, we found that was exactly the case for Motorola, Toyota Financial Services, and for Hospital Corp. of America – each of whom sat on a panel to reflect on their experiences.
They spoke of establishing change advisory boards to cut down on the incidence of unexpected changes (that tend to break systems), formalizing all service requests (to reduce common practices of buttonholing), reporting structures (which, not surprisingly for different organizations, varied widely), and what to put in you’re the Configuration Management Database (CMDB) that is stipulated by the ITIL framework as the definitive store defining what you are running and how you are running it.
But we also came across comments that, for all its lofty goals, that ITIL could wind up erecting new silos for an initiative that was supposed to break them down. One attendee, from a major investment bank, was concerned that with adoption of the framework would come pressure for certifications that would wind up pigeonholing professionals into discrete tasks, a la Frederick Taylor. Another mused about excessive reliance on arbitrary metrics for the sake of metrics, because that is what looks good in management presentations. Others questioned the idea of whether initiatives, such as adding a change advisory board or adding a layer of what were, in effect, “account representatives” between IT and the various business operating units it solves, would in turn create new layers of bureaucracy.
What’s more interesting is that the concerns of attendees were hardly isolated voices in the wilderness. John Willis, who heads Zabovo, third party consulting firm specializing in IBM Tivoli tools, recently posted the responses from the Tivoli mailing list on the question of whether ITIL actually matters. He didn’t get a shortage of answers. Not surprisingly, there were plenty of detractors. One respondent characterized ITIL as “merely chang[ing] the shape of the hoops I jump through, and not the fact that there are hoops…” Another termed it “$6 buzzword/ management fad,” while another claimed that ITIL makes much ado over the obvious. “The Helpdesk is NOT an ITIL process, it is merely a function…”
But others stated that, despite the hassles, that the real problem is defining processes or criteria more realistically. “Even when I’m annoyed, I still believe ITIL or ITIL-like processes should be here to stay, but management should be more educated on what constitutes a serious change to the environment…” Others claimed that ITIL formalizes what should be your IT organization’s best practices and doesn’t really invent anything new. “Most shops already perform many of the processes and don’t recognize how close they are to being ITIL compliant already. It is often a case of refining a few things.”
Probably the comment that best summarized the problem is that many organizations, not surprisingly, are “forgetting that ITIL is not an end in and of itself. Rather, it is a means to an end,” adding that this point is often “lost on both the critics and cheerleaders of Service Management.”
We’re not surprised that adopting of ITIL is striking nerves. It’s redolent of the early battles surrounding those old MRP II, and later, ERP projects, that often degenerated into endless business process reengineering efforts for their own sake. Ironically, promoters of early Class A MRP II initiatives justified their efforts on the ideal that integration would enable everyone to read from the same sheet of music and therefore enable the organization to act in unison. In fact, many of those early implementations simply cemented the functional or organizational walls that they were supposed to tear down. And although, in the name of process reengineering, they were supposed to make the organizations more responsive, in fact, many of those implementations wound up enshrining many of the inflexible planning practices that drove operational cost structures through the roof.
The bottom line is that IT could use a dose of process, but not the arbitrary kind that looks good in management PowerPoints. The ITIL framework presents the opportunity to rationalize the best practices that are already are or should be in place. Ideally, it could provide the means for correlating aspects of service delivery, such maintaining uptime and availability, with metrics that actually reflect the business value of meeting those goals. For the software industry, ITIL provides a nice common target for developing packaged solutions, just as the APICS framework did for enterprise applications nearly 30 years ago. That’s the upside.
The downside is that, like any technical profession, IT has historically operated in its own silo away from the business, where service requests were simply thrown over the wall with little or no context. As a result, the business has hardly understood what they are getting for their IT dollars, while IT has had little concept of the bigger picture on why they are doing their jobs, other than to keep the machines running. IT has good reason to fear process improvement exercises which don’t appear grounded in reality.
ITIL is supposed to offer the first step -– defining what IT does as a service, and the tasks or processes that are involved in delivering it. The cultural gap that separates IT from the business is both the best justification for adopting ITIL, and the greatest obstacle towards achieving its goals.
Our recent take on the out of control HP boardroom investigation was that this was a prime example of project management gone bad. While bad project management didn’t cause the crisis, it obviously exacerbated it.
VC Eric Risley put a fresh face on the issue by recalling some advice that his father, a retired GE executive and entrepreneur, passed him on the topic of corporate governance
Risley’s dad cast the issue in the context of the corporate lifecycle. While a company is in startup phase, it’s typically looking for “adult supervision” that’s focused, not necessarily on governance, but on tactical matters like getting an experienced hand on operations and budgets. Governance becomes an objective once the company matures and is thrust onto a wider stage after going public.
Rewind back to a year ago, and the sense of shock that occurred as allegations emerged over Mercury, a company that sold IT governance tools, was having governance issues of its own. A year later, the shock has turned to numbness with revelations of options questions snaring heavyweights like Apple, BEA, Microsoft, Verisign and many others. Evidently, options gaming was far more prevalent across Silicon Valley than we thought.
Given Risley’s dad’s advice, maybe the fact that more tech firms are being caught in a wider net shouldn’t be so surprising. Remember, this is a sector that values firms that preserve their entrepreneurial sprit. It’s a quality that connotes organizations that remain highly agile, are non-bureaucratic, and retain the energy and appetite of a startup.
Obviously, being hungry is preferable to being stalled by inertia. But has our value system indirectly given short shrift to the qualities of transparency that are expected in a post-Enron world?
At first glance, there’s something about the HP board saga that doesn’t ring true. On one side, you have a chairwoman who was a student of governance, suffering a lapse of judgment that could find her (and others) in legal jeopardy. Meanwhile, a dissident board member whose success was frequently built by throwing away the rules proved the one who blew the whistle.
What’s wrong with this picture?
An article in yesterday’s Wall Street Journal documenting the dissension and politics on the HP board provided good insights on a culture that allowed the situation to careen out of control.
You could spin the saga in any number of ways: the outsider battling against entrenched interests, the innocent being set up, the woman of the year battling to be taken seriously in a highly macho boardroom culture, or the need to vet board nominees more thoroughly.
But reality is hardly black and white. Although the smoking gun has yet to be uncovered, it’s clear that the pretexting occurred because project leadership lost sight of the big picture.
To recap, the investigation began in the wake of boardroom leaks during debates over former CEO Carly Fiorina’s future. Based on a solid premise that boardroom leaks compromised corporate governance, at some point, the investigation veered off course once somebody cleared the use of legally questionable tactics to flesh out phone records of suspect board members.
Consequently, a project conceived with the goal of sound governance ultimately compromised it. Governance broke down when somebody at the top authorized the pretexting, or failed to adequately manage subordinates to keep the process on hard ground.
While the consequences of the HP case may prove far more severe than a blown budget or project schedule, the scenario should still look rather familiar to any seasoned IT professional. Put another way, how often do project teams get so wrapped up in the details that they lose sight of the overall goals?
Consider the case of a major global investment bank’s compliance project. In this case, one of the bank’s compliance strategies is building systems that adequately separate and document risk. And, as part of the development cycle, IT is documenting the business requirements so that the system delivers the right performance, supports the necessary scalability, maintains appropriate security levels, and contains the right functionality.
Unfortunately, in the quest to document requirements, the team has found itself being evaluated on its progress in feeding those requirements to a tool that manages them. Yet, there are either no metrics in place, and nobody taking ownership for vetting the accuracy, reliability, or quality of the requirements. Ultimately, if the system is built on faulty requirements, compliance will prove problematical.
The consequences of project tunnel vision could become exacerbated as IT organizations get serious about adopting best practices frameworks such as COBIT for governance; ITIL for service management; or ISO 17999 for IT security. The goals may be admirable, but as the HP board learned the hard way, the devil is when the details block the big picture.
In the past couple years, the perceived impact of IT on the world has seesawed. First, we were scared that things would fall apart after the dawn of the new millennium. When they didn’t, conventional wisdom was that IT no longer mattered.
Call us jaded. When we first stumbled across a session at this year’s Gartner Symposium entitled “The End of the IT Department,” our first reaction was we’d heard it all before. But curiosity got the better of us, and we ended up hearing the kinds of provocative arguments that stimulate thought, rather than Crossfire-style knee-jerk reactions.
Neil MacDonald, the Gartner analyst leading the research, opened by reviewing findings of CIOs losing influence. Since 2002, nearly a quarter of them stopped reporting to the CEO, with most CEOs ranking IT at the bottom half of their concerns. Adding insult to injury was anecdotal evidence that IT is becoming a drag on corporate change. In the 70s and 80s, the system was typically down when you needed it. Today, it’s probably obsolete by the day it goes live because IT couldn’t keep up with all the changes happening to the business.
Looking a decade out, MacDonald predicted a dramatic redistribution of IT tasks. While strategy, administrative, architecture, and vendor management tasks remained internal, he predicted that software development, maintenance, IT operations, and support would be automated or outsourced. Excluding technology strategy and architecture, remaining tasks would be absorbed into the business units. The result? By 2015, the IT organization would shrink to about 15% of its current size, barely large enough to justify retaining a CIO.
Although we question some assumptions, such as the likelihood of application development being automated, MacDonald’s arguments about absorption of IT into the business are compelling. As the logical extension of well-known arguments for IT to better align, MacDonald foresees today’s business systems analyst morphing into tomorrow’s process architects. While IT professionals are logical candidates, with more powerful, graphical process management tools, people from the business side could also vie for the same jobs.
Yet, the notion of the IT organization disappearing is all too redolent of 19th century notions to close the Patent Office. As we’ve said time and again, because IT can embed the intellectual property of an enterprise, technology innovation won’t end, and with it, the need for visionaries who can understand the competitive advantages that new technology can contribute across business units. No single operating unit is likely to own such knowledge.