Shutting the barn doors

Security is one of those things that have become everybody’s business. Well maybe not quite everybody, but for software developers, the growing reality of web-based application architectures means that this is something that they have to worry about, even if they were never taught about back doors, buffer overflows, or SQL injection in their computer science programs.

Back when software programs were entirely internal, or even during Web 1.0 where Internet applications consisted of document dispensaries or remote database access, security would be adequately controlled through traditional perimeter protection means. We’ve said it before, as applications evolved to full web architectures that graduated from remote queries to a database to more dynamic interaction between applications, perimeter protection became the 21st century equivalent of a Maginot Line.

Security is a black box to most civilians and for good reason. The fact that, even in the open source world where you have the best minds constantly hacking away, users of popular open source programs like Firefox still are on the receiving end of an ongoing array f patches and updates. As a cat and mouse game, hackers are constantly discovering new back doors that even the brightest software development minds couldn’t imagine.

While in an ideal world, developers would never write bugs or leave open doors. In the real world, they need automated tools that ferret out what their training never provided, or what they wouldn’t be able to uncover through manual checks anyway. A couple years ago, IBM Rational acquired Watchfire, whose AppScan does so-called ”black box” testing or ethical hacking of an app once it’s on a testbed; today, IBM bought Ounce Labs, whose static (or “white box”) testing provides the other half of the equation.

With the addition of Ounce, IBM Rational claims it has the only end-to-end web security testing solution. For its part, HP, like IBM, also previously acquired a black box tester (SPI Dynamics) and currently covers white box testing through a partnership with Fortify (we wouldn’t be surprised if at some point HP ties the knot n that one as well). But for IBM Rational, it means they have put together the basic piece parts, but do not have an end-to-end solution yet. Ounce needs to be integrated with AppScan first. But in a discussion with colleague Bola Rotibi, we agreed that presenting as testbed no matter how unified is just the first step. She suggested modeling – kind of a staged approach where a model is tested first to winnow out architectural weaknesses. To that we could see an airtight case for a targeted solution with requirements that makes security testing an exercise driven by corporate (and were appropriate, regulatory) policy.

While the notion of application security testing is fairly new, the theme about proactive testing early in the application lifecycle is anything but. The more things change, the more they don’t.

Software AG ’s follow-up Act

Who says there are no second acts in life?

After having caught its breadth with the webMethods acquisition almost exactly two years ago, Software AG has struck again with an offer to buy roughly half the shares of IDS Scheer from the company’s founders. The offer, worth roughly $320 million, is still subject to regulatory review.

Both deals are similar in that they are major, but their impacts will be different. webMethods expanded the Software AG business horizontally, adding critical mass to a new SOA middleware business that it was only beginning to build. Additionally, webMethods was a less mature business with more headroom for growth. By contrast, IDS Scheer simply deepens one of Software AG’s existing businesses: webMethods BPM. It adds the ARIS process modeling language, which would provide yet another onramp for webMethods BPM customers. And IDS Scheer is a pretty mature business, with the brunt of its installed base being large SAP customers who have used the ARIS language to model their SAP applications. There obviously aren’t a lot of new SAP installations going in these days.

But in other ways, webMethods could give IDS Scheer the jolt that the ARIS business could use. While Software AG’s numbers continued to grow in spite of the recession, IDS Scheer’s business has flattened out with what little growth occurring attributable to maintenance streams.

For Software AG, IDS Scheer’s maintenance streams resemble those of its legacy ETS data management business, which has provided the company the annuity revenue flow to fund its acquisitions. But that’s where the similarity ends. The webMethods BPM business, which is much earlier in its growth curve, represents a potential greenfield base for ARIS. Better yet for Software AG, it provides a foothold into the SAP customer base where the company has not been heavily present. And, although SAP is also a player in the middleware space with NetWeaver, it has not been terribly active with BPM.

More interestingly, it throws down a gauntlet to Oracle, which currently OEMs the ARIS language as one of the options for its Fusion BPM middleware stack. Although Oracle promotes Fusion’s “hot pluggable” best of breed strategy, probably the last place Oracle wants best of breed is in the BPM stack. With ARIS providing a direct onramp to webMethods BPM, and in turn the Software AG SOA stack, continuation of the OEM deal provides Software AG the opportunity for a wedge strategy.

As for IBM, making ARIS native to the webMethods BPM suite provides a line of defense against WebSphere incursion into the SAP installed base. Although hardly a show stopper, it provides Software AG yet another tool in its arsenal to compete with IBM WebSphere.

Just about the only thing that surprised us in this announcement was that SAP didn’t act first. Their customers only happen to form the majority of the ARIS base.

Postscript: Here’s hoping that maybe we’ll have a chance to hear Professor Scheer’s mean baritone sax at Software AG events.

Software Abundance in a Downturn

The term “get” is a journalism (remember that?) term for getting hard-to-get interviews. And so we’re jealous once more about one of RedMonk/Michael Cote’s latest gets, Grady Booch at last month’s Rational Software Conference.

In a rambling discussion, Booch made an interesting point during his sitdown about software being an abundant resource and how that jibes with the current economic slowdown. Although his eventual conclusion – that it pays to invest in software because it can help you deal with a downturn more effectively (and derive competitive edge) – was not surprising, the rationale was.

It’s that Booch calls software an abundant resource. Using his terms, it’s fungible, flexible; there’s lots of it and lots of developers around; and better yet, it’s not a natural extractive resource subject to zero-sum economics. That’s for the most part true although, unless you’re getting your power off solar, some resource must be consumed to provide the juice to your computer.

Booch referred to Clay Shirkey’s concept that a cognitive surplus now exists as a result of the leisure time freed up by the industrial revolution. He contends that highly accessible, dispersed computing networks have started to harness this cumulative cognitive resource. Exhibit A was his and Martin Wattenberg of IBM’s back of the envelope calculation that Wikipedia alone has provided an outlet for 100 million cumulative hours of collected human thought. That’s a lot of volunteer contribution to what, depending on your viewpoint, is contribution to or organization of human wisdom. Of course other examples are the open source software that floats in the wild like the airborne yeasts that magically transform grains into marvelous Belgian Lambics.

Booch implied that software has become an abundant resource, although he deftly avoided the trap of calling it “free” as that term brings with it plenty of baggage. As pioneers of today’s software industry discovered back in the 1980s, the fact that software come delivered on cheap media (followed today by cheap bandwidth) concealed the human capital value that was represented by it. There are many arguments of what the value of software is today – is it proprietary logic, peace of mind, or is the value of technical support? Regardless of what it is, there is value in software, and it is value that, unlike material goods, is not always directly related to supply and demand.

But of course there is a question as to the supply of software, or more specifically, the supply of minds. Globally this is a non-issue, but in the US the matter of whether there remains a shortage of computer science grads or a shortage of jobs for the few that are coming out of computer science schools is still up for debate.

There are a couple other factors to add to the equation of software abundance.

The first is “free” software; OK, Grady didn’t fall into that rat hole but we will. You can use free stuff like Google Docs to save money on the cost of Microsoft Office, or you can use an open source platform like Linux to avoid the overhead of Windows. Both have their value, but their value is not going to make or break the business fortunes of the company. By nature, free software will be commodity software because everybody can get it, so it confers no strategic advantage to the user.

The second is the cloud. It makes software that is around more readily accessible because, if you’ve got the bandwidth, we’ve got the beer. Your company can implement new software with less of the usual pain because it doesn’t have to do the installation and maintenance itself. Well not totally – it depends on whether your provider is using the SaaS model where they handle all the plumbing or whether you’re using a raw cloud where installation and management is a la carte. But assuming your company is using a SaaS provider or somebody that mediates the ugly cloud, software to respond to your business need is more accessible than ever. As with free or open source, the fact that this is widely available means that the software will be commodity; however, if your company is consuming a business application such as ERP, CRM, MRO, or supply chain management, competitive edge will come in how you configure, integrate and consume that software. That effort will be anything but free.

The bottom line is that Abundant Software is not about the laws of supply and demand. There is plenty and not enough software and software developers to go around. Software is abundant, but not always the right software, or if it is right, it takes effort to make it righter. Similarly, being abundant doesn’t mean that the software that is going to get your company out of the recession is going to be cheap.

UPDATE — Google Docs is no longer free.

Oracle Fusion 11g Middleware: Executed According to Plan

Today’s announcement by Oracle of the rollouts of Fusion Middleware 11g is a bit anticlimactic in that the details are pretty much according to the plan that came out exactly a year ago today. Although the Fusion stack is comprised of multiple parts, internally developed and acquired, the highlight is that it represents the fruition of the BEA acquisition. Oracle had Fusion middleware prior to acquiring BEA, but there’s little question that BEA was the main event. WebLogic filled the donut hole in the middle of the Fusion stack with a server that was far more popular than Oracle Containers for Java EE (OC4J). Singlehandedly, BEA catapulted Oracle Fusion into becoming a major player in middleware.

Oracle largely stuck to the previously announced roadmap for convergence of BEA products, with the only major surprises being in the details. As planned, Oracle incorporated WebLogic as the strategic Java platform, JDeveloper as the primary development environment, dual business process modeling paths, with master data management, data integration, and identity management driven largely by Oracle offerings with some added BEA content.

Although the Oracle Fusion product portfolio came from far more diverse sources than BEA (as Oracle was obviously a more aggressive acquirer), the result is far more unified than anything that BEA ever fielded. Before getting swallowed by Oracle, BEA had multiple portal, development, and integration technologies lacking a common framework. By comparison, Oracle has emphasized a common framework for mashing the pieces together.

That’s rooted in Oracle’s heritage for developing native tools and utilities, dating back to the Oracle Forms 4GL and the various utilities for managing the Oracle database; the tools were sufficiently native that they typically were confined to Oracle shops. But that approach to native tooling morphed with development of a broader framework that is optimized for Oracle platforms. It’s an outgrowth of the mentality at Oracle that good is the enemy of best, and that what Oracle is building is a platform rather than discrete products.

It’s an approach that also makes Oracle’s tagline of Fusion being standards-based as being more nuanced. Yes, the Fusion products are designed to support Oracle’s “hot pluggable” best of breed strategy to work with other vendors products, but for designing and managing the Fusion environment, Oracle has you surrounded with native tooling if you want them. Call it a subtle pull for encouraging customers to add more Oracle content.

That explains how, 6 – 7 years ago, Oracle began developing what has become the Application Development Framework (ADF) as its own model-view-controller alternative to the Apache Struts framework that it previously used in early versions of the JDeveloper Java tool. That approach has carried through to this day with JDeveloper, which provides a higher level, declarative approach to development that would not fit with traditional Eclipse IDEs. And that approach applies to Oracle Enterprise Manager (EM), which does not necessarily compete with BMC, CA, HP, or IBM Tivoli in application management, but provides the last mile of declarative deployment, monitoring, and performance testing capabilities for the Fusion platform.

Bringing together the Oracle and BEA technologies resulted in some synergies where the value was greater than the sum of its parts. A good example is the pairing of BA’s quasi-real time JRockit JVM with Oracle Coherence data grid, a distributed caching layer for Java objects. In essence, JRockit juices up performance of Coherence, which is used whenever you need higher performance with frequently used objects; conversely, Coherence provides a high end enterprise clustered platform that provides an excellent use case for JRockit.

As noted, while the broad outlines of Fusion 11g are hardly any mystery, there are some interesting departures that occurred along the way. One of the more notable was in BPM where Oracle added another option to its runtime strategy for Oracle BPM Suite. Originally, Oracle BPEL Process Manager was to be the runtime, requiring BPM users to map their process models to BPEL, essentially an XML-based sequential programming language that lacks process semantics. A year later, OMG is putting finishing touches to BPMN 2.0, a process modeling notation that has added support for executable models. And so with release of 11g, Oracle BPM Suite users will gain the option of bypassing BPEL as long as their processes are not that transactionally complex.

Make no doubt about it, the Fusion 11g migration was a huge reengineering project, involving nearly 2000 development projects and over 5000 product enhancements. So it’s a shame that Oracle did not take the opportunity of re-architecting its middleware stack by migrating it to microkernel architecture, with OSGi being the most prominent example. Oracle WebLogic Server is OSGi-based, but the BPM/SOA stack is not. Oracle remains mum as to whether it plans to adopt a microkernel architecture throughout the rest of the Fusion stack.

So why are we all hot and bothered about this? OSGi, or the principle of dynamic, modular microkernels in general, offer the potential to vastly reduce Java’s footprint through deployment of highly compact, servers that contain only the Java modules that are necessary to run. The good news is that this is potentially a highly economic, energy-efficient, space efficient green strategy. The bad news is that it’s not enough for the vendor to adopt a microkernel, as the user has to learn how to selectively and dynamically deploy them.

But as we noted last week, OSGi seems to have lost its momentum of late. In our Ovum research last year, we believed that OSGi was going to become the de facto standard for Java platforms as IBM and SpringSource fully migrated their stacks, and as rivals were providing at least tacit support. A year later, Oracle’s silence is deafening. We believe that Oracle’s pending acquisition of Sun adds some interesting dynamics to the plot, as Sun has continued to speak on both sides of its mouth on the topic: supporting OSGi for its open source Glassfish Java platform, while putting its weight behind Project Jigsaw that aims to redefine Java modularity as JSR 294. Unfortunately, announcement of Fusion 11g has not cleared up matters.