12.13.10

The Second Wave of Analytics

Posted in Big Data, Business Intelligence, Cloud, Data Management, Database, e-Commerce, Java, Open Source, SaaS (Software as a Service) at 3:17 am by Tony Baer

Throughout the recession, business intelligence (BI) was one of the few growth markets in IT. Given that transactional systems that report “what” is happening are simply the price of entry for remaining in a market, BI and analytics systems answer the question of “why” something is happening, and ideally, provide intelligence that is actionable so you can know ‘how’ to respond. Not surprisingly, understanding the whys and hows are essential for maximizing the top line in growing markets, and pinpointing the path to survival in down markets. The latter reason is why BI has remained one of the few growth areas in the IT and business applications space through the recession.

Analytic databases are cool again. Teradata, the analytic database provider with a 30-year track record, had its strongest Q2 in what was otherwise a lousy 2010 for most IT vendors. Over the past year, IBM, SAP, and EMC took major acquisitions in this space, while some of the loudest decibels at this year’s Oracle OpenWorld were over the Exadata optimized database machine. There are a number of upstarts with significant venture funding, ranging from Vertica to Cloudera, Aster Data, ParAccel and others that are not only charting solid growth, but the range of varied approaches that reveal that the market is far from mature and that there remains plenty of demand for innovation.

We are seeing today a second wave of innovation in BI and analytics that matches the ferment and intensity of the 1995-96 era when data warehousing and analytic reporting went commercial. There isn’t any one thing that is driving BI innovation. At one end of the spectrum, you have Big Data, and at the other end, Fast Data — the actualization of real-time business intelligence. Advances in commodity hardware, memory density, parallel programming models, and emergence of NoSQL, open source statistical programming languages, cloud are bringing this all within reach. There is more and more data everywhere that’s begging to be sliced, diced and analyzed.

The amount of data being generated is mushrooming, but much of it will not necessarily be persisted to storage. For instance, if you’re a power company that wants to institute a smart grid, moving from monthly to daily meter reads multiplies your data volumes by a factor of 30, and if you decide to take readings every 15 minutes, better multiple all that again by a factor of 100. Much of this data will be consumed as events. Even if any of it is persisted, traditional relational models won’t handle the load. The issue is not only because of overhead of operating all the iron, but with it the concurrent need for additional equipment, space, HVAC, and power.

Unlike the past, when the biggest databases were maintained inside the walls of research institutions, public sector agencies, or within large telcos or banks, today many of the largest data stores on the Internet are getting opened through APIs, such as from Facebook. Big databases are no longer restricted to use by big companies.

Compare that to the 1995-96 time period when relational databases, which made enterprise data accessible, reached critical mass adoption; rich Windows clients, which put powerful apps on the desktop, became enterprise standard; while new approaches to optimizing data storage and productizing the kind of enterprise reporting pioneered by Information Builders, emerged. And with it all came the debates OLAP (or MOLAP) vs ROLAP, star vs. snowflake schema, and ad hoc vs. standard reporting. Ever since, BI has become ingrained with enterprise applications, as reflected by recent consolidations with the acquisitions of Cognos, Business Objects, and Hyperion by IBM, SAP, and Oracle. How much more establishment can you get?

What’s old is new again. When SQL relational databases emerged in the 1980s, conventional wisdom was that the need for indexes and related features would limit their ability to perform or scale to support enterprise transactional systems. Moore’s Law and emergence of client/server helped make mockery of that argument until the web, proliferation of XML data, smart sensory devices, and realization that unstructured data contained valuable morsels of market and process intelligence, in turn made mockery of the argument that relational was the enterprise database end-state.

In-memory databases are nothing new either, but the same hardware commoditization trends that helped mainstream SQL has also brought costs of these engines down to earth.

What’s interesting is that there is no single source or style of innovation. Just as 1995 proved a year of discovery and debate over new concepts, today you are seeing a proliferation of approaches ranging from different strategies for massively-parallel, shared-nothing architectures; columnar databases; massive networked and hierarchical file systems; and SQL vs. programmatic approaches. It is not simply SQL vs. a single post-SQL model, but variations that mix and match SQL-like programming with various approaches to parallelism, data compression, and use of memory. And don’t forget the application of analytic models to complex event processes for identifying key patterns in long-running events or coming through streaming data that is arriving in torrents too fast and large to ever consider putting into persistent storage.

This time, much of the innovation is coming from the open source world as evidenced by projects like the Java-based distributed computing platform Hadoop developed by Google; MapReduce parallel programming model developed by Google; the HIVE project that makes MapReduce look like SQL; the R statistical programming language. Google has added fuel to the fire by releasing to developers its BigQuery and Prediction API for analyzing large sets of data and applying predictive algorithms.

These are not simply technology innovations looking for problems, as use cases for Big Data or real-time analysis are mushrooming. Want to extend your analytics from structured data to blogs, emails, instant messaging, wikis, or sensory data? Want to convene the world’s largest focus group? There’s sentiment analysis to be conducted from Facebook; trending topics for Wikipedia; power distribution optimization for smart grids; or predictive analytics for use cases such as real-time inventory analysis for retail chains, or strategic workforce planning, and so on.

Adding icing to the cake was an excellent talk at a New York Technology Council meeting by Merv Adrian, a 30-year veteran of the data management field (who will soon be joining Gartner) who outlined the content of a comprehensive multi-client study on analytic databases that can be downloaded free from Bitpipe.

Adrian speaks of a generational disruption occurring to the database market that is attacking new forms of age old problems: how to deal with expanding datasets while maintaining decent performance. as mundane as that. But the explosion of data coupled with commoditization of hardware and increasing bandwidth have exacerbated matters to the point where we can no longer apply the brute force approach to tweaking relational architectures. “Most of what we’re doing is figuring out how to deal with the inadequacies of existing systems,” he said, adding that the market and state of knowledge has not yet matured to the point where we’re thinking about how the data management scheme should look logically.

So it’s not surprising that competition has opened wide for new approaches to solving the Big and Fast Data challenges; the market has not yet matured to the point where there are one or a handful of consensus approaches around which to build a critical mass practitioner base. But when Adrian describes the spate of vendor acquisitions over the past year, it’s just a hint of things to come.

Watch this space.

08.12.08

Worldwide Wait 2.0

Posted in e-Commerce, Mobile, Networks, Technology Market Trends, Web 2.0 Apps at 2:40 pm by Tony Baer

A hallmark of Web 2.0 is that the web is supposed to become more dynamic. That dynamism has been energized by critical mass broadband penetration, which in the U.S. now reaches over half of all households.

But unless you’re lucky (like us) to live within the Verizon FIOS service area, the future that’s supposedly already here is … not here yet. We’ve seen several fresh reminders over the past few weeks about the lack of connectivity, and how the issue is related to the fact that, while China is building cities, superhighways, metro lines, and networking, our physical and electronic infrastructure remains stuck in the 1960s.

No wonder that between 2001 and now, U.S. dropped from fourth to number 15 in broadband penetration. A proposed remedy by FCC chairman Kevin Martin to fund DSL-equivalent free WiMax access through royalties on wireless spectrum might contribute but a drop in the bucket.

Over the past few weeks, we’ve been reminded of the penalties that the U.S. is paying for letting the ball drop when it comes to Internet infrastructure. We’ve also been reminded about the inertia of the media and entertainment industry in fully embracing the new technologies to revive what is a stagnant (in the case of music) or threatened (in the case of film) market. And we’ve been reminded about the resulting difference between hype and reality when it comes to the capabilities of the dynamic, location-based Internet that supposedly is already here today — but in reality is not.

Here are a few cases.

Basic Connectivity. About a month ago, we spent a lovely week up on the Maine coast. People who move to Deer Isle do so because they cherish the isolation — it’s 40 miles to the nearest McDonalds. But, unless you’re lucky enough to live on Highway 15, the main road, chances are, you’re still relying on dial-up Internet access. That is, if you’re lucky to get a dial-up line of any kind, because the copper wire phone system on Deer Isle is fully tapped out. You need to wait for somebody to move or die before getting a new line. About 18 months ago, Verizon sold off the landlines to Fairpoint Communications , which subsequently decided that the infrastructure was too obsolete to continue investing in. It promises — someday — to replace copper with fiber. You want mobile instead? Only a single minor carrier provides cell phone coverage. By contrast, back in 2003 we vacationed on the other side of the Gulf of Maine in Nova Scotia where virtually every town of any size had, not only broadband, but cellular coverage.

The hype of 3G. Adding 3G support to the iPhone was supposed to make it a true mobile Interenet device. Maybe it does — it certainly has a great UI and operating environment, but don’t take the Apple commercials literally, as this entry from agile development and Ruby on Rails tools development firm 37 Signals attests. Our mobile infrastructure — which was built on a divide-and-conquer rather than an interchangeable standards-based strategy, continues to deliver coverage that is spotty and inferior to the rest of the developed world.

Internet Home Media. There has been lots of press over the idea of dynamic movie downloads from the likes of Netflix. But when it comes down to old fashioned home entertainment — the stuff where you’re going to utilize home theater 100-inch flat screens and 5:2 sound, don’t count on internet streaming just yet, wrote colleague Andrew Brust recently.

****

There are several issues here:

1. A national failure to mobilize to renew our nation’s infrastructure (we’re too hung up on keeping taxes low and letting the market sort it out to pay for it) that touches broader policy issues.
2. The inertia of certain sectors that feel threatened but could otherwise profit if they could only think out of the box.
3. Hype continues to outrace reality.

06.23.08

Back to the Future

Posted in .NET, Application Development, e-Commerce, Java, Open Source at 8:40 am by Tony Baer

We had an interesting conversation with Peter Cooper-Ellis, the guy who ran product management at BEA from the time it acquired WebLogic, who’s now taking on a similar role with SpringSource. Obviously, in the wake of the Oracle acquisition, it’s not surprising that Cooper-Ellis jumped ship.

But in making the jump to SpringSource, Cooper-Ellis has come full circle. As BEA was digesting its WebLogic acquisition, Cooper-Ellis was there when the Java stack was being built. Now at SpringSource, with its Eclipse Equinox OSGi-based server container, he’s now part of an open source company that’s helping deconstruct it. So we explored some history with him and compared notes. To summarize, Cooper-Ellis sees a bit of history repeating again today: a decade ago, it was drive for a unified middle tier stack to make the web interactive, and today, it’s the goal of having a dynamic lightweight stack that uses simpler constructs. In other words, a technology framework that actually delivers on the old promise of Internet time.

Let’s rewind the tape a bit. In the 90s, BEA (originally called Independence Technology) was formed to make a business in middleware. It thought its business would come from transaction monitors, but that only addressed a tiny portion of the market with transaction loads huge enough to justify buying another layer of software. Instead the killer app for middleware occurred with the appserver. When the web caught on, there was demand to add the kind of data-driven interactivity that became real with client/server. BEA bought WebLogic, a company that gambled (and won) a bet that EJBs would become standard (which it did with J2EE in 1999).

The good news was that J2EE (later joined by rival .NET) provided the standard middle tier that made e-commerce bloom (if you’re going to sell something, you need a database behind your online ordering system). The bad news was that J2EE was obese, proving overkill for anybody who wasn’t an Amazon, eBay, online banking, or travel reservations site – it was engineered for transaction-intensive, highly distributed data centric websites. Not surprisingly, the complexity of J2EE subsequently spawned a backlash for Plain Old Java Objects (POJOs), supported by a growing array of open source frameworks made famous by then Burton Group analyst Richard Monson-Haefel in 2004 as the Rebel Frameworks. More recently, there has been surging interest in dynamic scripting languages that let you manipulate data using higher-level constructs.

But so far, all these technologies were about development, not run time. That’s where Eclipse Equinox comes in. Leveraging the OSGi component model that Eclipse embraced for the IDE, Equinox extends the idea to run time. Like Java, OSGi was conceived for different purposes (Java was for set-top boxes, while OSGi was the framework for provisioning services in the smart, networked home) – you could consider both as fraternal twins reunited at adolescence. Eclipse reincarnated OSGi as the dynamic service bundle, first for its IDE (where developers could swap different vendor plug-ins at will), and more recently as a new run time.

That’s where Cooper-Ellis views OSGi as giving Java appservers a second wind. In place of installing the entire Java EE stack, OSGi lets you provision only the features you need at run time. So if you add distributed nodes, take the JMS plug-in; if traffic spikes, hot deploy clustering support, and so on. The advantage is that if you don’t need these or other bundles, you could run the server on a very tiny footprint of code, which reduces overhead and potentially makes it blazingly fast.

That was what BEA was trying to do with the micro-Service Architecture (mSA) that it announced roughly 18 months before Oracle swooped in, and how it built WebLogic Event Server, its complex event streaming engine. The product only used Java EE features such as managing availability, security, and user management; dispensed with much of the rest of the stack; and supported development with POJOs, which included support of the Spring framework.

OSGi/Eclipse Equinox is part of the same return to simplicity that originally spawned popularity of POJOs and the rebel frameworks. Beyond the Java space, it’s the same trend that has driven popularity of dynamic scripting languages as faster means to developing the relatively straightforward data-driven apps that are the mainstream of the web, and it’s also the driving force behind Ajax (which is easy enough that casual developers, business analysts, or web designers can grow dangerous). Each of these has catches and limitations, but they are evidence that for the rest of us, the 80/20 rule lives when it comes to developing for the web.

05.18.08

Microsoon

Posted in e-Commerce, Technology Market Trends at 11:50 pm by Tony Baer

As we mentioned a couple weeks back, we didn’t think that Microsoft’s walking away from its original Yahoo offer was absolutely final.

We expected that if Yahoo’s proposed ad syndication offer with Google hits an antitrust wall, that Microsoft would be back. Microsoft has just issued a statement saying exactly that, and although it might not derail a Yahoo-Google ad deal in the short run, it is actively declaring that it will be next in line should the anti-trust folks shoot things down.

And of course, in the past couple weeks, activist investor Carl Icahn has jumped into the game at Yahoo. You might recall that he recently bought a chunk of BEA, which eventually succumbed to Oracle after playing similarly hard-to-get.

Here’s an excerpt from Microsoft’s statement:
“Microsoft is considering and has raised with Yahoo! an alternative that would involve a transaction with Yahoo! but not an acquisition of all of Yahoo! Microsoft is not proposing to make a new bid to acquire all of Yahoo! at this time, but reserves the right to reconsider that alternative depending on future developments and discussions that may take place…”

It’s going to be a while before the fat lady sings.

05.04.08

Microwhen?

Posted in e-Commerce, Technology Market Trends at 10:18 pm by Tony Baer

For now Microhoo is history. With Steve Ballmer walking away this weekend from Yahoo’s $37/share demands, inboxes are bound to be overflowing Monday morning with al the postmortems.

We’ll keep ours short and sweet. It’s obvious why Microsoft ditched its otherwise successful M&A model to snag Yahoo: it’s in the unusual position of playing underdog, in this case, to Google’s online ad machine. It’s obvious why Yahoo turned down what is probably going to be the best offer it’s going to get: if it must surrender independence, it’d rather do so with Google in a way that keeps the rank and file, not to mention the antitrust folks happy.

One of the obvious arguments against the Microhoo deal is that Yahoo’s software platform is largely open source, while Microsoft’s obviously isn’t. But the open/closed source dichotomy shouldn’t be such a deal killer because Microsoft of late has shown welcome pragmatism toward open source, as evidenced by Sam Ramji’s keynote to EclipseCon a couple months back.

As we noted when the deal first surfaced, Microsoft faced a similar situation back roughly 15 years ago when the company was blindsided by rise of the Internet. At the time, Bill Gates called a management retreat and effectively steered a U-turn. This time around, Microsoft is punting rather than reaching back into its bench. Admittedly, life’s more complex today; when the modern web emerged, you had Netscape, which had market penetration but no business, whereas today with Google, Microsoft has met its match. Also, there’s the question of whether Ray Ozzie really has the moxie to fill Gates’ shoes.

There’s obviously no shortage of debate going round.

Veteran Microsoft watcher Mary Jo Foley couldn’t like the deal less. “Microsoft’s decision to walk restores my faith in the future of the company,” she wrote, adding that her view is evidently shared by at least some within Microsoft as well. The Gillmor Gang convened on Kentucky Derby Day and concluded that “Google is a big winner, Microsoft is not dead, and lives to bid another day.” Larry Dignan speculates that, besides Google, Facebook and MySpace (weren’t they the ones eclipsed by Facebook?) are also winners as, guess what, they’ve got the only other beachfront property left. Ovum’s David Mitchell counters that the deal would have helped Microsoft in Asia where it could have leveraged Yahoo’s penetration in China. My colleague Andrew Brust, who’s finally returned to blogging, offers a more idealistic view of why the deal should happen: it not only fills the online ad gap, but provides a great platform for monetizing its AQuantive ad server acquisition. “AQuantive’s ad serving platform and Razor Fish’s agency savvy could combine really well with the reach that Yahoo’s network of sites would provide. Add Silverlight to the equation, with its rich media capabilities, and things get really exciting…”

Our take? Yahoo will try sidling up to Google, but (1) if YaGoogle (or GoogleHoo?) can’t get past antitrust (remember, this is about ads) and (2) Microsoft still hasn’t spent its $44 billion, Ballmer will return with an offer that will sound a lot like $33, or even less.

02.01.08

Closeout Special

Posted in e-Commerce, Open Source, Technology Market Trends at 6:35 pm by Tony Baer

It’s tough when you get caught on the west coast, on your day off, and wake up to find that the world has already reacted to what is likely to be the tour de force in IT industry consolidation for 2008. By now you’ve already read that, previously spurned, Microsoft is finally going hostile in a takeover attempt for Yahoo. It’s price — $44.6 billion – is more than enough to jolt potential white knights like AT&T, Comcast, Time Warner (which didn’t do so well last time it made a bid, or more officially, got swallowed by an Internet portal), or even News Corp. for that matter

Having built one of the web’s most successful first generation portals, Yahoo has continually struggled to find Act Two. Microsoft has also struggled with online strategy, and not without trying, has attempted a portal strategy that dates back as far as its largely moribund MSNBC partnership. Of course, the difference between the two is that unlike Yahoo, Microsoft has this annuity business called Office and Windows.

The good news for Microsoft is that Yahoo is likely to come cheaper than it would have fared a couple years ago. The bad news is that Yahoo’s position is far weaker, and that in the interim, Google has so far remained invincible despite a textbook vulnerability of its core business remaining a one-trick pony. Nonetheless, Dana Gardner felt Microsoft should have jumped much earlier for a deal that “should have been an obvious merger for them a long time ago.” From published reports, it’s evident that Microsoft has wanted to make the move for some time, but was leery about resorting to a hostile bid.

Unquestionably, the deal is an attempt to build critical mass vs. Google and, arguably, a wide variety of media outlets that until now did not necessarily deem themselves Microsoft rivals. Clearly, this is a battle for consumer eyeballs, meaning online ad dollars. If you have any doubt, compare growth numbers for mobile device providers (except Motorola) vs. anybody in the high tech industry except Google. Or compare the numbers going to Comdex vs. the Consumer Electronics Show. Oops, there is no more Comdex.

For sideshows, it would represent a federation of two leading developer networks. And although Yahoo’s is largely based on open source, such a deal would provide a combined Microsoft/Yahoo (Microhoo?) real legs for Microsoft’s current open source strategy, which essentially amounts to, “if you can’t beat ‘em, join ‘em.” Specifically, it would be a way to really scale Microsoft’s WAMP stack strategy, as a diagram shared by ZDNet Microsoft watcher Mary Jo Foley after her conversation with Microsoft’s open source lab director illustrates.

Much of the blogosphere views the deal as a circling of the wagons. eWeek’s Steven Vaughan-Nichols stated that “Microsoft has been beginning to decline” as it struggles with lukewarm reception to Vista, and the fact that Google, and to an obviously lesser extent, OpenOffice are making inroads to Microsoft’s dominance on the desktop. “If Microsoft is to avoid aging into a last-generation technology company it must make a major move like trying to buy Yahoo. The company literally has no choice about it from where I sit.”

Mary Jo Foley couldn’t agree less. “I — and, apparently, many Microsoft shareholders (given the hit Microsoft’s stock took today) — am more in the “hate it” than “love it” camp.” Besides technology overlap and cultural differences, not to mention “the staggering price tag that has folks, including yours truly, up in arms,” Foley slammed the move as betraying Microsoft’s core competence in remaining forced on its own platform. Gardner added that given the conflicting platforms, that the whole deal “spells a significant period of confusion. And that’s for consumers, IT buyers, enterprise CIOs, and advertisers as well.”

What bothers us is that this seems to be a Hail Mary pass, in that Microsoft is departing from, as Foley described, what it has historically done quite well. That is, with some notable past exceptions, Microsoft has typically practiced a steady but shrewd acquisitions strategy that targeted small startups whose technologies could readily extend its core products. Obviously, Yahoo is an extreme departure, modeled after the ridiculously sized minority equity stake in Facebook, that’s intended to jumpstart Microsoft in consumer markets orbiting outside its domain.

So while we have a relatively easy time figuring that Microsoft could make this a laboratory for its WAMP open source strategy, we have a much harder time with several bigger issues: First, whether Microsoft can indeed do a big acquisition, not to mention gain clearance from the SEC, the EU, not to mention both company’s shareholders? Secondly, while each has strong spots in the battle for online eyeballs, will scale be enough to propel what are isolated areas of success into the critical mass necessary to take on Google or News Corp?

We’d like to see Microsoft duplicate its original success embracing the Internet, which it did so coming from a late start. Nearly 15 years ago, Bill Gates mobilized the company in a turnaround that remains to this day a classic business school case study. At the time, the company had executive vision, not to mention the depth of talent, to pull off such a 180. On this go round, we’d rather see Microsoft do what it does best instead of a huge, highly speculative gamble.

12.07.06

Divided at Birth

Posted in e-Commerce, Security, Standards Development, Technology Market Trends at 3:38 am by Tony Baer

In many ways, the IM market is déjà vu all over again. In an age of universal, standards-based Internet connectivity, IM remains a bastion of proprietary technology.

So if you’re on Microsoft Live Messenger, don’t even think about pinging buddies who happen to be using AOL’s AIM system unless you use a special gateway. And so you end up with segregated communities that can speak to each other with great difficulty if at all. It’s kind of funny that this still exists in the Internet age.

But 15 years ago, that’s exactly how email worked. When the business world began using the Internet, proprietary email systems added gateways. It was complicated and expensive, but the precedent was set. When ISPs emerged, providing direct access to the Internet with free, standards-based email, you could say that the rest was history.

Obviously, the transition forced huge dislocations on the first generation email industry, but of course, it opened up new e-business and service opportunities that far dwarfed what came before it.

Consequently, at first glance, you can’t help but conclude that by keeping their technologies proprietary, that IM vendors are shooting themselves in the feet. Even if standards commoditized their IM services, think of the additional higher value added services that could be unleashed as a result.

Well maybe.

Looking at email as precedent, the good news is that connectivity has grown virtual, cheap, and ubiquitous. But the bad news hits you in the face when you log in every morning, courtesy of the spam, malware, phishing attacks that clog your corporate networks and personal mailboxes.

Don’t give the AOLs or Microsofts of the world too much credit here. Their prime motivation remains protecting their turf. To date, they’ve barely paid lip service in supporting interoperability standards. The usual suspects, including IBM, Microsoft, Yahoo, and AOL, each developed their own dialects of SIMPLE, which meant that there was effectively no standard.

But what broke the ice was Google’s endorsement last year of XMPP, the protocol developed by open source IM server provider Jabber. As a consumer brand, it’s too hard to ignore Google. And just this week, IBM bit the bullet by agreeing to add support of the upstart standard.

Nonetheless, the opening up of IM is not about standards.

Look at Google. It lists roughly a half dozen, mostly minor, third party IM systems with which it interoperates via XMPP. Now, Google is not the kind of provider that would waste its time with custom links. Its standard practice is to publish a single API and expect that many will come.

But supporting XMPP doesn’t guarantee interoperability. For IM services linking to Google, it depends on the platform, and in the case of Trillian, only applies to the paid deluxe rather than the free downloaded version.

Nonetheless, reluctance at opening up IM networks isn’t limited to vendors. Corporate customers are equally leery. They obviously don’t want IM chats to sag under the weight of Spam, and security managers are not exactly gung ho over the prospects of their own carefully authenticated users exchanging chats with buddies going by anonymous calls like “pigsty123.”

Not surprisingly, corporate IM providers are approaching interoperability with 10-foot poles. This week, IBM signed interoperability agreements with AOL, Google, and Yahoo using variants of the rival standards that are in place. The hitch for shaking hands is preserving the authentication mechanisms so a SameTime user with credentials won’t compromise the security of the group using AOL Instant Messenger.

IBM’s announcement this week wasn’t its first bout with interoperability. Several years ago, IBM and AOL each agreed to support embedding of each other’s clients, so a SameTime user could sign on as an AOL AIM user or vice versa.

Not surprisingly, the idea broke down because neither provider could exert control over the foreign clients that were now running native within their supposedly protected enclaves.

05.24.04

Shaking Hands

Posted in e-Commerce, SOA & Web Services, Standards Development, Supply Chain Management at 2:00 am by Tony Baer

B2B trading hype notwithstanding, the premise of linking trading partners electronically is as old as enterprise computing itself. In the late 1960s, the trucking industry’s experiments with exchanging routine transactions such as purchase orders and shipping notices eventually led to Electronic Data Interchange (EDI). And although EDI standards eventually emerged, securing electronic “handshakes” between trading partner remained elusive because different companies applied the standards differently. Consequently, only top tier companies could afford the time and expense of making EDI work.

The crux of the problem is that, while it’s relatively simple to standardize lower- level protocols, such as opening or closing a message, going higher up the value chain to specify, for instance, what fields to include in a forecast cuts to the heart of how companies differentiate themselves. So, while my company is proud of its lean forecasts, your company includes customer data because that provides more insight on demand patterns for your products. Not surprisingly, ambitious standards efforts, such as ebXML, which attempted to address all facets of establishing electronic handshakes, failed to gain traction.

With web services promising to democratize B2B commerce for businesses of all shapes and sizes, the challenge of creating electronic handshakes grows even steeper. Not surprisingly, web services bodies have effectively addressed specifying how a service is described, listed, and requested, and now they are branching to higher level functions such as asserting what kind of security is enforced, how identity is declared, how business processes are choreographed, and how quality of service requirements are described. Another body, WS-I, is providing standard cases for testing all the handshaking. And, echoing EDI history, where various vertical sectors such as automotives defined extensions to standards, the same thing is occurring with specification of XML business vocabularies for areas like law, insurance, and financial services.

But what’s missing is a standard framework for describing or assembling service contracts. Admittedly, some pieces of the puzzle are falling into place, such as WSDL, which describes the service; UDDI, which provides a registry of services; LegalXML, which provides contract language; WS-Reliable Messaging, which addresses how to specify the way service messages are to be delivered; and BPEL, which “choreographs” business processes. But there is no standard framework for organizing all the service and identity descriptors that in aggregate define the relationship between customer and provider, and the services to which customers are entitled. For now, that’s the domain of individual products, such as Infravio’s new X-Registry, which provides a meta data repository for such descriptors.

Admittedly, standards are no panacea, as they won’t ensure handshakes will succeed. They simply harmonize formats and vocabularies for building and testing messages and content. Still, we’d like to see the standards community attempt a WS-Contract framework that describes service provider relationships. History tells us that would be an uphill battle, but we think it would be one worth the fighting.

12.20.01

It Takes Two to Tango

Posted in e-Commerce, SOA & Web Services at 5:52 pm by Tony Baer

When they first emerged a couple years back, B2B exchanges were supposed to streamline everything from procurement to planning, providing new opportunities for businesses to forge partnerships, tap new markets, and save money.

Since then, roughly 90% of all trading exchanges have vanished, an attrition rate that is normal for any new business or product sector. But the B2B implosion taught a few other lessons as well, such as:
* It wasn’t going to change existing 20-year old trends of organizations consolidating–not expanding–their trading relationships to a few trusted partners.
* It wasn’t changing the brute economics of procurement. Promises of savings by eliminating the middleman sounded all too much like “I can get it for you wholesale.”
* Post 9/11, the idea of using trading exchanges to open up business processes struck the wrong chord.

But the death of B2B marketplaces is exaggerated. Instead, the ones making the cut are reinventing old private EDI (Electronic Data Interchange) trading hubs, where a single behemoth–say GM or Sears–creates a trading ring for its own suppliers. This time, substitute “Extranet” for “EDI Value-Added [read: Private] Network.”

The advantage on this go-round is that the technology (based on XML web services) is far more flexible and less expensive. But web services technologies are works in progress. Today’s standards–such as SOAP (Simple Object Access Protocol) messaging, WSDL (Web Services Definition Language), and UDDI (Universal Description, Discovery, and Integration) registries–are just the beginning. We still need standards for all those services that normal transaction systems provide to make trading safe, like security, authentication, prioritization, acknowledgement, and rollback.

Regardless of whether the economy recovers or sputters, B2B marketplaces will play incremental roles, making it a bit easier for smaller trading partners to link electronically with their big brothers, while providing the starting point for some industry sectors such as automotives to increase collaborative product development or similar processes.

The Internet didn’t change everything, and going forward, neither will be B2B. It will simply allow enterprises to consummate partnerships that would have happened anyway.

12.10.01

Show Me the Money

Posted in e-Commerce, Standards Development at 5:51 pm by Tony Baer

An old proverb of the technology industry is that, the nice thing about standards is that there are so many of them to choose from. To be fair, occasionally, technology vendors take the high road and agree to universal standards, with the Internet, and more recently, XML web services, being the best-known examples. Whether technology actually gets standardized boils down to one question: can you make money selling your own proprietary versions? For HTTP, HTML, and newer XML technologies, the answers are clearly no.

But the current debate over digital IDs is another story. Providing a standard framework for storing personal identifiers and preferences, digital ID systems let you shop anywhere online without having to constantly re-key your login names or passwords. Better yet, compared to cookies, you don’t have to be a programmer to edit your preferences.

Conceivably, if just one vendor had a lock on the technology, it could tax every online transaction. That’s put green in the eyes of companies concerned about Microsoft’s Passport digital ID system, which reportedly claims some 200 million account holders. And that’s why prospects for standards are cloudy.

But this isn’t just any ABM (Anyone But Microsoft) tale, because it includes more than just Sun, Oracle and their friends. The Liberty Alliance, Sun’s answer to Passport, aims to develop its own digital ID interoperability standards, and has already attracted the likes of AOL, American Airlines and American Express (and those are just the A’s). They’ve even “invited” Microsoft to join.

Will Liberty fare any better than, say, Java, UNIX, or the Mac, which Microsoft largely deflected away from its desktop empire? The key is that financial services, not technology players, control the dollars of online commerce. Outside the computer industry, major players may play with Microsoft, but on their own terms. That’s exactly what happened in the cable industry, where AT&T, the largest provider, bought Microsoft set-top box technology, but not exclusively.

Sun claims to be negotiating with other major credit card providers, but our guess is that at least one or two major issuers will endorse Passport to keep Sun and Liberty honest. History will likely repeat itself in emerging fields like automotive telematics where Microsoft and various flavors of Java are competing. With the feds folding their cards in the Microsoft anti-trust case, players outside the computer industry will end up enforcing their own frontier justice.

« Previous entries Next Page » Next Page »