Bypass Operation

One of the few sleepers in technology today, WiFi could soon grow ubiquitous. Based on normal laptop replacement rates, industry estimates point to 20 – 25 million units alone, not even counting other devices like PDAs, appliances, or phones.

The quick hits are inside homes and offices. But well-publicized rollouts in Starbucks plus the blooming of free public hot spots have publicized the idea of broadband almost anywhere. Well, let’s not get carried away here — cell phones still don’t work away from cities or interstates. Nonetheless, the success of wired broadband in hotels at $10/night has proven that road warriors will buy reasonably priced high-speed access.

Of course, speed bumps await, like crowding or vulnerability to eavesdropping, but most of these problems will be resolved. The major hang up will be financing the build out.

Call us crazy, but we’re hazarding a few predictions:

* Technology: In the short run, WiFi will become a victim of its own success. Emerging alternatives that operate on less congested bands, could resolve bottlenecks. As for security, fixes such as digital signatures and encryption are already here today. Enterprises just have to enforce their use.

* Cost: Although individual hot spots are not that expensive, covering an entire metro area is another story. Existing and emerging technologies might resolve small parts of the problem, but it will cost billions of dollars to reach critical mass coverage. That will dictate a triage approach to build-out: premium, enterprise grade secure services introduced to downtown cores, office parks, hotels, and airports where business users are sitting down with both hands on the keyboard. The truly mobile Internet, strung out on cell towers long the highway, will have to await not just capital, but telematics innovations making the hands-free Internet possible.

* Market Development: Mobile carriers will open the battle, but lose the war. Their business models (private networks, dedicated hardware), technology (circuit-switched), and time charges are precisely the models that Internet users have rejected. (It’s amazing that cell customers still tolerate these 1970s-vintage business practices.)

The ultimate victors will be ISPs/wide area backbone providers (who already own the customer), aligned non-exclusively with third parries (like Cometa) that construct the build-out. There will be two tiers of service: enterprise class, which is secured, with the option of priority bandwidth; and consumer service, which is unsecured, less profitable, competing against piggybacked, informal neighborhood access.

No wonder we recently heard one observer characterize WiFi as “the great mobile carrier bypass.”

Skeletons and Demons

A few months back, we had an interesting discussion on the history of the relational database (RDBMS) with Oracle VP Ken Jacobs, a guy also known as Mr. DBA. An outgrowth of IBM research, RDBMSs languished until midrange platforms made the idea feasible, creating unexpected openings for startups like Oracle.

A revolutionary notion at the time, RDBMS systems theoretically altered the balance of power, making information more accessible to business analysts. Nonetheless, it wasn’t until the emergence of user-friendly client/server applications that business users could finally do things like write reports without the help of programmers or DBAs.

Over the next 25 years, RDBMS systems scaled up and out. Nonetheless, even today, they still don’t house he world’s largest transaction systems. Go to a bank, call the phone company, file an insurance claim, or scream over a utility bill, and in all likelihood, the data probably came from some tried and true 25-year old legacy system. The growth of RDBMS systems notwithstanding, the truism of 70% of the world’s data residing on legacy systems remains just as valid as ever.

Like Rome, most of the world’s classic transaction systems weren’t built in a day. Attaining such levels of resilience typically took years of trial and error. And, chances are, the minds behind all that resilience are no longer around. Anywhere.

Those systems were opened a bit by the web, which added HTML screen scrapers. Today, with short-term economic uncertainties and long-term upheavals calling for more agile or — dare we say it — real-time enterprises, the need to pry open legacy systems has grown more acute.

Data profiling tool providers, such as Relativity Technologies, and consulting firms, such as offshore outsourcer Cognizant Technologies, have been pushing the notion of legacy renewal for several years. Evidently, their rumblings are being heard. Now IBM Global Services is joining the crowd, announcing new legacy renewal service offerings leveraging the domain knowledge and life cycle tools of its recent PwC Consulting and Rational acquisitions.

What caught our ear was that ROI tools would be part of the mix. While not unusual — vendor ROI tools are pretty common these days for obvious reasons –we wondered, how can you quantify projects like this, which typically have plenty of unknowns. Yes, Y2K projects in general dealt successfully with skeletons in the closet, but in a tighter spending environment, we’ll be interested to see how IBM factors the uncertainties of programmers past.

Growing Pains

Not surprisingly, data warehouses are getting bigger and more current. A just- released study from The Data Warehousing Institute (TDWI) — surveying over 700 companies — revealed parallel trends for organizations loading over 500 Gbytes to their data warehouses, and requiring “near real-time” feeds. The proportions of companies doing either are expected to triple in the next 18 months.

Speaking with Wayne Eckerson, TDWI director of research, the big challenge is brains, not brawn. With today’s ETL (extraction, transformation, and loading) tools delivering up to 10x the throughout of a few years back, the big problem is knowing all that data.

Most ETL tools have adequate meta data repositories, which identify source data records, how they are transformed, and when they are loaded into data warehouses. However, BI (business intelligence) usually requires multiple tools, because no single vendor offers adequate soup-to-nuts solutions. Consequently, there is a need to correlate the physical characteristics of data (record structure, transformation processes, etc.) with its logical identifiers (business objects). And that’s where existing meta data stores fall down.

As we’ve noted before, problems such as these prove that BI is becoming a victim of its own success. That’s not surprising, given that BI is one of the few cost-justifiable IT investments during recessions.

While there is some hope on the horizon — the OMG’s Common Warehouse Model (CWM) appears to be gaining critical mass vendor support — technology and lip service to standards won’t resolve the problem. Just look at the track record of the Open Applications Group (OAG), which developed standards for interfacing common ERP business processes as if they were EDI (electronic data interchange) transactions. Today, EAI (enterprise application integration) remains as costly and elusive as ever.

No, we think that people and organizational inertia are the real hurdles. We recall a conversation with data warehousing guru Ralph Kimball. He told us of the difficulty getting one of his clients — a diversified global enterprise — to agree on a standard data definition for customer. Syntax gaps often make technology obstacles look downright trivial in comparison.