02.18.10

The death of Flash is exaggerated

Posted in .NET, Application Development, Java, Rich Internet Apps., Web 2.0 Apps at 5:26 pm by Tony Baer

In spite of a belated challenge from Microsoft, Adobe’s Flash framework has arguably remained the de facto standard for formal Rich Internet Applications (RIAs). But that existence has been called into question with its latest cold war with Apple.

Steve Jobs has slammed Adobe for being lazy; his motives of course are debatable, as we’ll get into below. Yes, Flash is buggy and there are lots of security holes. That’s because as a full RIA client framework,, the technology is being called upon to exert a much wider role from which it was originally designed; to bring multimedia to static web pages.

We were reminded of this while being contacted by an Asian reporter who was asking about whether Flash’s very market survival was now in question. But let’s get real; the only reason we’re having this discussion is Apple’s rejection of Flash for the new and overly hyped iPad.

The conflict between Apple and Adobe is nothing new, and in a way is rather ironic. Adobe Postscript was the technology that helped make the Mac what it is for creative professionals, as Postscript technology made the Mac the de facto standard for desktop publishing. Fast forward to the present, and Apple views Adobe technology as a threat to its revenue stream.

Apple has honed to a very clever business model for its mobile products that is actually a throwback to the golden days of turnkey systems, circa 1979. This model, where the hardware supplier controls what software goes on the machine and gives the client a box with functionality that is ready to go gives the hardware provider control over the revenue stream. The only difference between 1979 and now is that, while the hardware provider used to supply the software, today that comes through third parties who pay for the privilege of selling content to the iPod audience, and a mix of content or software to the iPhone, and now the iPad market.

The problem for Apple however is that the Flash framework could provide third-party software and content providers a bypass around Apple’s Berlin Wall and fees. Adobe is therefore an existential threat to Apple’s annuity stream.

Consequently, while Steve Jobs isn’t off base in criticizing Flash’s technical vulnerabilities, the real driver is cold hard cash.

The argument over whether denial of access to he iPad is a threat to Adobe is because there are questions as to whether the iPad will have the same transformational impact on the mobile Internet space that the iPod and iPhone have had over music and cell phone. Based on what’s out now, we think that the iPad is more hype and actually represents a step back for Internet users to the Web 0.9 experience as the iPad lacks multi-tasking, not to mention the Flash content that is ubiquitous across the web. Others are obviously rushing to come out with their iPad wannabees, most of them likely with Flash support. A new tablet market category will emerge and steal thunder from the netbook.

Admittedly, multi-tasking could be fixed in forthcoming rev, but we think that Apple has made a line in the sand regarding Flash. Maybe Apple has something up its sleeve, like its own answer to Flash, Silverlight, or JavaFX. Or maybe Apple eventually promotes HTML 5 as its RIA strategy. That’s the draft W3C standard that would bring RIA support right back into the mother ship, eliminating the need for those pesky add-ons or reliance on loosey-goosey Ajax. But HTML 5 is way off in the future. Currently in working draft and deficient in areas such as security and codec support, the W3C won’t likely approve it until 2011 at the earliest, an dafter that, it will be years before it reaches critical mass adoption if ever.

But let’s just pretend that maybe the iPad has the same transformative impact on the market as the iPod or iPhone. By 2011, there’s a definite trend away from netbooks to tablets, Apple’s rivals roll out their wannabees, but web developers find that much of their audience is drifting off Flash. (Fat chance.) That’s where things could get really weird. Microsoft, which has been watching from the sidelines, wants a game changer. It must decide which is its worst enemy: Apple or Adobe. If the former, it scraps Silverlight for Flash, because what use is there in being #3? If the latter, it embraces HTML 5 under the guise of industry standards support. Sound unlikely? Actually there’s a precedent. Years ago, Bill Gates promoted Dynamic HTML as Microsoft’s industry-standard alternative to Java clients (we saw him at a Gartner event back in 1999 making the pitch). Who’da thunk that DHTML would eventually becoming one of the pillars that made Ajax possible?

Back to our original point: the iPad is overhyped, it will gain some market share, but it won’t kill off Flash.

01.27.10

Oracle’s Sun Java Strategy: Business as Usual

Posted in Application Development, Cloud, Data Management, Database, Enterprise Applications, Java, Linux, Middleware, OS/Platforms, Rich Internet Apps., SOA & Web Services, Web 2.0 Apps at 5:58 pm by Tony Baer

In an otherwise pretty packed news day, we’d like to echo @mdl4’s sentiments about the respective importance of Apple’s and Oracle’s announcements: “Oracle finalized its purchase of Sun. Best thing to happen to Sun since Java. Also: I don’t give a sh#t about the iPad. I said it.”

There’s little new in observing that on the platform side, that Oracle’s acquisition of Sun is a means for turning the clock back to the days of turnkey systems in a post-appliance era. History truly has come full circle as Oracle in its original database incarnation was one of the prime forces that helped decouple software from hardware. Fast forward to the present, and customers are tired of complexity and just want things that work. Actually, that idea was responsible for the emergence of specialized appliances over the past decade for performing tasks ranging from SSL encryption/decryption to XML processing, firewalls, email, or specialized web databases.

The implication here is that the concept is elevated to enterprise level; instead of a specialized appliance, it’s your core instance of Oracle databases, middleware, or applications. And even there, it’s but a logical step forward from Oracle’s past practice of certifying specific configurations of its database on Sun (Sun was, and now has become again, Oracle’s reference development platform). That’s in essence the argument for Oracle to latch onto a processor architecture that is overmatched in investment by Intel for the x86 line. The argument could be raised than in an era of growing interest in cloud, as to whether Oracle is fighting the last war. That would be the case – except for the certainty that your data center has just as much chance of dying as your mainframe.

At the end of the day, it’s inevitably a question of second source. Dana Gardner opines that Oracle will replace Microsoft as the hedge to IBM. Gordon Haff contends that alternate platform sources are balkanizing as Cisco/EMC/VMware butts their virtualized x86 head into the picture and customers look to private clouds the way they once idealized grids.

The highlight for us was what happens to Sun’s Java portfolio, and as it turns out, the results are not far from what we anticipated last spring: Oracle’s products remain the flagship offerings. From looking at respective market shares, it would be pretty crazy for Oracle to have done otherwise

The general theme was that – yes – Sun’s portfolio will remain the “reference” technologies for the JCP standards, but that these are really only toys that developers should play with. When they get serious, they’re going to keep using WebLogic, not Glassfish. Ditto for:
• Java software development. You can play around with NetBeans, which Oracle’s middleware chief Thomas Kurian characterized as a “lightweight development environment,” but again, if you really want to develop enterprise-ready apps for the Oracle platform, you will still use JDeveloper, which of course is written for Oracle’s umbrella ADF framework that underlies its database, middleware, and applications offerings. That’s identical to Oracle’s existing posture with the old (mostly) BEA portfolio of Eclipse developer tools. Actually, the only thing that surprised us was that Oracle didn’t simply take NetBeans and set it free – as in donating it to Apache or some more obscure open source body.
• SOA, where Oracle’s SOA Suite remains front and center while Sun’s offerings go on maintenance.

We’re also not surprised as to the prominent role of JavaFX in Oracle’s RIA plans; it fills a vacuum created when Oracle terminated BEA’s former arrangement to bundle Adobe Flash/Flex development tooling. In actuality, Oracle has become RIA agnostic, as ADF could support any of the frameworks for client display, but JavaFX provides a technology that Oracle can call its own.

There were some interesting distinctions with identity management and access, where Sun inherited some formidable technologies that, believe it or not, originated with Netscape. Oracle Identity management will grab some provisioning technology from the Sun stack, but otherwise Oracle’s suite will remain the core attraction. But Sun’s identity and access management won’t be put out to pasture, as it will be promoted for midsized web installations.

There are much bigger pieces to Oracle’s announcements, but we’ll finish with what becomes of MySQL. In short there’s nothing surprising to the announcement that MySQL will be maintained in a separate open source business unit – the EU would not have allowed otherwise. But we’ve never bought into the story that Oracle would kill MySQL. Both databases aim at different markets. Just about the only difference that Oracle’s ownership of MySQL makes – besides reuniting it under the same corporate umbrella as the InnoDB data store – is that, well, like yeah, MySQL won’t morph into an enterprise database. Then again, even if MySQL had remained independent, that arguably it was never going to evolve to the same class of Oracle as the product would lose its beloved simplicity.

The more relevant question for MySQL is whether Oracle will fork development to favor Solaris on SPARC. This being open source, there would be nothing stopping the community from taking the law into its own hands.

03.30.09

Web 2.0 Maginot Line

Posted in Application Development, Security, Web 2.0 Apps at 7:59 am by Tony Baer

Last week we spent a couple lovely but unseasonably cold early spring days locked inside a hotel near the Boston convention center for HP’s annual analyst conference for the 1/3 of the company that is not PCs or printers. While much of what we heard or saw was under non-disclosure, we won’t be shot if we told you about a 20-minute demonstration given by Caleb Sima on how Web 2.0 apps can be honey pots for hackers. You can restrict access to your site as strictly as possible, use SSL to go out over secure HTTP, but if your Web 2.0 site uses a rich browser client for performing localizing all the authentication, you may as well have built a Maginot Line.

The point was demonstrating a new freebie utility from HP, SWFScan, which scans Flash files for security holes; you point it at a website, it then decompiles the code and identifies vulnerabilities. OK, pretty abstract sounding. But Sima did a live demo, conducting a Google search for websites with rich Flash clients that included logins, then picking a few actual sites at random (well, maybe he did the search ahead of time to get an idea of what he’d pick up, but that’s showbiz). Entering the url in the tool, it scanned the web page, de-compiling down to SWF (the vector graphics file format of Flash that contains the ActionScript), which then displayed in plain English all the instances of Password entry and processing. So why bother with the trouble of network sniffing when all you have to do is run a botnet that automates Google searches, hits web pages, decompiles the code, and forges log-ins. Sima then showed the same thing with database queries, providing yet simpler alternatives for hackers to SQL injection.

Point taken. Web 2.0 is fine as long as authentication is conducted using Web 1.0 design.

10.02.08

Is Social Media vs. Knowledge Management a generational war?

Posted in Enterprise Applications, IT Services & Systems Integration, Web 2.0 Apps at 11:52 am by Tony Baer

The Y2K issue a decade ago brought to light a critical problem facing many organizations: what happens when your most experienced minds retire? That’s especially critical in the case of skillsets for technologies, architectures, or methodologies that are no longer in vogue. That brought forth the idea that, if you can’t prevent the passage of time, there might as well be some way to harvest the experience gained from it.

The question is whether you do so in a carefully organized top down fashion or instead encourage a culture of more informal or organic knowledge sharing. There’s no single silver bullet that works, but what’s always disturbed us have been those top-down enterprise knowledge management projects that to us appeared as little more than make work for highly paid enterprise consultants. The problem we had with classical Knowledge Management was that the whole idea seemed too difficult to put boundaries around: Just where exactly do you draw the lines on a knowledge management project? Even ERP projects, which were notable for their cost overruns, had more tangible targets: implement a new transaction system that, in many cases, would require reengineering of business processes. And we know how well contained those projects ended up.

Along came Web 2.0 and social media which provided new technologies for the grassroots to simply not wait for some project manager to start a harvesting session which is then converted into retrievable assets from some application requiring significant custom coding. Instead, the notion of Wikis, blogs, microblogs, chats, forums and so on is to use the right tool for the purpose as the purpose arises. Some call it fun. We’ve thought of the new social media as the next generation Knowledge Management.

A couple days ago Xerox researcher (and of course blogger) Venkatesh G. Rao wrote about Social Media and Knowledge management as a generational war. He spoke of several occasions where he was asked to give token advice to knowledge management project leads or consultants with the idea of finding a middle ground to update knowledge management practices and make them more webby and agile. We thought he got kind of carried away with the generational argument that was rife with stereotypes. Sure, in all likelihood, Twittering, Facebook et al tend to hit more of a younger demographic, but use of Web 2.0 tools is definitely not restricted to people under 30.

We’re more in agreement with another facet of his argument, that conventional knowledge management is more of a waterfall process, whereas social media tends to be more agile. It’s very much analogous to the different methodologies of software development, not to mention the idea of top-down vs. bottom-up.

Conventional knowledge management initiatives tend to be top-down affairs, driven by plenty of advanced planning for designating experts and thought leaders, then harvesting or codifying their insights, developing applications and databases, and inputting this all. By contrast, the social media are designed for use when the muse hits. It might not be comprehensive, but it provides a fast outlet that, through processes such as folksonomies applies a Wikipedia-like grassroots approach to classifying or giving meaning to knowledge, and easy-to-use technologies employing liberal use of tagging that help insight originators to just get their thoughts down in a manner that makes it retrievable, at the point when they have the time (or not) to contribute it.

The battle is more about whether learning is a more bottom-up rather than top-down exercise. There have been many of us around for years who have always contributed “folk” knowledge, but until recently lacked the tools to share it. The idea that the debate between knowledge management and social media is a generational divide is hogwash.

08.27.08

Try this at home?

Posted in Application Development, Rich Internet Apps., Web 2.0 Apps at 1:39 pm by Tony Baer

An elusive goal of software development has been the invention of an easy-to-use platform that end users can write their own programs without having to rely on developers. Of course the very notion of “writing programs” is not exactly the kind of thing that you would expect your grandmother to do, not to mention business stakeholders who do not fall under the category of “power users.” To date, that goal has only been realized with the common office productivity tools that are equipped on just about every desktop which provide bare bones features for extending a spreadsheet or word processed document with a macro, and to varying extents, hobbyist programs like kinder simpler photo editors that are thrown in gratis with Windows or Mac platforms. But for the most part these are automation, not programming tools.

At the enterprise level, that of course has been the goal with BPM offerings, which are supposed to enable business stakeholders to model in business, rather than code executable terms, how their business processes run, or should run. Mashups in turn were supposed to provide extremely simple alternatives to integrating applications by focusing at the presentation rather than data, logic, or transaction levels. And the snowballing proliferation of enterprise mashup tools compete on a number of different features to make them safe for the business, the one common thread to most of them is that they minimize the need to drop down into coding JavaScript. However, no matter how visual mashup tools are, you still need developers or power users at some point of the lifecycle, whether it be to vet objects or sources than can be safely mashed up without violating some corporate policy, or to deal with some complexities of JavaScript under the hood.

Yesterday, the Mozilla Foundation fired the first shot in their attempt to transform the browser into a natural language mashup tool accessible to non-programmers. The project, appropriately titled Ubiquity, is supposed to enable anybody – not just JavaScript developers – to casually mash things up when you perform tasks like send emails. Let’s say you want to throw a party and invite a bunch of friends to a restaurant. Instead of signing up with a site like Evite, simply name the restaurant, hit an option key, type in “Map,” and voila, a Google Map with the location of the restaurant populates your email. Want some reviews or a display of the menu. Press the option key again and enter a command like “Yelp” and type in natural language that you want some reviews or display a menu. Of course, you can do similar things today by embedding links, but this makes the process a lot more direct.

Ubiquity is still at what Mozilla calls the 0.1 phase, which is the equivalent of a community preview alpha. In the long run, we’d doubt that Ubiquity will gain critical mass as standalone. Instead, we’d expect that third parties would write Firefox plug-ins that make the process much more graphical so you don’t have to type in a question, like “find me some reviews,” or more context-centric, such as party, meeting, or travel planners, and so on.

It’s a technology concept that could also lend itself to other leading portal sites like Facebook, Yahoo News, and so on for adding more context-centric productivity drop-down choices to embellish messaging, Wikis, micro-blogging, or other uses limited only by the imagination. Keep your eye on this. But on the other hand, don’t be lulled into the notion that Ubiquity will finally make developers non-ubiquitous at least in the enterprise, as at some point, companies still need to exercise adequate controls over the behavior of software and the data that it exposes.

08.12.08

Worldwide Wait 2.0

Posted in e-Commerce, Mobile, Networks, Technology Market Trends, Web 2.0 Apps at 2:40 pm by Tony Baer

A hallmark of Web 2.0 is that the web is supposed to become more dynamic. That dynamism has been energized by critical mass broadband penetration, which in the U.S. now reaches over half of all households.

But unless you’re lucky (like us) to live within the Verizon FIOS service area, the future that’s supposedly already here is … not here yet. We’ve seen several fresh reminders over the past few weeks about the lack of connectivity, and how the issue is related to the fact that, while China is building cities, superhighways, metro lines, and networking, our physical and electronic infrastructure remains stuck in the 1960s.

No wonder that between 2001 and now, U.S. dropped from fourth to number 15 in broadband penetration. A proposed remedy by FCC chairman Kevin Martin to fund DSL-equivalent free WiMax access through royalties on wireless spectrum might contribute but a drop in the bucket.

Over the past few weeks, we’ve been reminded of the penalties that the U.S. is paying for letting the ball drop when it comes to Internet infrastructure. We’ve also been reminded about the inertia of the media and entertainment industry in fully embracing the new technologies to revive what is a stagnant (in the case of music) or threatened (in the case of film) market. And we’ve been reminded about the resulting difference between hype and reality when it comes to the capabilities of the dynamic, location-based Internet that supposedly is already here today — but in reality is not.

Here are a few cases.

Basic Connectivity. About a month ago, we spent a lovely week up on the Maine coast. People who move to Deer Isle do so because they cherish the isolation — it’s 40 miles to the nearest McDonalds. But, unless you’re lucky enough to live on Highway 15, the main road, chances are, you’re still relying on dial-up Internet access. That is, if you’re lucky to get a dial-up line of any kind, because the copper wire phone system on Deer Isle is fully tapped out. You need to wait for somebody to move or die before getting a new line. About 18 months ago, Verizon sold off the landlines to Fairpoint Communications , which subsequently decided that the infrastructure was too obsolete to continue investing in. It promises — someday — to replace copper with fiber. You want mobile instead? Only a single minor carrier provides cell phone coverage. By contrast, back in 2003 we vacationed on the other side of the Gulf of Maine in Nova Scotia where virtually every town of any size had, not only broadband, but cellular coverage.

The hype of 3G. Adding 3G support to the iPhone was supposed to make it a true mobile Interenet device. Maybe it does — it certainly has a great UI and operating environment, but don’t take the Apple commercials literally, as this entry from agile development and Ruby on Rails tools development firm 37 Signals attests. Our mobile infrastructure — which was built on a divide-and-conquer rather than an interchangeable standards-based strategy, continues to deliver coverage that is spotty and inferior to the rest of the developed world.

Internet Home Media. There has been lots of press over the idea of dynamic movie downloads from the likes of Netflix. But when it comes down to old fashioned home entertainment — the stuff where you’re going to utilize home theater 100-inch flat screens and 5:2 sound, don’t count on internet streaming just yet, wrote colleague Andrew Brust recently.

****

There are several issues here:

1. A national failure to mobilize to renew our nation’s infrastructure (we’re too hung up on keeping taxes low and letting the market sort it out to pay for it) that touches broader policy issues.
2. The inertia of certain sectors that feel threatened but could otherwise profit if they could only think out of the box.
3. Hype continues to outrace reality.

07.23.08

Words and Pictures

Posted in Application Development, Business Intelligence, Cloud, Data Management, Database, SaaS (Software as a Service), SOA & Web Services, Web 2.0 Apps at 12:36 am by Tony Baer

Data is one of those things that tends to get hidden behind black boxes. Before data becomes accessible, somebody must jump through the hoops to either access the data, or make access intuitive for the rest of us. There’s been no shortage of tools over the years to do things like hiding the SQL, or making non-SQL data sources look like SQL, and so on. Data transformation and integration used to be an exceptionally thorny problem until, just over a decade ago, the emergence of data warehousing demanded a better solution, and Informatica invented it.

Informatica’s innovation was borrowing visual development techniques from 4GL RAD tools and backing it with a metadata engine to make data transformations visual and reusable. For his latest act, Informatica founder Gaurav Dhillon has returned to his roots, pushing ETL into the Web 2.0p world. His new venture, SnapLogic, exploits emergence of RESTful web services, a.k.a., Web-Oriented Architecture (WOA), which represents data, not as columns or rows in a relational table, but as web links that are searchable by Google. It combines that with a Microsoft Excel-like front end recognizable to power users, rather than the 4GL metaphors used by developers, and places the technology atop a commodity open source Apache Tomcat Java server. Using these technologies, you can connect to a growing array of data services, such as those provided by commercial providers like StrikeIron, or those that are already in the public or open source domain.

SnapLogic’s latest deals reveal its goals to make ETL available to SMB that previously judged the technology too expensive and complex. It has inked deals with SugarCRM to provide a high level front end that hides the complexity of its web services, and it has signed a deal to host its ETL services in Amazon’s EC2 compute cloud. The goal is to make the process as low-touch as possible.

Significantly, none of what SnapLogic is doing is totally new; providers like CastIron commoditize in an appliance the most popular data transformations that Informatica and others already perform; while Salesforce simplifies access to its web services with features such as Microsoft Outlook plug-ins. The difference is SnapLogic’s go-to-m,arket strategy and its leveraging of REST and open source, which makes the technology more platform-independent and more affordable than even Salesforce’s low prices (compared to Siebel et al).

It’s an auspicious start, but there’s no free lunch. RESTful services are certainly less complex and lighter weight than web services, as you don’t have complex headers and a bewildering array of standards to contend with. REST is simply about data access, retrieval, and updating – which is its greatest strength and weakness. If all you weant is data services, REST does the job far more efficiently than SOAP calls to WSDL services. However, REST is not extensible like web services, meaning you cannot make any of the safeguards, like authentication, authorization, or access intrinsic. In a risk-averse, increasingly compliance driven business world, where leaks of consumer credit card numbers have grown all too routine, you need to incorporate safeguards. Add the dangers of the unprotected cloud, as Computerworld’s Ephraim Schwartz reported in detail.

The good news is that the problem is so general and widespread as to lend itself to the same kind of open source commoditization that could spawn a new generation of partners with which the SnapLogics of the world could round out their vision.

06.13.08

Life’s Getting Interesting Again

Posted in .NET, Agile Development, Application Development, Java, Middleware, Open Source, Rich Internet Apps., SOA & Web Services, Technology Market Trends, Web 2.0 Apps at 12:39 am by Tony Baer

A conversation this week with database veteran Jnan Dash reminded us of one salient fact regarding computing, and more specifically, software platforms. That there never was and never will be a single grand unifying platform that totally flattens the playing field and eradicates all differences.

Dash should know, having been part of the teams that developed DB2, and after that, Oracle, who currently keeps off the street by advising tools companies that have gotten past startup phase. For now, his gig is advising Curl, developers of a self-contained language for Rich Internet Applications (RIA) combining declarative GUI and OO business logic inside the same language, which had the misfortune of emerging before its time (the term RIA had yet to be coined).

Curl provides an answer to unifying one piece of the process – developing the rich front end. But it’s a far cry from the false euphoria over “Write once, run anywhere” that emerged during Web 1.0, where the train of thought was on a single language (Java or later, C#) for logic on a single mid-tier back end and a universal HTML, HTTP, and TCP/IP stack for connectivity to the front end. Of course, not all web browsers were fully W3C compliant, and in the end, bandwidth killed the idea of Java applets (the original vision for RIA) and disputes between Sun and Microsoft gave rise to a Java/.NET duopoly on the back end. But the end result was not only a dumbed down thin client that was little more than a green screen with a pretty face, but also a dumbed down IDE market, as the Java/.NET duopoly effectively made development tooling commodity. Frankly, it made the tools market quite boring.

That’s in marked contrast to the swirl of competition that characterized the 4GL client/server era a few years before, where emergence of two key standards (SQL databases and Windows clients) provided a standard enough target that spawned a vibrant market of competing languages and IDEs that rapidly pushed innovation. Competition between VB, SQL Windows, PowerBuilder, Delphi and others spawned a race for ease of use, a secondary market for visual controls, simplified database connectivity, and birth of ideas like model-driven development and unified software development lifecycles.

What’s ironic is that today, roughly a decade later, we’re still trying to get to many of those goals. Significantly, as technology grew commodity, most of the innovation shifted to process methodology (witness the birth of the Agile Manifesto back in 2001).

While agile methodologies are continuing to evolve, we sense that the pendulum of innovation is shifting back to technology. In a talk on scaling agile at the Rational Software Development Conference last week, Scott Ambler told agile folks to, in effect, grow up and embrace some more traditional methods – like perform some modeling before you start – if you’re trying agile on an enterprise scale.

More to the point, the combined impacts of emergence of Web 2.0, emergence of open source, and a desire to simplify development such as what former Burton analyst Richard Monson-Haefel (who’s now an evangelist with Curl) termed the J2EE rebel frameworks spawned a new diversity of technology approaches and architectures.

Quoted in an article by John Waters, XML co-inventor and Sun director of web technologies Tim Bray recently acknowledged some of the new diversity in programming languages. “Until a few years ago, the entire world was either Java or .NET… And now all of a sudden, we have an explosion of new languages. We are at a very exciting inflection point where any new language with a good design basis has a chance of becoming a major player in the software development scene.”

Beyond languages, a partial list of innovations might include:
• A variety of open source frameworks like Spring or Hibernate that are abstracting (and simplifying use of) Java EE constructs and promoting a back to basics movement with Plain Old Java Objects (POJOs) and rethinking of the appserver tier;
• Emergence of mashups as a new path for accessible development and integration;
• Competition between frameworks and approaches for integrating design and development of Internet apps too rich for Ajax;
• Emergence of RESTful style development as simpler alternatives for data-driven SOA; and in turn,
• New competition for what we used to call component-based development; e.g., whether components should be formed at the programming language level (EJB, .NET) vs. web services level (Service Component Architecture, or SCA).

In short, there are no pat answers to developing or composing applications; it’s no longer simply, choose vanilla or chocolate for the back end, and using generic IDEs for churning out logic and presentation. In other words, competition has returned to software development technologies and architectural approaches, making the marketplace interesting once again.

06.12.08

Don’t Try This at Home

Posted in Application Development, Rich Internet Apps., SOA & Web Services, Web 2.0 Apps at 4:06 pm by Tony Baer

How often have you heard vendors extol their products as being so simple that people from the business side can take charge and configure their reports, manage their portals, or if you listen to all the enterprise mashup providers, that business people can assemble neat little personalized disposable apps without having to call on IT? We’ve seen our share of easy-to-use end-user tools that look pretty impressive, and at times have drank the Kool-Aid ourselves.

Reviewing the proceeding of a panel session at this past week’s Enterprise 2.0 conference, prolific BPM blogger Sandy Kemsley gave us a fresh shot of common sense. Commenting on a panel session that covered how mashups could consume data from basic, ubiquitous sources such as Atom/RSS feeds, SOAP, RESTful services, etc., Kemsley reminded us that when you put together mashups, the processes is akin to piecing together what we’d term jigsaw puzzle: you have to know something about how the pieces fit together. She stated that you need to consider the interfaces, and concluded, “Realistically, business users still can’t do mashups, in spite of what the vendors tell you…”

She stated that dragging and dropping is, literally, only the tip of the iceberg, as you need to know how those pieces may interact (isn’t that the point of doing a mashup?). Otherwise, if you just stick a two, three or more silo’ed data sources on your screen that don’t interact, you’re simply putting together a portal page, which may be OK in and of itself. It’s the difference between a dynamic mashup of a Google Map which shows the location of the sales leads that you overlaid atop it, or just a Google Map with a static table that doesn’t show where on the map those leads are. And, as we wrote after a conversation with Informatica’s Ashutosh Kulkarni a few months back, issues such as architectural integrity, customer privacy protection, or access control may not necessarily be forefront on the end user’s mind.

Admittedly, enterprise mashup providers like Serena and IBM remind us that their offerings provide protected sandboxes within which business users can mash safely vetted assets to limit or eliminate the possibility of data breaches. Clearly, mashups have the potential to make disposable applications more accessible to the rest of us. Just don’t forget to get some adult supervision.

06.02.08

Another Stab at Taming Ajax

Posted in Rich Internet Apps., Standards Development, Web 2.0 Apps at 10:36 pm by Tony Baer

Nexaweb this week is trying a new tack for taming Ajax development. Problem is, JavaScript lacks the strong typing, and so until now, the best solution has been for all the tools players to have their own custom frameworks for generating JavaScript libraries used in Ajax. The alternative has been to develop in JavaScript, something that is barely a side specialty for most web app developers. Nexaweb’s idea is to propose extensions to the Dojo toolkit that impose an XML-based abstraction layer so you can reduce the need to shell out into raw JavaScript coding, much the way the Spring framework eliminates the need to write all those J2EE artifacts while taking advantage of J2EE services. It’s an interesting shot in the dark from a player that is decently sized fish in the small fishbowl that is the Rich Internet Application (RIA) tooling market.

In so doing, Nexaweb is tilting at windmills because the Ajax style, and the culture that has emerged around it, has tended to be very fast and loose – and resistant to standards. That accounts for the rash of enterprise mashup hubs and protected sandboxes, as they are based on the idea that, while you may not be able to clean up Ajax-style development, if you place firmly guarded borders around it (e.g., carefully vet what can be mashed up), at least you won’t smother it.

« Previous entries Next Page » Next Page »