New Trends

There are a number of exciting trends in the industry today. I'll try to explain where I see things going, and the impact they're going to have. It's always difficult to predict the future, as previous pundits have discovered to their embarrassment. I believe that my observations have some validity due to the fact that I've spent two decades in this business and have personally witnessed many battles. There are two types of winners I've seen: short-term and long-term. Some solutions never survive the test of time, while others become staples of the industry. Without any further ado, following are my observations and predictions.

Microsoft

Animosity toward the Microsoft juggernaut is showing up all over the 'net. It's being fueled by a number of recent events as well as long-simmering disaffection. People are openly and angrily making their opinions known in a number of forums. Unlike in the past, the Microsoft apologists are rarely heard from these days, perhaps indicating they too have had enough. So how did the world's largest software company get into this state?

People for years have complained that the thrust of Microsoft products and upgrades has been increased functionality; more bells-and-whistles. This approach has brought us larger and more complex applications while doing little to improve reliability. When I needed to upgrade to Office 97 on my IBM ThinkPad, the installation process indicated that I needed 127MB free space in order to upgrade! Since the laptop was only equipped with an 850MB drive, I was forced to delete useful data in order to load the application suite.

Some of the functionality Microsoft adds is of questionable value. The capabilities of Word, for example, seem to be more oriented toward the requirements of professional publishers, previously the domain of Mac applications, not the typical business user. This added functionality made the product so fragile that I limited documents which included graphics to no more than twenty pages. I would never even dream of dealing with larger documents after repeated instances of loss of significant amounts of work when the application inevitably crashed. Even timed automatic backups didn't help much.

Microsoft has demonstrated a paternalistic attitude, suggesting that they alone know what's good for users. They have included features such as plug-and-play, better known to the industry as plug-and-pray, and the wondrous registry. They have steadily reduced the availability of tools to make changes to the way the system operates, claiming that it's become too complex for mere mortals to have access to Windows internals. They have integrated the browser with Windows 98, telling us that we should appreciate what they have done for us.

The Microsoft trial has demonstrated the arrogance and bully tactics used on a regular basis in the company. The use of doctored video tapes to make points demonstrates a cavalier attitude towards the court and the judge. The admissions of senior executives on the stand support allegations that Microsoft engaged in questionable practices in order to stifle competition and control the industry. The Senate hearing on July 24th, 1998, heard additional allegations, among them that Microsoft intentionally "broke" Real Player, so that it couldn't compete with Microsoft's own Media Player.

Click here for the full text of the court's findings of fact.

Finally, it was recently revealed that Microsoft was again trying to collect information regarding users' configuration. This first cropped-up on Win 95, and was even more insidious, searching for all Microsoft software on the LAN to which the computer was connected. Microsoft used the same lame excuses then and now, attempting to blame it on debugging code which had accidentally been included in the shipped version. People have gotten quite adept at smelling a whitewash, and the vitriol generated by this situation has been unprecedented, in my experience.

So what's the upshot of all this? I think it spells the beginning of the end of the Microsoft domination of the desktop. Efforts are currently underway (see below) to provide a new platform which offers users what they really want: compact, reliable and inexpensive computing. There's also the potential impact of Java Beans as a new computing paradigm, combining small, specific tools in order to create a complete application. Note that this is not significantly different from the visual development environments currently used.

I'm not sure that Microsoft ever really regarded the interests of their customers highly. The company way has been, from the beginning, to buy promising technologies and wrap them up in Microsoft packaging. Let's not forget that Bill Gates did not write DOS; he purchased it from Seattle Computer Products. Similarly, Microsoft Mail was a packaged version of Network Courier, developed by a company in Vancouver, British Columbia. Finally, the base code for the Microsoft browser is licensed from a company called Spyglass, which based their work on the freely distributed NCSA Mosaic browser.

Both company representatives and Microsoft apologists have frequently claimed that the company fosters innovation in the industry. This has become a tired argument, since the real innovations seem to come from companies which are either bought or buried by Microsoft. The strategies discussed in the Halloween documents demonstrate that Microsoft is not above developing proprietary extensions to previously standard protocols in order to protect their market position. They also seem to have no qualms about using Fear, Uncertainty and Doubt (FUD) tactics in their efforts.

Taken together, the level of frustration people are feeling becomes more understandable. Microsoft is working to improve their own situation, not the situation of the consumers. In fact, Microsoft seems to hold most users of their products in contempt. Read the words of Steven Sinofsky, regarding the security concerns recently raised. A company which has lost touch with the needs of their customers is doomed to failure. IBM learned a painful lesson when they failed to respond to a changing universe; it looks as though Microsoft is going to have to learn a lesson or two themselves.

Back to top

Linux

It certainly helps to be able to spot the trends in this industry! Three weeks ago I purchased a new 8.3 GB disk for my Compaq Presario and the Red Hat version of Linux. Linux is a home-brewed implementation of a UNIX-like system which is license-free. It is also freely available over the internet, so why did I pay the $30 for Red Hat? It's a fairly complete package, including the Apache web server, Netscape Navigator, X11R6 and even single-user, non-commercial versions of Sybase and DB/2.

Of course, I'm not the only one interested in this operating system. Through a combination of cost, availability, reliability and it's identification by Microsoft as a competitor to Windows, Linux has recently become a media darling. Companies which had previously eschewed a royalty-free, user-supported amalgam started to come out of the closet in droves to support the "new" O/S.

IBM's recent announcement that it would pre-install Linux on its full range of desktops and laptops show how popular Red Hat Linux has become. IBM has even announced a port of their flagship database product, DB/2 to the Linux platform. Click here to see how Sun is supporting Linux on its UltraSPARC processor architecture. Even Intel recently announced that they would be working with Cygnus to produce a compiler which will generate code which can utilize MMX and mathematical features of the Pentium II and III chips.

You might be surprised to hear how many companies are utilizing Linux to provide services. Add Raven to Apache and you can support SSL and provide a secure server. The jserv component of Apache permits the creation of servlets which can comprise one of the middle layers of an n-tier architecture. And trust me on this one, I know for a fact that one company is using just such an approach for a mission-critical application involving credit cards and on-line authorization via CyberCash.

I see a great future ahead for Linux. Despite warnings about fragmentation, I believe that Linus and his lieutenants will be able to keep things under control. The vested interests of thousands of programmers world-wide will attest to that! Companies such as Red Hat are making installation a breeze, and the only remaining issue is how to address the GUI. I would like to see a plug and play option so that users can select their favorite desktop and use it on any Linux system, but I might have to write that capability myself. The recent acknowledgement of this stable and powerful operating system is past due, but welcome nonetheless.

Back to top

CORBA

The Common Object Request Broker Architecture is going to take off in the year 2000. There are a couple of reasons for this: a growing awareness and understanding of object-oriented systems, and the need to rationalize business logic. Also, after the Y2K problems have been addressed, this is the technology next in line to warrant the attention of IT management. I'll explain how we've reached this nexus in more detail below, but I firmly believe that this will be a very significant technology, to the point where I'm investing my own time and money in learning how to use it.

Even though Smalltalk was introduced to an uncomprehending world back in the '70s, object-oriented programming didn't really take off, in my humble opinion, until the recent introduction of Java by Sun Microsystems. Based on a write-once, run-multiple (WORM) foundation, Java provides what many have considered to be the Holy Grail of programming: true cross-platform portability. Combined with elegant and powerful extensions, such as Java DataBase Connectivity (JDBC), the environment lives up to its' potential. Even IBM is now supporting Java across their entire range of systems, up to and including the largest mainframes.

Object-orientation is important in that it effectively hides the implementation of a process. As long as it's well designed, the internals can be modified or even completely replaced, as long as the external interfaces don't change. Well-defined objects are also inherently reusable. This reusability is key to leveraging the investment that many companies have already made in automating their business processes. It shouldn't be necessary to reimplement a business process time and again whenever a new application is developed.

Reusability in the next millenium is going to be worlds away from earlier paradigms which used libraries of functions to provide common functionality. With the preponderance of intranets, management as well as developers should not need to be concerned with where a particular logical entity resides. Similarly, resources should be able to dynamically adjust to a dynamic load. Finally, the entire system should be tolerant of faults, up to and including catastrophic failures.

Various architectures have been proposed to address some of these issues. There was the Distributed Computing Environment ( DCE) and even an entire company ( Forte) dedicated to providing solutions to the perceived problem. Unfortunately, the solutions were either too complex (DCE) or too proprietary (Forte) to justify the corporate buy-in required. Customers require a standards-based solution which will leverage the investment already made in the infrastructure and the codifying of business logic.

The term "business logic" is beginning to appear more often in the popular press, although many people are not familiar with the meaning of the phrase. Simply put, business logic is the collection of processes performed by a business in order to operate. A shipping company, for example, needs to be able to accept orders, make pickups and deliveries, and bill the customer. They need to be able to pay their employees and track the shipments in progress. Each of these processes is actually a collection of subprocesses, which might share common functionality.

Let me provide a concrete example by way of illustration. In a multinational corporation, both accounts receivable and payroll need to be calculated in the local currency. Corporate reports, however, will need to be generated in terms of the currency of the country in which the company is registered. The currency conversion routines, therefore, should be consistent across multiple applications. Additionally, currency conversion rates fluctuate over time, and using separate mechanisms for different processes will not provide a consistent view of the financial state of the company.

Many companies have invested considerable time and effort in rationalizing their business processes. Duplication of effort is both wasteful and potentially injurious. Reusability of business logic is essential in order to maximize efficiency and to ensure a coherent view of the corporate situation. Code which was developed from the '50s through the '90s needs to be accessible to new applications being developed in order to further the goals of the company. Applications such as Decision Support Systems (DSS) rely on the integrity and consistency of the underlying data store.

Given all of these considerations, CORBA provides the capabilities required to propel implementing companies into the next millenium. CORBA permits the "objectification" of processes implemented in various languages, even COBOL or FORTRAN. Access to these objects can be transparent to an application running anywhere on a corporate intranet. Reusability is enhanced and the power of the Java platform can be leveraged to deploy robust applications to the users.

Back to top

Virtual Companies

One of the lessons learned from Linus Torvald's Linux initiative is that it is eminently possible to develop applications using the skills and resources of geographically dispersed individuals. Even Microsoft has acknowledged the power of this mechanism, vis. the Halloween documents. Future endeavors will utilize the skills of individuals without regard to where they might be located, as it will not materially impact the development effort. There are, as always, a number of contributing factors to this scenario, and I will attempt to spotlight some of them here.

Good programmers are few and far between. Please don't take offense at this statement, but I personally believe it to be true. There are a lot of people who code quite proficiently and they regularly implement capable applications which address specified requirements. There is a small subset of programmers, however, who have insights hidden to the pedestrian masses and can literally change the world. If you doubt this assertion, click here to see a biography of Bill Joy.

Given that really good programmers are rare, it behooves us to determine how to best make use of the available talent. Traditional structures, save for some obvious exceptions, don't have the capability to harness and retain these mavericks. Not only that, but the talent pool is shrinking, as students eschew Computer Science in favor of Law or Medicine, which they perceive to be more financially lucrative. This is a particularly ironic situation since some 300,000 IT jobs are going begging in the US and employers are using devices like "signing bonuses" to attract employees, enticements which used to be reserved for professional athletes.

Truly talented developers can practically set their own price. I once charged US$250/hour, or US$2000/day (plus expenses) for mainframe consulting. Considering that the cost of operating a data centre at that time ran into the millions of dollars per annum, $2000/day was considered a drop in the bucket. Capable individuals are hired at these stratospheric rates because the work they do can be justified on a cost/benefit basis. If it only costs you $10,000 to improve or correct a process which will save you $10,000 per month, you've made a good investment.

This is why it doesn't make sense to be studying Visual Basic at this point in time. Programmers in this field have become a "dime-a-dozen" commodity. Earnings are based on supply and demand economics; skills which are in short supply will be richly rewarded. This is why it is vitally important to be able to discern industry trends and keep ahead of the curve.

Many companies these days are arriving at the conclusion that an internal Information Technology (IT) department doesn't make fiscal sense. It's a somewhat short-sighted view, but that's a personal bias which doesn't belong in this discussion. When "farming-out" development of desired applications, customers have traditionally approached Systems Integrators, or Consulting Companies, two examples being SHL Systemhouse and Anderson Consulting. Experiences have often begat less than optimum results due to the nature of the companies involved.

The following illustrations are provided from my personal experience, and are not intended to vilify the referenced parties. The Systemhouse approach is to bid technically qualified people who, due to other obligations, never seem to be available to work on the actual, signed contract. The Anderson approach (we in the industry call them Androids, by the way) is to make a professional presentation (read: suits and ties) and then dump a completely unqualified team (typically a half-dozen or more) into the customer's premises.

Given the nature of business today, what's a company to do when they need an application developed but don't want to go down the traditional path? The solution is to hire a "virtual company", one which exists for the duration of the contract and then disbands. Why would a company risk such an approach? I'll answer that question with a question: if you had a UNIX project, would you want Bill Joy to lead that project? If you've read any of the links in the biography cited above, you most certainly would. He's the person who gave us vi, BSD, Sun Microsystems, and so much more.

When you contract with a virtual company, you're taking advantage of a facet of natural selection. Good programmers strive toward to a higher level of competency and knowledge. They will not sub-contract to programmers of lesser skill since they "don't suffer fools gladly." This helps to assure the delivery ability of the virtual company. Another factor at work here is peer pressure. Failure to deliver will require the remainder of the team to make extraordinary efforts to complete the unfinished portion, almost ensuring that the failing party will not be invited to participate in future cooperative endeavors.

Other factors also favor the virtual company approach. With the availability of inexpensive hardware, coupled with the ubiquitous Linux, developers have access to processing power previously available only at centralized facilities. High-speed telecommunications are also being rolled-out at reasonable cost, ranging from the ISDN line I use at home, to ADSL and cable modem. Developers can work together from opposite coasts in order to integrate and debug the completed application. This synergistic approach to development will continue to find favor among those seeking high-quality applications.

Back to top

Copyright (c) 1999 by Phil Selby