Export (0) Print
Expand All

Architectural Shifts and Developer Challenges

Visual Studio .NET 2003

To fully appreciate the significance of the Microsoft .NET Platform, it is necessary to examine this architectural shift, its causes, and the continuing impact on the computer community.

Architectural Shifts

When Internet technology, notably the Web, moved into the computing mainstream in the middle of the 1990s, the model for business computing changed dramatically. This shift was centered on the industry's notion of client/server computing, which until this time was very complex, costly, and proprietary.

You can characterize the Web model as loosely connected tiers of diverse collections of information and applications that reside on a broad mix of hardware platforms. Remember, the driving force behind the Internet since its inception is the desire to provide a common information-delivery platform that is scalable, extensible, and highly available. This platform is flexible by design and not limited to one or two computing tiers. The only real limits to application development in the Internet world are computer capacity and the imagination of the application designer.

As the Web browser rapidly became ubiquitous and Web servers proliferated throughout companies, it was clear that — despite the best efforts of client/server software producers to Web-enable their products — developers needed a radically different way of thinking about the application model. The idea of the "lowest common denominator" approach to business computing hit the developer community first. Obviously, new techniques and tools were required to meet the technology shifts and challenges facing developers.

Technology Shifts and Developer Challenges

Once the Internet revolution took hold and new technologies appeared, developers faced several challenges that existing design models and tools could not address adequately. The following issues were central to the developers' dilemma:

Heterogeneous environments

One of the earliest, and perhaps biggest, challenges was the need to build applications that could readily fit into heterogeneous environments. Most large organizations had a mix of terminals, rich clients, and thin (Web) clients. In addition to accommodating the client base, new applications had to interact with legacy data and applications hosted on mainframe and midrange computers, often from different hardware vendors.

Scalability

Prior to the influx of Internet technologies, scalability was a relatively easy issue to manage. At that time, the computing environment was essentially a closed system because the amount of remote access by staff, customers, or business partners was limited. This meant that the number of users and their usage patterns for given applications and services were well known. Strategic planners had ample historical data on which to base their projections for scaling the computing environment to match consumer demand.

Furthermore, the application development life cycle typically spanned several years. Once again, planners had ample time to plan for system and application scaling.

Finally, personal computers still had not realized their full potential and corporations had only begun deploying personal computers throughout their enterprise. In fact, many viewed personal computers as something slightly smarter than a terminal. As time passed, there was an expectation that the personal computer would become part of any given application.

While the personal computer was redefining how people worked, Internet technology, notably the Web, altered the corporate mindset. Initially, corporations viewed this new technology as an ideal, low-cost method for sharing information throughout the organization. Not only was it inexpensive, but it also made it very easy for users to do their own development, and internal Web sites (intranets) quickly appeared on the computing landscape.

The traditional foundation for scalability planning started to erode, and when companies opened their doors to the outside world, it crumbled completely. The new design paradigm said that you had to design systems to accommodate a user base size from less than one hundred to more than one million.

Rapid application development and deployment

The intranet and Internet phenomena highlighted the possibility of, and need for, rapid application deployment. The corporate intranet experience clearly demonstrated that you could build business applications quickly. An added bonus was the simplicity of URL-based deployment. The result was that business managers and users began to question the entire traditional development platform and processes. They were no longer prepared to wait several years before being able to use an application. From an investment perspective, the business community questioned any investment in applications that would be legacy systems by the time they were completed.

Businesses refined the notion of rapid application development even further as they expanded their applications horizon from the intranet to the Internet. To be competitive, developers needed to create applications virtually on demand for immediate use — just-in-time (JIT) development. To achieve this, they needed to completely revamp and revitalize their approach to applications development.

Platform administration and management

As with any aspect of computer technology, things are not perfect in the Internet/Web world. The information technology (IT) professionals that embraced this new application model discovered that their newfound freedom and flexibility brought a completely new set of administration and management issues. These issues revolved around clients, applications, and hosts.

The browser, coming as it did from the grass roots, left most organizations without a browser standard. (Day-to-day support and upgrade issues themselves were often a logistical nightmare.) From a development perspective, the lack of standardization meant that application designers had to accommodate the core and extended HTML rendering capabilities of each browser version.

Application deployment was even more difficult to manage because system administrators had to contend with large numbers of content publishers rather than a single developer group. The management of this aspect of Web-based computing became increasingly difficult as businesses accepted the idea of providing data-driven, dynamic content. The need to include diverse data stores and accommodate several different scripting languages further broadened the scope of the Web programming model.

Any Webmaster from the initial days of Internet-based business applications can attest to the hours of painstaking, manual work required to keep even a medium-sized site operating properly and continuously — because another aspect of the Internet phenomenon was the users' expectation of access 24 hours a day, every day. Adding servers to accommodate increased traffic on Web sites increased the demand for support.

Unfortunately, the Web's designers and advocates neglected to include a set of tools for managing the platform, leaving it to the IT community to develop a solution.

Network-aware applications

The final challenge facing developers is a result of advances in portable computer technology and the decline in cost for portable computers, such as laptops, notebooks, and palmtops. Coupled with the global access made possible by the Internet, mobile computing has grown at a rate comparable to the Web. Recent figures indicate that laptop computer sales now exceed desktop sales.

Offline, or disconnected, use is no longer the exception. The user community expects to be able to use applications and services in both online and offline modes. The application developer must be able to provide this capability in an application.

See Also

Designing Distributed Applications | An Overview of Distributed Applications

Show:
© 2015 Microsoft