IT Strategies: Past, Present and Future

Updated: November 17, 2009

Up until the 1980's we experienced the era of mainframe exclusivity where strict controls and IT disciplines were the norm. This era limited us to mainframe reports as our exclusive medium for information. Custom report requests took weeks and months to deliver because of the sheer number of requests. There was a hunger within corporations for information: information to help guide decisions. But the mainframe reports only teased the taste buds of information hungry business leaders.

The 1980's

This hunger led to the explosion of distributed computing of the 80's. I use distributed computing in a very broad sense here to include all distributed systems of the time including midrange (Unix), Wintel (Novel, PC LAN) and desktop PC's (DOS, Windows 386). A multitude of others have sprung up since these "pioneering" systems were introduced and are included in this category. In this decade smaller departments took their standard mainframe reports and imported them into their own distributed systems and created their own customized data stores…with customized reporting. These new "distributed" systems sprung up throughout the enterprise. Different systems in different locations: all creating storehouses of information…all using different operating systems and different databases and different applications that didn't talk with each other.

The 1990's

In the early 90's corporations realized for the first time that with all these departments running their own distributed systems they lost track of how much was being spent on IT. Each department had their IT cost buried in their own departmental budgets. In these distributed environments many IT support staff performed dual roles. We saw Accounting clerks performing systems administration functions and tech savvy clerks become the informal help desk that department users called when help was needed.

By the mid 90's we saw a major effort in corporations to centralize their IT functions into one IT organization. This was the only way corporations could regain their grasp of their total IT spending. Through this IT Centralization effort, departments once again lost control of their systems, data and most importantly, their information. Central IT became the Technology Police telling departments what information they could have and what they couldn't.

The new Centralized IT's of the 90's had several challenges. Their charter: provide the whole enterprise with IT support within a set budget. Within these new centralized IT organizations many questions had to be answered. Should the new Centralized IT force departments to surrender their systems and technical staff or should incentives be provided for them to comply? What services should the Centralized IT provide? Everything the departments needed or wanted or only pre-defined services? How would Centralized IT services be funded? Will the departments pay by user or will it be a corporate expense? How should IT projects be funded, by the requesting organization or a Centralized IT budget? Consequently every corporation wrestled through these questions and came up with different IT models.

The New Millennium

When the 2000's rolled around we saw IT formalizing on best practices. These practices developed around a set of basic services provided across most system types such as Hardware/Software Standards, system updates/upgrades, tape backups, off-site storage, help desk services, Change Control, etc. These base services evolved from efforts to trim down IT overhead. In the 2000's Outsourcing of data center operations became more plausible as companies sought additional cost reductions. During this period the phrase "managing your mess for less" became the unofficial mantra for IT Outsourcing who took over the mess Centralized IT created and charged a lower rate for delivering the same basic services.

By 2005 we heard a new battle cry to lower costs even further. Enter the age of "off-shoring". IT providers heeded the call for lower rates by delivering services with low-cost overseas resources. If the early 2000's mantra was "your mess for less", the mid 2000's mantra was "your mess for even less"

2010 and Beyond

As we embark on the next decade of 2010, IT leaders will awaken to a new day where they realize "your mess for even less" still costs too much! Their new mantra will be "why are we living with this mess in the first place?" IT leadership will stop and evaluate why they are providing the services they are providing? Why are they backing up everything at each site to tape and sending it to off-site storage as if everything at remote sites is critical information? Why are they backing up everything at their data centers to tape when the only things that must go to tape are critical records for archiving? What low-value services are they providing that don't need to be performed? Over the next decade IT will redefine themselves and eliminate their mess altogether.

In 2010, IT Organizations will restructure themselves around three strategies. First, IT will fine tune the services they offer. For example, they will provide full services to critical systems/data only and they will tolerate longer outages on non-critical systems. They will change their approach to backups and incorporate temporary disk storage as a replacement for tape backups for non-critical systems, particularly in remote branch offices. IT will surgically eliminate all "low value" services on a "system by system" basis. They will provide services only where needed.

Secondly, IT will outsource their basic services such as helpdesk and systems administration to IT providers who will deliver those services for a fraction of their own costs. Thirdly, IT leaders will redirect their focus from providing generic IT services to providing essential technologies that key business groups need to grow the business. IT will transition from being cost centers/overhead to revenue enablers. IT once and for all will get rid of "their mess" and transform into a true business partner.

In 2010 the new IT Organization will put less emphasis on system specific technical skills and more towards senior technology professionals who can generate business value by incorporating technologies. By the end of the next decade IT organizations will consist of high-end technical resources that design business specific solutions across multiple frameworks. They will have diverse, multi-platform, solution designers with a handful of implementers and system administrators.

Related Categories
Featured Research
  • Baselining Best Practices

    IT must ensure new applications are rolled out quickly, reliably, and without risk, while at the same time guaranteeing performance and availability. Read this VirtualWisdom white paper to find out how to achieve application-aligned infrastructure performance, and more. more

  • Next Generation End User Experience Management: APM

    In an era of new technologies and cloud-based application delivery models, your business success depends on your ability to ensure optimal application performance and quality user experiences at all times. This complimentary white paper from AppNeta will enlighten you to the new frontiers in end user experience management and much more. more

  • Optimizing Application Delivery to the Network Edge

    Increasingly, the success of business is being tied to the network. The transformation of the network and IT can help organizations deliver and support highly available applications and services while reacting more quickly to changes in the business environment. In this complimentary white paper from IDC, learn how HP can help its customers and partners improve the overall application experience. more

  • Networking Routers Buyer's Guide for SMB & Enterprise

    This buyer's guide presents an overview of leading products on the market today and aims to improve research for companies needing to purchase or upgrade their equipment. more