Delivering Efficiency and Cost Reductions: Industry Trends and Impacts

Over the past two weeks, I covered how you should deliver IT cost reductions in the near term. And if executed well, you will also yield some longer term benefits such as a better workforce balance or elimination of redundant or low value systems. But how do you deliver material improvements in cost and efficiency that are sustainable and material? First, you must put in place long term tactics that you will relentlessly pursue. I will cover these long term tactics over the next two weeks but first I thought it important to understand the impact of technology trends on IT and how they are affecting how you should tackle cost reduction in IT.

To quickly recap, the CIO must understand the IT business trends of the past several years where:

– Cost reduction and efficiency have become a prevalent drumbeat for almost any IT shop in the past 4 years

– Businesses are becoming more and more reliant on IT to deliver the services and enable the business operations

– Technology, almost without regard to industry, is becoming a larger and larger portion of the product

and thus, cost reduction must be done such that you improve capability to ensure the viability of your business. This is no mean feat.

But there are several technology trends in the past 5 years that enable you to possibly achieve cost reductions while increasing capability. These technology trends include the consumerization of technology, smart phones and mobility, and more automated workflow and application tools, and virtualization.

IT in the ’60s and even ’70s was originally the domain of government, universities and the defense and space industry. From the ’70s, it was truly a corporate domain until the ’90s with the widespread use of the PC and the adoption of the Internet. But even with the greatly increased personal use of computing, technology was still heavily driven by corporate computing. In the past 5 years though this has changed with the growing consumerization of technology. The best chips, the most scalable software, the largest databases and systems are now delivered for the consumer devices, not for corporate systems. So, it is important as a corporate technologist to recognize this trend and to always be looking to leverage the consumer devices and hardware and the approaches to building consumer systems and networks back into the corporate environment. Some good examples are the increasing implementation of bring your own device (BYOD) by large companies. Instead of IT dictating and maintaining a set of corporate client devices, the shift is to allow employees to use the device they prefer and enable corporate computing on their device through a secure sandbox application such as Good Technologies. And IT benefits from reduced cost and maintenance of these devices while employee productivity and satisfaction increase. Small to mid-sized companies can take advantage of cloud offerings of generic corporate services for email and functions such as sales management and HR. The key for large companies is to be able to shed your legacy equipment, systems and processes fast enough to take advantage of such offerings. I am familiar with one global corporate technology company which is currently trying to implement a client computing approach from 2002, complete with physical tokens (instead of software tokens on the employee cell phone), restricted and standard corporate hardware for mobile devices (instead of BYOD) and crippled capabilities (instead of implementing a sandbox and enabling the rest of the device). The result is a disgruntled workforce that thinks IT doesn’t get it and a more costly configuration.

And as far behind as some large companies are in implementing a consumerized and modern client infrastructure, their application areas are often further behind. And these applications have not taken advantage of the dramatically improved capabilities of workflow and application construction. Companies are overloaded with either legacy fully proprietary and custom systems that are often poorly architected and brittle or they have packages implementations that have been overly customized and are many versions back resulting in a massive backlog of feature improvement and update required. And all of this is expensive. To get to a better future with lower costs and greater speed and flexibility, the technology team must be willing to take this application portfolio and do four things:

– identify those core proprietary or customized legacy systems that are critical to the business and offer unique competitive capabilities and work with the business to make the investment to architect them properly and bring them into the modern era

– cull those over-customized package applications or legacy systems that do not provide competitive value and once and for all, stay on the straight and narrow path of a vanilla release of packaged or cloud-based software. And avoid wasting any further resources or time in these areas.

– leverage the advanced workflow and application building tools in a rapid development or scrum approach to go after areas of operations and the business that have been long neglected in terms of automation, workflow and technology. By applying these new tools, where before only very large functions could be automated or addressed by technology, now, functions and their processes as small as 5 or 10 staff can be easily defined and automated with a matter of 6 to 8 weeks. Thus generating a rapid improvement cycle for the business. With this much better and more timely ROI, you can set up small SWAT teams to tackle inefficiencies throughout the business divisions driving operational cost reduction and quality improvements that could have never been addressed with your previous techniques.

The final technology trend is of course, virtualization.  And with virtualization (and TCP-IP) the IT industry is coming is full circle with its roots back to the mainframe constructs of the 1960s. In essence, with virtualization and cloud computing, we are going back to the future, where computing is one utility pool and all end devices can access it. The difference is of course that we have a heterogenous pool with global accessibility versus a homogenous pool within one corporation or even just one department. By effectively employing virtualization (in fact mandating all applications must be virtualized on compute and storage), you can reduce your infrastructure server and storage costs by 30 to 50%. If you are one the few that has not yet begun virtualization, get on the bandwagon. And if you are still below a 50% virtualization threshold (e.g., less than 50% of your compute or storage capacity is virtualized or pooled), then get going.

In sum, the technology trends of the past 5 or 10 years will help you get to lower cost with increased capability. But our legacies, particularly at large corporations, hold us back from leveraging these technologies. This is where IT management leadership is required. You must accelerate the conversions, cull those systems that cannot make the leap and are not critical for the company. And most importantly, ensure your engineers and design leads adopt these approaches with vigor and energy.

My next posts will cover further the long term approach to reducing costs for IT. In particular we will focus on how to use improved quality to eliminate rework and cost for your team.

Best,

Jim

 

 

 

 

 

 

 

Posted in Efficiency and Cost Reduction

About Jim D

Jim has worked in the IT field for over 25 years and as a senior leader for over 15 years. He has successfully turned around a number of IT shops to become high performing teams and a competitive advantage for their companies.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.