Leading IT Transformation
Leading IT Transformation is targeted at organizations that are struggling with various issues impacting information technology implementations. Perhaps your teams are plagued with schedule and cost overruns. There may be a lack of trust or partnership between IT and the business. The business may need IT to deliver value faster with better collaboration with the business. Often there is an increasing demand for responsive IT processes and infrastructure that meets the business’ growing and ever-changing needs. One or, more likely, a combination of these factors may have led the organization to conclude they need to transform their IT and business interactions to “fix” what is broken. This course will give learners tools to address these situations. +
Possible segments may include:
– Creating a 90 Day Plan (within 30 days);
– Conducting a Current State Assessment (Days 1-30 of 90 days);
– Creating a Future State Design (Days 31-60 of 90 days);
– Developing a realistic Transformation Roadmap (Days 61-90 of 90 days); and
– Execution of Transformation Roadmap (Day 91+ – usually 3-5 years in total).
Other possible segments include:
– Effective Leadership (including concepts of Servant Leadership, collaborative partnerships);
– Effective Organizational Design, (getting the “right people in the right seats”);
– Leading Organizational Change;
– Effective Team-Building (to enhance teamwork and collaboration);
– Developing Meaningful Metrics;
– Communicating Progress (Successes and Challenges);
– Implementing/Improving Agile Practices (Kanban and Scrum);
– Building an Effective IT Outsourcing or Co-Sourcing Model;
– Software Package/Service Evaluation;
– IT Vendor Negotiations; and
– IT Vendor Relationship Management
The 90 Day Plan is intended to jumpstart the transformation process by timeboxing the discovery, design and high-level planning efforts to 30 days each. The first 30 days is spent understanding where the company or business area is by interviewing key stakeholders (executive leaders, area leaders, staff, other business areas that do or should collaborate, etc.) to understand the perceptions of the current state. These may not be accurate, but they are true from those individuals’ viewpoints. The next 30 days are spent designing the ‘future state’. This involves researching industry leading practices, any published standards, leading competitor practices, etc., to develop the future state design. The final 30 days are spent developing a roadmap to get to that future state in a very practical way, anticipating (and addressing) the challenges likely to be encountered. The objective is not to achieve perfection in the 30-day increments, but rather, to gain the most value in the shortest period of time applying the 80/20 Rule. In the Current State Assessment, the objective is to learn 80% of what we need to know about what is good and bad about the current state in the first 30 days, and learn the remainder (e.g., outlier situations) over time. Likewise, the objective in the Future State Design is to get it 80% right, with adjustments to be made as more is learned. The Roadmap is not ‘set in stone’, but rather it provides high level targets to achieve important milestones which can be communicated to obtain buy-in from leadership, IT staff and business areas. Overall, it should be about 80% accurate, with more accuracy in the next 6 months, and less accuracy 2-3 years out. The Roadmap is a living document, with timing adjustments expected as more is learned along the way.
A Brief History of the Use of Technology in Business
A company’s IT infrastructure is its backbone. It provides the foundation to manage business processes, serve customers and work with vendors. Technology has advanced tremendously over the past seven decades. A laptop today has roughly the same power a supercomputer had 20-30 years ago. Information systems cover many more functions of the organization than before, have become more integrated, and allow visibility into vast amounts of data to inform business decisions. Technological advancements have continued at a relentless pace, impelling businesses worldwide to modernize their IT systems to gain competitive advantages.
Tracing the IT infrastructure evolution
1930-1950: This era of electronic accounting machines used large, cumbersome machines with hardwired software. The machines were used for sorting, adding and reporting data.
1959-present: Mainframe computers were the first powerful computers that provided virtual memory and multi-tasking, and supported thousands of remote terminals. This era was defined by centralized computing managed by programmers and system operators. Minicomputers made computing decentralized to individual business units.
1981-present: The IBM PC made an appearance in 1981, attracting the attention of businesses across America. The PC era had arrived, and the big winners were Intel and Microsoft, who formed an alliance to build the software and hardware platforms for personal computing and businesses. This ‘Wintel’ partnership became weak with the growth in mobile computing; in 2017, Microsoft announced partnerships with chipmakers Qualcomm and Cavium to power its PCs and Azure cloud computing service.
Starting from 1983, the client server era reshaped the way computers were used. In this form of computing, computers (“clients”) are networked to server computers that provide services and capabilities to clients. Client/server computing provided a host of benefits, including integrated services, centralized management, improved data sharing, and data interchangeability and interoperability.
From 1992 onwards, enterprise internet began connecting computers and related devices across departments, facilitating data accessibility. Also known as a corporate network, enterprise internet uses the TCP/IP communications architecture.
Evolution of management information systems and IT management roles
A management information system (MIS) is a computer system that collects and stores information and includes tools for analyzing that information. The tools support processes, operations, business intelligence and IT. Past management information systems operated independent of other company systems. They were found only on mainframe computers and the information they processed were used only by the company management. Today, information systems, as they are commonly called, serve different organizational levels.
In the mid-1960s to the mid-70s, information systems were centralized and reserved for governance and management issues. The information systems and their reports were controlled by the accounting department. They used mainframe computers, the assembly language Fortran, databases and ethernet networks.
As the benefits of organization-wide implementation of information systems became apparent, initiatives were formed to explore the scope of additional information system projects, culminating in the adoption of minicomputers and mid-range computers.
In the third era, from the mid-80s to the late 90s, information became decentralized and the role of the Chief Information Office (CIO) emerged for planning the purchase and management of information systems for various organizational departments.
The fourth era, beginning in the late 1990s, made systems and data more accessible to all organizational employees. Technologies utilized included social media, search engines and laptops, tablets and smartphones.
The recent years have seen the emergence of cloud computing-based information systems. The value of cloud computing derives from the availability of resources in a flexible and economical manner. As a model for delivering software, platforms and infrastructure on an on-demand basis, the cloud offers businesses huge cost-saving potential.
Problems with older technology
Relying on old technology to run operations is risky for several reasons. Not only can it affect operations but also negatively impact users.
Reduced productivity: Older systems may slow down not due to age but due to the weight of new software, which require new and better hardware than what aging systems have. This bloat makes PCs and laptops slow, hindering productivity. It poses a problem when employees need to download apps or use cloud services. If the operating system freezes and installation and integrations don’t occur in the time they’re supposed to, that can affect performance at the individual and organizational levels.
Invite cyberattacks: Outdated software cannot be maintained or patched, which means nothing can be done if vulnerabilities are found. They can fall prey easily to sophisticated cyberattacks, potentially compromising corporate networks and systems. Moreover, outdated software cannot integrate with new applications and does not work smoothly on new devices. If the software is no longer supported by its manufacturer, it should be discontinued. Major upgrades to software should be considered when operations are becoming noticeably inefficient, time-consuming and costly.
High maintenance costs: On average, 31% of an organization’s technology is comprised of legacy systems. Gartner defines a legacy application as “an information system that may be based on outdated technologies but is critical to day-to-day operations.” The costs associated with maintaining legacy systems can sometimes be greater than the costs of modernizing the corporate IT infrastructure. Frequent maintenance and hardware changes can add up quickly. Organizations should also think about the missed opportunities due to their outdated technology. There is also the problem of finding software engineers who have the experience and willingness to work with older systems. The COVID-19 pandemic exposed the shortage of COBOL developers. Organizations that were continuing to use 40-year-old systems written in COBOL, (which is 60 years old), struggled to find qualified developers to support and maintain their platforms and systems.
Signs that your company’s technology needs an update:
– Systems have become slow and clunky
– Systems have come under repeated cyberattacks
– Downtimes have increased
– Legacy systems do not support the company’s growing operations
– IT security struggles to comply with industry’s compliance standards
– A high electricity bill from older computers using more energy than the latest laptops
Agile Development Practices
Agile product development processes have become popular and very effective in delivering early business value from IT projects, especially when business requirements cannot be well-defined, and the does not know what they want until they see it. Agile practices, including Scrum and Kanban, are used extensively today as they provide huge benefits over traditional “waterfall” IT development methodologies and traditional project management methods.
Life before game-changing Agile
What did software development look like in the past? And why did we move on from those approaches to Agile and DevOps? Imagine being the IT lead of a technology company in the 80s. You would identify a problem and plan a solution. The methodology followed would be to define the requirements and scope of the work; design a solution based on those requirements; build the solution; test the solution; fix any problems discovered during testing; and launch the solution.
As the product requirements were pre-determined, it was difficult (i.e., expensive) to make changes once development had commenced. Bringing a complete product to market took years. During this time, the problem the product was intended to solve underwent changes, rendering it less effective or ‘dead on arrival’. The long development times also meant that the market segments that needed the solution would have to wait for years by the time a satisfactory product hit the market. Often, the end-result was systems delivered that didn’t meet the business or customer needs. In other cases, projects were abandoned mid-flight, when it became apparent that the nature of the problem had changed and the solution had lost relevance. Frustration was commonplace amongst business and IT, (both staff and leadership), resulting in poor relationships and lack of trust.
The 90s were dedicated to changing software development and delivery approaches. It led to Scrum, Pragmatic Programming and DSDM, among other development methods, and set the stage for Agile, which first appeared as a manifesto in 2001. The Agile Manifesto laid out the core values of the proposed method, emphasizing interactions and people, working software, customer collaboration, and responding to change. By 2015, Agile adoption skyrocketed and became near-ubiquitous across development teams.
Learners enrolling in the Leading IT Transformation program will understand how the adoption of Agile methodology can help them execute IT initiatives successfully. They may employ Agile product development techniques for a range of IT (and non-IT) priorities, including automating business processes, boosting customer engagement, developing software applications, and achieving ‘quick wins’ for digital transformation initiatives.
The advantages of implementing Agile practices will become apparent in the following ways:
– Creating business value through better products or improved processes in less time
– Ensuring that staff make the best use of their time on priority tasks
– Producing noticeable and trackable improvements in processes and products
– Having greater control over projects and improving project predictability
– Improving collaboration between the business and IT, and amongst IT application development and infrastructure areas
– Creating opportunities for team engagement and lifting team morale
The enduring power of Kanban
Kanban started life as a planning system in the 1940s on the Toyota manufacturing line. Its goal was to manage and control work and inventory at each stage of production optimally. As a visual system for managing work as it moves through a process, as well as see the work passing through it, Kanban could be applied to anything, in any industry. Nike used Kanban to standardize work and reduce overtime at its supplier factories, salvaging its reputation – which had come under attack from activists – and setting new standards for employee well-being in its industry. Jaguar leveraged the system to shorten time-to-market and improve new product design.
What makes Kanban so relevant in the digital world is its facilitation of team collaboration, work item visibility and efficient work planning. Companies may use Kanban boards to manage IT project work from development to deployment. Leading IT Transformation includes Kanban implementation and fine tuning, helping learners apply Kanban where it has potential to make a meaningful difference. There are many advantages of using Kanban as a way of working, including workflow transparency, ability to deliver a solution continuously, flexibility in managing changing or incoming project requirements, reducing wasted work, increasing efficiency and productivity, and focusing team member efforts on priority work while ensuring clarity and eliminating confusions.
Kanban is considered an Agile framework, although it doesn’t necessarily involve incremental and iterative development. It focuses on visualizing the work items on Kanban boards, to increase transparency and collaboration among team members (including business and IT staff). Kanban can be very useful in visually prioritizing work, identifying blockers (things that prevent progress) and demonstrating progress with the backlog of small maintenance projects, as well as when a solution is being developed and/or implemented over a long development cycle/duration. Kanban is also a powerful change management method.
An understanding of Kanban’s applications and advantages will help learners of this program apply change quickly or gradually in support of IT transformation. By implementing Kanban, they will be able to provide managers and staff a visualization of the work, create a feedback and review system, identify and manage bottlenecks, spot and use opportunities for improvement, and introduce the right performance metrics.
In-house versus Outsourcing
Software powers virtually everything around us. Every company has a need for software that supports or improves operations, solves customer challenges and provides a competitive advantage. Software development and maintenance can be staffed in-house with employees, outsourced, or co-sourced. The success of software development depends on a range of factors. The team working on it must possess the right technological skills to design and develop the software as it was intended, to serve a particular function or set of users.
Organizations that do not have the necessary internal resources to develop and/or support the IT application portfolio or are unable to cost-effectively meet the IT infrastructure needs, may engage the services of an external vendor, (e.g., software development company, software support vendor, infrastructure provider). Outsourcing is practiced by small and large companies alike and has been around since the 1980s.
One of the earliest examples of outsourcing dates to 1989, when Eastman Kodak outsourced the design, building and management of a data center to IBM. Hundreds of Kodak staffers were transferred to IBM. Other companies took note, and the IT outsourcing industry began. The growth of computer networks in the 90s led to the creation of application service providers (ASPs). Over the years, managed service providers (MSPs) have formed to deliver applications, networks, security and infrastructure remotely via pay-as-go models that enable businesses to save costs and increase or reduce the resources they pay for as they scale their business up or down.
A company may outsource an entire function or some activities only. Surveys indicate that 59% of businesses use outsourcing to reduce their operating expenses. The top outsourcing countries are India, Ukraine, Poland and Argentina.
The decision to outsource the company’s IT infrastructure management, cybersecurity monitoring or software development, is informed by the realities facing the organization, and strategic advantages that outsourcing brings. Common reasons for outsourcing include controlling operating c