A decade ago, many still questioned the relevance of digital technology. While Internet penetration was already significant, e-commerce made up less than 4% of retail sales. Mobile and cloud computing were just getting started and artificial intelligence was still more science fiction than reality.
Yet today, all of those things are not only viable technologies, but increasingly key to effectively competing in the marketplace. Unfortunately, implementing these new technologies can be a thorny process. In fact, research by McKinsey found that fewer than one third of digital transformation efforts succeed.
For the most part, these failures have less to do with technology and more to do with managing the cultural and organisational challenges that a technological shift creates. It’s relatively easy to find a vendor that can implement a system for you, but much harder to prepare your organisation to adapt to new technology. Here’s what you need to keep in mind:
Probably the most common trap that organisations fall into is focusing on technology rather than on specific business objectives. All too often, firms seek to “move to the cloud” or “develop AI capabilities.” That’s a sure sign you’re headed down the wrong path.
“The first question you have to ask is what business outcome you are trying to drive,” Roman Stanek, CEO at GoodData, told me. “Projects start by trying to implement a particular technical approach and not surprisingly, front-line managers and employees don’t find it useful. There’s no real adoption and no ROI.”
So start by asking yourself business related questions, such as “How could we better serve our customers through faster, more flexible technology?” or “How could artificial intelligence transform our business?” Once you understand your business goals, you can work your way back to the technology decisions.
Technological change often inspires fear. One of the most basic mistakes many firms make is to try to use new technology to try and replace humans and save costs rather than to augment and empower them to improve performance and deliver added value. This not only kills employee morale and slows adoption, it usually delivers worse results.
A much better approach is to use technology to improve the effectiveness of human employees. For example, one study cited by a White House report during the Obama Administration found that while machines had a 7.5 percent error rate in reading radiology images and humans had a 3.5% error rate, when humans combined their work with machines the error rate dropped to 0.5%.
The best way to do this is to start with the most boring and tedious tasks first. Those are what humans are worst at. Machines don’t get bored or tired. Humans, on the other hand, thrive on interaction and like to solve problems. So instead of looking to replace workers, look instead to make them more productive.
Perhaps most importantly, this approach can actually improve morale. Factory workers actively collaborate with robots they program themselves to do low-level tasks. In some cases, soldiers build such strong ties with robots that do dangerous jobs that they hold funerals for them when they “die.”
Another common mistake is to think that you can make a major technological shift and keep the rest of your business intact. For example, shifting to the cloud can save on infrastructure costs, but the benefits won’t last long if you don’t figure out how to redeploy those resources in some productive way.
For example, when I talked to Barry Libenson, Global CIO of the data giant, Experian, about his company’s shift to the cloud, he told me that “The organisational changes were pretty enormous. We had to physically reconfigure how people were organised. We also needed different skill sets in different places so that required more changes and so on.”
The shift to the cloud made Experian more agile, but more importantly it opened up new business opportunities. Its shift to the cloud allowed the company to create Ascend, a “data on demand” platform that allows its customers to make credit decisions based on near real time data, which is now its fastest growing business.
“All of the shifts we made were focused on opening up new markets and serving our customers better,” Libenson says, and that’s what helped make the technological shift so successful. Because it was focused on business results, it was that much easier to get everybody behind it, gain momentum and create a true transformation.
Consider how different work was 20 years ago, when Windows 95 was still relatively new and only a minority of executives regularly used programs like Word, Excel and PowerPoint. We largely communicated by phone and memos typed up by secretaries. Data analysis was something you did with a pencil, paper and a desk calculator.
Clearly, the nature of work has changed. We spend far less time quietly working away at our desks and far more interacting with others. Much of the value has shifted from cognitive skills to social skills as collaboration increasingly becomes a competitive advantage. In the future, we can only expect these trends to strengthen and accelerate.
To understand what we can expect, look at what’s happened in the banking industry. When automatic teller machines first appeared in the early 1970s, most people thought it would lead to less branches and tellers, but actually just the opposite happened. Today, there are more than twice the number of bank tellers employed as in the 1970s, because they do things that machines can’t do, like solve unusual problems, show empathy and up-sell.
That’s why we need to treat any technological transformation as a human transformation. The high value work of the future will involve humans collaborating with other humans to design work for machines. Get the human part right and the technology will take care of itself.
Originally posted here.