Back to the Future: Getting back to integrated systems
Back when computers were a shiny new concept, the hardware and software were tightly linked together. This tight coupling was mandatory because of the costs and technical immaturity of computing. The most powerful computers in the 1970s had less computing power than a smartphone has today. Over time as systems became more powerful and less expensive vendors began decoupling hardware from software. This gave rise to both the minicomputer and the PC markets. Business leaders loved the new freedom presented by being able to own and control their own systems. As a result, in the late 1980s and early 1990s, there was a renaissance of innovation in software – both in terms of packaged applications, tools, and software development technologies. I write extensively about this technology transitions brought about with these changes in my book, Smart or Lucky? How Technology Leaders Turn Chance into Success.
This innovation also gave rise to innovative distributed computing technologies like client/server and web based approaches that transformed both the technology industry and business itself. As with any industrial change, there were unintended consequences to the new frontier of distributed computing. Suddenly there were thousands of independent systems inside data centers that were difficult to manage. Ironically, in the aggregate these distributed systems were as complicated and costly as the mainframe systems that businesses were rebelling against. So, it is not surprising that an interesting trend that is taking hold that is upending the computer industry again. Let’s call it back to the future. We are now seeing a transition back to vertically integrated systems.
Apple has adhered to vertical integration for most of its history. The company has been able to distinguish itself in the market and avoid the commodity trap by vertically integrating its hardware with its operating system. By doing this, Apple was able to achieve a seamless customer experience – at a premium price – that its competitors have not been able to match. But what about in the enterprise market? There is a profound change emerging. Vendors including IBM, HP, Oracle, and EMC are increasing tuning their hardware and software to work as optimized systems.
Why is this necessary? After all, systems have become more powerful and capable than ever. There is also the assumption that hardware has become a commodity. Therefore, if hardware is a commodity there is no problem with simply adding more and more systems into the mix until the right amount of power is available. So, why the need to change? The answer is actually quite simple. The more moving parts you add into a system, the more energy, time, and money you have to spend making the parts work as a unified system.
This is not a new problem. In fact, it has been addressed over the past decade with virtualization technology that in essence provides an operating system layer on top of systems so that individual systems can be managed and controlled more efficiently. Virtualization has begun to dramatically transform distributing computing, making it possible to leverage the available resources in an effective manner. But virtualization in isolation is still not enough because it can have unintended consequences. For example, virtualization works by creating images of resources that can be used to execute a variety of processes. These copies consume space and resources on a system. In addition, these orphaned images potentially open holes in a company’s security firewall. When not removed after they are not needed, they will consume the system, impact security and undo all the benefits.
The consequences of complexities of managing piece parts has led to a new trend—vendors beginning to package and optimize their hardware with software. If software can be injected into hardware it becomes much more flexible and can be changed and modified on demand. Ironically, in many ways the very nature of hardware is changing so that it has the modularity and flexibility that we used to only associate with software. With the added flexibility, hardware is now being coupled more directly with the supporting software elements to create a new generation of vertically integrated systems. Focus isn’t just based on the underlying performance of the hardware but the systems incorporate best practices for a specific industry. For example, a vertically integrated system could be optimized for the needs of a healthcare research firm that needs to manage massive amounts of data but does not need strong transaction management support. Therefore, the hardware would be optimized to handle large amounts of data. The system would then include the software elements needed by that organization including software designs that have proven effective to support this type of business requirement.
The result of this approach is a tighter integration of hardware with software that are now designed not as independent elements that happen to be force fit together but an orchestrated system. As with anything else, there are hitches. This is a trend that should serve customers well, but a vertically integrated system has to be designed with modularity at the core. The system must be able to support change. Of course, it is much easier to create a tightly integrated system but it could be dangerous because of its rigidity. When you look to solve business problems through vertical integration make sure that it is implemented as a set of modular services that can evolve with business needs.