IT industry monetization has evolved dramatically; quantum computing adds yet more choice, and therefore more complexity, for purchase decision makers
For years, technology buyers had to consider hardware choices ahead of anything else. Software solutions were built to the specifications of proprietary hardware architectures and operating systems such that the business outputs were inextricably linked to the hardware vendors and the software vendors that wrote to them. Chart 1 depicts the core influences that dictated not only how technology was purchased but also how the industry itself evolved. Choke points in innovation across the continuum of compute, storage and networking brought new entrants into the market, ameliorating the bottlenecks and helping to spur overall scientific and engineering advancements to keep pace with business demand for “faster, better, cheaper” solutions.
The only fundamental difference between the two charts from the historical picture on the left to the current picture on the right has been removing input and output devices from the equation, thanks to cheaper device access for human interaction with what can broadly be defined as the compute network. Science and engineering have brought down price points and simultaneously ramped up compute power. The software abstraction inventions that brought us virtualization, however, brought forth a series of innovations that have driven far greater complexity into the purchase equation, which has not only disrupted technology vendor business models but also radically transformed the way in which businesses can consume and deploy technology for competitive advantage.
Cloud, the first of two major inflection points for business buyers
From the IT side of business, the three transformative elements have been virtualization, standardization and automation. Virtualization abstracted the compute power from physical infrastructure, standardization established rules of engagement between systems, and automation accelerated deployment and stripped labor out of the process. This reality has flipped the axis on IT vendors and business buyers alike and makes the infrastructure a derived decision rather than a primary or constraining decision.
From the business side, virtualization does not necessarily apply, but standardization and automation certainly do to exploit technology fully for competitive advantage. To capitalize on compute today, business units and trading partners have to come to a consensus around business rules and then gain strict human compliance with those rules when interacting with the systems. Without human compliance, the automation will require labor remediations that will put that business at a competitive cost disadvantage against those enterprises that have clean system data from compliant human inputs. That clean data is what fuels the outputs that machine learning (ML) and AI generate. Blockchain, aka distributed ledger technology, is the codification of those business rules into smart contracts that can distribute the clean data as needed to the ecosystem participants, who can then apply that data against their AI systems for business value creation.
This reality was on full display several years ago during an EY event breakout session discussing how ServiceNow could be used to improve clients’ business processes. The presenter, who was being peppered with “What about this?” questions from an eager and interested audience, answered both vaguely and succinctly by saying with a wry smile, “We can do whatever you want; you just have to make up your minds.” This is the business process change management necessary to establish the business rules or standardization that can now be automated through all elements of the business value chain. However, there must be industrywide consensus for the full power of blockchain and AI/ML in business processes to be unleashed.
Register to view this content in full.Register Login