IoTsoftwarewireless

FAIT ACCOMPLI – The Curiously Difficult-to- Predict Future of Software

By September 16, 2020 October 9th, 2020 No Comments

The simple truth is that humanity is trying to build a future on inadequate systems and structures from the past. We might as well be wandering through the uncharted jungles of the 21st century with walkie-talkies and reconnaissance reports from 25 years ago. For all the silicon-based “intelligence” permeating every aspect our lives, we still live in a brutally dumb world.

TWO SIDES OF THE SAME COIN

We at Harbor often refer to business architecture and technology architecture as two sides of one coin. By business architecture we mean the whole set of pre-existing conditions that go on inside a business, including the evolution of all the things a company does with data, applications, and systems. Technology architecture, by contrast, refers to the set of tools that a company adopts to go out and do things. Today those two architectures are becoming more tightly coupled, but that coupling also tends to militate against innovation—or even against doing anything unique.

When it comes to preparing for the global information economy, most people assume the IT, telco, and automation technologists are on the case, taking care of business. The well-managed creation of the future is essentially viewed as a fait accompli. This is equally true of both developers and adopters of new Smart Systems and IoT technologies—whether it’s the IT and telco “arms merchants” on the development side, or the automation vendors and, worse yet, the CIOs on the adoption side.

We think that perspective is painfully naive. The simple truth is that humanity is still trying to build a future on inadequate systems and structures from the past. We might as well be wandering through the uncharted jungles of the 21st century with walkie-talkies and reconnaissance reports from 25 years ago. For all the silicon-based “intelligence” permeating every aspect our lives, we still live in a brutally dumb world.

HISTORY BLINDS US

The problem with IT in the enterprise is that its roots go back to batched processing. Think of a giant soup pot into which you’re always throwing more data. You’re trying to predict what will happen next week based upon what happened last week. Even if you add AI and machine-learning to that soup, it’s still batched processing.

Then wireless carriers came along with real-time processing at scale. This was a huge development, but the carriers had no idea what an application was outside of a cell phone. Then we got embedded systems and automation, and enterprises finally had a thoroughly “stateful” awareness of the world—spatial, temporal, time-series, real-time. But they were still just trying to control things. They weren’t leveraging the inherently valuable information available from the data in these systems.

Squint your eyes and the story looks like this: Once there was enterprise, then there was the enterprise plus web, then there was the enterprise plus web and mobile. Today we have the enterprise plus literally everything—including IoT but not ending there. IoT is simply the latest expansion of this domain, and it will eventually become just another transformational fact of life, much the way mobile has.

There is a huge opportunity at the intersection of these three evolutions: batched, wireless, and embedded. But most, if not all, of the so-called platform players have been unable to bend their experience and evolve their views of the world of software beyond the blinders they are wearing—blinders that trap them inside their supply-side technical development cultures and prevent them from clearly seeing the future.

ORCHESTRATING INNOVATION FOR FUTURE ENTERPRISES

The modern business enterprise has been deconstructing for decades. Companies used to develop the logistics, tools, and processes they needed right inside their four walls. Today, no one thinks of a company as bound by the four walls of a building. Companies are ecosystems now, value-delivery networks consisting of a disassembled set of business functions and entities—some owned directly, many sub-contracted, but all requiring orchestrated data and information. Because enterprises have diverse users, functions, and entities all with an overabundance of data flows and interactions, they need optimized tools to orchestrate the value presenting itself. Two critical forces are at work underlying this shift:

  • The expanding number and diversity of new applications enterprises want to develop, along with the corresponding failure of software development organizations to keep up with the rapidly growing demand; and
  • The advent of multiple new classes of data-driven applications, including AI, machine learning solutions, and Internet of Things (IoT) applications.

Enterprises are struggling today to turn the operational data generated by their people, machines and fleets into tangible business value. Data is often trapped in machines, equipment, and incompatible systems, or stored locally on workstations and drives. Extracting value from diverse data types and disparate data sources requires special skills that are in short supply—including cloud server provisioning, data science, multiple programming languages, and more. Most development organizations have become overwhelmed with the proliferation of new application requirements.

The collapsing of the traditional boundaries in today’s enterprise has coincided with the collapsing of the technology stack, which used to force so many expensive decisions about how an enterprise would become digital. Once upon a time, buying IBM hardware versus one of its competitors was a huge decision. Then Linux came along and suddenly that decision was history. Even more painful debates arose around the choice of programming languages, with advocates making impassioned cases for their preferred way of writing code—most of which no one even uses anymore.

Today we have containerization, languages are less important, embedded virtualization is radically altering distributed edge computing systems, micro-services are becoming pervasive, and it’s all enabling big changes in the way we think about software. But it’s also creating a huge amount of complexity.

We’re entering a world of vast and diversified services from giant horizontal service providers, with a huge number of wildly unique vertical specialists doing things for them. The days of the monolithic app are over. Software tools, applications, and infrastructure are each changing rapidly but at the same time converging in ways that cause huge disruptions in the marketplace, including:

  • Software is infiltrating virtually every market, discipline and niche, indisputably eating the traditional profits and revenue models across most industries;
  • Software is not only displacing hardware and physical equipment systems, it’s displacing services and labor;
  • In this new chapter every company, no matter what their legacy business, is becoming a software player—often with scant appreciation for the vastly different realities of that world.

The forces at work in the software arena make strategic decision-making extremely difficult. The velocity of change conspires with the number of variables in play to overtax many managers’ ability to make confident and informed decisions.

In this light, is the creation of a global information economy really just a fait accompli? A done deal? We have good reasons to doubt it.


Reproduced with permission. Visit Harbor Research, a ASHB member, to view this article in its original publication format and to  download Harbor Research’s overview of the “Smart Systems and IoT Software Opportunity.”