Skip to main content

In simple terms, digital transformation involves a series of digital initiatives aimed at achieving specific features. To successfully carry out these initiatives, a robust development framework is needed to integrate various advanced technologies. Let’s understand how smart digital transformation works by looking at the bigger picture and then examining each technology involved.

The Bigger Picture

Many people see smart digital transformation architectures as a networking stack. However, the real value lies in viewing these technologies as part of a cyber-physical system. At the center, there’s the software-defined product, which collects data from internal sensors and interfaces with external systems to gather additional data. All of this is connected through a networking fabric. From this perspective, it becomes clear that smart products or operations are defined by software and driven by data.

Role of the Digital Twin

In this context, the digital twin plays a crucial role as the generator of value, with other technologies supporting it. The transformation of data into useful information is where the incremental value of a digital transformation comes from. Internal data is collected from sensors in the Internet of Things (IoT), while external data is sourced from online systems. Both sources of data are processed by the digital twin and data analytics, creating software-defined and data-driven outcomes. This is similar to what big tech companies have been doing with their software products for years, and now we can apply the same principles to physical products and operations.

Key Technologies

The key high technologies involved in smart products or operations include the digital twin, IoT, data analytics, extended reality, and blockchain. These are the tools used to implement each digital initiative that contributes to the overall digital transformation process.

Digital Twin

All the technologies used in digital transformation are deployed to support the digital twin, which is the core component responsible for the software structure and data analytics. The digital twin is the most important technology in the process of digital transformation. Think of the digital twin as a simulation. It consists of a collection of models and applications, along with their respective data structures.

Models are not new. We’ve had models in computer-aided design (CAD), animation, and video games for many years, and they can simulate various things. However, in our case, the digital twin simulates how value is created. It’s worth noting that there are two instances of the model and its data structure: one for the application and another for the analytics.

The digital twin model is dynamic and operates within a closed feedback loop. It incorporates real-world data from its own sensors and external systems. These models are represented statistically because the real world is too complex to capture precisely. Like all statistical models, it begins with sampling cause and effect and becomes more accurate over time.

Application Component

The application component of the digital twin defines how the product functions, its usability, and its utility. It also provides interfaces for humans and data, orchestrating the flow of data to and from the model. Most importantly, it acts upon the model to generate value. Let’s take a simple example of a baseball video game. The application would be the video game itself, and the models would be the different players in the game. The application effectively serves as a simulation that accesses the models (the synthetic baseball players), and potentially allows humans to play the game through an interface. This is how the digital twin operates within smart products or operations.

Significantly, once the physical product is represented mathematically, it becomes similar to software. Like software, it can interact with other software through application programming interfaces (APIs). Models come in two forms: statistical models, implemented by linking statistical libraries, and learning models, often referred to as machine learning, implemented by training models using collected cause and effect data from the Internet of Things (IoT) and utilizing them in real time. These models are also implemented using machine learning libraries linked to our application.

Dual Purpose of Models

The models within our digital twin serve two purposes. First, offline in the analytics phase, they are used to create the models and define the parameters of specific statistical or machine learning models. Second, during runtime, they reside in our application and are executed, sometimes called inference. This execution generates value. As mentioned earlier, models are dynamic and continuously go through a cycle of building, running, improving, and running again.

Application Deployment

The application is written, compiled, and executed on various computing platforms, including the cloud, fog, mist (which is really the embedded system), mobile devices, and desktop platforms. It connects to external systems through APIs, so the application must be linked to external system libraries. The digital twin forms the foundation of virtualization, which is essentially what we do when we transform physical entities into digital form. The digital twin simulates all the data flowing through our smart products or operations.

Internet of Things (IoT)

All successful smart digital transformations rely on data to generate value. The Internet of Things (IoT) plays a crucial role in collecting that data. Therefore, it’s essential to have a well-designed IoT network spanning different layers like mist, fog, and cloud.

Understanding IoT

Think of IoT as an extension of the internet into physical objects. It connects the embedded system in the product (known as the endpoint) to the IoT cloud or platform. IoT acts as the nervous system for all smart products and operations, whether it’s consumer devices like smartwatches and smart appliances, or business machinery, factories, and cities.

Purpose of IoT

The main purpose of IoT is to collect data from sensors, but it can also send data to actuators like servo motors. Sensor data from endpoints is packaged into protocols and transmitted through proprietary and IP networks to the application located somewhere in the network fabric.

Variety of Sensors

There are millions of sensors available, and their prices keep decreasing each year. These sensors can measure various aspects of the physical world, such as:

  • Physics (e.g., accelerometers)
  • Chemistry (e.g., CO sensors)
  • Biology (e.g., glucose sensors)

The application also collects external data by interacting with data services, business systems, and other smart products or operations through their software interfaces. Data services can include weather or mapping data, while business systems may involve CRMs, PLMs, or ERP systems to manage data within a company.

Connecting Smart Products

Connecting smart products to other smart products has a transformative impact. The ability to orchestrate these connections creates a network effect that enhances the value creation process. Once the sensor and external data are collected, the digital twin application determines how to process it. Typically, the data is stored in a data lake or fed into models being built in the cloud. The application can then use this data to generate useful information for users or operators. The IoT is implemented using networking hardware and middleware in the mist, fog, and cloud layers.

Data Journey

The data begins its journey within the smart product or operation. The sensors, actuators, and embedded systems (circuit boards) within the smart product or operation are collectively referred to as the mist.

The data is transmitted within the product or operation through an internal network called the operational technology (OT) network, which consists of gateways and switches. From there, the data is passed to the information technology (IT) network, usually via wireless protocols like Bluetooth or Wi-Fi. The IT network, composed of routers and switches, transports the data to the internet, which then leads it to the digital twin running on an application server in the cloud. The cloud is optimized to handle the specific data and protocols required by IoT and is known as the IoT cloud.

IoT Cloud and Platforms

The IoT cloud can be obtained from major cloud providers. Alternatively, an IoT platform, which serves as specialized middleware integrating the cloud, fog, and sometimes the mist, can be used. The IoT platform is offered as a service and often includes runtime analytics and an application development environment. It’s important to note that successful digital transformation is not primarily about networking. Networking is the infrastructure that supports the value creation process.

Data Analytics and Machine Learning

The main value gained from a digital transformation comes from turning data into useful information. Analytics and machine learning play a crucial role in transforming the data, making them essential for any deployment of smart products or operations.

Role of Data Analytics

Data Analytics combines traditional statistical techniques with modern machine learning algorithms. Statistics is a familiar concept, such as using linear regression in Excel to fit a line to data points. Interestingly, linear regression is considered both a statistical technique and a supervised machine learning algorithm. The distinction lies in its usage. If it’s a one-time event with fixed data, it’s statistics. If you’re training a model with one dataset to make predictions about another dataset and continuously improving the model using large datasets, it’s machine learning.

Machine Learning Explained

Machine Learning is essentially statistics wrapped in learning software that adjusts its model parameters to improve accuracy. In the example of linear regression, machine learning techniques modify the coefficients to find the best fit. When dealing with a small number of parameters, it’s manageable. However, in complex deep learning networks with thousands or even hundreds of thousands of parameters, machine learning is necessary. Analytics takes the data and transforms it into a model that can be queried by application software to extract useful information.

Key Tasks of Data Analytics

Data analytics performs four main tasks: identification, prediction, grouping, and association. Let’s examine each of these tasks individually.

  • Identification: Data analytics identifies things using classification algorithms, such as facial recognition.
  • Prediction: It predicts things using regression and dimensionality reduction algorithms, like forecasting vaccine effectiveness.
  • Grouping: It groups things together using clustering algorithms, such as spam filtering.
  • Association: It associates things together using association algorithms, such as in medical diagnoses.

Types of Machine Learning

Machine learning encompasses different types of learning software categorized as supervised learning, unsupervised learning, and reinforcement learning algorithms. Let’s explore each of these types.

  • Supervised Learning: This type is used for identification tasks, like Amazon Alexa’s speech recognition, and prediction tasks, such as GE’s jet engine predictive maintenance. It’s called supervised learning because labeled data is required. For instance, in speech recognition, each waveform is labeled as a word.
  • Unsupervised Learning: This type is employed for grouping or association tasks, like recommender systems used by Netflix and Amazon, or customer segmentation in large-scale advertising systems like LinkedIn ads.
  • Reinforcement Learning: This type is used for decision-making, like EA Sports game AI and Roomba robot navigation.

Implementation of Data Analytics

Data analytics is implemented by data scientists or teams who use data analytics software libraries programmed in languages like R, Python, or MATLAB. These libraries are integrated into the runtime software of smart products or operations. Analytics are utilized within the application or runtime environment to execute models, also known as inference, anywhere in the computing infrastructure, including the edge.

Role of Analytics Outside the Application

Analytics are also employed outside the application, typically in the cloud, to create models and analyze them to examine the past or make predictions about the future. When considering incorporating machine learning into your smart product or operations, think about how it can enhance the perception (seeing, hearing, feeling) to improve identification, prediction, grouping, or association capabilities.

Extended Reality

Sometimes, it’s beneficial to take the information generated by our software analytics and apply it in the physical world. This is where extended reality (XR) comes into play. XR uses sensory devices to present information from smart products or operations in a more useful and actionable way. It goes beyond simply viewing information on a flat screen by extending our ability to see, hear, and touch with helpful information.

Enhancing Senses with Extended Reality(XR)

For instance, XR can provide field service personnel with enhanced senses, improving their ability to see, hear, and sometimes even feel. Most XR deployments focus on extending vision by displaying information in the user’s field of view. However, XR can also extend hearing by playing sounds in the user’s ears and extend touch by applying forces to the user’s hands or body.

Types of Extended Reality

There are different ways to extend reality, ranging from physical world reality to a combination of physical and virtual reality, and finally to virtual world reality. We have assisted reality (AR), augmented reality (AR), mixed reality (MR), holographic reality (HR), and virtual reality (VR). Let’s explore each one.

  • Assisted Reality: Displays 2D virtual information in the 3D physical world. Examples include heads-up displays in cars or smart glasses like Google Glass.
  • Augmented Reality: Displays 2.5D virtual information in the 3D physical world using different 3D coordinate systems. Pokemon Go is a popular example.
  • Mixed Reality: Displays 3D virtual information in the 3D physical world using the same coordinate system. The HoloLens is a great example.
  • Holographic Reality: Displays 3D virtual information in the 3D physical world, but with different coordinate systems. It’s like looking at a 3D screen that’s separate from our own reality.
  • Virtual Reality: Displays completely synthetic 3D virtual information. Oculus systems from Facebook are a well-known example.

Converting Information in XR Systems

In XR systems, information is converted into visual, audio, or haptic formats:

  • Visual Information: Displayed in the user’s field of view and can be in various dimensions (1D, 2D, 3D, or even 4D). Examples include overlaying contextual information on malfunctioning parts, guiding technicians through complex procedures, and training users in rare real-world scenarios.
  • Audio Information: Converted into the audio frequency domain and played in mono, stereo, or spatial sound through the user’s ears. Examples include reading pre-written texts triggered by an application, announcing AI-discovered findings, and translating raw sensor data into sound for specialists to interpret.
  • Haptic Information: Converted into haptic force and applied to the body, usually the hands, using specialized actuators. Examples include amplifying vibrations of a part for specialists to feel, providing navigation feedback for performing tasks accurately, and delivering alerts and vibrations to notify users.

XR Hardware and Application Interaction

XR hardware is worn or held by users of smart products or operations. This hardware includes screens or wearables like visual headsets, earphones, or haptic devices. The smart application controls the hardware by interacting with the sensory components through a specific API, enabling the transfer of information to the sensory device for rendering. Extended reality enhances how people use their senses to interpret and apply the information generated by smart products or operations.

Extended Reality (XR) enhances the way we see, hear, and touch information from smart products or operations. It includes various types like assisted reality, augmented reality, mixed reality, holographic reality, and virtual reality. XR systems convert information into visual, audio, or haptic formats and display it through hardware like visual headsets, earphones, or haptic devices.


Blockchain is a technology that establishes trust between parties using algorithms. It records a history of transactions or events in a secure and unchangeable way. It can also execute smart contracts, which are software programs that run when specific conditions are met.

Understanding Blockchain

Bitcoin is a well-known example of blockchain, but it uses a public and open form of blockchain that is not suitable for business purposes. Instead, businesses use permissioned blockchain, which eliminates the need for mining and its associated high costs and environmental impact. An example of a business using blockchain is Bumble Bee Seafood, which uses it to provide transparency in their tuna supply chain.

Benefits of Blockchain

The main benefit of blockchain is the ability to trust the data shared between parties without relying on a central authority. This has significant technical implications because it enables trusted transactions and contracts between machines. It also has important business implications, as it reduces the need for intermediaries and can save time and money. For example, wiring money between banks can be expensive and time-consuming due to the involvement of intermediaries, but blockchain can streamline this process.

Real-World Examples

There are various real-world examples of smart contracts using blockchain. For instance, vending machines can use blockchain to autonomously order parts when they detect a malfunction. Lockheed Martin uses blockchain to verify the origins and history of electronic components to combat counterfeiting and improve supply chain management.

Blockchain Process

The process of using blockchain involves initiating a transaction or event between parties, verifying the transaction, storing the data in blocks, distributing the ledger to all computers on the blockchain network, and applying a hashing function to ensure trust and consensus among the network participants.

Security and Immutability

Blockchain offers trust through immutability. Any change in the data recorded in the chain would result in a change in the signature. Additionally, the distributed nature of blockchain makes it highly secure. Any attempt to forge the signatures would require control over a majority of the network’s computers, which is not practical.

Implementing Permissioned Blockchain

Permissioned blockchain can be implemented by building a private network or using a blockchain-as-a-service platform. However, it’s important to note that blockchain technology is still in its early stages. Businesses should carefully consider whether it aligns with their value proposition before adopting it. Waiting for more mature and scalable solutions may be a wise approach for some.

Blockchain is a technology that provides secure, immutable records of transactions or events. It is beneficial for establishing trust without a central authority and can execute smart contracts. Permissioned blockchains are more suitable for business purposes compared to public blockchains like Bitcoin. Blockchain’s decentralized nature offers significant security and efficiency benefits, although the technology is still developing and may not be suitable for every business at this time.