9 obstacles that might be blocking you from digital transformation


According to Gartner, by 2023, businesses that value and strive for data sharing will outperform those that don’t. However, simultaneously, they predict that through 2022, less than 5% of companies will get their data-sharing programs right. This means being able to locate and correctly identify trusted data.

When it comes to businesses, the ability to deliver proof of concept in less than a month and achieve ROI in less than six months is key to gaining internal buy-in for undergoing digital transformation. Still, there are plenty of potential blockers that can prevent your organization from getting the green light.

In this article, we discuss the nine obstacles that can hinder digital transformation. We also shed light on why looking for a trusted vendor (instead of building out the process from scratch) is the only option that guarantees success.

Two approaches to digital transformation

There are two ways companies can approach digital transformation:

  1. Through putting into work analytical data for statistical analysis and optimization purposes
  2. Through putting into work operational data for digitalization of traditional processes.

In the latter case, it’s crucial to ensure that data used in these processes is up-to-date (i.e., reflects the state of real objects in real time). And this can’t be done without the right data management strategy and infrastructure in place.

Bear in mind that to unlock the full potential of digital transformation, it’s recommended to go with both approaches, i.e., the digitization of traditional processes, where shared processes run on real-time data and are optimized based on historical data. And this can’t be achieved without secure data governance. However, regardless of your chosen path, most of the pain points discussed below are valid.

Common obstacles standing in the way of digital transformation

1. Lack of buy-in

A lot of businesses realize the benefits that come with digital transformation, however, despite their best efforts, many projects fail. One of the reasons why it happens is the lack of senior management buy-in. To increase the chances of success, and to persuade management that digital transformation is worth pursuing, it’s vital to start small – with one business case. Build proof of concept in less than a month and give it six months to prove value, not longer. If it works, then expand it to other areas – this way it will be a lot easier to gain management buy-in.

Source: Unsplash

It’s important to mention that the success of digital transformation isn’t limited to reliable technology only, it also involves people. If employees don’t see any value in the newly provided solutions and refuse to use it, the entire implementation will go to waste. That’s why it’s crucial to put the right digital strategy in place before starting any digital transformation project. Among others, it should account for workers’ pain points and abilities, and define goals that the organization is trying to achieve through digital transformation.

Truth to be told, only companies, which invest in the right people, technology, and processes will be able to become digital-first. Selecting only one or two factors simply won’t be good enough to go through digital transformation successfully.

2. Inhibited access to data

Many organizations have the tendency of limiting access to their data. Unfortunately, as they discourage data sharing, they also nurture existing silos. While this stems from the fear of data breaches, such an approach keeps companies from reaching their full business potential.

As Gartner aptly puts it, the “don’t share data unless” mentality should be replaced with a “must share data unless” one. If you use an infrastructure that guarantees the highest levels of data security and access control, then nothing will stand in the way of embracing this mindset.

This, in turn, will come with plenty of benefits. Your leaders and data team members will be able to refer to the right data whenever need be. Thus, they’ll be able to achieve agility, make well-informed decisions and get the most out of digital transformation.

3. Siloed & dispersed data in legacy systems

Here, we observe a lack of a single source of truth, i.e., comprehensive and trusted information about an object or the actions of an employee or member of an organization. 

To give you a sense of how legacy systems influence operations, let’s refer to a joint study by the scholars at Harvard Business School and Stockholm School of Economics. They surveyed a number of large corporations, which existed long before the big tech revolution of the 1990s. Next, they compared these organizations to companies that were set up in the 21st century. All of these businesses operate in a few major industries, including finance, healthcare, and manufacturing.

To decide whether any of these organizations worked on legacy servers, they asked them if they relied on at least one third-party maintenance (TMP) for servers before 2016. 

The results? The study found that those that relied on legacy servers displayed, on average, a 12% poorer data architecture coherence as compared to organizations that reported no legacy server.

Source: Unsplash

So, what can you do to relieve yourself from the dependency on any legacy systems? You can turn to Trusted Twin – we provide the means for setting up, storing, and handling business-relevant objects. These are created from data from multiple sources (i.e., different legacy systems or even different organizations).

Trusted Twin isn’t another copy of your entire data. We are the place where selected pieces of data from your silos are connected together and converted into business-relevant knowledge to fuel your digitized processes (i.e., objects powered by the Digital Twin concept).

4. Data is disconnected

Companies use different legacy systems, which makes it hard to identify the same objects. Especially, if they use isolated data pockets functioning in each of the applications that are used separately. As a result, a lot of businesses cannot gather all the necessary information about a single object as different systems use different identities, which makes matching hard.

Managing different identities (i.e., connecting and aggregating distributed knowledge of a single object) becomes even more vital when cooperating with external partners in a larger ecosystem. For example, if we put together multiple smaller data sets, which are owned by a few partners that work together we get a larger dataset through synergy. This way all partners who participate in a shared process can benefit from this data. However, this is only possible with correct identity matching.

The issue of data disconnect can be easily tackled with a platform like Trusted Twin. It provides means for aggregating data from different systems by managing and translating custom identities. Thanks to Trusted Twin you gain a place where comprehensive and aggregated business-relevant knowledge about the object is available for your digital processes.

5. Poor data accessibility

There are a few factors that can hinder access to data – the use of a variety of mechanisms, different technologies, and, perhaps most important of all, no operational (real-time) data access.

Accessibility is such an important factor in digital transformation because it enables efficient collaboration between multiple unassociated partners in shared processes. Say there’s an interruption in the data flow or a data exchange delay, and you can’t access up-to-date information. Upon such an event, all of the partners would risk basing their decisions on obsolete data.

Calculating the ROI of investing in a system that guarantees availability is easy. Think of the potential costs and repercussions your organization could see if your processes operate on incomplete or invalid data. If you make data accessibility your priority, you won’t have to worry not only about misleading data. You’ll also be able to spot and react to real-time events faster.

6. Data that isn’t normalized and standardized

Standardization and normalization are two data processing techniques used in the data transformation process. To leverage your data to its full potential, there needs to be a way for cohesive data representation. Data scientist Clare Fiu is right by saying that “without standardization, it will not only be hard to derive the right insights; there could even be an incorrect output, which can be costly”.

Source: Unsplash

When it comes to operational data, it’s essential to create a business abstraction layer, i.e., one where various sites and systems operate on the same standardized and normalized business objects. The reason why these objects need to follow the same standards/norms is that they will serve as a single source of truth. Thus, they will be vital for the fulfillment of the digitalized process.

7. Data quality differs for data stored in different legacy systems (no single source of truth)

According to a study by Experian, 68% of businesses agree that poor data quality has a negative impact on their digital transformation initiatives. Data quality refers to “to the overall utility of a dataset(s) as a function of its ability to be easily processed and analyzed for other uses”. It is subject to change, which means it can also go bad, especially when different legacy systems are used and there is no single source of truth. 

In the case of operational data, where data is aggregated from different systems in the form of “standardized shared objects” you can implement data verification mechanisms to ensure high data quality. What’s more, working on “shared objects” lets you easily change data sources while keeping the objects unchanged. As a result, the shared process isn’t disturbed, while data quality is significantly improved. 

After determining the structure of the shared object, each partner (or each of the systems) decides how to supply it with the most reliable data. They also have the option to implement procedures for automatic verification of the data with other sources. 

8. Integration mess (need for a full graph of integrations)

Did you know that more than 80% of DT projects fail due to integration problems? This stems from two reasons primarily: 

  • the need to manage multiple suppliers (or the lack thereof), which is time-consuming. As they have to find the right solution and then maintain them. 
  • companies inability to handle ever-growing data volumes, sources, and types of data.

By using a platform like Trusted Twin:

  • The business logic layer is separated (operating on standard objects) from the integration layer (supplying these standard objects with data from different systems)
  • The number of integrations is minimized. Suppliers don’t have to integrate with each other. They have a full graph of integrations, but they integrate with one platform dedicated to this type of activity only. They have access to many integration methods and remain independent from others when choosing them.

9. Lack of data governance

A proper data governance program makes sure that your data is legible, of the highest quality, relevant, and constantly available. Data that has these characteristics has all the prerequisites needed to bring your business tangible results. 

Yet, McKinsey has found that many organizations are following data governance programs that are simply ineffective or even obsolete. So, it’s not ‘just’ about companies losing out on data-driven opportunities or wasting internal resources. It’s also about staying compliant with local and international legislatures that regulate how data can be shared and accessed. Let’s take the EU’s Data Act, for one. The document, published in early-2022, aims to restore the balance in data sharing contracts with different parties. From users of IoT devices, who want to gain access to the data they generate (now used exclusively by the device manufacturer), to rebalancing the data sharing negotiation power between small businesses and organizations that currently have a stronger “bargaining position”.

The reason why the EU Data Act is a good example is that it partially amends the Database Directive that has been in force in the 1990s. This means that all organizations under the EU legislature now need to revise their data governance strategy and apply changes to their infrastructure so that it aligns with these latest changes. In essence, if you do so internally, you have to monitor changes continuously and apply costly technical refinements. 

Luckily, you can avoid this by partnering up with Trusted Twin – we are an easy way to ensure your data-sharing compliance, allowing you to focus entirely on leveraging your data potential.

The best way to approach digital transformation – summary

When it comes to undergoing digital transformation, it’s worth starting with a small business case. Invest a month in the preliminary work. Once you’ve delivered proof of concept, give it another six months to show the return on investment.

Don’t be discouraged by the blockers, such as infrastructure and integration mess. Instead, work with the right partner, who will tackle the technical aspects, allowing you to focus on your business goals.

A platform like Trusted Twin:

  • Makes digital transformation faster, cheaper, and less risky
  • Lets you enter a secure path to digital transformation, even if you don’t have extensive IT experience and team
  • Takes all issues related to availability, scalability, reliability, and security off your shoulders
  • Allows you to deliver proof of concept in a month, and ROI in six months
  • Stays on top of all the newest data regulations, like EU’s GDPR and Data Act.

See how we can support your digital transformation process – you can start off with a Free plan and scale as you grow.

Related articles

For more information about how to use the Trusted Twin platform in your application’s architecture or technology stack, please contact hello@trustedtwin.com

Or schedule a video consultation with us through Calendly