Building shared processes around operational data helps in improving business operations. However, to reap the full benefits for your organization, you need to have a solid data sharing policy and infrastructure in place. This will allow you to ensure the security, scalability, and flexibility of your data, for many years to come.
In this piece, we share our first-hand experiences in handling operational data (i.e. shared objects) and building shared processes around them.
11 best practices for sharing of data worth including in your data sharing policy and strategy
Given that operational and analytical data serve different purposes, your data sharing policy towards each of these types should be approached differently. When it comes to the former, it must reflect the specificity and use cases of real-time data, shared among collaborating partners (as opposed to just one party). Here are 11 tips that will help you get sharing of data right:
1. Don’t give your data to others
“Sharing data” and “giving data” aren’t the same thing. When it comes to sharing operational data, you shouldn’t give it away, i.e., transfer ownership to a third party. Instead, you should only allow others to use your data when needed, while retaining continuous control over its visibility and how it’s used. This can be done by setting the right access rules. These can take on the form of predefined permission lists, or as user-defined expressions. This approach is a bit like the saying “eating your cake and having it, too” – you grant data access to those who need it, all the while retaining control of what happens to it. This is the exact approach to shareable data we use at Trusted Twin. We guarantee the safety of your data, irrespective of the number of partners involved in the shared process.
2. Don’t share everything
When it comes to sharing of data, be selective. Don’t share all the data, which you have in your system. Instead, only share the data that is crucial and relevant to your shared processes. For example, let’s imagine you run an e-commerce store. And you use a CRM to keep all your client information, including their purchase history, delivery dates, invoices, etc. You’re planning to build a shared process with your delivery partner. Instead of giving them access to all the data stored in your CRM, you can only share information relevant to this specific shared process, i.e., data related to deliveries – the order and delivery address.
By being selective about the data you share, you’ll be able to maintain better control over it, ensure security, and compliance with data protection policies and laws. Operational data sharing is not about creating yet another copy of databases you already have. Its purpose is to aggregate data from many sources to create value by sharing it in a secure and effective way.
3. Don’t think from the “structure of your data” point of view
Instead, think from the shared process perspective. Define objects that are used in shared processes and aggregate operational data, which is contributed to by all the partners. These objects should be common for every partner – as opposed to data structures used in each of the partners’ IT systems, as these might vary.
In other words, in this approach, shared objects are the links between collaborating partners. Meanwhile, data structures are unique for each of them, and strongly connected to the database technologies used in each partner’s internal domain IT systems.
4. Don’t fix the structure of the shared object
Another best practice that you should implement in your data sharing policy is keeping your shared object’s structure open rather than fixed. Most processes tend to change over time. To enable modifications and improvements, you need to have the ability to enrich your shared objects. You should be able to do that independently from other partners, without harming their part of the shared process. This is only possible if you observe a dynamic object structure. If you opt for a fixed object structure, all potential changes and improvements will be much more difficult and costly to make.
Let’s go back to the CRM example we discussed above. If you decide to enrich your object with more data, such as delivery guaranteed time, this change won’t negatively affect the process for the remaining process participants, if you use the dynamic object structure. However, if you go with the fixed object structure instead, then every change might create process disruptions or at least a need for extra work for every process participant.
5. Don’t set a fixed integration strategy and technology. Give yourself a choice
As you develop your processes, you might need more data coming from an increasing number of systems. Bear in mind that data sources might change, while shared business objects remain the same. You might, for example, find yourself in need of switching from one data source to another that might require a different integration technology. So, a fixed integration strategy could seriously limit your ability to improve processes.
6. Keep your data’s visibility and ownership in mind
It’s possible that at some point, you will need to simultaneously manage more and more bilateral data sharing policies with an increasing number of partners. For this reason, when it comes to sharing data with third parties, you should have an easy way of deciding which data is shared with whom.
This will allow you to monitor and manage access to data flexibly, at any point in time. This is particularly important, given that you also need to ensure that you always align with any data governance policies.
7. Focus on the business goal not on the infrastructure
Without a doubt, it’s the business objectives and outcomes that matter the most when it comes to building processes around shareable data. You might have to treat setting up the whole IT infrastructure for sharing operational data as a separate project, which will definitely prove timely and costly (not to mention the forthcoming infrastructure maintenance). It’s a major investment requiring significant effort, which isn’t the core part of your business.
It might be simply a distraction not worth pursuing. Instead, consider outsourcing whatever you can to a trusted technology partner that will give you the needed support and ready-to-use infrastructure. Focus your resources and efforts on what really matters for your business.
8. Let all the partners decide on their integration strategy and technology
In terms of sharing data with third parties, don’t force them into using the same integration strategy as you. Always give others the freedom to select what works best for them. Letting all the partners decide independently on their integration methods not only lowers the costs but also saves time. It’s crucial to mention that forcing your partners into using the same integration strategy as you might actually prevent them from participating in the process in the first place.
9. Remember about the cost of sharing of data
Creating an infrastructure that would enable sharing of data between multiple partners can be very high. Especially when the process involves significant amounts of data. For this reason, it’s important that each partner has a clear understanding of the costs, and that everyone pays for themselves. They should also be made aware that the infrastructure costs might go up if more data is added to the process.
10. Choose scalable technologies
n this regard, ‘scalable’ means two things – allowing for the same levels of availability and responsiveness you’ve had at the beginning, at scale.
Operational data sharing is about real-time access to actual data, at scale. Assuming that you’ll grow the number of objects exponentially, you won’t want to see an increased response time while accessing data. Even more so, you wouldn’t want to see any data loss disrupting the entire process.
Make sure that you use a platform and infrastructure that you can fully rely on, especially when you plan to grow fast.
11. Operational data sharing is about simultaneously balancing integration and separation
Make integration for new partners easy and smooth, all the while ensuring that it won’t jeopardize the entire ecosystem. The key here is to make sure that the data of each partner is separated, i.e, that their actions would never break the process and negatively interfere with other partners.
To put this into relatable terms, let’s imagine one of the partners falls victim to a DDoS attack or data breach. If such an event is targeted at one of the parties in the process, they should be isolated so that the breach or attack doesn’t affect anyone else.
Balancing between integration and separation is possible with a solution like Trusted Twin. It’s an API-first platform that allows for easy integration of operational, real-time data in the form of shared objects. Trusted Twin acts as a data exchange layer, and addresses the challenge of integrating multiple systems directly with one another. Instead, it works as a trusted point where the data relevant for real-time processes is securely and reliably stored and shared.
Shareable data – using it right in your shared processes
Sharing of data between multiple partners provides significant benefits to everyone involved in the shared process. However, to ensure data safety and process efficiency there are a number of rules you should account for in your data sharing policy, including:
- Being selective about the data you share
- Opting for a dynamic object structure
- Using the right data sharing platform, i.e., one that is scalable and secure.
Most of the best practices we discussed fall under one of the following categories: data governance, access management, security, availability, and scalability. Trusted Twin is a solution made for operational data sharing, therefore it addresses most of these issues.