We launched the idea of the logical twins, the next evolution of digital twins

Most people I have conversations with find my business because they too want a portable, safe to use and logical approach to process structured information for their business. They would rather use logic than code.

The approach they seek needs to be easy and flexible, but sturdy enough for processing information crucial for their business. Some of the people I talk to know programming, but none of them, including me, want to have to build custom software to get their business information challenges solved. It’s just too painful to get software created before the processing logic is understood and known.

The logical twins enable a logical business approach to digital twinning

Starting with software is like putting the cart in front of the horse. Software ought to be built with an understanding of what it is to solve first. Start with getting the business information and its processing structured, and figure out the processing logic so its both defined and known.

We are constantly bombarded with solutions that solve for specific needs. But, we also know they drive substantive integration costs: spreadsheets, apps, SaaS solutions that only solve for fractional parts, like almost every other solution out there.

Such solutions thrive on messy data.

Solutions that feed on messy data lock us into a closed and local world, explicitly or implicitly. A world that ties us down. We get tied to the vendor and limited in our actions. At the end of the day, we just want to process our information securely, accurately and without fear of making mistakes. And walk into the beautiful sunset.

But picking a solution just feels overwhelming.

Why do we have to make it such a big decision? It’s because it is… So we resort to building yet another spreadsheet, it’s affordable and known to fit our budget, not seeing the hidden costs…

Explainable augmentation is the only way

As a business professional, I have a constant unease about my inability to fully understand the logic of the solutions I will be picking. I have this fear that the delete button is not going to do what it’s expected to do, it will do something bad. I worry of it because in many systems, there is no easy way to go back to a previous version without resorting to backups.

With a simple spreadsheet, I can make a copy, try the thing I want, and I immediately know if it works. It’s clunky but it is understandable and works well. But I also know it is neither what I want, nor elegant.

Spreadsheets feel nice because I know exactly what to expect, even if I know damn well that this pesky VLOOKUP never works like I want it to, until I force it to. But the logic of a spreadsheet, albeit fragile, is possible to check and verify manually. I know I can trust my, or my team’s, thinking that was put into the sheet, even if its slow and cumbersome.

I imagined there must be a similar, but better way. One with accurate and structured information. And where computers would do the work for me.

There must be a better way because I believe we created computers to augment us humans. The only way to augment me is in a way I trust. In a way that makes it possible to understand every part of that augmentation.

No voodoo computer magic for me: I trust what I can understand.

Most solutions are just painful

For almost every solution I have come to work with, it took me a long time to understand the logic of it enough to trust. Eventually I learned to trust the solutions, or just had to trust the vendor’s assessments that things were correct, and hope for the best.

I usually took quite some time to break things down as individual pieces of logic, if it was even possible. There was no easy way to verify the logic, the solutions were hidden in plain code. As an enterprise and solution architect I had to dig deep to uncover the theories things were built on. The are built in the current paradigm, I believe we need to build for a new one.

This itch to peek inside probably comes from my childhood, where I used a screwdriver to disassemble both things I was allowed to disassemble and things I probably ought not to have. All I wanted was to understand what was within, and how it worked. I think many of us feel that urge.

The great business professionals I’ve met along the way also wanted to understand how things worked in their domains. And the best ones learned in a way deeper and richer than everyone else, and then built solutions correctly with their understanding. They usually ended up making a ton of money from improving their businesses or creating completely new ones, and instead of blaming humans, they explored how to make things better.

I hear conversations that focus on great the technology is, and that blame people for making mistakes. They are missing the point. What is important is to explore how technology can be used to augment humans and improve quality. And see how it can enable skilled and experienced business professionals to produce higher quality work. It’s not primarily to automate away, the game (in my view) is about augmentation for better results.

In the new algorithmic economy, business professionals will be augmented with both generative AI to ideate, and deductive AI for validating ideas and potential solutions. The logical twins, in time, enable both modes; required to be effective in the algorithmic economy. We recently heard of the Google Deepmind solution to solve The Math Olympics geometry problems on par with the best humans with a similar approach.

At first, the algorithmic paradigm shift will be slow, an then shift fast. Remember that exponential curves measured with whole numbers look indistinguishable from zero for a very long time.

We want technology to augment us, not replace us. This is an important distinction worth remembering.

How to build high quality structured information

What is key, is the want to be effective. To get an ability to understand the processing logic of the structured information of one’s domain. Without repeatability of a process, it’s impossible to be effective at improving incrementally. To learn and remove errors only happens by observing patterns, learning from mistakes and building the learnings into a repeatable articulation of a process, preferably automated.

Business software solutions are mostly, and unfortunately, built separately from business professionals, treating them as “customers” of the software solution. These practices are changing, but only slowly, into a partnership, or a fully integrated team of builders. I firmly believe that business professionals and domain experts need to be an inherent part of the twinning process, and have tools available to articulate the business information processing as logic.

The big issue driving this separation of concerns, is that we have not made a separation of the encoding of business logic (that a business professional can understand) from the software that drives it (that computer wizards understand). A great collaboration would be possible if the logic could be expressed as portable metadata.

Building software with logic as metadata

I believe the key to the logical twinning approach lies in enabling businesses professionals to store business processing metadata, declarative logic, as separate individually stored blocks of logic that performs specific and deterministic tasks on structured information, such as a validation, query or state change. Every state change, query or validation should be possible for a business professional to look at and understand individually. Such pieces of logic metadata for queries and state changes should be exposed to software programmers to build solutions from.

Storing logic is nothing new though, a similar concept for relational databases dates half a century back, with something called stored procedures (SQL datalog logic) in relational databases. But, it was unfortunately not made easy to use, readable, portable and with safe interfaces for day to day use by business professionals with forgiving technical mechanisms like version control and the ability to easily check out a temporary workspace to simulate something.

The data in relational databases is quite hard to work with, as columns and rows of interconnected tables is challenging to design well. The stored data is not structured as objects that business professionals can think of, like a sales order or an invoice, and the individual data points are hard for business professionals to reason about and work with.

Stored procedures require software engineers and APIs to get them protected, and they have significant challenges when it comes to reasoning abilities, or recursive, logic. Some business professionals learned a reasoning language from the 1970's called Prolog the hard way, and got moderate or significant success with it, but most did not manage to industrialise their efforts, because building databases is very hard.

The key lies in enabling businesses professionals to store business processing metadata, declarative logic, as individual stored blocks of logic that operate on structured information.

Where did the idea of the logical twin come from?

I’ve helped large global enterprises with up to 150'000 employees and more with knowledge management, contract management systems, B2B supply chain integration hubs, and long-term enterprise architecture planning for systems with lifecycles of up to half a century (!) These are massive challenges requiring the ability to process large amounts of connected information quickly, and build structured sets of connected data that does not fit into spreadsheets. And this was a challenge I met over and over again.

Computer programming is something I’ve enjoyed for almost four decades now. Building custom systems when leading projects as a business professional was simply not a feasible activity though. I learned the hard way that you either drive an initiative, or your build the content in it. Not both, it wears you out quickly.

What happened though is that patterns emerged of how I needed to structure information, that were similar between the initiatives. Two decades later I was still missing the generic toolchain I wanted. I found that something could be built to solve for the unique needs of business professionals that need information processing in such environments. It was also possible to use the same methods to enable structured information collaboration amongst business professionals cross organisations. Which I learned was a real need during COVID, but that’s a story for another day.

The discovery of the logical twins

I realized logical digital twins, logical twins, make the process of structuring business information a much easier process for business professionals that understand structured information and logic, but don’t necessarily know how to build software:

  1. Encode and describe your business information (it’s a no-brainer, it’s useful even if you decide not to use logical twins later),
  2. import business data from spreadsheets or create it using the forms,
  3. define your first logical metadata that draws a conclusion you need,
  4. once you have encoded your business logic, invite user experience and behavioural specialists and software engineers to build high performance software applications around your logical interfaces (using standard APIs such as GraphQL, REST and logical APIs),
  5. leverage the information and logic of your logical twin incrementally.

I decided to build the implementation of my logical twins using an open source foundation based on a Prolog logic engine through guardrails provided by the open source TerminusDB, which is a semantic knowledge graph solution. For the technically inclined, the core is built on JSON-LD, RDF and a digital twin-oriented schema meta-model. It was important to keep things open to enable full data portability, and only data objects that adhere to the schema are permitted!

Prolog is an accurate foundational AI engine that does not hallucinate. It gives precise results based on declarative logic and has existed for decades. It’s only recently now with modern hardware and software that it became possible to build large-scale industrial solutions with it as a core part.

We believe the right way to leverage portable logical twins is by co-locating logic, schema and data in a single unit of distribution with full version history and workspaces. Enabling businesses to use this effective and new way of working with data products using a logical approach is what logical twins are all about.

The modern logical twins approach to structured information is enabled by a few core constructs that are required to build them, and this is what we provide to make it happen:

  • A data modeller (ours is complete and the best one for TerminusDB),
  • The ability to build information structures visually,
  • The ability to store and use logic as metadata within a single portable logical twin, which is unique to our approach,
  • Ensure every piece of information is accurate to their definitions,
  • Provide full version history, roll-back to known good versions, temporary workspaces (“branches”) for experiments and simulations, and synchronized copies for cross-organisational collaboration,
  • Education, learning, services and a community of practitioners.

This system is available today and our first customers have been using the solution since the beginning of 2023. Read more about them on the website!

Business professionals as the data janitors of today

Most business professionals I encounter, myself included, work with presentation-oriented spreadsheet data because their options are limited. Some go as far as calling that work we do, “the work of data janitors”. We manually move and fix data, some of do it all the time.

The business professionals I talk to believe that augmented business professionals will outperform every data janitor, no PhD required. The reason is that they apply their skills and experience in combination with automation of the dirty work that computers can do faster, better and cheaper through tools like the new logical twins. What is offered is a precise, effective, and accurate logical solution for business professionals.

I believe that with logical twins, you can be the hero of your story. You get a durable and logical approach to tackle the messy data and get enabled to build solutions that last and that use metadata instead of code to drive the work.

Augment your team and let the computers do the dirty work.

Know that enabling your business professionals to incrementally encode business logic will help you meet specific business needs such automatically verifying that rules are fulfilled through data, make complex recursive queries or perform transformations of data and simulations using declarative logic (datalog).

Congratulations on reading all the way to here. Sign up below!

Subscribe to receive updates on how to leverage TerminusDB data products, exciting new features in DFRNT and how to build solutions using a data-centric architecture. By providing your email, you agree to our terms of service and privacy policy.

Research, advisory and analysis

Next Steps

Let's get started!

Getting started with DFRNT is easy. We include everything you need to get started with your first graph data product.

The DFRNT Twinfox hypergraph platform helps you specify, build, collaborate and share model-based linked data products on your own, and export visualisations, definitions and data.

Get started on your DFRNT journey by signing up through the link below and we'll set you up for a demo and free trial.

Section to accept terms and conditions, and privacy policy

(check to continue)

Latest related blogposts

DFRNT® - Made for changemakers

We equip changemakers to make an impact.