Software is math. Every class is a theorem. The compiler is the proof. And unit tests check our work.
Historical Modeling
I cultivated a technique for system analysis and design based on immutable historical facts. I started thinking about this idea in 2001 while working on a distributed system. The gift card system had the unique ability to redeem value without making a network connection. The project struggled with all of the classic data management problems keeping the balance correct at every local node.
For the next system, I started with the assertion that things that don't change are easy to keep in sync. From there, I discovered and documented a small set of rules governing immutable data. Then I modeled not only this new application, but also every real world problem that interested me. Through this exercise, I became convinced that immutability applies to a large class of problems. Learn this technique through a series of lessons.
Open Source
Once I started building systems using the principles of immutability, I saw common problems and patterns of implementation.
APIs needed to exchange not trees but graphs.
Databases needed to support multiple foreign keys.
Queries that include WHERE NOT EXISTS
were common, and needed to be fast.
So I started building abstractions that solved these problems in the general case.
The first was the Peer-to-Peer Game Engine. One of the domains that I modeled most often was that of board games. So I wanted a system that would let you express the rules of a game, and then distribute the moves among a cloud of connected players. Napster and Freenet were popular at the time, and Java Applets were big. So I built PPGE in Java. The one and only game I built with it was Kriegspiel, blind chess.
I recognized that this idea could serve more than games, so I re-branded the project the Peer-to-Peer Correspondence Engine. Also at the time I adopted .NET as my platform of choice, so I translated the system into C#. This is how Correspondence the open source project was born.
Correspondence featured a compiler that generated C# code from a model specification. I called this language Factual. It would later become the analysis language that appears in The Art of Immutable Architecture as a quick way to describe a model.
Correspondence was written in a .NET before asynchronous programming. It was targeted toward Windows Forms and XAML. And it was tightly coupled to my other open source project at the time, Update Controls. As technology moved on, Correspondence could not keep up.
In 2015, I was on a road trip with my family. I had plenty of time in the car to think about the future of Correspondence. I decided to consider what it would look like if it adopted the idioms of JavaScript. When I got home, I started creating Jinaga.
Jinaga
What started as the JavaScript version of Correspondence quickly became much more. Jinaga is an immutable runtime, the operating system for the decision substrate. Jinaga is how individuals will take control of their own data, businesses will connect their internal systems, and organizations will collaborate.
Currently, Jinaga.JS is a data management framework for React. I am building Jinaga.NET to support Blazor and MAUI applications. The two will interconnect, using a common network protocol. So a Jianga.JS backend running in Node will support Jinaga.NET on the frontend.
The next step is to create a hosted service. Application developers will sign up for Jinaga in the cloud. They will configure their models complete with security rules. Then they can distribute their Web and mobile applications connecting to this server.
After that, I'll construct a consumer-focused interface. Individuals will manage their own personal data using skills similar to those required by Excel and SharePoint. They will be in charge of their own data models, instead of relying upon an application developer to build exactly what they need.
With this suite of offerings, Jinaga will make the jump to the enterprise. I'll build a team that can support business with training and consulting. And we'll adapt the frameworks and services to meet their needs. The focus will be on helping organizations create immutable decision substrates to integrate their business units.
Eventually, we will add a mechanism for publishing endpoints for other immutable organizations to interoperate. They will each have their own models. We will automate the creation of secure adapters between organizations. The vision is for standard interfaces to emerge, unlocking a new ecosystem of collaboration.
The Art of Immutable Architecture
This vision is documented in my first book. Much like my own journey, it starts with the technology problems that immutability aims to solve. It teaches the reader how to analyze a problem using Historical Modeling. It shows which assertions they can make about their systems, and which ones they cannot. It gives them the mathematical tools to understand why it works and when it doesn't.
The book then goes on to show how to implement an immutable system on top of the tools that we currently use. You can translate a Historical Model directly into SQL. You can extract messages from the model to publish on a message bus. These are techniques I currently use to build systems for clients at Improving. But these are only a bridge to the larger vision.
The book ends by showing how all of the requirements of an immutable runtime can be derived mechanically from the model. An application-agnostic database can support any immutable model, even as it evolves over time. Queries are derived from the Factual specifications. Pipelines are inverted to update caches and user interfaces. And the system determines which facts a client is interested in, just based on the projections it displays. These are the algorithms are implemented in Jinaga. They form the basis for the immutable runtime, and will help us build a world-wide decision substrate supporting value generation across organizations.