What’s Event Sourcing (ES) for? Why deviate from the all encompassing cosy domain data models for keeping and storing application state? Why jump through all the hoops of ES and risk overengineered code?
The other day I read an article about Event Sourcing by Tobias Richling in issue 7/2019 of the German .NET magazine dotnetpro. I was happy to see the topic discussed - but from the second page of the article on my disagreement with the author grew. I disagreed with his approach to explaining Event Sourcing which he rooted in object state and described as „‚just‘ a pattern for persistence.“
Sorry, no, that’s no what Event Sourcing is to me. Right to the contrary!
ES has nothing (!) to do with single objects, or collections of objects, or object graphs.
ES has nothing (!) to do with persistence, either.
ES might me used in conjunction with objects or persistence, but that’s not defining ES, it’s not mandatory, it’s not why I’m so excited about ES.
Even Martin Fowler’s article on ES does not catch what makes ES so revolutionary to me. Here’s a tell tale excerpt:
The most obvious thing we've gained by using Event Sourcing is that we now have a log of all the changes.
He’s talking about events as representing changes. But changes to what? To an application state, or more specifically to a simple domain model consisting of ship object he’s using.
But I argue: events are not changes, they are not differences. There does not need to be a reference point for events. If change happened the question is, what changed. A change always refers to something which changed.
ES does not require such a reference for any event. A „something“ that changes and whose changes need to be recorded instead of overwriting each other is not the point of ES.
Events as experiences
ES events are more universal, broader, less specific: ES events are just descriptions of things happening. Whatever that may be. ES events are traces of experiences. The only thing to be in existence for that is an experiencing system.
Maybe an object happens to change. Fine, use an event to record that. However, to me that is only a special case. And it’s a case which requires an abstraction to exist already: an object.
Where does such an abstraction come from, though?
Does it exist on it’s own; is it absolute? That, to me, kind of sounds like platonism.
I don’t believe in that (anymore). Instead I’ve come to value the constructivist approach to „explaining the world“. It refutes the idea that „the act of knowing, produces a copy of the order of reality.“ To constructivists the mind is not passively observing an independently existing reality. Rather each mind constructs (builds) it’s own model of reality from purely personal experiences.
This is very plausible to me because it explains what I see in the world, e.g. people with very different beliefs, opinions, understanding of the same worldly events, or why cognitive reframing works.
There is no mirror image of the world in people’s heads. There is also no homunculus sitting inside of people’s heads interpreting their perceptions.
And people (or all organisms for that matter) are doing fine, aren’t they? They are impressively adaptable. They even grow from a single cell to a full blown being with trillions of cells. What a wonder to behold!
People (or all organisms for that matter) don’t need fixed or pre-defined models of the world hard wired into their heads. The more hard wired any aspect of a world view (model) is, I’d say, the harder it’s for an organism to deal with change. Take cognitive biases as an example; I think they are pretty hard wired „rules“ for how to interpret perceptions. And what suffering they cause!
Anyway, what I wanted to say: a fixed model of the world to me is a recipe for disaster in a changing world. It might be honed for very specific circumstances, it might allow very efficient and effective behavior in certain situations – but it’s the antithesis of adaptability. At least adaptability during an organism’s life time. Over generations even a fixed model might change and thus adapt members of a species to a new environment. But that’s little solace for a single member confronted with an environment which its particular fixed model does not fit.
The same is true for software, I think. Software with a single and fixed (data) model is effective and efficient in a very narrow range of situations. Much software development goes into imagining all possible situations an application might encounter (requirements analysis) and then deriving from that a single (data) model to help dealing with them (design).
This is really hard stuff! And it does not become easier when considering all the unknown changes in the environment in the near and far future. Such single (data) models thus need to be concrete and efficient for imminent situations, but at the same time be open and flexible for unknown situations some day.
Somehow software teams pull that feat off again and again. Or at least they think so, until their progress slows down or even comes to a halt. Because they realize their (data) models have been patched and tweaked into exhaustion.
So I guess it’s more accurate to say, software teams are trying this single fixed (data) model approach again and again, and shoot themselves in the foot with it again and again. I’ve done so myself very often in the past. And I’ve seen many teams suffering from calcified single (data) models (be they in memory or persistent). Now I’m eager to find a different approach. ES to me is very promising in that regard.
The single model fallacy
Now that I’m thinking about it… I guess I haven’t seen a single project where this kind of fixed (data) model thinking has led to a happy end. Hence I’d call it the „single model fallacy“. It’s a fallacy to believe you can get the single model right for now and at the same time easy to change for the unknown future to come.
To me this kind of software platonism is dead. And neither O/R mappers for RDBMS, nor document databases, nor graph databases or the like are going to change that.
The fundamental flaw in all those meta-models is the single (data) model. Some technologies may make it easier (or harder) to adapt the schema of such a model. But they don’t question the model. A fixed (data) model might be more or less open for change, but still it’s a single model.
Reframing is not part of the picture of single (data) models. Slight and slow change might be ok, but nothing radically new. The fixed (data) model is how the world is perceived. A software with a single (data) model can only serve a world which conforms to this model. And then it has to fear the world changing… A single (data) model cannot possibly embrace change.
And that might have been ok for many decades. Software created in the image of machines, or maybe later software created in the image of bureaucratic organizations. Because they world was changing slower than today or easier to understand in the first place. Or maybe just because software systems were necessarily simpler/smaller due to resource constraints.
It worked for the structure of software and for the software production process.
Until it did not anymore. At least for the software production process. Enter: Agility.
The core insight of Agility: software cannot be built in one pass. It needs incremental iterative development to be of constant and increasing value in an ever changing environment.
And now’s the time to realize it does not work for the internal structure of software anymore either. We need to rethink the basic paradigm we’re basing software structure on. And I don’t mean „the monolith“. That has already been broken up.
No, the monolithic architecture or its successor the distributed and then micro-service architecture are the platform to implement functionality on. How can and should functionality be distributed?
What I mean, though, is the paradigm underlying (!) a software system of any structure, be that a monolith or a network of micro-services.
Moving from monolith to micro-services does not force anyone to question the single, fixed (data) model. But it’s that what needs to be challenged, I think.
I understand if you flinch at this suggestion. Event Sourcing can never be the default, right? It’s not efficient enough, it causes overhead to build (compared to the traditional approach) etc. etc.
Time for a new paradigm
But I’d like to invite you to clear your mind! Let go of all that you’ve learned as a programmer. Start at day one of your career and imagine what you get told is very, very different from what you remember.
Imagine you get told software systems are like organisms triggered by stimuli and producing responses based on perceptions. And each reaction is a cascade of events modulated by data received from the environment and past events (experiences).
And these strange virtual organisms observe their environment as well what’s happening internally. And wherever it appears to helpful they build local and temporary abstractions to make their reactions more efficient and effective. But as soon as those abstractions start to hamper them, they amend them or even throw them away. Because any such abstraction is an illusion since it necessarily is out of sync with the ever changing facts (eventual consistency) and/or misses now pertinent data.
Just imagine developer newbies got taught programming in such a radically different way. They would not feel any pain because their worldview got challenged. ES would be their first worldview. And that’s very different from any software developer steeped in RDBMS and OOP from the beginning.
By the way, I think the same is true for Functional Programming (FP) as opposed to OOP when it’s the first paradigm you learn. And it would be true for asynchronous programming as opposed to synchronous programming as it’s taught first today. And it would be true for test-first development from day one on as opposed to test negligence or test later as it’s commonly taught today.
Your gut reaction „ES cannot possibly be the default!“ is - sorry to say - no (!) proof of anything (except that you haven’t been brought up as a programmer with ES being your default worldview).
Sure, I don’t have the proof either, that ES really will make that much of a radically positive difference if it was the default. But I dare to imagine, I dare to challenge the status quo.
Please understand me right: I’m not against all the cherished paradigms and technologies like OOP or RDBMS or document databases or monolithic software or what not. I don’t want to deny them their usefulness.
However, what I want to get across is: they are optimizations. And as that they are comparatively narrow, inflexible solutions - with all the good and bad attributes of narrow and inflexible.
Platonism influenced christianity and I’d say even current software development. But the world has changed since then. Why not follow constructivism and pluralism not only „spiritually“ (as the west at least has increasingly done in the past 50 years or so), but also in software development?
I strongly believe the malleability, the flexibility, the suppleness of software will greatly benefit from moving to an „ES first“ worldview. It’s a very fundamental shift and not for everyone, sure. But I find it worth entertaining the idea, letting inspire me, and trying.
Software systems are not about a single (data) model; they should not be „monistic“. Instead they should fundamentally embrace pluralism. There is room for more than one model. In fact there is room for any number of models in a software system. As many as are helpful to deal with all the different triggers assailing it.
Models are means to an end. Not more. They are not the purpose of software. Hence it’s not your job as a software developer to devise models. Your job is to endow software with valuable behavior. And that probably will be easier or only possible if software keeps track of its experiences. And experiences are nothing more than encountered/produced events.
There is no model in experience. Models are generated, abstracted, built from experiences. But why stick to just one model?
Set your applications free to construct all sorts of models from their experiences! Events are first, models are second.
That’s what Event Sourcing is about, I believe.