- Published on
Easy will make you dizzy?
- Authors
- Name
- Damian Płaza
- @raimeyuu
ORM - Oh My Model
Imagine that we are dealing with a project that requires writing migration scripts manually when database schema changes.
We try to keep backward compatibility to avoid downtime as much as we can, but sometimes it is not possible.
Each time we want to introduce a change, we need to carefully analyze the impact, understand the current schema and figure out the most appropriate new schema.
We don't want to introduce big changes to minimize the risk of breaking something, so we spend time designing new data models.
Sometimes we would like to get data model fitting the problem domain, but it would require substantial changes so we decide to make a trade-off and use imperfect but pragmatic data model.
OMG, no ORM?
Now imagine that we are using an ORM (Object-Relational Mapping) library that automatically generates migration scripts based on the changes in our data model.
Thanks to the ORM, we are delegating migration scripts generation to the tool.
When we designed the data model in an incorrect way, we can quickly and easily fix the weak design and get rid of its weaknesses.
We exchanged time-consuming, error-prone and manual process of writing migration scripts for a fast, autogenerated one.
We got this flexibility and speed, but are they for free?
So you want us to help you modeling it?
Imagine that we are working on a project that supports B2C sales process and it is a multi-tenant, multi-step process.
We used collaborative methods (like Event Storming) to build a common and shared understanding of this process.
Thankfully we were able to apply Domain-Driven Design heuristics and we ended up with a set of bounded contexts.
We identified proper capabilities that need to be provided in order to satisfy the requirements.
Process and subject matter experts successfully confirmed and validated our understanding by looking at the acceptance tests we wrote together.
It took some time as many people were involved, as various perspectives and many experiences collided with each other.
Ah, so you want to model it yourself?
Now imagine that process and subject matter experts got introduced into a low-code platform for visualizing workflows and connecting systems together.
In the blink of an eye, they can create steps, define conditions, sink data and connect elements together.
Sometimes they struggle with "the platform" and that's when "IT nerds" are called to help them out.
They are able to ship new sales workflows in a matter of days, sometimes even hours.
The organization exchanged time-consuming, error-prone and manual process of modeling for a fast, automated and visual one.
We got this flexibility and speed, but are they for free?
What should be the new scenario?
Another context - let's think about a web application that allows users to register new support cases and support agents responding to them.
The whole team is working on it - we could say it is a cross-functional team as it includes product engineers, product designer, product owner and product architect.
Turns out that there's a request from the customers buying this software from us - they would like to have another scenario supported, as there's some pattern occuring when handling support cases.
The team gathers and analyzes the request, they discuss it and come up with a solution.
It takes some iterations in which they are able to refine the solution based on the feedback they are getting from pilot customers.
Turns out that the feature is a great success and they release it to other customers in the blink of an eye as they use feature flags.
"Create a new scenario based on the codebase and rules I give you"
Next time, the product owner is equipped with a modern tool - "one of those AI assistants that can generate apps in seconds".
The product owner is able to describe the scenario in natural language and the AI assistant is able to generate a new scenario based on the existing codebase and rules.
It works almost like magic - there are some hiccups, but the product owner calls product engineers and product architect to figure out "the technical stuff" so that it all compiles and passes all tests.
Soon, yet another scenario is supported and the product owner is able to focus on the business value and market fit.
The team exchanged time-consuming, error-prone and manual process of sitting together and discussing the requirements for a fast, automated and AI-powered one.
We got this flexibility and speed, but are they for free?
You probably missed the point
You might be thinking dear Reader that I went nuts and I an advocating for stopping using ORMs, low-code platforms and "AI assistants".
Who doesn't want to write data migration scripts manually?!
But if you could reflect a little while on those three little stories, do you think they share something?
There was a problem which got solved by introducing a tool that "automates" some of the work.
It comes with some new abstractions (here: abstract concepts people need to learn to use the tool and communicate effectively) but gives so much powers, isn't it?
Each time we advanced capabilities by using the tool, we delegated some part of the work to it too.
What about the process of designing?
Did we lose something when those parts got delegated, or should we say, "automated"?
Easy will make you dizzy?
Turns out that we might misunderstand the problems we are dealing with (as we tried to discuss in The ambiguity of problems).
They provide us very important signal - a diagnostic one, telling us that we might be doing something that is not so rational at this point.
When new tools are introduced, some of those "signals" are gone.
A classical trade-off - we get something in exchange for something else.
Here, in those stories, the easiness of iterating and delivery speed got traded for the getting feedback while designing - whether those were data models (database schema), bounded contexts supporting business processes or web applications and new scenarioa.
It might be that the more design aspects we delegate "to the tools", the less we might actually be able to design.
One might say: "but we are designing on a totally different level now - no one is designing using assembly language, right?".
Definitely - I wouldn't like to write assembly language for a web application either.
I am more warning ourselves, dear Reader, that using Tools make fools?.
If something is getting easier, it might be we are losing something - which eventually might turn out to be beneficial and positive for us, for the surroundings and for the world around us.
The world of "AI"
I can't say I like the label "AI" when it comes to all the tools we are getting, especially when non-technical people are commeting on them.
I still consider them as "tools", which makes it possible to use metaphor related to making fools out of ourselves.
Or giving us dizziness when we overdose "AI fixes everything" fumes.
The industry will transform itself into something different, no doubt on it.
But I am afraid that we might be losing something important while getting better tools - possibly our industry ancestors would have said similar when it comes to not using assembly language on daily basis, by everyone.
I strongly believe that designing as a skill will be even more important than ever.
Either we take ORMs, low-code platforms or "AI assistants", for sure we got cheaper iterations and faster delivery - agile software development at its best, one could say.
I want to make it easier to deliver value to the customers by shipping high-quality software.
Easiness for "code production" does not always mean "high-quality software", isn't it?
Let's recall what was the first sentence in the great book by Craig Larman, Applying UML and Patterns:
Programming is fun, but building high quality software is hard.
As "typing was never the problem", optimizing code generation/production isn't a problem either.
One could easily extrapolate and conclude:
Code (over)production is fun, but building high quality software is hard.
It's not code production that's hard, it's designing that is hard
Often, when people are not familiar with specifying behavior through tests (also known as TDD - Test-Driven Development), they say that writing tests is hard.
It's not that writing tests is hard, it's designing that is hard.
Learn software design - at various levels - from data models, through domain models, to software architecture and user experience - and more!
To my current knowledge, it is an evergreen skill that will never go out of fashion.
And no, I am not taking solely about designing.
I am talking mostly about learning.