Published on

New tools, old rules

Authors
Attention!

In this article, by rules I mean patterns/guides/observations/heuristics/principles/laws in general.

A day of typical software engineer

npm install X to get the package we want (and maybe need).

Deploy to cloud because it's easy, sometimes cheap and makes us prepared for massive scale.

Leverage React, or other view library, to craft beautiful user interfaces.

Use a SaaS app for sketching and collaborating in real time, generate markdown file and send it to team members.

Add logging and monitoring with a single step and entire product gets observability with the highest possible grade.

Levarage CQRS to have narrow, vertically sliced behaviors of the system.

Prompt GPT and have a chat regarding implementing a feature with highly encapsulated and flexible OO code.

Easy.

Does it sound familiar to you, dear Reader?

A typical horror of a software engineer

Installed 3rd party packages, wrapped with our leaky abstractions made their usage appear everywhere.

Applications are fast to deploy, but slow to be developed - everything is connected with everything.

User interfaces have strange glitches as numerous booleans are spread across components for controlling app state. (see related: Bool considered harmful?)

Generated markdown file contains persistence models (aka data models), as it was created with "noun-oriented" thinking - and totally discards behaviors and transformations. (see related: Concepts, Entities, Data)

Logged information comes in massive amounts and soon we need to parse plain text to capture crucial business facts.

Implemented Commands and Queries are full of collaborators which makes them bloated and makes related tests full of fixtures. (see related: I, interface)

Suggested code is full of inheritance, indirection, layers and the only notion of modularization is in names of the classes: "*Module".

Too fast, too dubious

We could enumerate through the tools more and more, for instance, mention k8s and other modern "equipment", but that's not the point.

Nowadays, we need to move fast (I wonder if 20 years ago people weren't saying wanting the same?) and deliver quickly.

We need resilience and reliability.

Apps must be snappy, modern looking and highly interactive.

To achieve so, we are grabbing the latest advancements from IT industry.

Nothing wrong with that - it will be weird (and unprofessional) to argue to use punch cards or to build web applications with Assembler.

Tools and their functions, the roles they play, shouldn't be neglected whatsoever.

But such behavior might root us in "new good, old bad" way of reasoning.

There will always be new, shiny tools

Year by year, human creativity is able to be manifested, materialized, and expressed through technical evolution.

It's really amazing that one person is able to build SaaS product alone, run in the cloud and utilize serverless - marvelous experience.

Amazing times to live in.

As an industry, we don't have very long tradition. New tools are (re)invented periodically, sometimes they are getting new names, maybe they are faster.

It's a matter of evolution, one might say.

The new glittery tool presented a month ago shows how better it is than its competitor released half a year ago.

There will always be new, shiny tools.

Trying to advertise strengths, power-ups, and so on.

What about the almanacs?

Books that were published 10 years ago, 20 years ago, 30 years ago.

Articles published 10 years ago. Probably using old-fashioned, not-so-cool version of your favorite language.

What's with them?

Turns out that there's knowledge that is hidden from the regular sight.

Of course, not every book or article carries "the value", the essential teaching one would wish for.

Among all of there, one could find eternal practices about modularization, encapsulation, decomposition, composition, cohesion, coupling, revealing intentions through abstractions, modeling and more.

Unfortunately, it seems they got burried under the dust of time.

Comparing to tooling, it doesn't seem so "sexy" enough to read about modeling, "patterns" for achieving modularization, heuristics for decomposition. (btw. I wrote a bit about similar topic in Software Engineering for busy parents)

This might be considered as the "art"-like part of software engineering - it might be even considered as not so deterministic, way far from real engineering.

Contradictory, tools are here - they are quite tangible, visible, "touchable".

You can read the docs and you "know it all" - of course, I am not saying there are no nitty gritty things and fishy solutions.

Old rules never get old, huh?

Somehow "old" guides, observations, insights, remain evergreen.

Why?

"Problems" of decomposition, composition, responsibilities assignment, roles/interface discovery, capabilities mapping, domain modeling, encapsulation, cohesion, coupling will always be there.

Why?

Because they are essential.

They are inherent.

They are built in the problem we are trying to solve. (you might be interested in Essentially bounded, Accidentally unlimited)

There are beautiful and mind-changing books, for example:

that might be considered as "old".

And the fact is - they are old. But the hidden gems of knowledge, subtle sapphires of mental models, created through experience and reflection - it's all there.

The problem of rewrite

I don't want to claim I have a magical orb that enables me seeing the future.

But all this focus on the tools, on their glitter, might actually make new people coming into the field a bit distracted from what is also important (note I didn't say tools are not important 😉).

In the times, where microservices grow up like mushrooms in the forest after heavy rain, it shouldn't be hard to rewrite a piece of capability delivered by a microservice.

Just stop developing crappy and poorly written code in C# or Java - set up new service, use strangler pattern and write in Rust (btw. I really like Rust!).

What are the chances of having next rewrite in some months? Of course, it's not that easy and straightforward to predict (even with my imaginary magical orb!).

But I believe you get the point.

Being able to quickly tear down a module/microservice/service/command handler/function/put-the-name-for-the-thing-you-want-to-rewrite and build a new one doesn't solve the problem of failing with modularization and other tenets of high-quality software.

I feel that GPT and other prompting thingies are beautifully showing it - they work like a signal amplifiers.

Give them the correct input signal, you will get what you wanted in return.

Given them the lack of knowledge signal, you will get amplified lack of knowledge.

Learn tools, study rules

At the moment of writing, I think there's a subtle difference between learning tools and studying rules.

We learn tools like Azure Functions to be able to levarage serverless computing to solve our business problems, without being concerned about infrastructure too much.

Of course, we can learn about various triggers, and so on - still, it's quite finite job.

What's more, we can even discover new ways of using it!

Contrary to that, studying rules requires time.

A lot of time.

It's something like going back and forth from saying "OO design patterns are utter BS" to "I will decorate this handler with validation behavior".

Applying patterns in the wrong contexts make them anti-patterns.

Noticing so requires reflection, self-awareness and time.

I think that's why it is so important to go back to the "roots" and study fundamentals (from time to time).

To reflect on them.

To rewire our brains.

To change the understanding of the concepts we grasped such a long time ago.

What's worth noting is that "the essential" parts give good ground for applying tools.

Learn tools, study rules.