Three Ds

Marko Djukic
4 min readOct 9, 2021

--

With the recent start of my new role, I have been thrust back into the hands-on coding, which I have to admit I haven’t done in a long while.

I haven’t stopped enjoying coding, but it has been a steep learning curve to step back into it after being so removed from the keyboard with years of management. But challenge accepted.

In some other post I’ll cover the actual project, the coding languages, framework choices (look how many there are now!!!), architectures, UI/UX etc. There were however, two bigger challenges I initially grappled with which I thought I’d share the experiences:

  1. How do I maximize my time in the output of code?
  2. How do I ensure my code is open to others to collaborate on and does not end up a pile of technical sh… debt, only known and managed by me?

Both made me initially concentrate on what I ended up calling the Three Ds, possibly at the expense of generating any meaningful code for a while.

Development

I wanted a standardized, repeatable development process and environment. My ultimate nirvana would be that any developer could fire up their editor and get into our code with the least effort.

We standardized internally on Microsoft VS Code, and I focused on getting the entire development flow defined using devcontainers and Docker, with some additional bootstrapping scripts and tooling. Even the VS Code settings and extensions that help productivity can be baked into the devcontainer. All of this saved down to the same Git repo, meaning what I see as a developer can be shared and seen by any other.

If you haven’t yet seen Development Containers from VS Code, definitely check it out further.

Deployment

The code had to be easy to stand up as a functional app. Locally this meant running it in the Docker which is part of the devcontainer. But it also needed to be viewable by other collaborators in some hosted environment, and eventually pushed to live. So, we needed to use infrastructure as code and some CI pipelines.

At Ingenii we all had a strong Terraform background, but recently we’ve been impressed with how things are done in Pulumi. After some experimentation we made the decision to fully embrace Pulumi and make our infrastructure as code Pythonesque!

We integrated the Pulumi infrastructure into our project code. This standardizes the recipe for the rollout of the environment: application stack, database, networking etc. And being part of a mono-repo, there is some elegancy to working only in Python and flipping seamlessly between the app code and the infra code, without having to make a mental switch of coding languages.

The next part of the puzzle was running GitHub actions on any repository code changes to:

  • Build new Docker images for the app stack as the code is updated, with all the environment tagging, testing etc.
  • Using Pulumi push the infra and app stack to Azure, stand up the app, configure the DNS and networking.
  • Stand up database, restore any data.
  • Any code without infra changes had to make sure the Azure app services refreshed correctly.
  • Send back the URL (if it is a test/dev Azure hosted).

Database

The final piece was ensuring a seamless database experience between environments, from what’s being run on a laptop doing local development, all the way to production. Some other challenges to consider were:

  • How do we make sure we don’t lose data in production between infrastructure updates?
  • Can we provide dummy data to development environments, rather than distribute around production data?

This is still a work in progress. For now, we’ve standardized on Postgres and use Docker Compose locally in VS Code devcontainers to run a separate DB instance for local development. Pulumi then takes care of the orchestration of the Azure hosted Postgres. Still to solve is the dummy data and protecting of the production data.

Time to Shine

On the one hand, it felt like a lot of time was lost initially which could have been spent on pure coding. But that could be a bit of a false impression, as I’m not sure I would have had much efficiency any other way, hand cranking out infrastructure, rebuilding Docker containers manually, etc.

On the other hand, there was a huge vindication in this approach which we experienced last week. A new developer started, with his first day Monday only a couple of days after his interviews and contracts.

So, his Monday morning start gave us very little prep time to give him any instructions how to get into the project and get stared. I pulled together a brief email pager with some bullets, but the key ones were:

  • Install VS Code and Git clone this repo
  • Start coding, the app will be available on this URL …

He started at 8am, we had a 30min run through of the code, and he successfully released a new feature by 9am.

Couldn’t have been happier with the result. The Three Ds worked seamlessly.

And if you’re curious what the rest of the team and I are up with Data Engineering and Quantum Computing, check out: https://ingenii.dev/

--

--

Marko Djukic
Marko Djukic

Written by Marko Djukic

Techie, entrepreneur, building data engineering solutions, working on quantum computing.

No responses yet