Hello everyone, and welcome to the Sprkl Expert Talk, this time we’ll discuss quality code specifically around Node.js design patterns and performance. In our expert talks, we host a prominent developer in each episode and explore topics that would bring value to the developer community.
I’m Raz, a software engineer at Sprkl Personal Observability, and I’m the person asking those questions.
At Sprkl, we bring observability to the developers; we provide distributed tracing with a personalized view of your code changes effects when you’re building software.
Check out Sprkl on the VS Code marketplace. It’s really exciting stuff.
This time we sat with Luciano Mammino, a senior architect at FourTheorem, the author of Node.js Design Patterns (3rd edition) and an active member of the Node.js community, to talk about quality code in Node.js (backend): Performance and design patterns.
Just a heads up, we decided to split the interview into a few posts, so some of you will actually read it 🙂 This article will feature critical takeaways from the first part of the discussion. We also have the video recording of the entire session and the other parts of the article coming out soon. We’ll update.
My conversation with Luciano was incredibly inspiring, and I learned a lot from him. I hope he also learned from me and that you guys will get as much value from this interview as I did.
🦁 I’m Luciano Mammino, a senior architect at FourTheorem specializing in cloud computing. Most of the time, we help companies move to the cloud. If they’re already in the cloud we help optimize cloud workflows. We prefer serverless architecture and mostly work with AWS. So this is what I do most of my time at work.
“Serverless architecture helps you focus more on business logic and delivery rather than on operating systems, the latest security patches, and infrastructure provisions.”
🦁I think we have this vision where the cloud’s future will be more commodity. So we’ll be accessing compute more and more as a commodity where all the concerns of managing that compute layer will be the cloud providers’ responsibility.
And as developers, we want to focus more on business logic. So serverless now seems to be the best option for that vision. It’s not perfect; there are still many rough edges, but it seems to be the thing that makes you focus more on business logic and delivery rather than thinking about operating systems, the latest security patches, and infrastructure provisions.
We’ve been doing some interesting work, even using serverless outside APIs, which is probably the most common use case today. We even use it for high-performance computing, like doing statistical modeling.
There are a bunch of talks/papers coming out about the work we did, and we’ll be publishing them with AWS.
So you talked earlier about your biggest achievement – your book. So you’re the co-author of the Node.js Design Patterns book. Why Node.js?
So I decided that if I really learned something from this book, I should be able to build a few projects. And one of the projects I created was a CLI application that allowed me to download entire picture galleries of high-quality pics from Flickr, which was not a feature that was available on the website at the time.
And I just wanted to download that big gallery with hundreds of pictures from an event. So I was like, this seems like the use case where I can try to put some of the learnings into practice. So it was working, but I didn’t stop there. I was like, okay, maybe I should make this open source because maybe other people will find it helpful.
I started asking people for feedback to see if things could improve. Maybe there are obvious issues I don’t see because I was still inexperienced. And funny, I ended up asking for feedback on this forum where Mario was active and got so much feedback from the people on that forum. So the story is that we stayed in touch. Then we ended up living in the same city in Dublin, Ireland, just by coincidence, and the rest is history:)
“Node.js, with its dynamic and async nature, creates the need for new patterns.”
Most of these principles can help make Node.js code better and more maintainable. However, it’s also interesting to acknowledge that Node.js with its dynamic and async nature, creates the need for new patterns, and the book explores them and shows what problems they solve and how to be utilized.
Finally, I want to mention that the book goes a bit beyond design patterns. It aims to be more of a guide on all the techniques that might help developers to ramp up their Node.js game. You could do new things, like more advanced topics such as:
“The constructor pattern is a kind of a builder pattern that I haven’t seen in any other language. In classical OOP languages, a constructor is a unique method to initialize a newly created object once the memory has been allocated.”
There are interesting things with proxies, which tend to differ slightly from the classic proxy definition. Maybe also because there is a class called proxy in an ES2015, which is not really meant to be used as the classic proxy version. Many patterns are becoming more apparent in Node.js because of the dynamic nature of the language, for example, being single-threaded and so on.
“Working 100% Node.js means more productivity, less context switch, fewer utilities, less code duplication, no rewriting of same models in different languages, and ultimately fewer problems to deal with.”
And sometimes, you end up rewriting the same models and the same business logic around the models in both places in different languages. And it might not match 100%.
So yeah, there were different kinds of libraries and utilities already available at the time, but I felt that just working 100% Node.js would be better and you could be more productive. So that was my main motivation, I think.
“With functional programming, you end up writing a much more declarative code. But there are trade-offs; it’s known that with a declarative mindset, you cannot optimize as much as you can optimize imperative code. If you commit 100% to one pattern or the other, then probably you’re going to miss out.”
🦁Yeah, it’s a good question. I don’t think I’m not siding with either one, and I don’t even feel like an expert in either. Especially functional programming feels like you’re never an expert. But at the same time, I think there is value in both approaches. They give you different trade-offs.
So I would be one of those people that would say it’s okay to use both with moderation, and you’re probably going to get the most value that you can possibly get from both approaches depending on the type of problem you’re trying to solve.
In the case of functional programming, there are a few things that I like. Generally, I like the idea that you end up writing a much more declarative code that is very useful.
For example, when you’re doing front end, for instance, writing React code, where you want to describe the final state of your view, like what are the final pieces of the DOM that you want to render at a given point of time, based on your actual code state, but you don’t want to say, how we got there.
Like you don’t want to describe all the DOM manipulation when you need to go from one status to another in your application. So, in that case, that kind of mindset is very useful. But there are trade-offs; for instance, it’s known that with a declarative mindset, you can not optimize as much as you can optimize imperative code.
So most of the time, it works well from a code writing experience and code reading experience. But you are also trading a little bit of performance in most cases.
In the case of object-oriented cases, for instance, when you have to manage different classes in a tree structure.
And one case that exists in Node.js is the Streams library, where you have different classes for different kinds of streams. So, for example, you have readable streams, and you have transfer streams. Transfer streams are a specific subtype of duplex streams, so the object-oriented pattern comes in handy there, enabling you to describe all these different variations on objects that are actually quite similar and need to be composed.
So you can reuse all the similar things, but at the same time, you can apply their differences by extending classes and using the template pattern where you extend a base class and replace the methods for which you want to change the behavior.
I think again; you can probably go a long way by trying to use the best of both worlds without overdoing it. Because I believe if you commit 100% to one pattern or the other, then probably you’re going to miss out, in some cases, some of the advantages just for the sake of being a purist on one option or the other.
“Some people in the community will still prefer using a more functional approach, even when dealing with Streams. Still, Streams have this kind of a hierarchy of objects which made sense in the original implementation to go with a more object-oriented approach.
🦁 It’s interesting to see the last few changes they’ve (the community) been doing on the Streams library. First, they started using more functional programming ideas. For instance, they recently introduced Map and Filter directly in the stream methods.
Also, there are (this has been out for a while, I think a few years now) different ways to instantly change your custom streams without having to create a class. It’s like you call a function and you provide the meta definition that you want, and it’s abstracting all the complexity of creating your own class and replacing specific methods.
“Streams are complex because it’s a complex problem.”
🦁I think Streams are complex because it’s a complex problem.
Also, I think the documentation doesn’t give you a good background; instead, it assumes that you know the background: This is an API; here, use it.
So, if you find some resources that could explain all the background around streams, how they work and why they exist, and the good use cases, then I think all the API makes sense. And then, you can prefer either to cover the pure object-oriented one or to use these wrappers that are a little bit more functional, and they can probably give you more declarative code when you use them.
Watch our friend Erick Wendel explaining everything you should know about Node.js Streams (exclusive Node 18 features included):
“In functional programming – I like using immutables. But at the same time, you are paying with more memory.”
I want to talk a little more about functional programming: you talked about the declarative style, React and state management and how you find them useful. In the backend, the most useful thing for me is immutability.
In Node.js, we don’t use locks as much because it’s single-threaded mostly, with all these Async jobs. So, you don’t need to lock anything. But when you do, if your data is immutable, because there is no race condition, and you know that if you change something, you’re getting a new instance that doesn’t interact with other system components. So this is what I find interesting in functional programming. I like using immutables.
🦁I agree, and another use case I like talking about is using Redux. It forces you in one way or another every time you generate a new state, you are not mutating the previous state. But you are creating a clone with just a few changes you want to apply. And the nice side effect is that you have an entire history of changes that you can literally apply because you never lost. So, you never override your changes; you just create a convenient sequence. So, for instance, when you need to debug something, if there is an exception, you can rewind and get a series of actions that the user was doing and understand the journey from the beginning of the application to that particular exception.
Although, I think it’s fair to say that you pay back in terms of memory. Because you will be using more and more memory because you are reallocating more and more objects. So I suppose you must be careful when dealing with big objects. Note that every time you make a change, you’re basically creating an entire copy of that object.
There are ways to limit that effect, but at the same time, you are still paying with more memory when you go for an immutable approach.
Reminds me a bit of Event Sourcing. You can also run history to whichever point you if you need to review some states in the past. It’s probably where Redux took the inspiration, and they just use it in the front end.
We’ll stop here for now. I hope you learned something about Node.js design patterns. Stay tuned for our next session, which hopefully includes more interesting questions that you’ll find valuable.
For more insights, follow us on Twitter, or LinkedIn, or ping me if you have any thoughts about our discussion.
Enjoy your reading 16 Min Read