Eurorust - Tackling Memory Leaks
During EuroRust, Ayaz Hafiz’s presentation on memory leaks in Rust applications caught our attention. The talk centred around a real-world scenario where a Rust web application encountered unexpected memory issues, leading to a deeper exploration of memory allocators and their impact on application performance. Rust is known for its memory safety and performance benefits. So, what happened?
The Technical Challenge
The application in question was a multi-threaded async web server using the Tokio runtime. The core issue revolved around how memory allocation was managed in a concurrent environment. In this setup, requests to the server, while being handled asynchronously, could shift between threads due to Tokio’s work-stealing scheduler. This scheduler is designed to optimise task execution across multiple threads by allowing idle threads to take over suspended tasks from busy ones.
This behaviour led to a complex scenario:
- Concurrent Task Handling: Async tasks, such as processing web requests, were inherently concurrent. A task might start on one thread but, upon suspension and resumption, could be completed by another thread.
- Memory Allocation Challenges: The application faced challenges with memory fragmentation. This was partly due to the variable-length allocation lifetimes and sizes, coupled with the thread-contested nature of memory allocation and deallocation operations.
- Impact on Performance: The initial choice of memory allocator, mimalloc, was not optimal for this specific use case. Mimalloc, known for its performance in multi-threaded applications, was not providing the expected benefits in this context, particularly in a scenario with bottleneck I/O and limited resources.
The Solution and Tips
The resolution to this problem was a switch from mimalloc to mallocng. This change improved the memory management efficiency, addressing the fragmentation issue.
From this talk, there were several points we liked:
- Thread Management: Minimise sharing of resources between threads where possible. This can reduce contention and improve performance.
- Memory Management: Consider the characteristics of different memory allocators and choose one that aligns with your application’s architecture and requirements.
- Design Considerations: Employ strategies like using less reference
counting (
Rc
) where possible, separating synchronous and asynchronous threads, and using scoped threads to create scoped memory pools.
Our Perspective
At tenics we know that understanding these nuances in memory management and thread handling is crucial to good software. Embedded systems often operate under constraints of limited resources and high reliability requirements. The insights from this talk highlight the importance of choosing the right tools and strategies, not just in terms of programming languages but also in terms of runtime characteristics and memory allocation strategies.
Rust’s promise of safety and efficiency is particularly appealing in this context. However, as this case study shows, it’s also essential to understand the underlying behaviours of runtime environments and memory allocators to fully leverage Rust’s capabilities in embedded systems.
Looking Ahead
We believe that Rust is a key player in the future of embedded systems. As Rust continues to mature, we anticipate it will play an increasingly significant role in our projects and the embedded systems community at large. However, as we can see from this talk, Rust is not a silver bullet. Rust helps alleviate a lot of pain in software development, but it’s not a panacea. It’s still important to have a deep understanding of the underlying system and the tools available to us.