In projects with constrained hardware resources and the need to minimize chip costs, strict binary code size limits are critical. However, assessing how code changes affect the binary size is challenging, as it is heavily influenced by factors like compiler/linker effects, data structure padding, and a variety of hardware-specific code variants. Relying solely on build-breaking size thresholds provides feedback too late, making reactive binary size reduction time-consuming and costly.
To solve this, we partnered with a customer to develop a proactive, integrated system that combines static analysis and continuous monitoring to deliver precise, early feedback. Our static analysis warns developers while coding about changes that potentially increase the binary size. These analyses include duplicated string literals, unnecessary padding, or over-precision in data types.
Furthermore, continuous monitoring tracks the binary size for every build. Upon unacceptable deviation, the system provides a granular breakdown across code variants, attributing the change directly to the specific source code changes. Initial experiences confirm a significant improvement in transparency, leading to effective size adherence and a quantifiable reduction in overall binary sizes.
In this presentation, I will introduce the fundamentals of binary size analyses, provide concrete recommendations for process integration, and share practical experiences from deploying this successful approach.