FPGA tools for more predictive needs in critical

“Find bugs earlier.” Every software developer has heard that mantra. In many ways, SoC and FPGA design has become very similar to software development – but in a few crucial ways, it is very different. Those differences raise a new question we should be asking about uncovering defects: earlier than when?

Structured development methodology was a breakthrough in computing, leading to the idea of structured programming and modular approaches to code. Systems engineers banished “goto” statements and embraced data flow diagrams, defining system inputs and outputs that can in turn be exploded into more detailed low-level functions with their inputs and outputs defined. Functions are decoupled such that a change in one does not affect others, and abstracted so that its design details are kept from the rest of the system. They are coded and tested stand-alone, then connected into subsystems, then the system, and everything usually works as expected.

The results of structured programming are rather spectacular. Code is easier to develop, easier to test, more reusable, and more manageable. Initial development time shrank, but a bigger impact is on the lifecycle – enhancements and maintenance tasks went from huge problems to achievable objectives. Capturing and understanding programming metrics brings predictability to the software development lifecycle (SDLC). Once requirements are determined, teams can accurately size a project, load resources, and estimate a schedule.

Those all sound like great things, especially for teams working on anything labeled “-critical”. Whether the job is mission, life, or safety, the approach is similar. Teams in mil/aero, medical, and industrial segments usually seek structured processes and procedures to reduce schedule and risk through planning, visibility, and predictability. Requirements are usually carefully bounded and traceable, and once frozen even minute changes come under intense scrutiny.

In contrast, more modern agile methods can produce stunning results where requirements are more dynamic. Defects are more of a continuous stream, popping up quickly and receiving immediate attention. Agile teams tend to value working software over documentation, and often have difficulty accurately projecting completion dates. Adaptive methods are often just too much entropy for risk-adverse teams and customers in the critical realm.

Predictive_sdlc_vs_Adaptive_sdlc

Tension between adaptive and predictive methods shows up prominently in FPGA design. FPGAs are the ultimate in adaptive hardware, taking shape around a user-defined design. The process at first appears structured, with high-level hardware description languages and reusable blocks of IP and automated synthesis tools. Entropy creeps in as a virtual representation of a design – even one heavily simulated – is translated into physical constructs in the FPGA. What were known good functional blocks can suddenly break down at integration and hardware debug, victims of a variety of rule violations and anomalous behavior.

Without effective automation, finding and remedying those behavioral issues turns into a random manual exercise. The problems worsen incrementally, as new changes and re-synthesis can result in new unforeseen defects. The question of “earlier than when” becomes causal, with the schedule clock and expectations for a predictable result restarting as changes are planned and integrated. The job is finding critical bugs right now, prior to beginning each synthesis run, quickly verifying the entire FPGA design through accurate testing and regression analysis.

What kind of things crop up in an FPGA? Most vendor-supplied synthesis tools weed out the simple design rule violations, but more sophisticated checks are needed for bus contention, register conflicts, and race conditions. Another issue is related to clock domains and what happens when logic crosses them, resulting in potential metastability issues. High-performance tools can check and insert synchronizer constructs and re-timing logic, an area where FPGA design teams often become mired without automation assistance. Re-timing and synchronization are excellent examples of things that can change at each iteration, and unchecked can lead to problems.

Automating critical manual processes in FPGA verification is the entire mission for Blue Pearl Software. Using a TcL shell, a graphical front end for all Blue Pearl tools manages results from advanced analysis engines and presents information visually. Analyze RTL provides static design and rule checking, a CDC tool analyzes clock domain crossings (CDCs) using patent-pending User Grey Cell modeling and other techniques, and an SDC tool handles Synopsys Design Constraints – all before time-consuming synthesis, improving result quality and shortening the overall design cycle with fewer iterations. Integration with industry-standard tools and flows on either Windows or Linux means the FPGA design process is enhanced, not completely altered.

Perhaps just as important as finding more actual errors quickly is sorting and filtering them by severity, up to and including not reporting items that do not present an actual problem – often the case in CDC analysis. By managing reporting within the Blue Pearl suite, design teams are led to issues requiring direct attention, and not bothered by items of limited or no importance. Results are presented in an executive-friendly dashboard and a design cycle manager, improving visibility and aiding teams used to higher levels of predictability.

Some applications are overtly declared -critical, but as I’ve been sharing lately, applications such as the connected car, the IoT, and wearables are also taking on increasing expectations. Accelerating verification is really about finding bugs right now, before declaring an FPGA design ready for synthesis – at every iteration of the design. Predictability is worth an extra step. In future posts, we will drill down into the capability of the Blue Pearl Software suite of tools and how they support industry-standard FPGA design flows including integration services.