Are FPGA Vendor Provided Tools All I Need?

Anyone who has designed with the latest breed of FPGAs, complete with scalar, RF, and DSP engines, your choice of hard and soft processors along with custom and standard interfaces, understands why most FPGA projects are behinds schedule.  In fact, according to a recent Wilson Research study, responders reported that 68% of FPGAs designs are delivered behind schedule.

When talking with design teams about this, the typical response boils down to, ‘we are too busy being productive to be more productive’. Meaning, we have a current flow, we use the FPGA vendor provided tools, and we don’t have the time nor the people to evaluate new tools and methodologies. This said, with today’s FPGA complexity and need for high-reliability designs and systems, this can be a costly mistake.

To convince FPGA development teams that change is needed, EDA companies have come up with cute marketing slogans such as “Shift Left” and “Verify as you Code”. The benefit proposed is if you can catch issues sooner, the faster and less costly they are to fix. While this would seem obvious, it typically takes management’s commitment to high-reliability and streamlined design practices to realize the true benefit of adding additional tools into the flow.

While FPGA vendor provided tools are necessary, by themselves, they are not sufficient when it comes to streamlining high-reliability FPGA design. So why adopt a 3rd party Lint tool like Visual Verification Suite’s Analyze RTL? To answer this question we asked Adam Taylor, Founder and Lead Consultant at Adiuvo Engineering & Training Ltd. for the top 10 reasons his team adopted the Visual Verification Suite for their work with the European Space Agency (ESA) aimed at improving the usability of the ESA soft-core IP. 

For background, the ESA soft-core IP (ESA IP portfolio) was developed to promote and consolidate the use of standardized functions, protocols and/or architectures such as SpaceWire, CAN, TMTC, and more. Adam and his team have been reviewing the cores to ensure they are clean of syntax, structural, and clock domain crossing issues.

Here is Adam’s response…

  1. Ease of use – No steep learning curve to using and becoming proficient with the tool.
  2. Focuses in on issues – Provides filtered reports, path-based schematics, and cross probing to quickly find issues and then assign waivers to fix or not.
  3. Design Enablement – Low ‘noise’ text reports provide significant information on the structure of the design to help optimise if necessary – they also help designers understand legacy designs and pre-existing IP blocks.
  4. Find issues earlier in the design cycle – Enter simulation and synthesis with a better quality of code. The later issues are found the more costly they are to fix.
  5. Design Scenarios – Ensure the configuration of generics does not introduce any corner cases when developing IP e.g. one generic resulting in an overflow which is not caught until much later.
  6. FSM viewer – Ensure no illegal /deadlocked/unmapped states are in the FSM – simulation requires you ask the right question to find it, or worse you find it after hours of simulation, which then must be done again.
  7. Design metrics – Tracking of warnings, errors, and ‘Must Fix / Won’t fix’ waivers over time allows assessment of the maturity of the code and the engineering effort to fix it. This results in more accurate program management estimations as to the state of the design.
  8. Design Sign off – You know required tests were actually run – good for “goods in inspection” of code as well as to understand the impact of code changes.
  9. Easy creation of custom packages for company design rules – Automates the design review process by enabling design reviews to be consistent and focus on assigning must/won’t fix waivers.
  10. Built in safety packages (DO-254), industry standard checks (STARC), FPGA specific libraries on your choice of Linux and Windows to streamline setup and deployment.

The Visual Verification Suite augments FPGA vendor tools by generating complete timing constraints for false and multicycle paths and reporting on functional design, FSM and clock domain crossing issues that can be fixed before simulation, synthesis and physical implementation, reducing the number of iterations in the flow considerably. To find out how your team can benefit by verifying as they code, request a demo from the Blue Pearl team.