Showing posts with label Formal Verification. Show all posts
Showing posts with label Formal Verification. Show all posts

Formal Verification: Theorem proving


How many times in the course of a project have you heard of the term Formal Verification? This relatively short on article theorem proving article assumes that you know the basic definition of Formal Verification while we try to see what Theorem proving is in relation to Formal Verification.

The theorem proving techniques relate specification and implementation as a theorem in logic to be proven in the context of a proof calculus. The implementation acts as a provider of axioms and assumptions for the proof. Like the math classes we had in school, involves proving of lemmas and theorems by deduction, rewriting and induction. This is mainly used for verification at a higher level of abstraction usually in algorithmic verification. This is more successful in complex problems and systems like distributed algorithms and real-time systems.

The first tools for this method were available in 1970's. Currently there exists a variety of tools which are more or less automatic. some of them being, ACL2, NQTHM, PVS, HOLY, Isabelle, STeP...

However this technique requires a strong user expertise and time intenseive guidance of the proof. The usual theorem prover tools are based on first order predicate logic, Higher-order predicate logic, Recursive functions, Induction etc. Eg., "The ACL2 logic is a first order logic of total recursive functions providing mathematical induction..."

Brush up of basics:
First-order predicate logic:
Predicate logic in which predicates take only individuals as arguments and quantifiers only bind individual variables.

Induction: A powerful technique for proving that a theorem holds for al cases in a large infinite set. The proof has two steps, the basis and induction step. Roughly, in the basis the theorem is proved to hold for the "ancestor" case, and in the induction step it is proved to hold for all "descendant" cases.

Higher-order predicate logic: Predicate logic in which predicates take other predicates as arguments and quantifiers bind predicate variables. For example, second order predicates take first-order predicates as arguments. Oder n predicates take order n-1 predicates as arguments (n>1).

In summary, the theorem proving tools use sophisticated combination of decision proedures, conditional rewriting, Induction & recursion, propositional simplification, complex heuristics to orchestrate all of that. The tool learns over its lifetime by making use automatically of lemmas being proven by the user.

Bottomline:
Very powerful mainly at algorithmic level. However you should go get an expert :-)

Motivation: What else can we talk about verification?


Many of you already know that verification efforts are as or more important as the design efforts themselves. They cannot be ignored or hind-sighted and they are part of the chip design process itself. You may also have known or learnt that there is always a productivity gap, the significance of verification, the very famous pentium bug, the moore's law, that verification cannot be done by simulation itself, the benefits of formal verification and the complexities that tag along :-), the benefits of model checking, what is equivalence checking and how do those algorithms work etc. So, what part of the verification landscape are you yet to explore or understand? Is there something i can explain or detail about? Do you want to contribute? Please comment!

Formal Verification or EquivalenceChecking


Design verification, must show that the design, expressed at the RTL or structural level, implements the operations described in the data sheet or whatever other specification exists.

Verification at the RTL level can be accomplished by means of simulation, but there is a growing tendency to supplement simulation with formal methods such as model checking. At the structural level the use of equivalence checking is becoming standard procedure. In this operation the RTL model is compared to a structural model, which may have been synthesized by software or created manually. Equivalence checking can determine if the two levels of abstraction are equivalent. If they differ, equivalence checking can identify where they differ and can also identify what logic values cause a difference in response.

Updates: 18th Dec 2008 !
Another series of steps in equivalence checking goes beyond what was described above...

RTL to Pre-Synthesis Netlist!
Pre-Synthesis Netlist Vs Post Synthesis Netlist!

When are DFT and Formal verification used?


DFT:
  • manufacturing defects like stuck at "0" or "1".
  • test for set of rules followed during the initial design stage.

Formal verification:

  • Verification of the operation of the design, i.e, to see if the design follows spec.
  • gate netlist == RTL (Equivalence checking)
  • using mathematics and statistical analysis to check for eqivalence.