Vlsi Unit-V
Vlsi Unit-V
Synthesis Process:
Circuit Design Flow:
The VLSI IC circuits design flow is shown in the figure below. The various levels of design
are numbered and the blocks show processes in the design flow. Specifications comes first,
they describe abstractly, the functionality, interface, and the architecture of the digital IC
circuit to be designed.
RTL description is done using HDLs. This RTL description is simulated to test functionality.
From here onwards we need the help of EDA tools. RTL description is then converted to a
gate-level netlist using logic synthesis tools. A gate-level netlist is a description of the circuit
in terms of gates and connections between them, which are made in such a way that they
meet the timing, power and area specifications.
Finally, a physical layout is made, which will be verified and then sent to fabrication.
The three domains of the Gajski-Kuhn Y-chart are on radial axes. Each of the domains can
be divided into levels of abstraction, using concentric rings.
functionality.
• Structural domain: which specifies how modules are connected together to affect the
prescribed behaviour.
• Physical domain: which specifies the layout used to build the system according to the
At the top level (outer ring), we consider the architecture of the chip; at the lower levels
(inner rings), we successively refine the design into finer detailed implementation.
4. Flattening:
Benifits of Using Synthesis (or) Advantages:
1. It forces higher level of abstraction
2. Easy debugging
3. Code portability
4. Designer can guide synthesizer to optimize the design for speed, power or area.
5. Synthesis allows technology independent coding
Simulation:
Types of Simulation:
Design capture tools:
1. HDL Design
2. Schematic Design
3. Layout Design
4. Floor Planning
5. Chip Composition
1.HDL Design:
HDL design can be used for designing integrated circuits like processor or any other kind of
digital logic chip.
2. Schematic Design:
A collection of components may be collected into a module for which can be defined.
The icon is a diagram that stands for the collection of components within the module.
Given figure shows typical schematic for a module and its schematic icon.
Primarily, schematic editors are menu-based graphic editors with operations such as:
3. Layout Design:
Layout too can be captured via code or interactive graphics editors. Layout editors,like
schematic editors, are based on drawing editors. A layout editor might interface to a
Design Rule checking program to allow interactive checking of DRC errors an to a
layout extraction program to examine circuit connectivity issues.
4. Floor Planning:
:
Design Verification Tools:
The functionality of cmos chip is to be verified certain set of verification tools are used for
testing functional specification.
Given figure shows a conventional flow through a set of design tools to produce a working
CMOS chip from a functional specification.
1. Simulation Tools
2. Timing Verifiers
3. Network Isomorphism
4. Netlist Comparision
5. Layout Extraction
6. Back Annotation
7. Design Rule Verification
8. Pattern Generation
1. SimulationTools:
Simulators are probably the most often used design tools. A simulator uses mathematical
models to represent the behavior of circuit components. Given specific input signals, the
simulator solves for the signals inside the circuit. Simulators come in a wide variety
depending on the level of accuracy and the simulation speed desired:
Circuit level simulation
• The most detailed and accurate simulation technique is referred to as circuit analysis.
As the name suggests simulators operate at the circuit level. Circuit simulators are used
to verify performance of CMOS circuits should not be assumed to accurately predict the
performance of designs.
• Logic level simulators provide the ability to simulate larger designs than circuit-level
simulators. Logic simulation is the use of simulation software to predict the behavior
of digital circuits and hardware description languages. Simulation can be performed at
varying degrees of physical abstraction, such as at the transistor level, gate level, register-
transfer level (RTL), electronic system-level (ESL), or behavioral level.
2. Timing Verifiers: Timing verifiers determine the longest delay path in a circuit to
optimize performance and to make sure that the clock cycles are correct.
The designers simulated with unit delay simulators to verify functionality. They ran
simulators with delays to check for timing problem. The detection of such problems is
pattern dependent. In other works, if the critical timing vector is not exercised, the critical
path will not be found. A Timing verifier takes a different approach to temporal verification.
Here the delays through all paths in a circuit are evaluated in a pattern.
3. Network Isomorphism:
Network isomorphism is used to prove that two networks are equivalent and therefore should
function equivalently. This is used often to prove only those circuits requiring detailed
simulation expend expensive compute cycles.
An electrical network may be represented by a graph where the vertices of the graph are
devices such as MOS transistors, bipolar transistors, diodes, resistors, and capacitors. The
arcs are the connections between devices. These are the electrical nodes in the circuit.
Two electrical circuits are identical if the graphs representing them are isomorphic.
4. Netlist Comparison:
In the comparison phase, the verification tool compares the electrical circuits from the
schematic netlist and the layout extracted netlist. The netlist comparison process also uses
the LVS (Layout versus Schematic) rule check.
5. Layout Extraction:
6. Back Annotation:
Back annotation is the term that describes the step of feeding layout information back to the
circuit design. Back annotation is the process of adding the extra delay caused by the
parasitic components back into the original timing analysis, which only has the timing from
the cells’ delay.
Gate-level simulation and static timing analysis (STA) are the two most commonly
used approaches in verifying a chip’s timing performance. Both of the methods can verify the
chip’s operating speed against the design specification.
7. Design Rule Verification:
8. Pattern Generation:
Pattern Generation is the last step in the sequence that starts at architecture for a chip and
ends with a database suitable for manufacture. It is the operation of creating the data that is
used for manufacture. It is the operation of creating the data that is used for mask making.
Now a days most semiconductor operations use electron beam generated masks. These
machines expose the masks in a raster-scan style similar to a television.
A common format is the Electron Beam Exposure System (EBES) format. The following
steps must be completed to create an EBES file.
TEST AND TESTABILITY
Testing:
• Testing is a process of verification. It can be done when a known input is applied to a unit
a known response can be evaluated. In other words, the response from a circuit is compared
with a known response or predictable response. The Testing process equally applicable to
circuits, chips, boards, and systems from a transistor level, gate level, microcells, chips and
printed circuit boards.
• Testing is used not only to find the fault-free devices, PCBs, and systems but also to
improve production yield at the various stages of manufacturing by analyzing the cause of
defects when faults are encountered.
Role of Testing:
• Testing of a system is an experiment in which the system is exercised and its resulting
response is analyzed to ascertain whether it behaved correctly.
• If incorrect behavior is detected, a second goal of a testing experiment may be to diagnose,
or locate, the cause of the misbehaviour.
• The role of testing is to detect whether something went wrong and the role of diagnosis is to
determine exactly what went wrong, and where the process needs to be altered.
• Therefore, correctness and effectiveness of testing is most important for quality products
(another name for perfect products.).
• The benefits of testing are quality and economy.
Testing may be two types
➢ Functionality test
➢ Manufacturing test
Principle of testing
• The response of the circuit is compared with the expected response.
• The circuit is considered good if the responses match. Obviously, the quality of the tested
circuit will depend upon the thoroughness of the test vectors.
• Generation and evaluation of test vectors is one of the important concepts in the testing.
• A testable circuit is defined as a cừcuit whose internal nodes of interest can be set to 0 or 1
and in which any change to the desừed logic value at the node of interest, due to a fault, can
be observed externally.
• The VLSI development process is iỉỉusứated in Fig. 9.2, where it can be seen that some
form of testing is involved at each stage of the process. Based on a customer or project
need, a VLSI device requừement is determined and formulated as a design specification.
Designers are then responsible for synthesizing a circuit that satisfies the design
specification and for verifying the design. Design verification is a predictive analysis that
ensiưes that the synthesized design will perform the required functions when manufactured.
When a design error is found, modifications to the design are necessary and design
verííìcation must be repeated. As a result, design verification can be considered as a form of
testing.
• Once verified, the VLSI design then goes to fabrication. At the same time, test engineers
develop a test procedure based on the design specification and fault models associated with
the implementation technology. A defect is a flaw or physical imperfection that may lead
to a fault. Due to unavoidable statistical flaws in the materials and masks used to
fabricate ICs, it is impossible for 100% of any particular kind of IC to be defect-free.
• Thus, the first testing performed during the manufacturing process is to test the ICs
fabricated on the wafer in order to determine which devices are defective. The chips that
pass the wafer-level test are exttacted and packaged. The packaged devices are retested to
eliminate those devices ùat may have been damaged during the packaging process or put
into defective packages. Additional testing is used to assure the final quality before going to
market. This final testing includes measurement of such parameters as inpuưoutput timing
specifications, voltage.
Fault Models:
Fault models are necessary for generating and evaluating a set of test vectors. Generally, a good fault
model should satisfy two criteria:
(1) It should accurately reflect the behaviour of defects.
(2) It should be computationally efficient in terms of fault simulation and test pattern generation.
1. Stuck-at-fault model:
Example 1:
Example 2:
2. Transistor level stuck fault model (or) Stuck-Open and Stuck-Short faults:
3. Bridging fault model:
4. Delay Fault Model:
Fault simulation:
Fault Simulation is defined as the process of measuring the quality of test. It consists of simulating a
circuit in the presence of faults. Any input pattern or sequence of input patterns that produces a
different output response in a faulty circuit from that of the fault-free cừcuit is a test vector, or
sequence of test vectors, that will detect the faults. Fault simulation is performed using gate-level
model and functional level model.
The main goals of fault simulation:
Measuring the effectiveness of the test patterns
Guiding the test pattern generator program
Generating fault dictionaries
Fault simulation serves following functions:
1. Confirms detection of fault
2. Computes fault coverage
3. Diagnostics of circuit
4. Identifies areas of circuit where fault coverage is inadequate.
• The mechanics of testing for fault simulation, as illustrated in fig. First, a set of target faults
(fault list) based on the CUT is enumerated. Often, fault collapsing is applied to the
enumerated fault set to produce a collapsed fault set to reduce fault simulation or fault grading
time. Then, input stimuli are applied to the CUT, and the output responses are compared with
the expected fault-free responses to determine whether the circuit is faulty. For fault
simulation, the CUT is typically synthesized down to a gate-level design (or circuit netlist).
• Ensuring that sufficient desing verification has been obtained is a difficult step for the
designer. Although the ultimate determination is whether or not the design works in the
system, fault simulation illustrated in fig, can provide a rough quantitative measure of the level
of design verification much earlier in the design process.
• Fault simulation also provides valuable information on portions of the design that need further
design verfication, because design verification vectors are often used as functional vectors
(called functional testing) during manufacturing test.
Functional Testing:
In this testing every entry in the truth table for the combinational logic circuit is tested to
determine whether it produces the correct response. In practice, functional testing is considered by
many designers and test engineers to be testing the CUT as thoroughly as possible in a system-like
mode of operation. In either case, one problem is the lack of a quantitative measure of the defects
that will be detected by the set of functional test vectors.
Structural Testing:
The approach of structural testing is to select specific test patterns based on circuit structural
information and a set of fault models. Structural testing saves time and improves test efficiency, as
the total number of test patterns is decreased because the test vectors target specific faults that
would result from defects in the manufactured circuit. Structural testing cannot guarantee detection
of all possible manufacturing defects, as the test vectors are generated based on specific fault
models; however, the use of fault models does provide a quantitative measure of the fault-detection
capabilities of a given set of test vectors for a targeted fault model. This measure is called fault
coverage and is defined as:
Number of detected faults
Fault coverage =
Total number of faults
It may be impossible to obtain fault coverage of 100% because of the existence of undetectable faults.
An undetectable fault means there is no test to distinguish the fault-free circuit from a faulty circuit
containing that fault. As a result, the fault coverage can be modified and expressed as the fault
detection efficiency, also referred to as the effective fault coverage, which is defined as:
Fault detection efficiency = Number of detected faults
Total number of faults - number of undetectab le faults
Fault coverage is linked to the yield and the defect level by the following expression:
• Design for testing or design for testability (DFT) consists of IC design techniques that add
testability features to a hardware product design.
• The added features make it easier to develop and apply manufacturing tests to the designed
hardware.
• The purpose of manufacturing tests is to validate that the product hardware contains no
manufacturing defects that could adversely affect the product's correct functioning.
• DFT plays an important role in the development of test programs and as an interface for test
application and diagnostics.
• Two important attributes related to testability are controllability and observability.
• Controllability is the ability to establish a specific signal value at each node in a circuit by
setting values on the circuit's inputs.
• Observability is the ability to determine the signal value at any node in a circuit by
controlling the circuit's inputs and observing its outputs.
• The speed of an asynchronous logic circuit can be faster than that of the synchronous logic
circuit counterpart.
• The design and test of an asynchronous logic circuit are more difficult than for a synhronous
logic circuit, and its state transition times are difficult to predict.
• The operation of an asynchronous logic circuit is sensitive to input test pattersns, oftern
causing race problems and hazards of having momentary signal values opposite to the
expected values.
• Some designed-in logic redundancy is used to mask a static hazard condition for reliability.
• The redundant node cannot be observed since the primary output value cannot be made
dependent on the value of the redundant node.
• This means that certain fault conditions on the node cannot be detected, such as a node SA1
of the function F.
Avoid Delay-Dependent Logic:
Scan-Based Techniques
The goal of the scan path technique is to reconfigure a sequential circuit, for the purpose of testing,
into a combinational circuit. Since a sequential circuit is based on a combinational circuit and some
storage elements, the technique of scan path consists in connecting together all the storage elements
to form a long serial shift register. Thus the internal state of the circuit can be observed and
controlled by shifting (scanning) out the contents of the storage elements. The shift register is then
called a scan path.
• The storage elements can either be D, J-K, or R-S types of flip-flops, but simple latches
cannot be used in scan path. However, the structure of storage elements is slightly different
than classical ones. Generally the selection of the input source is achieved using a
multiplexer on the data input controlled by an external mode signal. This multiplexer is
integrated into the D-flip-flop, in our case; the D-flip-flop is then called MD-flip-flop
(multiplexed-flip-flop).
• The sequential circuit containing a scan path has two modes of operation : a normal mode
and a test mode which configure the storage elements in the scan path.
• In the normal mode, the storage elements are connected to the combinational circuit, in the
loops of the global sequential circuit, which is considered then as a finite state machine.
• In the test mode, the loops are broken and the storage elements are connected together as a
serial shift register (scan path), receiving the same clock signal. The input of the scan path is
called scan-in and the output scan-out. Several scan paths can be implemented in one same
complex circuit if it is necessary, though having several scan-in inputs and scan-out outputs.
• A large sequential circuit can be partitioned into sub-circuits, containing combinational sub-
circuits, associated with one scan path each. Efficiency of the test pattern generation for a
combinational sub-circuit is greatly improved by partitioning, since its depth is reduced.
• Before applying test patterns, the shift register itself has to be verified by shifting in all ones
i.e. 111...11, or zeros i.e. 000...00, and comparing.
1. Set test mode signal, flip-flops accept data from input scan-in
2. Verify the scan path by shifting in and out test data
3. Set the shift register to an initial state
4. Apply a test pattern to the primary inputs of the circuit
5. Set normal mode, the circuit settles and can monitor the primary outputs of the circuit
6. Activate the circuit clock for one cycle
7. Return to test mode
8. Scan out the contents of the registers, simultaneously scan in the next pattern
Built – in Self-Test (BIST) :
• Built-in self-test (BIST) is a design technique in which parts of a circuit are used to
test the circuit itself.
A test vector generator produces the test vectors to be applied to the circuit under test.
The response of a good circuit may be determined using the simulator tool of a CAD system.
The expected responses must be stored on the chip for comparison during testing.
The maximum number of test patterns can be generated by n bit LFSR, the maximum 2n – 1 test
patterns can be generated using n bit LFSR as all 0’s is not allowed.
Single-input compressor circuit (SIC):
In PRBSG, it is not attractive to store a large number of responses to the tests on a chip. A Practical
solution is to compress the results of the tests into a single pattern. It can be done using an LFSR
circuit. Instead of providing the feedback signals as the input, a compresser circuit is included is
called the Single-Input Compressor Crrcuit (SIC).
• After applying a number of test vectors, the resulting values of p drive the SIC and, coupled
with the LFSR functionality, produce a four-bit pattern.
• The pattern generated by the SIC is signature of the tested circuit for the given sequence of
tests.
• The signature can be compared against a predetermined pattern to see if the tested circuit is
working properly.
Multiple input compressor circuit (MIC):
If the circuit uner test has more than one output, then an LSFR with multiple inputs can be used.
Four-bit signature provides a good mechanism for distinguishing among different sequences of four-
bit patterns that may appear on the inputs of this multiple-input compressor circuit (MIC).
The effectiveness of the BIST approach depends on the length of the LFSR generator and
compressor circuits. Longer shift registers give better results.
Advantages of BIST:
• Low cost
• High Quality Test
• Faster Fault Detection
• Ease of Diagnostics
• Reduce maintenance and repair cost.