- Open Access
- Authors : P. Swetha , K. Divya Lakshmi , Dr. B. Sudarshan
- Paper ID : IJERTV10IS070062
- Volume & Issue : Volume 10, Issue 07 (July 2021)
- Published (First Online): 12-07-2021
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
An Assessment of Low-Power VLSI Testing Requires A Test Data Compression Architecture
1P. Swetha 2 K. Divya Laksmi 3 Dr. B. Sudarshan
1&2 Assistant professor Department of Electronics and Communications Engineering,
3. Associate Professor, Dept., of Mechanical Engineering
-
College of engineering, kadapa-AndhraPradesh-India-516003
Abstract:- With semiconductor technology's ever-increasing integration capability, today's huge integrated circuits necessitate an ever-increasing quantity of knowledge for testing, resulting in longer test times and higher memory requirements for testers. Larger test data sizes necessitate not only more memory, but also a significant rise in testing time. These continue to be the stumbling block in the system's testing. The foregoing issues are resolved using a technique known as test data compression, which involves lowering the test data volume without compromising the system's overall performance. ISCAS'89 benchmark circuits are known for compressing test sets, hence MINTEST vector creation is used to construct test vectors. The major purpose of the work is to use coding systems like variable length coding and fixed length coding to reduce test data volume and, as a result, test data memory.
Key words: Compress, Tester memory, MINTEST vector, ISCAS89.
-
INTRODUCTION:
Since the invention of the integrated circuit (IC), Moore's law has predicted that the number of transistors integrated per square inch on a die wills double every year and a half. As the use of digital ICs grows in everyday life, the focus is shifting to smaller, more complicated designs using improved process technologies. The size of the transistors used is also shrinking every few years, but the frequency of circuits is increasing. If the frequency of circuits rose and the trend continued. If trend testing of VLSI circuits [1]. VLSI circuit design testing is required to determine whether a system is good or bad. ICs with billions of transistors have resulted from continuous advancements in semiconductor fabrication technology. SOCs are massive integrated circuits that can include all of the electronic circuitry required for a complete system. Such a large amount of information. System chips are often very large Integrated Circuits (ICs) with millions of transistors that include a (ICs with millions of transistors that contain a) are reusable, predesigned silicon circuit blocks. Embedded cores in system chips provide a wide range of functions, including processors, mpeg coding/decoding, memory, and so on. The IC fabrication process is far apart from flaws and imperfections such as shorts owing to power or ground, additional materials, and so on, and it also results in failures and defects. As a result, each and every IC produced should go through a testing procedure. The goal of testing is to ensure that the produced IC does not have any manufacturing flaws. Due to the complexity of the testing process, a design technique known as DFT is examined, with the goal of making the testing phase simpler and faster [2].
It is absolutely necessary for reducing test time and power. This work describes a lossless data compression strategy for reducing the quantity of test data that must be saved on the tester and subsequently sent to the chip. Instead of sending the entire system, a smaller amount of compressed data is communicated from the tester to the core in each test vector, and these are referred to as SOCs. Such a large amount of information.
-
IT WILL TAKE LESS TIME TO TRANSFER THE ENTIRE DATA:
There Will Be a Variety of Hardware Modules. These modules are referred to as cores. Many different reusable, predesigned silicon circuit blocks are used to achieve Test vector compression. Embedded coding techniques such as variable length coding and fixed length coding are discussed. Golomb Coding, Huffman Coding, Arithmetic Coding, and other types of variable length coding are examples [3].
-
ASSOCIATED WORK:
Until now, several complications have arisen while testing data compression. Until now, many test data have been subjected to data compression testing. Many test data systems will be described in further detail. Statistical coding, run length coding, Golomb coding, selective Huffman coding, Tunstall coding, LZW coding, heterogeneous compression technique, FDR coding, run length Huffman coding, multilevel Huffman coding, Bitmask and Dictionary Selection Methods, and variable to variable Huffman coding are just a few examples [4].
-
RUN LENGTH CODING:
For each type of compression approach, certain significant quality factors should be addressed, such as the amount of compression, area overhead, decrease in test time, compression scalability, and so on [5]. Runs of repeated values were used to encode the data compression algorithm for compressing scan vectors. The frequency of runs of 0s with run lengths less than 20 is high in the new way of coding, and the frequency of runs of length l falls with increasing length within the range of 0 to 20. Shorter run lengths are matched to shorter code words as well.
The FDR code is identical to the Golomb code; however the variable group size is different. There are several types of run length codes, including simple run length codes, Golomb codes, frequency directed run length codes, Extended FDR, Shifted alternating FDR codes, and so on. The coding for run length encoding of binary streams is shown in the example below: Original Test Data: 00 11111 0000
-
Run Length: 2 5 4
The don't care bits are also used to fill the longer runs in order to achieve maximum compression. Other strategies, such as run- based reordering, are utilised to extend the run length and boost compression, overall test time, and power [6].
-
Run Length: 2 5 4
The don't care bits are also used to fill the longer runs in order to achieve maximum compression. Other strategies, such as run- based reordering, are utilised to extend the run length and boost compression, overall test time, and power [7].
-
Rectangular Encoding:
Several strategies have been proposed to increase the fundamental scheme's encoding efficiency. Weighted pattern testing is one of the BIST systems that takes advantage of the characteristic of test data. The initial stage in rectangle encoding is to divide the test cubes into clusters with the goal of maximizing pattern wise correlation within each cluster. The clustering procedure is utilised in this case. A test cube's scan slices are divided into a series of variable-length rectangles. Within each test cube cluster, all test cubes are partitioned in the same way. The greatest collection of scan chains with compatible values is discovered within each rectangle.
The diagrams above clearly show how rectangular encoding is carried out between the test cubes. If scan chain reordering was used in conjunction with this strategy to increase correlation, test compression would be considerably improved.
Fig. 1: Example of Test Cubes
Fig. 2: Example of Rectangles
T1 B1 B2 B3 B4 B5 B6 B7
Sc1
0
X
X
1
1
1
X
Sc2
X
0
0
X
1
1
X
Sc3
0
X
X
X
X
1
1/p>
Sc4 X X X X 1 1 0
Fig. 3: Format of Rectangles
-
Compression Method:
The variable length codes are used in this paper to achieve Test Compression [8]. For compression, a variable length Huffman Coding example is studied. Huffman codes are used to generate any collection of deterministic scan vectors in terms of compression and decompression schemes. Huffman coding is used to evaluate the cores with internal scan [9]. A Huffman code is a statistically optimum code that is used to
Fig. 4: Huffman Tree Construction
among all uniquely decodable variable length codes, provide the shortest average code word length The Huffman tree is built using Huffman coding [10]. The code word for the binary string corresponding to each leaf can be found by following the path from the root to the leaf. The amount of compression that statistical coding may achieve is determined by how skewed the frequency of occurrence for the various code words is [11]. They don't care bits allow a block to be encoded with more than one possible code word, giving it additional flexibility. A Huffman code would provide the most compression, but it would necessitate a complicated decoder and might not meet the constraint on the minimal size of a file. Codeword. Huffman coding is used to implement the suggested compression and decompression strategy.
-
-
Testing Results
In the table below, the experimental findings for the various ISCAS89 benchmark circuits test vectors have been compressed and displayed. The Mintest test data s were used. For each of the benchmark circuits, we achieved the highest compression %. The results are implemented in either MODELSIM or XILINX, which are HDL languages. The table below also includes a comparison.
Table 1: compares alternative compression schemes using data from the MINTEST test.
Existing methods
Circuits
Golomb Rice algorithm Tunstall FDR
Proposed Work Selective Huffman
s9234 46 56 60 60 68.1
s13207 82 32 35 61 62.3
s38417 30 46 69 57 63.5
Fig. 4: Appearing Results on the MINTEST test
Huffman coding is an excellent method of compressing test data. The coding approach will significantly minimize test time and memory requirements. In this study, we detail how we applied our algorithm to various benchmark circuits and compared our results to existing test compression techniques. We were able to attain a substantially greater compression ratio [1-15] by using our technique.
REFERENCES
-
Abhijit Jas, Jayabrata Ghosh-Dastidar, Mom-Eng Ng and Nur A. Touba, 2003. An Efficient Test Vector Compression Scheme Using Selective Huffman Coding, in IEEE Trans. Computer.-Aided Des. Integr. Circuits Syst., 22(6):L 797-806.
-
Chandra, A. and K. Chakravarty, 2000. Test Data Compression and Decompression for System-On- a- chip using Golomb codes, VLSI Test Symposium, pp: 113-120.
-
Krishna, C.V. and N. A. Touba, 2002. Reducing Test Data Volume Using LFSR Reseeding with S e e d Compression, in Proc. of the IEEE International Test Conference (ITC).
-
Chandra, A. and K. Chakravarty, 2001. System on a chip test data compression and decompression architectures based on Golomb c o d e s , IEEE architectures based on Golomb c o d e s , IEEE Trans. Computer.-Aided Des. Integr. Circuits S y s t .,20(3): 355-368.
-
Kalligeros and D. Nikolos, 2008. Multilevel-Huffman test-data compression f o r IP cores with multiple scan chains, IEEE Trans. Very Large Scale Integr. (VLSI) Syst., 16(7): 926-931.
-
Kavousianos, X., E. Kalligeros and D. Nikolos, 2008. Test data compression based on variable-to-variable Huffman encoding with codeword reusability, IEEE Trans. Computer.-Aided Des. Integr. Circuits S y s t ., 27(7): 1333-1338.
-
Kanad Basu and Prabhat Mishra, 2010. Test Data Compression Using Efficient Bitmask and Dictionary Selection Methods, IEEE Trans. Very Large Scale Integr. (VLSI) Syst., 18(9).
-
Sundarraj, M., 2013. Study of Compact Ventilator, Middle-East Journal of Scientific Research, ISSN: 1990-9233, 16(12): 1741-1743.
-
Sundar Raj, M., T. Saravanan and R. Udayakumar, 2013. Energy Conservation Protocol for Real time traffic in wireless sensor networks, Middle-East Journal of Scientific Research, ISSN:1990-9233, 15(12): 1822-1829.
-
Sundar Raj, M. and T.R. Vasuki, 2013. Automated Anesthesia Controlling System, Middle-East Journal of Scientific Research, ISSN: 1990-9233, 15(12): 1719-1723.
-
Sundar Raj, M. T. Saravanan and R. Udayakumar, 2013. Data Acquisition System based on CAN bus and ARM, Middle-East Journal of Scientific Research, ISSN: 1990-9233, 15(12): 1857-1860.
-
-