With the explosion of off-the-shelf SoCs in terms of size and the advent of novel techniques related to failure modes, commercial ATPG and fault simulation engines can often be insufficient to measure the coverage of very specific metrics. In these cases, many researchers firstly store the simulation trace during the analysis phase. Then, they collect the desired statistics during a post-processing step. In this framework, the so-called Value Change Dump (VCD) is a very commonly used file format to record simulation traces. The target of this paper is twofold. From the one hand, we illustrate some Burn-In (BI) related metrics which cannot be evaluated by current commercial fault simulators and ATPG engines. These metrics are indeed based on a post-processing analysis of memory dumps in VCD format. From the other hand, we mitigate the evaluation time and the memory required to analyze huge VCD files by exploiting optimization techniques coming from modern programming features and smart parallelization. Adopting this strategy, we can analyze simulation dumps of more than 250 GBytes in less than one hour, showing improvements of two orders of magnitude over previous tools, with a consequent higher scalability and testability power.
Accelerated Analysis of Simulation Dumps through Parallelization on Multicore Architectures / Appello, D.; Bernardi, P.; Calabrese, A.; Littardi, S.; Pollaccia, G.; Quer, S.; Tancorre, V.; Ugioli, R.. - (2021), pp. 69-74. (Intervento presentato al convegno 24th International Symposium on Design and Diagnostics of Electronic Circuits and Systems, DDECS 2021 tenutosi a aut nel 2021) [10.1109/DDECS52668.2021.9417048].
Accelerated Analysis of Simulation Dumps through Parallelization on Multicore Architectures
Bernardi P.;Calabrese A.;Littardi S.;Quer S.;
2021
Abstract
With the explosion of off-the-shelf SoCs in terms of size and the advent of novel techniques related to failure modes, commercial ATPG and fault simulation engines can often be insufficient to measure the coverage of very specific metrics. In these cases, many researchers firstly store the simulation trace during the analysis phase. Then, they collect the desired statistics during a post-processing step. In this framework, the so-called Value Change Dump (VCD) is a very commonly used file format to record simulation traces. The target of this paper is twofold. From the one hand, we illustrate some Burn-In (BI) related metrics which cannot be evaluated by current commercial fault simulators and ATPG engines. These metrics are indeed based on a post-processing analysis of memory dumps in VCD format. From the other hand, we mitigate the evaluation time and the memory required to analyze huge VCD files by exploiting optimization techniques coming from modern programming features and smart parallelization. Adopting this strategy, we can analyze simulation dumps of more than 250 GBytes in less than one hour, showing improvements of two orders of magnitude over previous tools, with a consequent higher scalability and testability power.File | Dimensione | Formato | |
---|---|---|---|
09417048.pdf
accesso riservato
Tipologia:
2a Post-print versione editoriale / Version of Record
Licenza:
Non Pubblico - Accesso privato/ristretto
Dimensione
1.93 MB
Formato
Adobe PDF
|
1.93 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/11583/2909176