logo

SCIENTIA SINICA Informationis, Volume 47 , Issue 1 : 127-143(2017) https://doi.org/10.1360/N112015-00213

Mutual information analysis for digital circuits}{Mutual information analysis for digital circuits

More info
  • ReceivedJan 6, 2016
  • AcceptedApr 14, 2016
  • PublishedOct 25, 2016

Abstract

Information theory is an effective tool to study information flow systems. The information entropy is a measure of the average uncertainty in the random variable, and the mutual information is a measure of the dependence between two random variables. As the mutual information, which is often used to characterize the processing capability in the communication system, is closely related to the channel capacity and circuit power, we use the information theory to analyze the digital circuits in this paper. We discussed variation trend of mutual information and its influence factors for four typical circuit models, symmetric, asymmetric, feedback and non-feedback circuits. We also proved that the circuit with feedback has fault tolerance and mutual information gain. The analysis method is suitable for other circuits, and it can provide a theoretical basis for circuit design.


Funded by

国家自然科学基金(61371104)

国家高技术研究发展计划(863)

(2014AA01A707)


References

[1] Cover T M, Thomas J A. Elements of Information Theory. 2nd. Oxford: Wiley-Blackwell, 2006. 7-13. Google Scholar

[2] Shannon C E. A mathematical theory of communication. Bell Syst Tech J, 1948, 27: 379-423 CrossRef Google Scholar

[3] Sotiriadis P P, Tarokh V, Chandrakasan A P. Energy reduction in VLSI computation modules: an information-theoretic approach. IEEE Trans Inf Theory, 2003, 49: 790-808 CrossRef Google Scholar

[4] Amankwah A, Aldrich C. Multiresolution image registration using spatial mutual information. In: Proceedings of Oceans, Hampton Roads, 2012. 1-4. Google Scholar

[5] Rubanov N. A general framework to perform the MAX/MIN operations in parameterized statistical timing analysis using information theoretic concepts. IEEE Trans Computer-Aided Design Integrated Circ Syst, 2011, 30: 1011-1019 CrossRef Google Scholar

[6] Ivrlac M T, Nossek J A. Toward a circuit theory of communication. IEEE Trans Circ Syst I: Regular Papers, 2010, 57: 1663-1683 CrossRef Google Scholar

[7] Fan X, Tanik M M. An experiment on evolutionary design of combinational logic circuits using information theory. In: Proceedings of IEEE Southeastcon, Nashville, 2011. 379-383. Google Scholar

[8] Grover P, Goldsmith A, Sahai A, et al. Information theory meets circuit design: why capacity-approaching codes require more chip area and power. In: Proceedings of 49th Annual Allerton Conference on Communication, Control, and Computing, Monticello, 2011. 1392-1399. Google Scholar

[9] Shanbhag N R. A mathematical basis for power-reduction in digital VLSI systems. IEEE Trans Circ Syst II: Analog Digital Signal Process, 1997, 44: 935-951 CrossRef Google Scholar

[10] Hegde R, Shanbhag N R. Energy-efficiency in presence of deep submicron noise. In: Proceedings of IEEE/ACM International Conference on Computer-Aided Design, San Jose, 1998. 228-234. Google Scholar

[11] Wang L, Shanbhag N R. Energy-efficiency bounds for deep submicron VLSI systems in the presence of noise. IEEE Trans Very Large Scale Integration (VLSI) Syst, 2003, 2: 254-269. Google Scholar

[12] Aguirre A H, Coello C, Carlos A. Mutual information-based fitness functions for evolutionary circuit synthesis. In: Proceedings of Congress on Evolutionary Computation, Portland, 2004. 2: 1309-1316. Google Scholar

[13] Zhang Y, He Z Y, Guo L. A power system fault diagnosis method based on mutual-information network. In: Proceedings of Asia-Pacific Power and Energy Engineering Conference, Chengdu, 2010. 1-4. Google Scholar

[14] Kinoshita K, Ochiai H. Energy minimization of wireless sensor networks based on modulation and coding optimization under finite frame length constraint. In: Proceedings of IEEE Military Communications Conference, Orlando, 2012. 1-5. Google Scholar

[15] Nizami L. Information theory's failure in neuroscience: on the limitations of cybernetics. In: Proceedings of IEEE Conference on Norbert Wiener in the 21st Century (21CW), Boston, 2014. 1-8. Google Scholar

[16] Juan L, de K J, Kuhn L, et al. A unified information criterion for evaluating probe and test selection. In: Proceedings of International Conference on Prognostics and Health Management, Denver, 2008. 1-8. Google Scholar

[17] Li C, Elia N. Bounds on the achievable rate of noisy feedback Gaussian channels under linear feedback coding scheme. In: Proceedings of IEEE International Symposium on Information Theory (ISIT), St. Petersburg, 2011. 169-173. Google Scholar

[18] Ardestanizadeh E, Franceschetti M. Control-theoretic approach to communication with feedback. IEEE Trans Autom Control, 2012, 57: 2576-2587 CrossRef Google Scholar

[19] Eswaran K, Gastpar M. Feedback communication and control over a single channel. IEEE Trans Inf Theory, 2013, 59: 6243-6257 CrossRef Google Scholar

[20] Korkmaz P. Probabilistic CMOS (PCMOS) in the nanoelectronics regime. Dissertation for Ph.D. Degree. Atlanta: Georgia Institute of Technology, 2007. Google Scholar

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有

京ICP备17057255号       京公网安备11010102003388号