SCIENCE CHINA Information Sciences, Volume 62, Issue 6: 062408(2019) https://doi.org/10.1007/s11432-018-9863-6

Circuit design of RRAM-based neuromorphic hardware systems for classification and modified Hebbian learning

• AcceptedMar 28, 2019
• PublishedMay 7, 2019
Share
Rating

Abstract

This paper proposes a solution to the learning of neuromorphic hardware systems based on metal-oxideresistive random access memory (RRAM) arrays, which are used as binary electronic synapses. A modified Hebbian learning method is developed to update the binary synaptic weights, and mixed-signal circuits are designed to implement the proposed learning method. The circuits are verified by SPICE, and systematic simulations are also conducted to verify the capability of the neuromorphic system to process relatively large databases. The results show that the system presents high processing speed ($10^6$ examples per second) for both classification and learning, and a high recognition accuracy (up to 95.6%) on the MNIST database.

Acknowledgment

This work was supported by National Natural Science Foundation of China (Grant Nos. 61421005, 61334007, 61604005).

References

[1] Indiveri G, Chicca E, Douglas R. A VLSI array of low-power spiking neurons and bistable synapses with spike-timing dependent plasticity.. IEEE Trans Neural Netw, 2006, 17: 211-221 CrossRef PubMed Google Scholar

[2] Mead C. Neuromorphic electronic systems. Proc IEEE, 1990, 78: 1629-1636 CrossRef Google Scholar

[3] Yu S, Wu Y, Jeyasingh R. An Electronic Synapse Device Based on Metal Oxide Resistive Switching Memory for Neuromorphic Computation. IEEE Trans Electron Devices, 2011, 58: 2729-2737 CrossRef ADS Google Scholar

[4] Wong H S P, Lee H Y, Yu S. Metal-Oxide RRAM. Proc IEEE, 2012, 100: 1951-1970 CrossRef Google Scholar

[5] Yang J J, Strukov D B, Stewart D R. Memristive devices for computing. Nat Nanotech, 2013, 8: 13-24 CrossRef PubMed ADS Google Scholar

[6] Yu S M, Gao B, Fang Z, et al. A neuromorphic visual system using RRAM synaptic devices with Sub-pJ energy and tolerance to variability: experimental characterization and large-scale modeling. In: Proceedings of IEEE International Electron Devices Meeting (IEDM), San Francisco, 2012. Google Scholar

[7] Gao B, Bi Y, Chen H Y. Ultra-low-energy three-dimensional oxide-based electronic synapses for implementation of robust high-accuracy neuromorphic computation systems.. ACS Nano, 2014, 8: 6998-7004 CrossRef PubMed Google Scholar

[8] Yu S, Li Z, Chen P Y, et al. Binary neural network with 16 Mb RRAM macro chip for classification and online training. In: Proceedings of IEEE International Electron Devices Meeting (IEDM), San Francisco, 2016. Google Scholar

[9] Jang J W, Park S, Burr G W. Optimization of Conductance Change in Pr$_{1-x}$Ca$_{x}$MnO$_{3}$-Based Synaptic Devices for Neuromorphic Systems. IEEE Electron Device Lett, 2015, 36: 457-459 CrossRef ADS Google Scholar

[10] Prezioso M, Merrikh-Bayat F, Hoskins B D. Training and operation of an integrated neuromorphic network based on metal-oxide memristors. Nature, 2015, 521: 61-64 CrossRef PubMed ADS arXiv Google Scholar

[11] Soltiz M, Merkel C, Kudithipudi D, et al. RRAM-based adaptive neural logic block for implementing non-linearly separable functions in a single layer. In: Proceedings of IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH), Amsterdam, 2012. 218--225. Google Scholar

[12] Ambrogio S, Balatti S, Milo V, et al. Novel RRAM-enabled 1T1R synapse capable of low-power STDP via burst-mode communication and real-time unsupervised machine learning. In: Proceedings of IEEE Symposium on VLSI Technology, Honolulu, 2016. Google Scholar

[13] Chu M, Kim B, Park S. Neuromorphic Hardware System for Visual Pattern Recognition With Memristor Array and CMOS Neuron. IEEE Trans Ind Electron, 2015, 62: 2410-2419 CrossRef Google Scholar

[14] Milo V, Pedretti G, Carboni R, et al. Demonstration of hybrid CMOS/RRAM neural networks with spike time/rate-dependent plasticity. In: Proceedings of IEEE International Electron Devices Meeting (IEDM), San Francisco, 2016. Google Scholar

[15] Cantley K D, Subramaniam A, Stiegler H J. Hebbian Learning in Spiking Neural Networks With Nanocrystalline Silicon TFTs and Memristive Synapses. IEEE Trans Nanotechnol, 2011, 10: 1066-1073 CrossRef ADS Google Scholar

[16] Burr G W, Shelby R M, Sebastian A. Neuromorphic computing using non-volatile memory. Adv Phys-X, 2017, 2: 89-124 CrossRef Google Scholar

[17] Zhang Q, Wu H, Yao P. Sign backpropagation: An on-chip learning algorithm for analog RRAM neuromorphic computing systems.. Neural Networks, 2018, 108: 217-223 CrossRef PubMed Google Scholar

[18] Liao Y, Deng N, Wu H. Weighted Synapses Without Carry Operations for RRAM-Based Neuromorphic Systems.. Front Neurosci, 2018, 12: 167 CrossRef PubMed Google Scholar

[19] Lynch M A. Long-term potentiation and memory.. Physiological Rev, 2004, 84: 87-136 CrossRef PubMed Google Scholar

[20] Massey P V, Bashir Z I. Long-term depression: multiple forms and implications for brain function.. Trends Neurosciences, 2007, 30: 176-184 CrossRef PubMed Google Scholar

[21] Diehl P U, Matthew C. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Frontiers in Computational Neuroscience, 2015, 9: 1-9. Google Scholar

[22] Haykin S. The human brain. In: Neural Networks and Learning Machines, 3rd ed. New Jersey: Pearson Education, 2009. Google Scholar

[23] Chen A. A Comprehensive Crossbar Array Model With Solutions for Line Resistance and Nonlinear Device Characteristics. IEEE Trans Electron Devices, 2013, 60: 1318-1326 CrossRef ADS Google Scholar

[24] Yu S, Chen P Y, Cao Y, et al. Scaling-up resistive synaptic arrays for neuro-inspired architecture: challenges and prospect. In: Proceedings of IEEE International Electron Devices Meeting (IEDM), San Francisco, 2016. Google Scholar

[25] Hebb D O. The organization of behavior: a neuropsychological theory. In: The Organization of Behavior: A Neuropsychological Theory. New York: Wiley, 2013. Google Scholar

[26] Lowel S, Singer W. Selection of Intrinsic Horizontal Connections in the Visual Cortex by Correlated Neuronal Activity. Science, 1992, 255: 209-212 CrossRef ADS Google Scholar

[27] Haykin S. Principles of self-organization. In: Neural Networks and Learning Machines, 3rd ed. New Jersey: Pearson Education, 2009. Google Scholar

[28] Stent G S. A Physiological Mechanism for Hebb's Postulate of Learning. Proc Natl Acad Sci USA, 1973, 70: 997-1001 CrossRef ADS Google Scholar

[29] Changeux J P, Danchin A. Selective stabilisation of developing synapses as a mechanism for the specification of neuronal networks. Nature, 1976, 264: 705-712 CrossRef ADS Google Scholar

[30] Principe J C, Euliano N R, Lefebvre W C. Hebbian learning and principal component analysis. In: Neural and Adaptive Systems: Fundamentals Through Simulations, 2nd ed. New York: Wiley, 1997. Google Scholar

• Figure 1

(Color online) An overview of the neuromorphic system in this work. The outputs of Layer 1 are directly conveyed to Layer 2 as inputs. A postsynaptic neuron in Layer 1 and the corresponding presynaptic neuron in Layer 2 form a hidden neuron together. Note that Layer 2 uses delayed clocks.

• Figure 2

Communications among modules in each layer.

• Figure 3

(Color online) An analogy between biological synapses and RRAM synapses. (a) A biological synapse;protect łinebreak (b) RRAM synapses. In our modified Hebbian learning method, there are two types of synapses: excitatory synapses and inhibitory synapses.

• Figure 4

(Color online) (a) Measured transient response of an RRAM device during a SET process. The device switches from HRS to LRS. (b) Measured transient response of an RRAM device during a RESET process. The device switches from LRS to HRS. (c) Measured and simulated resistance distribution of the fabricated RRAM devices. The devices have two stable resistance states and show reliable LTD and LTP behaviors under successive RESET and SET pulses. The measured results indicate that the RRAM devices can be used as qualified binary synapses.

• Figure 5

(Color online) Schematic diagram of the 1/3 bias scheme used in this work. The RRAM cell in the middle is selected, while others are not. By applying operation voltages ($V_X$ and 0 on the selected columns and rows, $\frac13V_X$ and $\frac23V_X$ on other columns and rows) on each column and each row simultaneously, the selected RRAM cell can be switched to either HRS or LRS.

• Figure 6

(Color online) Flowchart for the neuromorphic system based on the modified Hebbian learning.

• Figure 7

(Color online) An example of the transitions of RRAM cells during the modified Hebbian learning. (a) Example input; (b) LTD; (c) LTP.

• Figure 8

(Color online) Analog parts of the proposed neuron circuits. (a) The analog part (either an E-part or an I-part) of a presynaptic neuron. $V_{\rm~PRE}$ is the output of the analog part of the presynaptic neuron and connects to a column of the RRAM array. (b) The analog part of a postsynaptic neuron. Synapses is an I/O port that links to a row of the RRAM array.

• Figure 9

(Color online) The SPICE transient simulations of analog parts of the proposed neuron circuits. The overlapped pulses from both sides can change the state of the RRAM synapse. At about $t=500$ ns, RRAM_1 changes to HRS, while RRAM_2 changes to HRS first, and then back to LRS. Then the two cells keep their resistance states unchanged during the subsequent operations. (a) Waveforms of signal CLK_1M, VPRE_1, VPRE_2 and VPOST; (b) waveforms of signal VRRAM_1, IRRAM_1, VRRAM_2 and IRRAM_2.

• Figure 10

(Color online) (a) Circuit design of a threshold controller; (b) the SPICE transient simulations of the threshold controller. A clock signal is applied to the threshold controller, and a dynamic threshold that rises gradually and declines rapidly is generated.

• Figure 11

Digital parts of the proposed neuron circuits and the neuron manager module. (a) The digital part of a presynaptic neuron. The digital part of a presynaptic neuron is implemented by only a positive-edge-triggered D flip-flop. (b) The digital parts of a postsynaptic neuron. These digital circuits accept the output signal from the comparator in the analog part of the neuron, and accept a serial of control signals, and finally output the firing state of the neuron. (c) Neuron manager. The neuron manager generates control signals to the digital parts of postsynaptic neurons, and implements the WTA rule by sending a Sign signal that implies whether a winner neuron occurs.

• Figure 12

(Color online) The SPICE transient simulations of digital parts of the proposed neuron circuits during unsupervised learning. No label is needed in unsupervised learning, and the postsynaptic neurons compete with each other until one of them becomes the winner and fires. (a) Waveforms of signal CLK_1M_D1, CMP1, CMP2, and CMP3; (b) waveforms of signal Fire1, Fire2, Fire3, and Sign.

• Figure 13

(Color online) The SPICE transient simulations of digital parts of the proposed neuron circuits during supervised learning. Alongside with each example, a label should be input to force one of the postsynaptic neurons to fire.protect łinebreak (a) Waveforms of CLK_1M_D1, Label1, Label2, Label3, and Sign; (b) waveforms of CMP1, CMP2, CMP3, Fire1, Fire2, and Fire3.

• Figure 14

(Color online) (a) Circuit design of an NRP controller. (b) The SPICE transient simulations of the NRP controller. The NRP selection and the firing of a neuron itself control whether the neuron is allowed to fire during the subsequent operations together.

• Figure 15

(Color online) (a) Circuit design of a learning controller. (b) The SPICE transient simulations of the learning controller. The learning controller generates two signals that are crucial for learning. The pulse of Learning immediately follows the pulse of direction, and the time difference is fixed to 100 ns.

• Figure 16

Resistance matrices of part of the RRAM arrays before and after learning. Before learning, the initial states of RRAM synapses are arbitrary; after learning, information is stored into the RRAM synapses, which also indicates the memory of the example is created in the system. Excitatory synapses and inhibitory synapses show opposite evolutions.

• Figure 17

(Color online) (a) Recognition accuracy as a function of the number of hidden neurons. When inhibitory synapses are used, the system achieves a better recognition accuracy (up to 95.6%). (b) Recognition accuracy as a function of device variations. Although the recognition accuracy of the system decreases as the device variations increase, the system shows a relatively good tolerance. When 10 thousand neurons are used, the system can maintain a high recognition accuracy ($>$ 90%) when the device variation is up to 20%.

• Figure 18

(Color online) The impact of line resistance on the recognition accuracy. RL is the line resistance between two adjacent RRAM cells, and RLRS is the average resistance of LRS RRAM cells. Systems with more hidden neurons are more sensitive to the IR drop issue.

• Table 1   Synaptic current levels
 Presynaptic neuron Fire Rest Excitatory synapse LRS 1 0 HRS 0 0 Inhibitory synapse LRS 0 1 HRS 0 0
• Table 2   Resource utilization of digital circuits of the system with differenct number of hidden neurons
 Hidden neurons Logic utilization (in ALMs) Total registers 10 88 913 100 441 1273 1000 4188 4861

Citations

• 0

Altmetric

Copyright 2020 Science China Press Co., Ltd. 《中国科学》杂志社有限责任公司 版权所有