Turing complete neural computation based on synaptic plasticity
Autoři:
Jérémie Cabessa aff001
Působiště autorů:
Laboratory of Mathematical Economics and Applied Microeconomics (LEMMA), University Paris 2 – Panthéon-Assas, 75005 Paris, France
aff001; Institute of Computer Science, Czech Academy of Sciences, 18207 Prague 8, Czech Republic
aff002
Vyšlo v časopise:
PLoS ONE 14(10)
Kategorie:
Research Article
doi:
https://doi.org/10.1371/journal.pone.0223451
Souhrn
In neural computation, the essential information is generally encoded into the neurons via their spiking configurations, activation values or (attractor) dynamics. The synapses and their associated plasticity mechanisms are, by contrast, mainly used to process this information and implement the crucial learning features. Here, we propose a novel Turing complete paradigm of neural computation where the essential information is encoded into discrete synaptic states, and the updating of this information achieved via synaptic plasticity mechanisms. More specifically, we prove that any 2-counter machine—and hence any Turing machine—can be simulated by a rational-weighted recurrent neural network employing spike-timing-dependent plasticity (STDP) rules. The computational states and counter values of the machine are encoded into discrete synaptic strengths. The transitions between those synaptic weights are then achieved via STDP. These considerations show that a Turing complete synaptic-based paradigm of neural computation is theoretically possible and potentially exploitable. They support the idea that synapses are not only crucially involved in information processing and learning features, but also in the encoding of essential information. This approach represents a paradigm shift in the field of neural computation.
Klíčová slova:
Action potentials – Language – Machine learning algorithms – Neurons – Recurrent neural networks – Synapses – Synaptic plasticity – Neural pathways
Zdroje
1. McCulloch WS, Pitts W. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysic. 1943;5:115–133. doi: 10.1007/BF02478259
2. Kleene SC. Representation of events in nerve nets and finite automata. In: Shannon C, McCarthy J, editors. Automata Studies. Princeton, NJ: Princeton University Press; 1956. p. 3–41.
3. Minsky ML. Computation: finite and infinite machines. Englewood Cliffs, N. J.: Prentice-Hall, Inc.; 1967.
4. Siegelmann HT, Sontag ED. On the computational power of neural nets. J Comput Syst Sci. 1995;50(1):132–150. doi: 10.1006/jcss.1995.1013
5. Siegelmann HT, Sontag ED. Analog computation via neural networks. Theor Comput Sci. 1994;131(2):331–360. doi: 10.1016/0304-3975(94)90178-3
6. Cabessa J, Siegelmann HT. Evolving recurrent neural networks are super-Turing. In: Proceedings of IJCNN 2011. IEEE; 2011. p. 3200–3206.
7. Cabessa J, Siegelmann HT. The Super-Turing Computational Power of plastic Recurrent Neural Networks. Int J Neural Syst. 2014;24(8). doi: 10.1142/S0129065714500294 25354762
8. Síma J, Orponen P. General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results. Neural Computation. 2003;15(12):2727–2778. doi: 10.1162/089976603322518731 14629867
9. Elman JL. Finding Structure in Time. Cognitive Science. 1990;14(2):179–211. doi: 10.1207/s15516709cog1402_1
10. Pollack JB. The Induction of Dynamical Recognizers. Machine Learning. 1991;7:227–252. doi: 10.1023/A:1022651113306
11. Indyk P. Optimal Simulation of Automata by Neural Nets. In: Mayr EW, Puech C, editors. STACS. vol. 900 of Lecture Notes in Computer Science. Springer; 1995. p. 337–348.
12. Horne BG, Hush DR. Bounds on the complexity of recurrent neural network implementations of finite state machines. Neural Networks. 1996;9(2):243–252. doi: 10.1016/0893-6080(95)00095-X
13. Siegelmann HT. Recurrent Neural Networks and Finite Automata. Computational Intelligence. 1996;12:567–574. doi: 10.1111/j.1467-8640.1996.tb00277.x
14. Maass W. Computing with Spiking Neurons. In: Maass W, Bishop CM, editors. Pulsed Neural Networks. Cambridge, MA, USA: MIT Press; 1999. p. 55–85.
15. Maass W, Bishop CM, editors. Pulsed Neural Networks. Cambridge, MA, USA: MIT Press; 1999.
16. Păun G. Computing with Membranes. J Comput Syst Sci. 2000;61(1):108–143. doi: 10.1006/jcss.1999.1693
17. Păun G. Membrane Computing. An Introduction. Berlin: Springer-Verlag; 2002.
18. The P Systems Webpage;. Available from: http://ppage.psystems.eu/.
19. Neumann Jv. The computer and the brain. New Haven, CT, USA: Yale University Press; 1958.
20. Kilian J, Siegelmann HT. The dynamic universality of sigmoidal neural networks. Inf Comput. 1996;128(1):48–56. doi: 10.1006/inco.1996.0062
21. Hyötyniemi H. Turing machines are recurrent neural networks. In: Alander J, Honkela T, M J, editors. STeP’96—Genes, Nets and Symbols; Finnish Artificial Intelligence Conference, Vaasa 20-23 Aug. 1996. Vaasa, Finland: University of Vaasa, Finnish Artificial Intelligence Society (FAIS); 1996. p. 13–24.
22. Balcázar JL, Gavaldà R, Siegelmann HT. Computational power of neural networks: a characterization in terms of Kolmogorov complexity. IEEE Transactions on Information Theory. 1997;43(4):1175–1183. doi: 10.1109/18.605580
23. Neto JaPG, Siegelmann HT, Costa JF, Araujo CPS. Turing Universality of Neural Nets (Revisited). In: EUROCAST’97: Proceedings of the A Selection of Papers from the 6th International Workshop on Computer Aided Systems Theory. London, UK: Springer-Verlag; 1997. p. 361–366.
24. Siegelmann HT. Neural networks and analog computation: beyond the Turing limit. Cambridge, MA, USA: Birkhauser Boston Inc.; 1999.
25. Cabessa J, Duparc J. Expressive Power of Non-deterministic Evolving Recurrent Neural Networks in Terms of Their Attractor Dynamics. In: Calude CS, Dinneen MJ, editors. Unconventional Computation and Natural Computation—14th International Conference, UCNC 2015, Auckland, New Zealand, August 30—September 3, 2015, Proceedings. vol. 9252 of Lecture Notes in Computer Science. Springer; 2015. p. 144–156.
26. Cabessa J, Duparc J. Expressive Power of Nondeterministic Recurrent Neural Networks in Terms of their Attractor Dynamics. IJUC. 2016;12(1):25–50.
27. Cabessa J, Finkel O. Expressive Power of Evolving Neural Networks Working on Infinite Input Streams. In: Klasing R, Zeitoun M, editors. Fundamentals of Computation Theory - 21st International Symposium, FCT 2017, Bordeaux, France, September 11-13, 2017, Proceedings. vol. 10472 of Lecture Notes in Computer Science. Springer; 2017. p. 150–163.
28. Cabessa J, Siegelmann HT. The Computational Power of Interactive Recurrent Neural Networks. Neural Computation. 2012;24(4):996–1019. doi: 10.1162/NECO_a_00263 22295978
29. Cabessa J, Villa AEP. The expressive power of analog recurrent neural networks on infinite input streams. Theor Comput Sci. 2012;436: 23–34. doi: 10.1016/j.tcs.2012.01.042
30. Cabessa J, Villa AEP. The Super-Turing Computational Power of Interactive Evolving Recurrent Neural Networks. In: et al VM, editor. Proceedings of ICANN 2013. vol. 8131 of Lecture Notes in Computer Science. Springer; 2013. p. 58–65.
31. Cabessa J, Villa AEP. Interactive Evolving Recurrent Neural Networks Are Super-Turing Universal. In: et al SW, editor. Proceedings of ICANN 2014. vol. 8681 of Lecture Notes in Computer Science. Springer; 2014. p. 57–64.
32. Cabessa J, Villa AEP. Computational capabilities of recurrent neural networks based on their attractor dynamics. In: 2015 International Joint Conference on Neural Networks, IJCNN 2015, Killarney, Ireland, July 12-17, 2015. IEEE; 2015. p. 1–8.
33. Cabessa J, Villa AEP. Recurrent neural networks and super-Turing interactive computation. In: Koprinkova-Hristova P, Mladenov V, Kasabov KN, editors. Artificial Neural Networks: Methods and Applications in Bio-/Neuroinformatics. Springer; 2015. p. 1–29.
34. Cabessa J, Villa AEP. On Super-Turing Neural Computation. In: Liljenström H, editor. Advances in Cognitive Neurodynamics (IV): Proceedings of the Fourth International Conference on Cognitive Neurodynamics—2013. Dordrecht: Springer Netherlands; 2015. p. 307–312.
35. Cabessa J, Villa AEP. Expressive power of first-order recurrent neural networks determined by their attractor dynamics. Journal of Computer and System Sciences. 2016;82(8):1232–1250. doi: 10.1016/j.jcss.2016.04.006
36. Turing AM. Intelligent Machinery. Teddington, UK: National Physical Laboratory; 1948.
37. Rosenblatt F. The perceptron: A perceiving and recognizing automaton. Ithaca, New York: Cornell Aeronautical Laboratory; 1957. 85-460-1.
38. Hebb DO. The organization of behavior: a neuropsychological theory. John Wiley & Sons Inc.; 1949.
39. Rosenblatt F. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review. 1958;65(6):386–408. doi: 10.1037/h0042519 13602029
40. Widrow B. The Speed of Adaption in Adaptive Control Systems. In: American Rocket Society (ARS) Guidance, Control and Navigation Conference Proceedings; 1961. p. 1933–1961.
41. Minsky ML, Papert S. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA, USA: MIT Press; 1969.
42. Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks. 2015;61:85–117. doi: 10.1016/j.neunet.2014.09.003 25462637
43. Abbott LF, Nelson SB. Synaptic plasticity: taming the beast. Nat Neurosci. 2000;3 Suppl.:1178–1183. doi: 10.1038/81453 11127835
44. Markram H, Lübke J, Frotscher M, Sakmann B. Regulation of Synaptic Efficacy by Coincidence of Postsynaptic APs and EPSPs. Science. 1997;275(5297):213–215. doi: 10.1126/science.275.5297.213 8985014
45. Caporale N, Dan Y. Spike timing-dependent plasticity: a Hebbian learning rule. Annu Rev Neurosci. 2008;31:25–46. doi: 10.1146/annurev.neuro.31.060407.125639 18275283
46. Sjöström J, Gerstner W. Spike-timing dependent plasticity. Scholarpedia. 2010;5(1):1362.
47. Abeles M. Local Cortical Circuits. An Electrophysiological Study. vol. 6 of Studies of Brain Function. Berlin Heidelberg New York: Springer-Verlag; 1982.
48. Abeles M. Corticonics: Neuronal Circuits of the Cerebral Cortex. 1st ed. Cambridge University Press; 1991.
49. Abeles M. Time Is Precious. Science. 2004;304(5670):523–524. doi: 10.1126/science.1097725 15105481
50. Ikegaya Y, Aaron G, Cossart R, Aronov D, Lampl I, Ferster D, et al. Synfire Chains and Cortical Songs: Temporal Modules of Cortical Activity. Science. 2004;304(5670):559–564. doi: 10.1126/science.1093173 15105494
51. Mainen ZF, Sejnowski TJ. Reliability of spike timing in neocortical neurons. Science. 1995;268(5216):1503–1506.
52. Zheng P, Triesch J. Robust development of synfire chains from multiple plasticity mechanisms. Front Comput Neurosci. 2014;8(66). doi: 10.3389/fncom.2014.00066 25071537
53. Izhikevich EM. Polychronization: computation with spikes. Neural Computation. 2006;18(2):245–82. doi: 10.1162/089976606775093882 16378515
54. Szatmáry B, Izhikevich EM. Spike-Timing Theory of Working Memory. PLoS Computational Biology. 2010;6(8):e1000879. doi: 10.1371/journal.pcbi.1000879 20808877
55. Jun JK, Jin DZ. Development of Neural Circuitry for Precise Temporal Sequences through Spontaneous Activity, Axon Remodeling, and Synaptic Plasticity. PLOS ONE. 2007;2(8):1–17. doi: 10.1371/journal.pone.0000723
56. Montgomery JM, Madison DV. Discrete synaptic states define a major mechanism of synapse plasticity. Trends in Neurosciences. 2004;27(12):744–750. doi: 10.1016/j.tins.2004.10.006 15541515
57. Hopcroft JE, Motwani R, Ullman JD. Introduction to Automata Theory, Languages, and Computation (3rd Edition). Boston, MA, USA: Addison-Wesley Longman Publishing Co., Inc.; 2006.
58. Šíma J. Three Analog Neurons Are Turing Universal. In: Fagan D, Martín-Vide C, O’Neill M, Vega-Rodríguez MA, editors. Theory and Practice of Natural Computing - 7th International Conference, TPNC 2018, Dublin, Ireland, December 12-14, 2018, Proceedings. vol. 11324 of Lecture Notes in Computer Science. Springer; 2018. p. 460–472.
59. Neary T. Three small universal spiking neural P systems. Theor Comput Sci. 2015;567:2–20. doi: 10.1016/j.tcs.2014.09.006
60. Song T, Pan L, Păun G. Spiking neural P systems with rules on synapses. Theoretical Computer Science. 2014;529:82–95. doi: 10.1016/j.tcs.2014.01.001
61. Mead C. Neuromorphic electronic systems. Proceedings of the IEEE. 1990;78(10):1629–1636. doi: 10.1109/5.58356
62. Monroe D. Neuromorphic Computing Gets Ready for the (Really) Big Time. Commun ACM. 2014;57(6):13–15.
Článek vyšel v časopise
PLOS One
2019 Číslo 10
- S diagnostikou Parkinsonovy nemoci může nově pomoci AI nástroj pro hodnocení mrkacího reflexu
- Je libo čepici místo mozkového implantátu?
- Pomůže v budoucnu s triáží na pohotovostech umělá inteligence?
- AI může chirurgům poskytnout cenná data i zpětnou vazbu v reálném čase
- Nová metoda odlišení nádorové tkáně může zpřesnit resekci glioblastomů
Nejčtenější v tomto čísle
- Correction: Low dose naltrexone: Effects on medication in rheumatoid and seropositive arthritis. A nationwide register-based controlled quasi-experimental before-after study
- Combining CDK4/6 inhibitors ribociclib and palbociclib with cytotoxic agents does not enhance cytotoxicity
- Experimentally validated simulation of coronary stents considering different dogboning ratios and asymmetric stent positioning
- Risk factors associated with IgA vasculitis with nephritis (Henoch–Schönlein purpura nephritis) progressing to unfavorable outcomes: A meta-analysis
Zvyšte si kvalifikaci online z pohodlí domova
Všechny kurzy