Dr. Robert Legenstein

Full Professor

Institute Head
Institute of Machine Learning and Neural Computation
Graz University of Technology

 

Speaker of the Graz Center for Machine Learning

Action Editor for TMLR


Research       Teaching      Projects       Publications


Publications

[82] Anand Subramoney, Guillaume Bellec, Franz Scherr, Robert Legenstein, and Wolfgang Maass.
Fast learning without synaptic plasticity in spiking neural networks.
Scientific Reports, 14(1):8557, 2024. (Link to PDF)

[81] Maximilian Baronig and Robert Legenstein.
Context association in pyramidal neurons through local synaptic plasticity in apical dendrites.
Frontiers in Neuroscience, 17:1276706, 2024. (Link to PDF)

[80] Ozan Özdenizci and Robert Legenstein.
Adversarially robust spiking neural networks through conversion.
arXiv preprint arXiv:2311.09266, 2023. (Link to PDF)

[79] Thomas Limbacher, Ozan Özdenizci, and Robert Legenstein.
Memory-dependent computation and learning in spiking neural networks through hebbian plasticity.
IEEE Transactions on Neural Networks and Learning Systems, 2023. (Link to PDF)

[78] Romain Ferrand, Maximilian Baronig, Thomas Limbacher, and Robert Legenstein.
Context-dependent computations in spiking neural networks with apical modulation. In
International Conference on Artificial Neural Networks, pages 381-392. Springer, 2023. (Link to PDF)

[77] Francisco Javier Klaiber Aboitiz, Robert Legenstein, and Ozan Özdenizci.
Interaction of generalization and out-of-distribution detection capabilities in deep neural networks. In
32nd International Conference on Artificial Neural Networks (ICANN), 2023.

[76] Adam Sebestyen, Ozan Özdenizci, Robert Legenstein, and Urs Hirschberg.
Generating conceptual architectural 3D geometries with denoising diffusion models. In
41st eCAADe Conference: Education and Research in Computer Aided Architectural Design in Europe, 2023.

[75] Horst Petschenig and Robert Legenstein.
Quantized rewiring: hardware-aware training of sparse deep neural networks.
Neuromorphic Computing and Engineering, 3(2):024006, 2023.

[74] Ceca Kraisnikovic, Spyros Stathopoulos, Themis Prodromakis, and Robert Legenstein.
Fault pruning: Robust training of neural networks with memristive weights. In
20th International Conference on Unconventional Computation and Natural Computation, 2023.

[73] Ozan Özdenizci and Robert Legenstein.
Restoring vision in adverse weather conditions with patch-based denoising diffusion models.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023. (Link to arXiv PDF)

[72] Thomas Limbacher, Ozan Özdenizci, and Robert Legenstein.
Memory-enriched computation and learning in spiking neural networks through Hebbian plasticity.
arXiv preprint arXiv:2205.11276, 2022. (PDF). (Link to arXiv PDF)

[71] Horst Petschenig, Marta Bisio, Marta Maschietto, Alessandro Leparulo, Robert Legenstein, and Stefano Vassanelli.
Classification of whisker deflections from evoked responses in the somatosensory barrel cortex with spiking neural networks.
Frontiers in Neuroscience, 16, 2022. (Link to PDF)

[70] Ozan Özdenizci and Robert Legenstein.
Improving robustness against stealthy weight bit-flip attacks by output code matching. In
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022. (Link to PDF)

[69] Agnes Korcsak-Gorzo, Michael G Müller, Andreas Baumbach, Luziwei Leng, Oliver J Breitwieser, Sacha J van Albada, Walter Senn, Karlheinz Meier, Robert Legenstein, and Mihai A Petrovici.
Cortical oscillations support sampling-based computations in spiking neural networks.
PLoS computational biology, 18(3):e1009753, 2022. (Link to PDF PDF)

[68] Jyotibdha Acharya, Arindam Basu, Robert Legenstein, Thomas Limbacher, Panayiota Poirazi, and Xundong Wu.
Dendritic computing: Branching deeper into machine learning.
Neuroscience, 2021. (Link to PDF)

[67] Ceca Kraisnikovic, Wolfgang Maass, and Robert Legenstein.
Spike-based symbolic computations on bit strings and numbers. In
Neuro-Symbolic Artificial Intelligence: The State of the Art, pages 214-234. IOS Press, 2021. ( paper link; link to bioRxiv PDF)

[66] D. Salaj, A. Subramoney, C. Kraisnikovic, R. Legenstein G. Bellec, and W. Maass.
Spike-frequency adaptation supports network computations on temporally dispersed information.
eLife, 10:e65459, 2021. (PDF). Supplementary material PDF (Link to eLife version, Link to bioRxiv version PDF)

[65] Manuel Traub, Robert Legenstein, and Sebastian Otte.
Many-joint robot arm control with recurrent spiking neural networks. In
2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 4918-4925, 2021. (Link to PDF)

[64] Manuel Traub, Martin V. Butz, Robert Legenstein, and Sebastian Otte.
Dynamic action inference with recurrent spiking neural networks. In
30th International Conference on Artificial Neural Networks (ICANN 2021), pages 233-244, 2021. (Link to PDF)

[63] Ozan Özdenizci and Robert Legenstein.
Training adversarially robust sparse networks via Bayesian connectivity sampling. In
Proceedings of the 38th International Conference on Machine Learning, Marina Meila and Tong Zhang, editors, volume 139 of
Proceedings of Machine Learning Research, pages 8314-8324. PMLR, 18-24 Jul 2021. (Link to PDF)

[62] Anand Subramoney, Guillaume Bellec, Franz Scherr, Robert Legenstein, and Wolfgang Maass.
Revisiting the role of synaptic plasticity and network dynamics for fast learning in spiking neural networks.
BioRxiv, 2021. (Link to PDF)

[61] A. Rao, R. Legenstein, A. Subramoney, and W. Maass.
A normative framework for learning top-down predictions through synaptic plasticity in apical dendrites.
BioRxiv/2021/433822, 2021. (PDF).

[60] Thomas Limbacher and Robert Legenstein.
H-mem: Harnessing synaptic plasticity with Hebbian memory networks.
Advances in Neural Information Processing Systems 33, 2020. (Link to PDF in pre-proceedings)

[59] G. Bellec, F. Scherr, A. Subramoney, E. Hajek, D. Salaj, R. Legenstein, and W. Maass.
A solution to the learning dilemma for recurrent networks of spiking neurons.
Nature Communications, 11:3625, 2020. (PDF). Supplementary material PDF, Supplementary movies PDF, (Commentary by Manneschi, L. & Vasilaki, E. (2020). An alternative to backpropagation through time. In Nature Machine Intelligence, 2(3), 155-156. PDF)

[58] Christophe Verbist, Michael G Müller, Huibert D Mansvelder, Robert Legenstein, and Michele Giugliano.
The location of the axon initial segment affects the bandwidth of spike initiation dynamics.
PLOS Computational Biology, 16(7):e1008087, 2020. (Link to journal PDF)

[57] T. Limbacher and R. Legenstein.
Emergence of stable synaptic clusters on dendrites through synaptic rewiring.
Frontiers in Computational Neuroscience, 14:57, 2020. (Link to journal PDF)

[56] M. G. Müller, C. H. Papadimitriou, W. Maass, and R. Legenstein.
A model for structured information representation in neural networks of the brain.
eNeuro, 7(3), 2020. (Journal iink to PDF)

[55] W. Maass, C.H. Papadimitriou, S. Vempala, and R. Legenstein.
Brain computation: a computer science perspective. In
Computing and Software Science (LNCS 10000), pages 184-199. Springer, 2019. (Journal link to PDF)

[54] J. Kaiser, M. Hoff, A. Konle, J. C. V. Tieck, D. Kappel, D. Reichard, A. Subramoney, R. Legenstein, A. Roennau, W. Maass, and R. Dillmann.
Embodied synaptic plasticity with online reinforcement learning.
Frontiers in Neurorobotics, 13(81), 2019. (PDF). (Journal link to PDF)

[53] C. Pokorny, M. J. Ison, A. Rao, R. Legenstein, C. Papadimitriou, and W. Maass.
STDP forms associations between memory traces in networks of spiking neurons.
Cerebral Cortex, 30(3):952-968, 2020. (PDF). (Supplementary material PDF), (Journal link to PDF)

[52] Y. Yan, D. Kappel, F. Neumärker, J. Partzsch, B. Vogginger, S. Höppner, S. Furber, W. Maass, R. Legenstein, and C. Mayr.
Efficient reward-based structural plasticity on a spinnaker 2 prototype.
IEEE Transactions on Biomedical Circuits and Systems, 13(3):579-591, June 2019. (PDF).

[51] G. Bellec, F. Scherr, E. Hajek, D. Salaj, R. Legenstein, and W. Maass.
Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets.
arxiv.org/abs/1901.09049, January 2019. (PDF).

[50] C. Liu, G. Bellec, B. Vogginger, D. Kappel, J. Partzsch, F. Neumärker, S. Höppner, W. Maass, S. B. Furber, R. Legenstein, and C. G. Mayr.
Memory-efficient deep learning on a spinnaker 2 prototype.
Frontiers in Neuroscience, 2018. (PDF).

[49] G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass.
Long short-term memory and learning-to-learn in networks of spiking neurons.
32nd Conference on Neural Information Processing Systems (NIPS 2018), Montreal, Canada, 2018. (PDF).

[48] R. Legenstein, W. Maass, C. H. Papadimitriou, and S. S. Vempala.
Long term memory and the densest K-subgraph problem.
In Proc. of Innovations in Theoretical Computer Science (ITCS), 2018. (PDF).

[47] G. Bellec, D. Kappel, W. Maass, and R. Legenstein.
Deep rewiring: training very sparse deep networks.
International Conference on Learning Representations (ICLR), 2018. (PDF).

[46] R. Legenstein, Z. Jonke, S. Habenschuss, and W. Maass.
A probabilistic model for learning in cortical microcircuit motifs with data-based divisive inhibition.
arXiv:1707.05182, 2017. (PDF).

[45] Z. Jonke, R. Legenstein, S. Habenschuss, and W. Maass.
Feedback inhibition shapes emergent computational properties of cortical microcircuit motifs.
Journal of Neuroscience, 37(35):8511-8523, 2017. (PDF).

[44] D. Kappel, R. Legenstein, S. Habenschuss, M. Hsieh, and W. Maass.
A dynamic connectome supports the emergence of stable computational function of neural circuits through reward-based learning.
eNeuro, 2 April, 2018. (PDF).

[43] M. A. Petrovici, S. Schmitt, J. Klähn, D. Stöckel, A. Schroeder, G. Bellec, J. Bill, O. Breitwieser, I. Bytschok, A. Grübl, M. Güttler, A. Hartel, S. Hartmann, D. Husmann, K. Husmann, , S. Jeltsch, V. Karasenko, M. Kleider, C. Koke, A. Kononov, C. Mauch, P. Müller, J. Partzsch, T. Pfeil, S. Schiefer, S. Scholze, A. Subramoney, V. Thanasoulis, B. Vogginger, R. Legenstein, W. Maass, R. Schüffny, C. Mayr, J. Schemmel, and K. Meier.
Pattern representation and recognition with accelerated analog neuromorphic systems.
arXiv:1703.06043, 2017. (PDF).

[42] S. Schmitt, J. Klähn, G. Bellec, A. Grübl, M. Güttler, A. Hartel, S. Hartmann, D. Husmann, K. Husmann, S. Jeltsch, V. Karasenko, M. Kleider, C. Koke, A. Kononov, C. Mauch, E. Müller, P. Müller, J. Partzsch, M. A. Petroviciy, S. Schiefer, S. Scholze, V. Thanasoulis, B. Vogginger, R. Legenstein, W. Maass, C. Mayr, R. Schüffny, J. Schemmel, and K. Meier.
Neuromorphic hardware in the loop: Training a deep spiking network on the BrainScaleS Wafer-Scale System. In
IEEE International Joint Conference on Neural Networks (IJCNN) 2017, pages 2227-2234, 2017. (PDF).

[41] R. Legenstein, C. H. Papadimitriou, S. Vempala, and W. Maass.
Assembly pointers for variable binding in networks of spiking neurons.
arXiv preprint arXiv:1611.03698, 2016. (PDF).

[40] A. Serb, J. Bill, A. Khiat, R. Berdan, R. Legenstein, and T. Prodromakis.
Unsupervised learning in probabilistic neural networks with multi-state metal-oxide memristive synapses.
Nature Communications, 7:12611, 2016. (Journal link to the PDF)

[39] Z. Yu, D. Kappel, R. Legenstein, S. Song, F. Chen, and W. Maass.
CaMKII activation supports reward-based neural network optimization through Hamiltonian sampling.
arXiv:1606.00157, 2016. v2. . (link to the PDF)

[38] D. Kappel, S. Habenschuss, R. Legenstein, and W. Maass.
Synaptic sampling: A Bayesian approach to neural network plasticity and rewiring. In
Advances in Neural Information Processing Systems 28, C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, editors, pages 370-378. Curran Associates, Inc., 2015. (PDF).

[37] J. Bill, L. Buesing, S. Habenschuss, B. Nessler, W. Maass, and R. Legenstein.
Distributed Bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition.
PLOS ONE, 10(8):e0134356, 2015. (Journal link to the PDF)

[36] R. Legenstein.
Nanoscale connections for brain-like circuits.
Nature, 521:37-38, 2015. (PDF).

[35] D. Kappel, S. Habenschuss, R. Legenstein, and W. Maass.
Network plasticity as Bayesian inference.
PLOS Computational Biology, 11(11):e1004485, 2015. (Journal link to the PDF)

[34] R. Legenstein.
Recurrent network models, reservoir computing. In
Encyclopedia of Computational Neuroscience, pages 1-5. Springer New York, 2014.

[33] J. Bill and R. Legenstein.
A compound memristive synapse model for statistical learning through STDP in spiking neural networks.
Frontiers in Neurosciense, 8(214):1-18, 2014. (Journal link to PDF)

[32] R. Legenstein and W. Maass.
Ensembles of spiking neurons with noise support optimal probabilistic inference in a dynamically changing environment.
PLOS Computational Biology, 10(10):e1003859, 2014. (Journal link to the PDF)

[31] A. V. Blackman, S. Grabuschnig, R. Legenstein, and P. J. Sjöström.
A comparison of manual neuronal reconstruction from biocytin histology or 2-photon imaging: morphometry and computer modeling.
Frontiers in neuroanatomy, 8, 2014. (Journal link to the PDF)

[30] G. Indiveri, B. Linares-Barranco, R. Legenstein, G. Deligeorgis, and T. Prodromakis.
Integration of nanoscale memristor synapses in neuromorphic computing architectures.
Nanotechnology, 24:384010, 2014. (PDF).

[29] G. M. Hoerzer, R. Legenstein, and Wolfgang Maass.
Emergence of complex computational structures from chaotic neural networks through reward-modulated Hebbian learning.
Cerebral Cortex, 24:677-690, 2014. (PDF). (Supplementary material PDF)

[28] R. Legenstein and W. Maass.
Branch-specific plasticity enables self-organization of nonlinear computation in single neurons.
The Journal of Neuroscience, 31(30):10787-10802, 2011. (PDF). (Commentary by R. P. Costa and P. J. Sjöström in Frontiers in Synaptic Neuroscience PDF)

[27] R. Legenstein, N. Wilbert, and L. Wiskott.
Reinforcement learning on slow features of high-dimensional input streams.
PLoS Computational Biology, 6(8):e1000894, 2010. (PDF).

[26] M. Jahrer, A. Töscher, and R. Legenstein.
Combining predictions for accurate recommender systems. In
KDD '10: Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 693-702, New York, NY, USA, 2010. ACM. (PDF).

[25] R. Legenstein, S. M. Chase, A. B. Schwartz, and W. Maass.
A reward-modulated Hebbian learning rule can explain experimentally observed network reorganization in a brain control task.
The Journal of Neuroscience, 30(25):8400-8410, 2010. (PDF).

[24] R. Legenstein, S. A. Chase, A. B. Schwartz, and W. Maass.
Functional network reorganization in motor cortex can be explained by reward-modulated Hebbian learning. In
Proc. of NIPS 2009: Advances in Neural Information Processing Systems, D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, volume 22, pages 1105-1113. MIT Press, 2010. (PDF).

[23] L. Buesing, B. Schrauwen, and R. Legenstein.
Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons.
Neural Computation, 22(5):1272-1311, 2010. (PDF).

[22] B. Schrauwen, L. Buesing, and R. Legenstein.
On computational power and the order-chaos phase transition in reservoir computing. In
Proc. of NIPS 2008, Advances in Neural Information Processing Systems, volume 21, pages 1425-1432. MIT Press, 2009. (PDF).

[22b] B. Schrauwen, L. Buesing, and R. Legenstein.
Supplementary material to: On computational power and the order-chaos phase transition in reservoir computing. In
Proc. of NIPS 2008, Advances in Neural Information Processing Systems, volume 21. MIT Press, 2009. in press. (PDF).

[21] Andreas Toescher, Michael Jahrer, and Robert Legenstein.
Improved neighborhood-based algorithms for large-scale recommender systems. In
KDD-Cup and Workshop. ACM, 2008. (PDF).

[20] R. Legenstein, D. Pecevski, and W. Maass.
A learning theory for reward-modulated spike-timing-dependent plasticity with application to biofeedback.
PLoS Computational Biology, 4(10):e1000180, 2008. (Journal link to the PDF)

[19] R. Legenstein, D. Pecevski, and W. Maass.
Theoretical analysis of learning with reward-modulated spike-timing-dependent plasticity. In
Proc. of NIPS 2007, Advances in Neural Information Processing Systems, volume 20, pages 881-888. MIT Press, 2008. (PDF).

[18] S. Klampfl, R. Legenstein, and W. Maass.
Spiking neurons can learn to solve information bottleneck problems and extract independent components.
Neural Computation, 21(4):911-959, 2009. (PDF).

[17] R. Legenstein and W. Maass.
On the classification capability of sign-constrained perceptrons.
Neural Computation, 20(1):288-309, 2008. (PDF).

[16] S. Klampfl, R. Legenstein, and W. Maass.
Information bottleneck optimization and independent component extraction with spiking neurons. In
Proc. of NIPS 2006, Advances in Neural Information Processing Systems, volume 19, pages 713-720. MIT Press, 2007. (PDF).

[15] R. Legenstein and W. Maass.
Edge of chaos and prediction of computational performance for neural circuit models.
Neural Networks, 20(3):323-334, 2007. (PDF).

[14] R. Legenstein and W. Maass.
What makes a dynamical system computationally powerful?. In
New Directions in Statistical Signal Processing: From Systems to Brains, S. Haykin, J. C. Principe, T.J. Sejnowski, and J.G. McWhirter, editors, pages 127-154. MIT Press, 2007. (PDF).

[13] R. Legenstein, C. Naeger, and W. Maass.
What can a neuron learn with spike-timing-dependent plasticity?.
Neural Computation, 17(11):2337-2382, 2005. (PDF).

[13a] R. Legenstein and W. Maass.
Additional material to the paper: What can a neuron learn with spike-timing-dependent plasticity?. Technical report, Institute for Theoretical Computer Science, Graz University of Technology, 2004. (PDF). (PDF)

[12] R. Legenstein and W. Maass.
A criterion for the convergence of learning with spike timing dependent plasticity. In
Advances in Neural Information Processing Systems, Y. Weiss, B. Schoelkopf, and J. Platt, editors, volume 18, pages 763-770. MIT Press, 2006. (PDF).

[11] T. Natschlaeger, N. Bertschinger, and R. Legenstein.
At the edge of chaos: Real-time computations and self-organized criticality in recurrent neural networks. In
Advances in Neural Information Processing Systems 17, Lawrence K. Saul, Yair Weiss, and Léon Bottou, editors, pages 145-152. MIT Press, Cambridge, MA, 2005. (PDF).

[10] W. Maass, R. Legenstein, and N. Bertschinger.
Methods for estimating the computational power and generalization capability of neural microcircuits. In
Advances in Neural Information Processing Systems, L. K. Saul, Y. Weiss, and L. Bottou, editors, volume 17, pages 865-872. MIT Press, 2005. (PDF).

[9] R. A. Legenstein and W. Maass.
Wire length as a circuit complexity measure.
Journal of Computer and System Sciences, 70:53-72, 2005. (PDF).

[8] R. Legenstein, H. Markram, and W. Maass.
Input prediction and autonomous movement analysis in recurrent circuits of spiking neurons.
Reviews in the Neurosciences (Special Issue on Neuroinformatics of Neural and Artificial Computation), 14(1-2):5-19, 2003. (PDF).

[7] W. Maass, R. Legenstein, and H. Markram.
A new approach towards vision suggested by biologically realistic neural microcircuit models. In
Biologically Motivated Computer Vision. Proc. of the Second International Workshop, BMCV 2002, Tuebingen, Germany, November 22-24, 2002, H. H. Buelthoff, S. W. Lee, T. A. Poggio, and C. Wallraven, editors, volume 2525 of
Lecture Notes in Computer Science, pages 282-293. Springer (Berlin), 2002. (PDF).

[6] R. A. Legenstein.

The Wire-Length Complexity of Neural Networks. PhD thesis, Graz University of Technology, 2002. (PDF).

[5] R. A. Legenstein and W. Maass.
Neural circuits for pattern recognition with small total wire length.
Theoretical Computer Science, 287:239-249, 2002. (PDF).

[4] R. A. Legenstein.
On the complexity of knock-knee channel routing with 3-terminal nets.
Technical Report, 2002. (PDF).

[3] R. A. Legenstein and W. Maass.
Optimizing the layout of a balanced tree.
Technical Report, 2001. (PDF).

[2] R. A. Legenstein and W. Maass.
Foundations for a circuit complexity theory of sensory processing. In
Proc. of NIPS 2000, Advances in Neural Information Processing Systems, T. K. Leen, T. G. Dietterich, and V. Tresp, editors, volume 13, pages 259-265, Cambridge, 2001. MIT Press. (PDF).

[1] R. A. Legenstein.
Effizientes Layout von Neuronalen Netzen. Master's thesis, Technische Universitaet Graz, September 1999.

[-] R. Legenstein, S. A. Chase, A. B. Schwartz, and W. Maass.
A model for learning effects in motor cortex that may facilitate the brain control of neuroprosthetic devices.
38th Annual Conference of the Society for Neuroscience, Program 517.6, 2008.

[-] R. Legenstein and W. Maass.
An integrated learning rule for branch strength potentiation and STDP.
39th Annual Conference of the Society for Neuroscience, Program 895.20, Poster HH36, 2009.


Contact
image/svg+xml

Dr. Robert Legenstein
Institute of Machine Learning and Neural Computation
Inffeldgasse 16b/I
8010 Graz
Austria

+43 / 316 873 5824
robert.legenstein@tugraz.at