Scalable Parallel Algorithms
In addition to tools for parallel programming, we also develop highly scalable parallel algorithms for selected application areas. An example is simulating the evolution of the neural network in the brain. This network is not hardwired. Even in the mature brain, new connections between neurons are formed and existing ones are deleted, which is called structural plasticity. The dynamics of the connectome is key to understanding how learning, memory, and healing after lesions such as stroke work. However, to predict which neurons connect to each other, a naïve algorithm would compute probabilities for all pairs of neurons, resulting in an effort that increases with the square of the number of neurons. To enable large-scale simulations with millions of neurons and beyond, this quadratic growth is prohibitive. Inspired by hierarchical methods for solving n-body problems in particle physics, we have developed a scalable approximation algorithm for this problem that reduces the complexity to O(n*log^2 n) without any notable impact on the quality of the results.
- Sebastian Rinke, Markus Butz-Ostendorf, Marc-André Hermanns, Mikaël Naveau, Felix Wolf: A Scalable Algorithm for Simulating the Structural Plasticity of the Brain. Journal of Parallel and Distributed Computing, 2018. [PDF] [DOI] [BibTeX]
- Sebastian Rinke, Mikaël Naveau, Felix Wolf, Markus Butz-Ostendorf: The Rewiring Brain: A Computational Approach to Structural Plasticity in the Adult Brain, chapter Critical Periods Emerge from Homeostatic Structural Plasticity in a Full-Scale Model of the Developing Cortical Column. Academic Press, San Diego, pages 177-202, 2017. [BibTeX]
- Sebastian Rinke, Markus Butz-Ostendorf, Marc-André Hermanns, Mikaël Naveau, Felix Wolf: A Scalable Algorithm for Simulating the Structural Plasticity of the Brain. In Proc. of the 28th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD), Los Angeles, CA, USA, pages 1-8, October 2016. [PDF] [DOI] [BibTeX]