Learning and memory rely on synaptic communication in which intracellular signals are transported to the nucleus to stimulate transcriptional activation. Memory induced transcriptional increases are accompanied by alterations to the epigenetic landscape an ...
The formation and storage of memories has been under deep investigation for several decades. Nevertheless, the precise contribution of each brain region involved in this process and the interplay between them across memory consolidation is still largely de ...
Memory formation and storage rely on multiple interconnected brain areas, the contribution of which varies during memory consolidation. The medial prefrontal cortex, in particular the prelimbic cortex (PL), was traditionally found to be involved in remote ...
Short-term synaptic plasticity and modulations of the presynaptic vesicle release rate are key components of many working memory models. At the same time, an increasing number of studies suggests a potential role of astrocytes in modulating higher cognitiv ...
Long-term memory formation relies on synaptic plasticity, neuronal activity-dependent gene transcription, and epigenetic modifications. Multiple studies have shown that HDAC inhibitor (HDACi) treatments can enhance individual aspects of these processes and ...
Episodic autobiographical memories are characterized by a spatial context and an affective component. But how do affective and spatial aspects interact? Does affect modulate the way we encode the spatial context of events? We investigated how one element o ...
Where memories are stored in the brain is an age-old question in psychology and neuroscience alike. In particular, whether hippocampus-encoded memories are transferred to the cortex or remain hippocampus-dependent over time has not been definitely answered ...
In this thesis, we propose model order reduction techniques for high-dimensional PDEs that preserve structures of the original problems and develop a closure modeling framework leveraging the Mori-Zwanzig formalism and recurrent neural networks. Since high ...
We consider the problem of training a neural network to store a set of patterns with maximal noise robustness. A solution, in terms of optimal weights and state update rules, is derived by training each individual neuron to perform either kernel classifica ...
Even the largest neural networks make errors, and once-correct predictions can become invalid as the world changes. Model editors make local updates to the behavior of base (pre-trained) models to inject updated knowledge or correct undesirable behaviors. ...