Investigating how language representations and computations in humans compare to those in artificial models
This line of work investigates the extent to which we, humans, share representations and computational principles with these artificial neural network models, despite them having emerged in completely different ways.
Tuckute, G., Kanwisher, N., Fedorenko, E. (2024): Language in Brains, Minds, and Machines, Annual Review of Neuroscience 47, doi: https://doi.org/10.1146/annurev-neuro-120623-101142.
AlKhamissi, B., Tuckute, G., Bosselut^, A., Schrimpf^, M. (2024): Brain-Like Language Processing via a Shallow Untrained Multihead Attention Network, arXiv; doi: https://doi.org/10.48550/arXiv.2406.15109.
Tuckute, G., Finzi, D., Margalit, E., Zylberberg, J., Chung, SY., Fyshe, A., Fedorenko, E., Kriegeskorte, N., Yates, J., Grill-Spector, K., Kar, K. (2024): How to optimize neuroscience data utilization and experiment design for advancing primate visual and linguistic brain models?, arXiv; doi: https://doi.org/10.48550/arXiv.2401.03376.
Tucker*, M., & Tuckute*, G. (2023): Increasing Brain-LLM Alignment via Information-Theoretic Compression, 37th Conference on Neural Information Processing Systems (NeurIPS 2023), UniReps Workshop, url: https://openreview.net/forum?id=WcfVyzzJOS.
Tuckute, G.*, Feather*, J., Boebinger, D., McDermott, J. (2023): Many but not all deep neural network audio models capture brain responses and exhibit correspondence between model stages and brain regions, PLoS Biology 21(12), doi: https://doi.org/10.1371/journal.pbio.3002366.
Kauf*, C., Tuckute*, G., Levy, R., Andreas, J., Fedorenko, E. (2023): Lexical semantic content, not syntactic structure, is the main contributor to ANN-brain similarity of fMRI responses in the language network, Neurobiology of Language 5(1), doi: https://doi.org/10.1162/nol_a_00116.
Schrimpf, M., Blank, I.*, Tuckute, G.*, Kauf, C.*, Hosseini, E. A., Kanwisher, N., Tenenbaum^, J., Fedorenko^, E. (2021): The neural architecture of language: Integrative modeling converges on predictive processing, PNAS 118(45), doi: https://doi.org/10.1073/pnas.2105646118.
Development of biologically plausible, principled artificial models that perform linguistic computations more like humans
This line of work uses insights from human language processing to develop more biologically plausible neural network models and to develop tools to compare representations or outputs from humans and neural networks (note that this is a relatively new direction, so most of these projects are in preparation / under review).
BinHuraib, T., Tuckute, G., Blauch, Nicholas M. (2024): Topoformer: brain-like topographic organization in Transformer language models through spatial querying and reweighting, International Conference on Learning Representations (ICLR 2024), Re-Align Workshop, url: https://openreview.net/forum?id=3pLMzgoZSA.
Wolf, L., Tuckute, G. , Kotar, K., Hosseini, E., Regev, E., Wilcox, E., Warstadt, A. (2023): WhisBERT: Multimodal Text-Audio Language Modeling on 100M Words, Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, CoNLL-CMCL Shared Task BabyLM Challenge, Empirical Methods in Natural Language Processing (EMNLP), doi: https://doi.org/10.48550/arXiv.2312.02931.
Tuckute, G.*, Sathe*, A., Wang, M., Yoder, H., Shain, C., Fedorenko, E. (2022): SentSpace: Large-scale benchmarking and evaluation of text using cognitively motivated lexical, syntactic, and semantic features, Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: System Demonstrations, pages 99–113. Association for Computational Linguistics (ACL), url: https://aclanthology.org/2022.naacl-demo.11.
Earlier Research
I got introduced to the field of neuroscience and artificial intelligence through the domain of vision, where I worked on decoding semantic features from EEG signatures (Tuckute et al., 2019:
Single Trial Decoding of Scalp EEG Under Natural Conditions) and decoding attentional states using real-time EEG neurofeedback (Tuckute et al., 2021:
Real-Time Decoding of Attentional States Using Closed-Loop EEG Neurofeedback).
Before that—in late high-school—I was fascinated by quantum physics and I did one project on
quantum tunneling in Bose-Einstein condensates, and another project on
sequential storage and readout of laser light in a diamond for quantum relays (supervised by Dr. Jacob Broe and Dr. Klaus Moelmer), and I was a finalist in two national research competitions: “The Junior Researcher's Project” by University of Copenhagen (December 2012), and “Young Researchers” competition by Danish Science Factory (April 2013).