Aaron Traylor
 Aaron Traylor
 Contact
  Premise: All students at Brown University have a "brown.edu" email address.
  Hypothesis: I have a "brown.edu" email address.
  Completely tangentially, my email address is aaron_traylor@<mask>.edu
.
Bio
I am a sixth year PhD student studying computer science at Brown University. I am working with Professor Ellie Pavlick in the LUNAR (Language UNderstanding And Representation) Lab, and I am also advised by Professor Roman Feiman of the Brown Language and Thought Lab.
I study the reasoning capabilities of large language models.
As of May 2023, I'm a summer intern for Microsoft Turing. I worked on the team last summer as well!
In the summer of 2019, I worked as an intern at Megagon Labs.
Previously, I worked in Professor Andrew McCallum's Information Extraction and Synthesis Laboratory and collaborated with Nicholas Monath and Rajarshi Das.
Research Interests
My research is focused around how humans, large langauge models, and other modern machine learning systems use logic. Logical reasoning is fundamental to human cognition, and humans apply it flexibly in a wide variety of contexts. How would a model have to behave for us to say that it has a similar capacity for logical reasoning or inference, especially if it does not explicitly represent logic symbolically? And, what is the fundamental source of logic in the human brain— are logical capabilities learned or innate? I approach these questions from a computational perspective, a philosophical perspective, and from a human developmental perspective.
I am interested in natural language understanding (NLU), computational cognitive science, neuro-symbolic frameworks, textual inference, reinforcement learning, and representation learning.
As a pet project, I am intersested in research involving competitive Pokémon. This game is unique among zero-sum strategy games for several reasons, but mainly because of its emphasis on the "draft" phase: both players select their action space within the game, and afterwards make actions in the environment. The draft phase is common in other games that AI systems have successfully played (League of Legends, DotA 2, Honor of Kings), but in Pokémon, the draft phase is scaled up to an astronomically large state space (roughly 10^200), meaning the game will require novel research before an AI can achieve superhuman strength.
Education
Brown University. 2018-Ongoing. PhD Candidate. Computer Science.
University of Massachusetts Amherst. 2014-2018. BS. Computer Science.
Publications
Aaron Traylor, Roman Feiman, and Ellie Pavlick. Can Neural Networks Learn Implicit Logic from Physical Reasoning? EMNLP 2023. PDF.
Brandon Prickett, Aaron Traylor, Joe Pater. Learning reduplication with a neural network that lacks explicit variables. Journal of Language Modelling 2022. PDF (download).
Aaron Traylor, Roman Feiman, and Ellie Pavlick. AND does not mean OR: Using Formal Languages to Study Language Models’ Representations. ACL 2021. PDF.
Aaron Traylor, Roman Feiman, and Ellie Pavlick. Transferring Representations of Logical Connectives. NALOMA workshop at ACL 2020. PDF.
Nikita Bhutani, Aaron Traylor, Chen Chen, Xiaolan Wang, Behzad Golshan, Wang-Chiew Tan. SAMPO: Unsupervised Knowledge Base Construction for Opinions and Implications. AKBC 2020. PDF.
Derek Tam, Nicholas Monath, Ari Kobren, Aaron Traylor, Rajarshi Das, Andrew McCallum. Optimal transport-based alignment of learned character representations for string similarity. ACL 2019. PDF
Aaron Traylor, Chen Chen, Behzad Golshan, Xiaolan Wang, Yuliang Li, Yoshihiko Suhara, Jinfeng Li, Cagatay Demiralp, Wang-Chiew Tan.Enhancing Review Comprehension with Domain-Specific Commonsense. arXiv preprint. PDF.
Brandon Prickett, Aaron Traylor, and Joe Pater. Seq2Seq Models With Dropout Can Generalize Reduplication. SIGMORPHON at EMNLP 2018. PDF
Aaron Traylor*, Nicholas Monath*, Rajarshi Das, and Andrew McCallum. Learning String Alignments for Entity Aliases AKBC WS at NIPS 2017. (* equal contribution) [PDF]
Haw-Shiuan Chang, Abdurrahman Munir, Ao Liu, Johnny Tian-Zheng Wei, Aaron Traylor, Ajay Nagesh, Nicholas Monath, Patrick Verga, Emma Strubell, and Andrew McCallum. Extracting Multilingual Relations under Limited Resources: TAC 2016 Cold-Start KB construction and Slot-Filling using Compositional Universal Schema. NIST TAC KBP Workshop 2016. Notebook version [pdf]