Summary High-level cognition inevitably involves multiple component processes, which are difficult to distinguish at the neural level. We apply model-guided componential analysis to disaggregate components of verbal analogical reasoning, a hallmark of human intelligence. This approach integrates a sequential task design with representational and encoding analyses of fMRI data. The analyses were guided by three computational models of lexical and relation semantics that vary in the specificity of their relation representations. Word2vec-concat is nonrelational (based solely on individual word meanings); Word2vec-diff computes the generic relation between any word pair; and BART derives relational similarity from a set of learned abstract semantic relations (e.g., synonym, antonym, cause-effect). The predictions derived from BART, based on its learned relations, showed the strongest correlation with neural activity in regions including the left posterior parietal cortex (during both relation representation and relation comparison) and rostrolateral prefrontal cortex (during relation comparison). Model-guided componential analysis shows promise as an approach to discovering the neural basis of propositional thought.