Subject relative clauses (SRCs) are typically processed more easily than object relative clauses (ORCs), but this difference is diminished by an inanimate head-noun in semantically non-reversible ORCs (“The book that the boy is reading”). In two eye-tracking experiments, we investigated the influence of animacy on online processing of semantically reversible SRCs and ORCs using lexically inanimate items that were perceptually animate due to motion (e.g., “Where is the tractor that the cow is chasing”). In Experiment 1, 48 children (aged 4;5–6;4) and 32 adults listened to sentences that varied in the lexical animacy of the NP1 head-noun (Animate/Inanimate) and relative clause (RC) type (SRC/ORC) with an animate NP2 while viewing two images depicting opposite actions. As expected, inanimate head-nouns facilitated the correct interpretation of ORCs in children; however, online data revealed children were more likely to anticipate an SRC as the RC unfolded when an inanimate head-noun was used, suggesting processing was sensitive to perceptual animacy. In Experiment 2, we repeated our design with inanimate (rather than animate) NP2s (e.g., “where is the tractor that the car is following”) to investigate whether our online findings were due to increased visual surprisal at an inanimate as agent, or to similarity-based interference. We again found greater anticipation for an SRC in the inanimate condition, supporting our surprisal hypothesis. Across the experiments, offline measures show that lexical animacy influenced children's interpretation of ORCs, whereas online measures reveal that as RCs unfolded, children were sensitive to the perceptual animacy of lexically inanimate NPs, which was not reflected in the offline data. Overall measures of syntactic comprehension, inhibitory control, and verbal short-term memory and working memory were not predictive of children's accuracy in RC interpretation, with the exception of a positive correlation with a standardized measure of syntactic comprehension in Experiment 1.
|Date made available||2020|
|Publisher||Open Science Framework|