Object size is represented by functionally distinct sectors along the ventral visual pathway. The early visual cortex encodes objects' sensory-retinal size. Subsequently, the occipitotemporal cortex computes objects' canonical size based on statistical regularities of visual features. Although the neurocomputation of size has been studied in a “bottom-up” sensory-driven framework, little is known about how perceptual size information is transformed into conceptual knowledge and how this computation is modulated by “top-down” goal-driven signals. Using continuous theta burst stimulation, we demonstrated that behavioral goal shapes the neurocognitive network underpinning object size. We manipulated the congruency of perceptual versus conceptual object size, which provides a robust behavioral probe reflecting implicit size knowledge. Neurostimulation was targeted at the lateral occipital cortex (LOC), a key region for object perception, or the anterior temporal lobe (ATL), a “hub” of supramodal conceptual processing. We observed striking contextual modulation of the neurocognitive architecture: when human participants judged perceptual size, the congruency effect was significantly attenuated by LOC stimulation but stayed resilient to ATL stimulation. By contrast, when they judged conceptual size, both LOC and ATL stimulation eradicated the otherwise robust effect. Our findings demonstrate disparate functional profiles of the LOC and ATL, providing the first evidence of a malleable network adaptively altering its division of labor with top-down states. The LOC, regardless of task demand, automatically represents “bottom-up” statistical regularities of visual conformation (reflecting typical object size), whereas the ATL contributes to this computation when the context requires semantically based linkage of visual attributes to object recognition.