The integration of visual and haptic input can facilitate object recognition. Yet, vision might dominate visuo-haptic interactions as it is more effective than haptics in processing several object features in parallel and recognizing objects outside of reaching space. The maximum likelihood approach of multisensory integration would predict that haptics as the less efficient sense for object recognition gains more from integrating additional visual information than vice versa. To test for asymmetries between vision and touch in visuo-haptic interactions, we measured regional changes in brain activity using functional magnetic resonance imaging while healthy individuals performed a delayed-match-to-sample task. We manipulated identity matching of sample and target objects: We hypothesized that only coherent visual and haptic object features would activate unified object representations. The bilateral object-specific lateral occipital cortex, fusiform gyrus, and intraparietal sulcus showed increased activation to crossmodal compared to unimodal matching but only for congruent object pairs. Critically, the visuo-haptic interaction effects in these regions depended on the sensory modality which processed the target object, being more pronounced for haptic than visual targets. This preferential response of visuo-haptic regions indicates a modality-specific asymmetry in crossmodal matching of visual and haptic object features, suggesting a functional primacy of vision over touch in visuo-haptic object recognition.