Multimodal cues can improve behavioral responses by enhancing the detection and localization of sensory cues and reducing response times. Across species, studies have shown that multisensory integration of visual and olfactory cues can improve response accuracy. However, in real-world settings, sensory cues are often noisy; visual and olfactory cues can be deteriorated, masked, or mixed, making the target cue less clear to the receiver. In this study, we use an associative learning paradigm (Free Moving Proboscis Extension Reflex, FMPER) to show that having multimodal cues may improve the accuracy of bees responses to noisy cues. Adding a noisy visual cue improves the accuracy of response to a noisy olfactory cue, despite neither the clear nor noisy visual cue being sufficient when paired with a novel olfactory cue. This may provide insight into the neural mechanisms underlying multimodal processing and the effects of environmental change on pollination services.