The human brain continuously receives sensory input from the dynamic physical world via various sensory modalities. In many cases, a single physical event generates simultaneous input to more than one modality. For example, a ball hitting the ground generates both visual and auditory input. The human brain has developed mechanisms to take advantage of the correlations between inputs to different modalities to form a uniform and stable percept. Recently, there has been a lot of research interest, psychophysical, neurophysiological and computational, to explore the mechanisms involved in crossmodal interactions in general and auditory-visual interactions in particular. The current thesis makes three significant contributions to the field of auditory-visual interactions. First, I designed a comprehensive study to psychophysically examine the interactions between auditory and visual motion mechanisms for three different motion configurations: horizontal, vertical and motion-in-depth. I showed that simultaneous presentation of a strong motion signal in one modality influences perception of a weak motion signal in the other modality both when the weak motion in presented in the visual, as well as in the auditory modality. I further observed that crossmodal aftereffects were induced only when subjects adapted to spatial motion in the visual modality and not in the auditory modality. However, adaptation to auditory spectral motion did induce vertical visual motion aftereffects. To my knowledge, this is the first report of auditory-induced visual aftereffects. Second, I conducted psychophysical experiments to study the effects of spectral attention on the visual and the auditory motion mechanisms and showed that there are similar attentional effects on motion mechanisms within the two modalities. Third, I developed a neurophysiologically relevant computational model to provide a possible explanation for crossmodal interactions between the auditory and the visual motion mechanisms. In addition, I developed a model that can explain the observed experimental findings on the role of spectral attention in modulating motion aftereffects. The results obtained from both the model simulations agree very closely with the human behavioral data obtained from the experiments.