Young soccer players are taught to communicate verbally with their teammates. For instance, when a player is receiving the ball with his back to goal, teammates will yell “Turn” or “Man on” to assist the player with his first touch on the ball.
Several years ago, I watched a Long Island University soccer practice, head coach T.J. Kostecky emphasized players taking a quick look over their shoulder before they received the pass. He wanted players not to yell at each other, but instead for the receiver to use his vision to enhance his decision-making with his first touch.
Basketball coaches certainly want their players to talk, but most emphasize the communication on defense. There is less instruction from a teammate when a player receives a pass. However, cutters often will yell “Trailer!” when filling the lane on a fast break or name their cut when using a screen “Fade!” This communication is intended to improve decision-making and court vision.
In Embodied Wisdom: The Collected Papers of Moshe Feldenkrais, Feldenkrais introduces an interesting idea with regards to learning and court vision:
“As a child begins to be trained in reading and writing, his hearing is gradually withdrawn from most of the space around him. He learns to pay increasing attention, sometimes exclusively, to that sector of space which he sees. In general, it is the case that we see only a small part of the space around us, even though in hearing we hear from all around us” (p. 46).
What if the court vision of players like Steve Nash that is described frequently as eyes in the back of one’s head has nothing to do with vision, but hearing? If we only a small part of the area – our peripheral vision – but have 360-degree hearing, maybe expert performers have more acute hearing than their peers? Rather than testing different measures of visual acuity to determine the differences in experts, we should test for hearing differences.
“He will listen – mostly to his ears, checking the eyes for accuracy and detail” (p. 47).
When I switch lanes while driving, I often initiate the lane change before making a visual scan. I have a sense of the presence of another car. I associate this sense to previous scanning and pattern recognition: If there was a car there previously, and the car passes me, I sense that there is an opening.
What if this sense is not pattern recognition, but hearing? What if I sense the presence or absence of a car next to me based on barely perceptible auditory signals? I hear the absence of a car next to me, and I use my eyes to check for accuracy and detail before committing fully to the lane change.
Similarly, when Nash races down court with the dribble, he may sense a teammate filling the lane through barely perceptible auditory cues, and he uses his eyes at the last possible second to check for accuracy and detail.