“For every visual aid a mainstreamed D/HH student decides to focus on in the classroom, two or more important visuals go ignored. How does one ensure they have access to all that is available?”
My childhood education consisted of mainstream public school in the days before the ADA. People often ask me what my educational experience was like. Each reflection stirs up yet another memory to add to my already-complicated answer bank. During my childhood, people would ask me the ever-popular questions,“How is school? Do you like school?” Like most kids, my responses would largely depend upon my mood at that time, and my mood in turn was dependent upon the kind of day I had just experienced. School in general could be considered “fun” while a particular class identified as “frustrating”in spite ofthat certainsubjectbeing a personal favorite otherwise. One probably could have argued that I had a dislike of school and I would have been unable to put my finger on a specific reason--unless “I’m tired” qualified as a good enough answer.
Multimedia in the 1970s-1990s consisted largely of carbon-copied handouts giving way to Xerox machine products. Overhead projectors qualified as “high-tech” and educational films were on 16mm reels of tape without captions. The classroom was in no way “learner friendly” by today’s standards. On the other hand, maybe the simplicity of the times could be viewed today as “learner friendly” when compared to the current plethora of complicated technological advances requiring students to develop a talent for controlling distractions and maintaining focus. At any rate, a typical classroom setup involved a blackboard, a teacher’s desk, a pull-down screen, and papers. For a deaf student relying solely upon lipreading for communication access, the visual processing mode was on overdrive and had no “off” button. Maneuvering visual attention from the teacher’s lips to the handout/blackboard, and back again, and scanning the room to find out which classmates were asking a question then attempting to follow the discussion, it was common for me to miss out on so important information. I much preferred self-teaching. For me, that involved the pretense of following along in classes, then digging into my textbooks once I got home to make sense of the limited information understood at school. I often brought home library books related to the subject matter causing me struggle.
Studies suggest that deafness has a profound effect on the brain with a strong correlation between language and brain functioning. The brain must have some form of language in order to fuel comprehension and affect thought process. Language development is closely tied to communication. When communication is restricted, language development will be as well. For a mainstreamed student, ensuring full and equal communication access should drive program goals, supports and services in order to facilitate the best environment for language development. Literacy learning is influenced by language development. Teachers today may not necessarily change the way they teach when informed a deaf or hardofhearing student is in their classroom. They may instead turn to models that cater to and encourage language development for these students, such as creating an environment involving a large amount of printed matter, making use of interpreters, collaborating with teacher aides assigned to specific students, or encouraging peer assistance. In the course of their education and experience, teachers learn about the Spatial Contiguity Principle and the Temporal Contiguity Principle. For those not familiar with these principles, the Spatial Contiguity Principle states “students learn better when corresponding words and pictures are presented near rather than far from each other on a page or screen.” The Temporal Contiguity Principle states “students learn better when corresponding words and pictures are presented simultaneously rather than successively.” These principles are proving to be a boost to deaf and hard of hearing students in the mainstream setting.
The Clerc Center of Gallaudet University presented a webinar not long ago about visual split-attention in the classroom. I had the privilege of participating in this informative webinar and discovered explanations for many of the emotions experienced in my own childhood classroom settings. I cannot say enough positive about the information presented. Much of it is available in an article on split attention in the 2012 issue of Odyssey, a Gallaudet newsmagazine. I encourage teachers and parents of mainstreamed students to locate this article written by Susan Mather and Diane Clark, both professors with PhD’s.The webinar and article focused on the visual span of a student in the classroom, both horizontally and vertically. The key to communication access success began with ensuring that the teacher, the visual aid used, and the interpreter (or whatever assistive model in play) all fell in the line of sight of the student. Teachers tend to discuss handouts, refer to PowerPoint slides, play snippets of videos calling attention to key parts, and encourage class discussion concurrently. When several visual aids happen simultaneously, a student depending on their eyes more than their ears finds himself forced to choose where to look, or, in other words, choosing quickly what NOT to learn. Looking back at my own experience, I recall times I would look away from my teacher’s lips to take a quick peek at the visual she was referring to, then dart my attention back to her lips only to experience momentary confusion because I missed a sentence or two. In classroom discussions, I played the “guess who is talking now” game, scanning the room quickly to find the student speaker, finally locating her by the end of her comment, missing the comment, and then starting all over again in an attempt to discover who was responding next. It was a vicious cycle for my already-overworked brain, and by the end of the class I desperately wanted a nap--ten minutes to rest my overtired brain.
Even in my recent college experience with an interpreter, my eyes would still dart around due to the interpreter being assigned at one corner of the room, the professor in the opposite corner, a handout on my desk, and a PowerPoint in the center of the wall before me. After this webinar, I realize how much more relaxing it would have been were my professor and interpreter to stand within my line of vision to the PowerPoint slide. Studies show that hearing students learned better when presented with two sources of sensory input, usually visual and auditory. Having access to both simultaneously goes a long way in retaining information. Deaf or hardofhearing students deal with the conversion of auditory information into visual information, then combine that with the “normal” visual information, in essence doing double duty. If those visuals are scattered about, the brain can quickly overload, information may not be retained as well, and the focus is not always where it should be.
Mather and Clark recognized the unique challenge of split-attention in a mainstream setting to a point of modifying learning strategies tailored for visual adaptation will assist in an increase in performance levels of deaf and hard of hearing students. Making the classroom more visual incorporates strategies such as eye contact, hand waving, raising hands to speak, ensuring there is no overlap of conversation, and using facial expressions to emphasize key points or questions.
(See an adapted list in the article: Reducing the Effects of
Above all, any visual adaptation to the educational experience must be evaluated to ensure it is in the line of sight of deaf or hard of hearing student(s) and not set up in such a way that the eyes behave as though there was a tennis match in the room.
Editor’s note: Find the 2012 issue of Odyssey and the article on split attention by Mather and Clark online:http://www.gallaudet.edu/Images/Clerc/articles/