- Define kinesics.
- Define haptics.
- Define vocalics.
- Define proxemics.
- Define chronemics.
- Provide examples of types of nonverbal communication that fall under these categories.
- Discuss the ways in which personal presentation and environment provide nonverbal cues.
Just as verbal language is broken up into various categories, there are also different types of nonverbal communication. As we learn about each type of nonverbal signal, keep in mind that nonverbals often work in concert with each other, combining to repeat, modify, or contradict the verbal message being sent.
The word kinesics comes from the root word kinesis, which means “movement,” and refers to the study of hand, arm, body, and face movements. Specifically, this section will outline the use of gestures, head movements and posture, eye contact, and facial expressions as nonverbal communication.
There are three main types of gestures: adaptors, emblems, and illustrators (Andersen, 1999). Adaptors are touching behaviors and movements that indicate internal states typically related to arousal or anxiety. Adaptors can be targeted toward the self, objects, or others. In regular social situations, adaptors result from uneasiness, anxiety, or a general sense that we are not in control of our surroundings. Many of us subconsciously click pens, shake our legs, or engage in other adaptors during classes, meetings, or while waiting as a way to do something with our excess energy. Public speaking students who watch video recordings of their speeches notice nonverbal adaptors that they didn’t know they used. In public speaking situations, people most commonly use self- or object-focused adaptors. Common self-touching behaviors like scratching, twirling hair, or fidgeting with fingers or hands are considered self-adaptors. Some self-adaptors manifest internally, as coughs or throat-clearing sounds. My personal weakness is object adaptors. Specifically, I subconsciously gravitate toward metallic objects like paper clips or staples holding my notes together and catch myself bending them or fidgeting with them while I’m speaking. Other people play with dry-erase markers, their note cards, the change in their pockets, or the lectern while speaking. Use of object adaptors can also signal boredom as people play with the straw in their drink or peel the label off a bottle of beer. Smartphones have become common object adaptors, as people can fiddle with their phones to help ease anxiety. Finally, as noted, other adaptors are more common in social situations than in public speaking situations given the speaker’s distance from audience members. Other adaptors involve adjusting or grooming others, similar to how primates like chimpanzees pick things off each other. It would definitely be strange for a speaker to approach an audience member and pick lint off his or her sweater, fix a crooked tie, tuck a tag in, or pat down a flyaway hair in the middle of a speech.
Emblems are gestures that have a specific agreed-on meaning. These are still different from the signs used by hearing-impaired people or others who communicate using American Sign Language (ASL). Even though they have a generally agreed-on meaning, they are not part of a formal sign system like ASL that is explicitly taught to a group of people. A hitchhiker’s raised thumb, the “OK” sign with thumb and index finger connected in a circle with the other three fingers sticking up, and the raised middle finger are all examples of emblems that have an agreed-on meaning or meanings with a culture. Emblems can be still or in motion; for example, circling the index finger around at the side of your head says “He or she is crazy,” or rolling your hands over and over in front of you says “Move on.”
Emblems are gestures that have a specific meaning. In the United States, a thumbs-up can mean “I need a ride” or “OK!” Kreg Steppe – Thumbs Up – CC BY-SA 2.0.
Just as we can trace the history of a word, or its etymology, we can also trace some nonverbal signals, especially emblems, to their origins. Holding up the index and middle fingers in a “V” shape with the palm facing in is an insult gesture in Britain that basically means “up yours.” This gesture dates back centuries to the period in which the primary weapon of war was the bow and arrow. When archers were captured, their enemies would often cut off these two fingers, which was seen as the ultimate insult and worse than being executed since the archer could no longer shoot his bow and arrow. So holding up the two fingers was a provoking gesture used by archers to show their enemies that they still had their shooting fingers (Pease & Pease, 2004).
Illustrators are the most common type of gesture and are used to illustrate the verbal message they accompany. For example, you might use hand gestures to indicate the size or shape of an object. Unlike emblems, illustrators do not typically have meaning on their own and are used more subconsciously than emblems. These largely involuntary and seemingly natural gestures flow from us as we speak but vary in terms of intensity and frequency based on context. Although we are never explicitly taught how to use illustrative gestures, we do it automatically. Think about how you still gesture when having an animated conversation on the phone even though the other person can’t see you.
Head Movements and Posture
I group head movements and posture together because they are often both used to acknowledge others and communicate interest or attentiveness. In terms of head movements, a head nod is a universal sign of acknowledgement in cultures where the formal bow is no longer used as a greeting. In these cases, the head nod essentially serves as an abbreviated bow. An innate and universal head movement is the headshake back and forth to signal “no.” This nonverbal signal begins at birth, even before a baby has the ability to know that it has a corresponding meaning. Babies shake their head from side to side to reject their mother’s breast and later shake their head to reject attempts to spoon-feed (Pease & Pease, 2004). This biologically based movement then sticks with us to be a recognizable signal for “no.” We also move our head to indicate interest. For example, a head up typically indicates an engaged or neutral attitude, a head tilt indicates interest and is an innate submission gesture that exposes the neck and subconsciously makes people feel more trusting of us, and a head down signals a negative or aggressive attitude (Pease & Pease, 2004).
There are four general human postures: standing, sitting, squatting, and lying down (Hargie, 2011). Within each of these postures there are many variations, and when combined with particular gestures or other nonverbal cues they can express many different meanings. Most of our communication occurs while we are standing or sitting. One interesting standing posture involves putting our hands on our hips and is a nonverbal cue that we use subconsciously to make us look bigger and show assertiveness. When the elbows are pointed out, this prevents others from getting past us as easily and is a sign of attempted dominance or a gesture that says we’re ready for action. In terms of sitting, leaning back shows informality and indifference, straddling a chair is a sign of dominance (but also some insecurity because the person is protecting the vulnerable front part of his or her body), and leaning forward shows interest and attentiveness (Pease & Pease, 2004).
We also communicate through eye behaviors, primarily eye contact. While eye behaviors are often studied under the category of kinesics, they have their own branch of nonverbal studies called oculesics, which comes from the Latin word oculus, meaning “eye.” The face and eyes are the main point of focus during communication, and along with our ears our eyes take in most of the communicative information around us. The saying “The eyes are the window to the soul” is actually accurate in terms of where people typically think others are “located,” which is right behind the eyes (Andersen, 1999). Certain eye behaviors have become tied to personality traits or emotional states, as illustrated in phrases like “hungry eyes,” “evil eyes,” and “bedroom eyes.” To better understand oculesics, we will discuss the characteristics and functions of eye contact and pupil dilation.
Eye contact serves several communicative functions ranging from regulating interaction to monitoring interaction, to conveying information, to establishing interpersonal connections. In terms of regulating communication, we use eye contact to signal to others that we are ready to speak or we use it to cue others to speak. I’m sure we’ve all been in that awkward situation where a teacher asks a question, no one else offers a response, and he or she looks directly at us as if to say, “What do you think?” In that case, the teacher’s eye contact is used to cue us to respond. During an interaction, eye contact also changes as we shift from speaker to listener. US Americans typically shift eye contact while speaking—looking away from the listener and then looking back at his or her face every few seconds. Toward the end of our speaking turn, we make more direct eye contact with our listener to indicate that we are finishing up. While listening, we tend to make more sustained eye contact, not glancing away as regularly as we do while speaking (Martin & Nakayama, 2010).
Aside from regulating conversations, eye contact is also used to monitor interaction by taking in feedback and other nonverbal cues and to send information. Our eyes bring in the visual information we need to interpret people’s movements, gestures, and eye contact. A speaker can use his or her eye contact to determine if an audience is engaged, confused, or bored and then adapt his or her message accordingly. Our eyes also send information to others. People know not to interrupt when we are in deep thought because we naturally look away from others when we are processing information. Making eye contact with others also communicates that we are paying attention and are interested in what another person is saying. As we will learn in Chapter 5 “Listening”, eye contact is a key part of active listening.
Eye contact can also be used to intimidate others. We have social norms about how much eye contact we make with people, and those norms vary depending on the setting and the person. Staring at another person in some contexts could communicate intimidation, while in other contexts it could communicate flirtation. As we learned, eye contact is a key immediacy behavior, and it signals to others that we are available for communication. Once communication begins, if it does, eye contact helps establish rapport or connection. We can also use our eye contact to signal that we do not want to make a connection with others. For example, in a public setting like an airport or a gym where people often make small talk, we can avoid making eye contact with others to indicate that we do not want to engage in small talk with strangers. Another person could use eye contact to try to coax you into speaking, though. For example, when one person continues to stare at another person who is not reciprocating eye contact, the person avoiding eye contact might eventually give in, become curious, or become irritated and say, “Can I help you with something?” As you can see, eye contact sends and receives important communicative messages that help us interpret others’ behaviors, convey information about our thoughts and feelings, and facilitate or impede rapport or connection. This list reviews the specific functions of eye contact:
- Regulate interaction and provide turn-taking signals
- Monitor communication by receiving nonverbal communication from others
- Signal cognitive activity (we look away when processing information)
- Express engagement (we show people we are listening with our eyes)
- Convey intimidation
- Express flirtation
- Establish rapport or connection
Pupil dilation is a subtle component of oculesics that doesn’t get as much scholarly attention in communication as eye contact does. Pupil dilation refers to the expansion and contraction of the black part of the center of our eyes and is considered a biometric form of measurement; it is involuntary and therefore seen as a valid and reliable form of data collection as opposed to self-reports on surveys or interviews that can be biased or misleading. Our pupils dilate when there is a lack of lighting and contract when light is plentiful (Guerrero & Floyd, 2006). Pain, sexual attraction, general arousal, anxiety/stress, and information processing (thinking) also affect pupil dilation. Researchers measure pupil dilation for a number of reasons. For example, advertisers use pupil dilation as an indicator of consumer preferences, assuming that more dilation indicates arousal and attraction to a product. We don’t consciously read others’ pupil dilation in our everyday interactions, but experimental research has shown that we subconsciously perceive pupil dilation, which affects our impressions and communication. In general, dilated pupils increase a person’s attractiveness. Even though we may not be aware of this subtle nonverbal signal, we have social norms and practices that may be subconsciously based on pupil dilation. Take for example the notion of mood lighting and the common practice of creating a “romantic” ambiance with candlelight or the light from a fireplace. Softer and more indirect light leads to pupil dilation, and although we intentionally manipulate lighting to create a romantic ambiance, not to dilate our pupils, the dilated pupils are still subconsciously perceived, which increases perceptions of attraction (Andersen, 1999).
Our faces are the most expressive part of our bodies. Think of how photos are often intended to capture a particular expression “in a flash” to preserve for later viewing. Even though a photo is a snapshot in time, we can still interpret much meaning from a human face caught in a moment of expression, and basic facial expressions are recognizable by humans all over the world. Much research has supported the universality of a core group of facial expressions: happiness, sadness, fear, anger, and disgust. The first four are especially identifiable across cultures (Andersen, 1999). However, the triggers for these expressions and the cultural and social norms that influence their displays are still culturally diverse. If you’ve spent much time with babies you know that they’re capable of expressing all these emotions. Getting to see the pure and innate expressions of joy and surprise on a baby’s face is what makes playing peek-a-boo so entertaining for adults. As we get older, we learn and begin to follow display rules for facial expressions and other signals of emotion and also learn to better control our emotional expression based on the norms of our culture.
Smiles are powerful communicative signals and, as you’ll recall, are a key immediacy behavior. Although facial expressions are typically viewed as innate and several are universally recognizable, they are not always connected to an emotional or internal biological stimulus; they can actually serve a more social purpose. For example, most of the smiles we produce are primarily made for others and are not just an involuntary reflection of an internal emotional state (Andersen, 1999). These social smiles, however, are slightly but perceptibly different from more genuine smiles. People generally perceive smiles as more genuine when the other person smiles “with their eyes.” This particular type of smile is difficult if not impossible to fake because the muscles around the eye that are activated when we spontaneously or genuinely smile are not under our voluntary control. It is the involuntary and spontaneous contraction of these muscles that moves the skin around our cheeks, eyes, and nose to create a smile that’s distinct from a fake or polite smile (Evans, 2001). People are able to distinguish the difference between these smiles, which is why photographers often engage in cheesy joking with adults or use props with children to induce a genuine smile before they snap a picture.
Our faces are the most expressive part of our body and can communicate an array of different emotions. Elif Ayiter – Facial Expression Test – CC BY-NC-ND 2.0.
We will learn more about competent encoding and decoding of facial expressions in Section 4.3 and Section 4.4, but since you are likely giving speeches in this class, let’s learn about the role of the face in public speaking. Facial expressions help set the emotional tone for a speech. In order to set a positive tone before you start speaking, briefly look at the audience and smile to communicate friendliness, openness, and confidence. Beyond your opening and welcoming facial expressions, facial expressions communicate a range of emotions and can be used to infer personality traits and make judgments about a speaker’s credibility and competence. Facial expressions can communicate that a speaker is tired, excited, angry, confused, frustrated, sad, confident, smug, shy, or bored. Even if you aren’t bored, for example, a slack face with little animation may lead an audience to think that you are bored with your own speech, which isn’t likely to motivate them to be interested. So make sure your facial expressions are communicating an emotion, mood, or personality trait that you think your audience will view favorably, and that will help you achieve your speech goals. Also make sure your facial expressions match the content of your speech. When delivering something light-hearted or humorous, a smile, bright eyes, and slightly raised eyebrows will nonverbally enhance your verbal message. When delivering something serious or somber, a furrowed brow, a tighter mouth, and even a slight head nod can enhance that message. If your facial expressions and speech content are not consistent, your audience could become confused by the mixed messages, which could lead them to question your honesty and credibility.