Jump to content

英文维基 | 中文维基 | 日文维基 | 草榴社区

Non-Instrumental Movement Inhibition

From Wikipedia, the free encyclopedia

Non-Instrumental Movement Inhibition (NIMI) is an aspect of body language when a person stops fidgeting because they are interested in what they are watching. For example, when a young child is rapt watching a cartoon, they often sit motionless with their mouth open; this motionlessness is NIMI. As such, it is psychological phenomenon and a form of embodied behavior, where gestures and body movements reflect the thoughts and emotions in a person's mind. This phenomenon is different from almost all other body language because it interprets what does not happen (i.e. an absence of movement) rather than making an interpretation based on a specific gesture. During NIMI, visual engagement or attention leads subconsciously to lower levels of fidgeting (and other non-instrumental movements). [1]

The movements and actions that are inhibited during NIMI are non-strictly limited to fidgeting only. Non-Instrumental movements are bodily actions that are not related to the goal of the current task; for example, when in a classroom and the goal is to listen to a lecture, attentive listeners will not talk to their neighbors or make a call on their phone. Non-instrumental (unnecessary) movements include fidgeting, scratching, postural micromovements (e.g. sitting forward in a chair), certain emotional expressions (e.g. shrugging), and even breathing. To use breathing as an example, when a person watches a tense movie, they might momentarily stop regular breathing, and this pause is also an example of NIMI. NIMI is important for recognizing boredom [2][3] during human-robot interaction, human-computer interaction, computer-aided learning with automated tutoring systems, market research, and experience design.[4]

Historical Evidence

[edit]

The original observation that, in a seated audience, interest is associated with diminished fidgeting, and that boredom doubles the amount of human movement, was made by Francis Galton in 1885.[5] Modern experiments suggesting that movement inhibition (and NIMI) were quantifiable and related to flow or interest were suggested by a series of papers regarding automated tutoring systems by Sidney D’Mello and colleagues.[6] Using a non-visual task, Paul Seli and collaborators showed that increased episodes of mind wandering led to an increase in fidgeting, presumably because attention requires comparative stillness (maintaining that stillness is described as “a secondary task”).[7] Nadia Bianchi-Berthouze and colleagues demonstrated that engagement in games (and human computer interaction) could lead to either increased movement or decreased movement, depending on the motivational nature of movement tasks involved with the accomplishment of the task.[8] Harry Witchel and colleagues named the inhibitory phenomenon as NIMI,[1] and demonstrated that the visual aspect of the human-computer interaction task was the most powerful contributor to the inhibitory effect on movement.[9] They also demonstrated that, during individual human computer interaction in instrumentally identical reading comprehension tasks, interest itself was sufficient to diminish movement.[9] This was reflected in experiments by Patrick Healy and colleagues in a seated audience at a dance performance.[10]

Controversy

[edit]

While it is known that frustration[11] and restlessness can lead to increased movement during human computer interaction, it remains controversial as to whether NIMI that occurs during engagement is actually an inhibition of a baseline amount of physiologically required movement.

References

[edit]
  1. ^ a b Witchel, Harry; Westling, Carina; Tee, Julian; Healy, Aoife; Needham, Rob; Chockalingam, Nachiappan (2014). "What does not happen: Quantifying embodied engagement using NIMI and self-adaptors" (PDF). Participations: Journal of Audience and Reception Studies. 11 (1): 304–331.
  2. ^ Gurney-Read, Josie (2016-02-23). "Computers can detect boredom by how much you fidget". The Telegraph newspaper (London, UK). ISSN 0307-1235. Retrieved 2017-11-28.
  3. ^ Gregoire, Carolyn (2016-03-09). "Computers Can Now Read Our Body Language". Huffington Post. Retrieved 2017-11-28.
  4. ^ Nuwer, Rachel (2016). "Now computers can tell when you are bored: That ability could lead to more engaging coursework and machines that better understand human emotions". Scientific American. 314 (5): 15. doi:10.1038/scientificamerican0516-15. PMID 27100240.
  5. ^ Galton, Francis (1885-06-25). "The Measure of Fidget". Nature. 32 (817): 174–175. Bibcode:1885Natur..32..174G. doi:10.1038/032174b0. S2CID 30660123.
  6. ^ D'Mello, Sidney; Chipman, Patrick; Grasesser, Art (2007). "Posture as a predictor of learner's affective engagement" (PDF). Proceedings of the 29th Annual Cognitive Science Society: 905–910.
  7. ^ Seli, Paul; Carriere, Jonathan S. A.; Thomson, David R.; Cheyne, James Allan; Martens, Kaylena A. Ehgoetz; Smilek, Daniel (2014). "Restless mind, restless body". Journal of Experimental Psychology: Learning, Memory, and Cognition. 40 (3): 660–668. doi:10.1037/a0035260. PMID 24364721.
  8. ^ Bianchi-Berthouze, Nadia (2013-01-01). "Understanding the Role of Body Movement in Player Engagement". Human–Computer Interaction. 28 (1): 40–75. doi:10.1080/07370024.2012.688468. ISSN 0737-0024. S2CID 9630093.
  9. ^ a b Witchel, Harry J.; Santos, Carlos P.; Ackah, James K.; Westling, Carina E. I.; Chockalingam, Nachiappan (2016). "Non-Instrumental Movement Inhibition (NIMI) Differentially Suppresses Head and Thigh Movements during Screenic Engagement: Dependence on Interaction". Frontiers in Psychology. 7: 157. doi:10.3389/fpsyg.2016.00157. ISSN 1664-1078. PMC 4762992. PMID 26941666.
  10. ^ Theodorou, Lida; Healey, Patrick (2017). "What can Hand Movements Tell us about Audience Engagement?" (PDF). Proceedings of Cognitive Sciences Society Annual Meeting, London 2017.
  11. ^ Kapoor, Ashish; Burleson, Winslow; Picard, Rosalind W. (August 2007). "Automatic Prediction of Frustration". Int. J. Hum.-Comput. Stud. 65 (8): 724–736. CiteSeerX 10.1.1.150.1347. doi:10.1016/j.ijhcs.2007.02.003. ISSN 1071-5819.