Skip to main content Accessibility help
×
Home
Hostname: page-component-6c8bd87754-4q2hw Total loading time: 0.372 Render date: 2022-01-17T02:26:50.142Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true, "newUsageEvents": true }

15 - Interpersonal Synchrony: From Social Perception to Social Interaction

from Part II - Machine Analysis of Social Signals

Published online by Cambridge University Press:  13 July 2017

Mohamed Chetouani
Affiliation:
University Pierre et Marie Curie
Emilie Delaherche
Affiliation:
University Pierre et Marie Curie
Guillaume Dumas
Affiliation:
University Pierre et Marie Curie
David Cohen
Affiliation:
University Pierre et Marie Curie
Judee K. Burgoon
Affiliation:
University of Arizona
Nadia Magnenat-Thalmann
Affiliation:
Université de Genève
Maja Pantic
Affiliation:
Imperial College London
Alessandro Vinciarelli
Affiliation:
University of Glasgow
Get access

Summary

Introduction

Synchrony refers to individuals’ temporal coordination during social interactions (Cappella, 2005). The analysis of this phenomenon is complex, requiring the perception and integration of multimodal communicative signals. The evaluation of synchrony has received multidisciplinary attention because of its role in early development (Feldman, 2003), language learning (Goldstein, King, & West, 2003), and social connection (Harrist & Waugh, 2002). Initially, instances of synchrony were directly perceived in the data by trained observers. Several methods have been proposed to evaluate interactional synchrony, ranging from behavior microanalysis (Cappella, 1997) to global perception of synchrony (Bernieri, Reznick, & Rosenthal, 1988). Behavioral synchrony has now captured the interest of researchers in such fields as social signal processing, robotics, and machine learning (Prepin & Pelachaud, 2011; Kozima, Michalowski, & Nakagawa, 2009).

In this chapter, we focus especially on description and definition of synchrony for the development of computational models. The chapter begins with a review of evidences of interpersonal synchrony from different research domains (psychology, clinics, neuroscience and biology). Then, we introduce a working definition of interpersonal synchrony (see Proposed Definition). The chapter surveys evaluation models and methods from the literature of psychology (see Non-computational Methods of Synchrony Assessment) and social signal processing (see Fully Automatic Measures of Synchrony). Finally, the chapter discusses a number of challenges that need to be addressed (see Conclusions and Main Challenges).

Non-verbal Evidence of Interpersonal Synchrony

Among social signals, synchrony and coordination have been considered lately (Ramseyer & Tschacher, 2010; Delaherche et al., 2012). Condon and Ogston (1967) initially proposed a microanalysis of human behavior (body motion and speech intonation) and evidenced the existence of interactional synchrony, the coordination between listener's and speaker's body movements, or between the listener's body movement and the speaker's pitch and stress variations. Bernieri et al. (1988) define coordination as the “degree to which the behaviors in an interaction are non-random, patterned or synchronized in both form and timing”. (Kendon, 1970) raises fundamental questions about the condition of interactional synchrony arousal and its function in interaction. When he synchronizes with the speaker, the listener demonstrates his ability to anticipate what the speaker is going to say. In this way, he gives feedback to the speaker and smoothens the running of the encounter.

Type
Chapter
Information
Social Signal Processing , pp. 202 - 212
Publisher: Cambridge University Press
Print publication year: 2017

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Al Moubayed, S., Baklouti, M., Chetouani, M., et al. (2009). Generating robot/agent backchannels during a storytelling experiment Proceedings of IEEE International Conference on Robotics and Automation (pp. 3749–3754).
Altmann, U. (2011). Studying movement synchrony using time series and regression models. In I. A., Esposito, R., Hoffmann, S., Hübler, & B., Wrann (Eds), Program and abstract of the COST 2012 Final Conference held in conjunction with the 4th COST 2012 International Training School on Cognitive Behavioural Systems (p. 23).
Ashenfelter, K. T., Boker, S. M., Waddell, J. R., & Vitanov, N. (2009). Spatiotemporal symmetry and multifractal structure of head movements during dyadic conversation. Journal of Experimental Psychology: Human Perception and Performance, 35(4), 1072–1091.Google Scholar
Bernieri, F. J., Reznick, J. S., & Rosenthal, R. (1988). Synchrony, pseudosynchrony, and dissynchrony: Measuring the entrainment process in mother–infant interactions. Journal of Personality and Social Psychology, 54(2), 243–253.Google Scholar
Boucenna, S., Anzalone, S., Tilmont, E., Cohen, D., & Chetouani, M. (2014). Learning of social signatures through imitation game between a robot and a human partner. IEEE Transactions on Autonomous Mental Development, 6(3), 213–225.Google Scholar
Campbell, N. (2009). An audio-visual approach to measuring discourse synchrony in multimodal conversation data. In Interspeech (pp. 2159–2162), September, Brighton, UK.
Cappella, J. N. (1997). Behavioral and judged coordination in adult informal social interactions: Vocal and kinesic indicators. Journal of Personality and Social Psychology, 72, 119–131.Google Scholar
Cappella, J. N. (2005). Coding mutual adaptation in dyadic nonverbal interaction. In V., Manusov (Ed.), The Sourcebook of Nonverbal Measures: Going Beyond Words (pp. 383–392). Mahwah, NJ: Lawrence Erlbaum.
Champagne, F., Diorio, J., Sharma, S., & Meaney, M. J. (2001). Naturally occurring variations in maternal behavior in the rat are associated with differences in estrogen-inducible central oxytocin receptors. Proceedings of the National Academy of Sciences, 98(22), 12736– 12741.Google Scholar
Chartrand, T. L. & Bargh, J. A. (1999). The chameleon effect: The perception-behavior link and social interaction. Journal of Personality and Social Psychology, 76(6), 893–910.Google Scholar
Chatel-Goldman, J., Schwartz, J.-L., Jutten, C., & Congedo, M. (2013). Non-local mind from the perspective of social cognition. Frontiers in Human Neuroscience, 7, 107.Google Scholar
Chittaranjan, G., Aran, O., & Gatica-Perez, D. (2011). Inferring truth from multiple annotators for social interaction analysis. In Neural Information Processing Systems (NIPS) Workshop on Modeling Human Communication Dynamics (HCD) (p. 4).
Cohn, J. F. (2010). Advances in behavioral science using automated facial image analysis and synthesis. IEEE Signal Processing Magazine, 27(November), 128–133.Google Scholar
Condon, W. S. & Ogston, W. D. (1967). A segmentation of behavior. Journal of Psychiatric Research, 5, 221–235.Google Scholar
Delaherche, E., Boucenna, S., Karp, K., et al. (2013). Social coordination assessment: Distinguishing between shape and timing. In Multimodal Pattern Recognition of Social Signals in Human–Computer Interaction (vol. 7742, pp. 9–18). Berlin: Springer.
Delaherche, E. & Chetouani, M. (2010). Multimodal coordination: Exploring relevant features and measures. In Second International Workshop on Social Signal Processing, ACM Multimedia 2010.
Delaherche, E. & Chetouani, M. (2011). Characterization of coordination in an imitation task: Human evaluation and automatically computable cues. In 13th International Conference on Multimodal Interaction.
Delaherche, E., Chetouani, M., Mahdhaoui, M., et al. (2012). Interpersonal synchrony: A survey of evaluation methods across disciplines. IEEE Transactions on Affective Computing, 3(3), 349–365.Google Scholar
Dumas, G., Nadel, J., Soussignan, R., Martinerie, J., & Garnero, L. (2010). Inter-brain synchronization during social interaction. PLoS ONE, 5(8), e12166.Google Scholar
Feldman, R. (2003). Infant–mother and infant–father synchrony: The coregulation of positive arousal. Infant Mental Health Journal, 24(1), 1–23.Google Scholar
Feldman, R. (2007). Parent–infant synchrony and the construction of shared timing: Physiological precursors, developmental outcomes, and risk conditions. Journal of Child Psychology and Psychiatry and Allied Disciplines, 48(3–4), 329–354.Google Scholar
Goldstein, M. H, King, A. P., & West, M. J. (2003). Social interaction shapes babbling: Testing parallels between birdsong and speech. Proceedings of the National Academy of Sciences of the United States of America, 100(13), 8030–8035.Google Scholar
Gratch, J., Wang, N., Gerten, J., Fast, E., & Duffy, R. (2007). Creating rapport with virtual agents. IVA '07: Proceedings of the 7th International Conference on Intelligent Virtual Agents (pp. 125–138). Berlin: Springer.
Gravano, A. & Hirschberg, J. (2009). Backchannel-inviting cues in task-oriented dialogue. In Proceedings of InterSpeech (pp. 1019–1022).
Guedeney, A., Guedeney, N., Tereno, S., et al. (2011). Infant rhythms versus parental time: Promoting parent–infant synchrony. Journal of Physiology-Paris, 105(4–6), 195–200.Google Scholar
Harrist, A. W. & Waugh, R. M. (2002). Dyadic synchrony: Its structure and function in children's development. Developmental Review, 22(4), 555–592.Google Scholar
Huang, L., Morency, L.-P., & Gratch, J. (2011). A multimodal end-of-turn prediction model: Learning from parasocial consensus sampling. In The 10th International Conference on Autonomous Agents and Multiagent Systems AAMAS '11 (vol. 3, pp. 1289–1290).Google Scholar
Kelso, J. A. S., Dumas, G., & Tognoli, E. (2013). Outline of a general theory of behavior and brain coordination. Neural Networks, 37(1), 120–131.Google Scholar
Kendon, A. (1970).Movement coordination in social interaction: Some examples described. Acta Psychologica, 32, 100–125.Google Scholar
Kipp, M. (2008). Spatiotemporal coding in ANVIL. In Proceedings of the 6th International Conference on Language Resources and Evaluation, LREC, Marrakech.
Kozima, H., Michalowski, M., & Nakagawa, C. (2009). Keepon. International Journal of Social Robotics, 1, 3–18.Google Scholar
Lakens, D. (2010). Movement synchrony and perceived entitativity. Journal of Experimental Social Psychology, 46(5), 701–708.Google Scholar
Lee, C., Katsamanis, A., Black, M. P., et al. (2011). An analysis of PCA-based vocal entrainment measures in married couples, affective spoken interactions. In Proceedings of InterSpeech (pp. 3101–3104).
Mahdhaoui, A. & Chetouani, M. (2011). Understanding parent–infant behaviors using nonnegative matrix factorization. In Proceedings of the Third COST 2102 International Training School Conference on Toward Autonomous, Adaptive, and Context-Aware Multimodal Interfaces: Theoretical and Practical Issues (pp. 436–447). Berlin: Springer.
Messinger, D. M., Ruvolo, P., Ekas, N. V., & Fogel, A. (2010). Applying machine learning to infant interaction: The development is in the details. Neural Networks, 23(8–9), 1004–1016.Google Scholar
Michalowski, M. P., Simmons, R., & Kozima, H. (2009). Rhythmic attention in child–robot dance play. In Proceedings of RO-MAN 2009, Toyama, Japan.
Michelet, S., Karp, K., Delaherche, E., Achard, C., & Chetouani, M. (2012). Automatic imitation assessment in interaction. Human Behavior Understanding(vol. 7559, pp. 161–173). Berlin: Springer
Morency, L.-P., Kok, I., & Gratch, J. (2008). Predicting listener backchannels: A probabilistic multimodal approach. In Proceedings of the 8th International Conference on Intelligent Virtual Agents IVA '08 (pp. 176–190). Berlin: Springer.
Nadel, J., Carchon, I., Kervella, C., Marcelli, D., & Roserbat-Plantey, D. (1999). Expectancies for social contingency in 2-month-olds. Developmental Science, 2(2), 164–173.Google Scholar
Oullier, O., De Guzman, G. C., Jantzen, K. J. S. Kelso, J. A., & Lagarde, J. (2008). Social coordination dynamics: Measuring human bonding. Social Neuroscience, 3(2), 178–192.Google Scholar
Ozkan, D., Sagae, K., & Morency, L.-P. (2010). Latent mixture of discriminative experts for multimodal prediction modeling. Computational Linguistics, 2, 860–868.Google Scholar
Pentland, A., Lazer, D., Brewer, D., & Heibeck, T. (2009). Using reality mining to improve public health and medicine. Studies in Health Technology and Informatics, 149, 93–102.Google Scholar
Perry, A., Troje, N. F., & Bentin, S. (2010). Exploring motor system contributions to the perception of social information: Evidence from EEG activity in the mu/alpha frequency range. Social Neuroscience, 5(3), 272–284.Google Scholar
Petridis, S., Leveque, M., & Pantic, M. (2013). Audiovisual detection of laughter in human machine interaction. Affective Computing and Intelligent Interaction ACII 2013 (pp. 129–134).
Prepin, K. & Gaussier, P. (2010). How an agent can detect and use synchrony parameter of its own interaction with a human? In A., Esposito, N., Campbell, C., Vogel, A., Hussain, & A., Nijholt (Eds), Development of Multimodal Interfaces: Active Listening and Synchrony (pp. 50–65). Berlin: Springer.
Prepin, K. & Pelachaud, C. (2011). Shared understanding and synchrony emergence: Synchrony as an indice of the exchange of meaning between dialog partners. In ICAART2011 International Conference on Agent and Artificial Intelligence(vol. 2, pp. 25–30).Google Scholar
Ramseyer, F. & Tschacher, W. (2006). Synchrony: A core concept for a constructivist approach to psychotherapy. Constructivism: The Human Sciences, 11, 150–171.Google Scholar
Ramseyer, F. & Tschacher, W. (2010). Nonverbal synchrony or random coincidence? How to tell the difference. In A., Esposito, N., Campbell, C., Vogel, A., Hussain, & A., Nijholt (Eds), Development of Multimodal Interfaces: Active Listening and Synchrony (pp. 182–196). Berlin: Springer.
Ramseyer, F. & Tschacher, W. (2011). Nonverbal synchrony in psychotherapy: Coordinated body movement reflects relationship quality and outcome. Journal of Consulting and Clinical Psychology, 79(3), 284–295.Google Scholar
Richardson, D. C. & Dale, R. (2005). Looking to understand: The coupling between speakers' and listeners' eye movements and its relationship to discourse comprehension. Cognitive Science, 29(6), 1045–1060.Google Scholar
Richardson, D., Dale, R., & Shockley, K. (2008). Synchrony and Swing in Conversation: Coordination, Temporal Dynamics, and Communication. Oxford: Oxford University Press.
Richardson, M J., Marsh, K L., Isenhower, R. W., Goodman, J. R. L., & Schmidt, R. C. (2007). Rocking together: Dynamics of intentional and unintentional interpersonal coordination. Human Movement Science, 26(6), 867–891.Google Scholar
Saint-Georges, C., Chetouani, M., Cassel, R., et al. (2013).Motherese in interaction: At the crossroad of emotion and cognition? (A systematic review.) PLoS ONE, 8(10), e78103.Google Scholar
Saint-Georges, C., Mahdhaoui, A., Chetouani, M., et al. (2011). Do parents recognize autistic deviant behavior long before diagnosis? Taking into account interaction using computational methods. PLoS ONE, 6(7), e22393.Google Scholar
Shockley, K., Santana, M.-V., & Fowler, C. A. (2003). Mutual interpersonal postural constraints are involved in cooperative conversation. Journal of Experimental Psychology: Human Perception and Performance, 29(2), 326–332.Google Scholar
Sun, X., Lichtenhauer, J., Valstar, M., Nijholt, A., & Pantic, M. (2011). A multimodal database for mimicry analysis. In J, Luo (Ed.) Affective Computing and Intelligent Interaction (pp. 367– 376). Berlin: Springer.
Sun, X., Truong, K., Nijholt, A., & Pantic, M. (2011). Automatic visual mimicry expression analysis in interpersonal interaction. In Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (CVPR-W'11),Workshop on CVPR for Human Behaviour Analysis (pp. 40–46).
Thórisson, K. R. (2002). Natural turn-taking needs no manual: Computational theory and model, from perception to action. In B., Granström, D., House, & I., Karlsson (Eds), Multimodality in Language and Speech Systems (pp. 173–207). Dordrecht, Netherlands: Kluwer Academic.
Varni, G., Volpe, G., & Camurri, A. (2010). A system for real-time multi-modal analysis of nonverbal affective social interaction in user-centric media. IEEE Transactions on Multimedia, 12(6), 576–590.Google Scholar
Viaux-Savelon, S., Dommergues, M., Rosenblum, O., et al. (2012). Prenatal ultrasound screening: False positive soft markers may alter maternal representations and mother–infant interaction. PLoS ONE, 7(1), e30935.Google Scholar
Ward, N. G., Fuentes, O., & Vega, A. (2010). Dialog prediction for a general model of turn-taking. In Proceedings of InterSpeech (pp. 2662–2665).
Weisman, O., Delaherche, E., Rondeau, M., et al. (2013). Oxytocin shapes parental motion during father–infant interaction. Biology Letters, 9(6).Google Scholar
Weisman, O., Zagoory-Sharon, O., & Feldman, R. (2012). Oxytocin administration to parent enhances infant physiological and behavioral readiness for social engagement. Biological Psychiatry, 72(12), 982–989.Google Scholar
Wiltermuth, S. S. & Heath, C. (2009). Synchrony and cooperation. Psychological Science, 20(1), 1–5.Google Scholar
5
Cited by

Send book to Kindle

To send this book to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Send book to Dropbox

To send content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Dropbox.

Available formats
×

Send book to Google Drive

To send content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Google Drive.

Available formats
×