ACM Multimedia 2019

ConfLab: Meet the Chairs!

A multi-modal conference living lab!

Meet peers and chairs of ACM MM 2019 while co-creating a community data set you could be using for your own research.

How it Works

1 . Subscribe

Subscribe to our emails using the form on this page and we will keep you informed about the signup process.

2 . Meet the Chairs on Tuesday

Our data collection will take place on Tuesday from 15:30 to 16:30 at the Rhodes hall. After you arrive, we will check if you agreed to donate your data for research. You fill in a short survey about your research interests and experience with MM. We will give you our newly designed MINGLE Midge to be worn around your neck. After that you are free to meet peers and the conference’s organizational chairs at this event.

3 . Tutorial and Debrief on Friday

Join us in learning more about the science and technology behind ConfLab including discussions on privacy, ethics, & data sharing.

4 . Research and the Future

Your data will help progress research on social interaction analysis in the wild. It will be shared in a pseudonymised form with the research community under an End User License Agreement to be only used for non-commercial and non-governmental research. We also hope to use it for setting future grand challenges.


Subscribe to our notifications to stay informed about the signup process!

You can change your mind at any time by clicking the unsubscribe link in the footer of any email you receive from us or by contacting us at

We use mailchimp as our email platform. By agreeing below, you acknowledge that your information will be transmitted to Mailchimp for processing. Learn more about Mailchimp’s privacy practices here.

When you sign up for ConfLab you will need to consent to donating your data for research. We take your privacy very seriously. A preliminary version of the consent form is below.

Still not sure if you want to sign up? Check out our FAQ page for more information.

What data are you contributing?

We will be collecting the following data as part of this event

Acceleration and proximity

Our newly designed MINGLE Midge wearable device records acceleration and proximity during your interactions. Acceleration readings can be used to infer some of your actions like walking and gesturing. It is worn around the neck like a conference badge.


Overhead cameras will mounted to capture the interaction. These videos will be used to annotate behavior and for detection of social actions like speaking or of conversational groups.

Low-frequency audio

The Mingle MIDGE will also record low-frequency audio. This low frequency is enough for recognizing if you are speaking, but not enough to understand the content of your speech, giving us valuable information without compromising your privacy. Example audio:

Survey measures

Your research interests and level of experience within the MM community will be linked to the data above via a numerical identifier.

Hayley Hung

Associate Professor: Social Perceptive Computing

Ekin Gedik

Postdoctoral researcher: Multi-modal Social Experience Modelling

Bernd Dudzik

PhD Student: Memory as Context for Induced Affect from Multimedia

Stephanie Tan

PhD Student: Multi-modal Head Pose Estimation & Conversation Detection

Chirag Raman

PhD Student: Multi-modal Group Evolution Modelling

Jose Vargas

PhD Student: Multi-modal Conversational Event Detection


ConfLab is an initiative of the Socially Perceptive Computing Lab, Delft University of Technology.

We have over 10 years of experience in developing automated behavior analysis tools and collecting large scale social behavior in the wild.

We are partially supported by the Dutch Research Agency (NWO) MINGLE project and the organization of ACM Multimedia 2019.

Have any questions?