Conferences Suck At Measurement!

2014.04.22_Evaluation scale by Bill Sodeman

Conference professionals suck at measurement.

If you trust your conference smile sheet evaluations as a barometer of how effective your conference education was, you are just being foolish says learning research psychologist Dr. Will Thalheimer.

Ouch! The truth hurts!

The Emptiness Of Smile Sheets Evaluations

Thalheimer points to research that the smile sheet evaluation shows a 0.9 correlation that learning occurred. That’s basically no correlation at all.

Smile sheets are not related to learning.

Smile-sheet ratings that are good don’t guarantee that the learning intervention was effective.

Smile-sheet ratings that are bad don’t necessarily mean that the learning intervention was poor. ~ Dr. Will Thalheimer.

Still, we praise and extol our conference vortex of smile sheet summaries. We hold them up to board members and future conference prospects about how successful our event was. We promote vague realities.

We don’t realize how seriously flawed our measurement is. Those happy face assessments cost us a lot as they suck the life of any quality improvement process.

Encouraging Bias Towards Entertainment

To paraphrase Thalheimer,

By only seeking attendee opinions about education sessions, we’ve created a bias toward entertainment, motivation and charisma. We fail at evaluating attendees’ true learning, remembering and application as well as the session’s content validity.

By measuring only when the attendees are in conference sessions, we can’t evaluate if the attendees remember and apply the content at work.

We can measure if the participants have immediate understanding and recall. However, our evaluations don’t measure if the session supported long-term attitude, behavior and skill change.

In short, we don’t know if they ever applied what they heard at the conference. Or if it really helped them at all.

Smile Sheet Artificial Responses

Recently I attended a conference where the organizers texted a happy face and a frown face to attendees immediately after every session, meal and reception. There was also a box to text comments. Participants were to click the appropriate face and text it back to organizers.

Wow! What type of information could that possibly provide to the conference organizers except attendee immediate gratification? That evaluation process has very little to do with conference improvement or real learning.

Usually our smile sheet evaluations fail to examine if our participants can now apply the information to their profession. We don’t even know if our conference education improves workplace performance.

In short, we don’t get the right attendee feedback we need to make quality conference improvements.

The Bad News

Most of us will continue to use smile sheet evaluations anyway!

It’s easier. It’s what we’ve always done. We don’t know any other ways.

Unfortunately, many conference organizers and hosts don’t care if learning occurred. They only care if their paying attendees’ walked out of the room with temporary and possible fleeting satisfaction. It’s all about the short-lived high, the cotton-candy fluff, the enjoyment of the experience.

And due to a lack of accurate information from valid evaluations, we make very poor decisions about future conference education and networking.

We’ve created a black hole of information with our current measurement procedures.

The Good News

Is there a better way? Yep! We’ll discuss better strategies in our next post.

What are some good questions you’ve seen on smile sheet evaluations that are an improvement over the tradition questions? What should conference organizers measure to help improve future conferences?

6 comments
  1. Measuring requires planning. Planning requires responding to measuring. Most humans do not like to do either. They rather thrive on inspiration and positive emotional responses, and blame their problems on things our of their control.

  2. […] conferences suck at collecting data. (Oh, were good at collecting registration and fees but that’s about […]

  3. I think asking what specific things you learned and how do you plan to apply what you learned are at least a start. I don’t know if it works for everyone, but for me, writing down an intention to do something ups the chances I actually will do it.

    I like the presenter who once had us all write down what we intended to change on a postcard we self-addressed, which he then sent to us three months down the road. You can’t necessarily measure the results that way, but it may help to jog memories and maybe get people to act on what they learned.

    Looking forward to your next post! This is a problem with pretty much all conferences I’ve been to.

  4. We’re making a run at ratings for conferences and events similar to TripAdvisor.com. Check out http://www.conferenceiq.com

  5. […] Recently, I wrote about how most conference organizers are really bad at measurement. […]

  6. […] actually provide adequate evaluation strategies. Some provide smile sheet strategies that are grossly biased as Dr. Will Thalheimer has proven. His research and writings show a better smile sheet strategy to […]

Leave a comment

Your email address will not be published. Required fields are marked *