R5 Bums in Seats Tells You Almost Nothing
Bums in Seats Tells You Almost Nothing
Week 5: Bums in Seats Tells You Almost Nothing
Most post‑event reports start the same way.
We had 420 people in the room. Registration exceeded target by 15%. Standing room only.
It sounds impressive. It looks good on a slide. And it tells you almost nothing about whether the event actually worked.
Attendance is a reach number, not an impact number. It says people showed up. It does not say what changed because they were there.
That is the gap we need to close.
The Problem with Counting Bodies
Counting bodies is easy. It is also lazy.
You can fill a room by making attendance compulsory, putting the right name on the invite or offering decent food and a day out of the office. None of that means the event did anything beyond occupying time and budget.
An event worked only if something is different afterwards that matters to the organisation: a belief, a behaviour, a relationship, a decision, a number on a report.
If you cannot point to that difference, all you have is a busy room.
Three Better Questions Than ‘How Many Turned Up?’
Before we talk about metrics, it helps to change the questions.
Instead of ‘How many people attended?’ ask who did we get in the room? Did we get the right people or just a lot of people? For a sales kick‑off that might mean quota‑carrying reps and their leaders, not every possible staff member. For a partner event it might mean the right tier of partner, not every reseller in the database. The difference matters.
Then ask what did we want them to think, feel or do differently? If you cannot answer this in one or two sentences, there is no way to measure success honestly. ‘Informed’ and ‘engaged’ are not enough. Did you want them to back a new strategy? Sell differently? Renew membership? Stay with the organisation? Invest more? Get specific.
Finally, ask what happened in the weeks after the event that would not have happened otherwise. This is where the real evidence lives. Did sales conversations move? Did internal projects unblock? Did leaders see different behaviour on the ground? Did members renew at higher rates? Events do not exist in a vacuum. They are supposed to shift something. If nothing shifted, why did we do it?
Once you have answers to those three, then the metrics start to make more sense.
Activity vs Outcome
We find it useful to separate activity metrics from outcome metrics.
Activity metrics are the usual suspects. Registrations vs. attendance. Session popularity. Satisfaction scores, ‘rate the conference out of 10.’ Social media mentions and photos. They tell you about the event as an event. They are not useless, but they are very easy to spin.
Outcome metrics look different. They are specific to the event’s purpose.
For a sales conference, did opportunity volume, win rates or average deal size change in the quarter after the event compared to before? For a member congress, did renewals, event re‑bookings or uptake of member services shift in the following months? For an internal leadership summit, did teams actually implement the changes agreed in the room? Were decisions made faster? Did staff engagement or retention move in the areas you focused on?
None of these are perfect. All of them are better than ‘we filled the ballroom.’
A Simple Timing Framework
You do not need a PhD in data to do this well. You just need a bit of discipline around timing.
Before the event, ask what is the one thing we would like to be able to say, truthfully, three months after this event? And what evidence would convince a sceptical CFO or CEO that the event helped? Agree those answers before you lock the agenda. Otherwise you will end up defending a very expensive day out based on nothing more than how happy people looked in the photos.
During the event, build the bridge. Are we collecting anything more than smile sheets? Are we capturing who attended which sessions in a way that can be linked to later behaviour? Are we giving people clear, simple next steps to take afterwards? This is where tools like Pollinate’s data layer come into their own for broadcasts and live streams, but the principle is the same for physical events: know who did what, not just who turned up.
After the event, look for patterns. Did the people who engaged most with a particular topic behave differently afterwards? Did teams whose leaders attended respond differently to change? Did segments of your audience, members, partners, customers, show different results? The idea is not to prove a neat causal link for everything. It is to see whether the event showed up in the numbers and decisions you care about.
Two Very Different Successes
Two real conference outcomes illustrate the difference.
In one, we had 600 people in the room, high satisfaction scores and glowing feedback about the entertainment. Six months later, sales performance and staff turnover looked exactly the same as before. The client called it a success. We did not.
In another, attendance was slightly below target. A few people grumbled about the time it took to get to the venue. But in the next quarter, the client saw a measurable lift in cross‑selling between two product lines we had focused on, regional leaders using the
same language and story we had developed in the plenary and an internal project (stalled for a year) finally getting agreed and funded.
On paper, the first event looks better. In reality, the second one did what it was meant to do.
That is the shift we are talking about.
What This Means for Your Next Brief
The next time you are writing or receiving an event brief, try this small change.
Instead of ‘We need a national roadshow for 400 customers,’ say ‘In three months’ time, we want to see X, Y and Z changed in this group. We believe a national roadshow is the right lever. Help us design and measure it so we will know if that actually happened.’
Once you frame it that way, ‘bums in seats’ becomes what it really is: a useful context metric, but nowhere near the headline.
The headline is always the same question: what changed because this event happened?
If you cannot answer that, it does not matter how full the room was.
Next week: four questions every event brief should answer, and why most briefs do not.