Over the past several years, having attended a large number of conferences, I’ve noticed a significant change in how information is shared with attendees, and how folks can learn the most effectively.
Think back to even ten years ago: most conferences were attended by folks who wanted to gain new information, and were willing to travel (in some cases great distances) to learn. It made little difference whether in the information technology space, or the social sciences, or even medicine: all conferences were positioned as a way for folks to learn the latest news in their area of expertise. Vendors, of course, were often laid out in a large “expos” by which they could market their wares to attendees.
Now, of course, things have changed dramatically: most products and vendors are only a short web search away. News about products or services that used to take months to disseminate among a particular sales channel now is often released in a preview form months before general availability, allowing early adopters to not only learn the information first, but to help influence the design of the products themselves. Even conference sessions themselves are often available (as in the case with Microsoft Ignite) as soon as they are delivered, for consumption over the web without any cost. Without the need to market in person, expo attendance at many conferences has dropped dramatically, in some casescausing the cancellation of the conference itself.
In this environment, it becomes more difficult to say what the value of attending a conference is. I’ve heard some folks say “networking”, and others indicate that, er, attending parties as a HR “reward” justifies attendance. I don’t find the latter a compelling reason, but I am sympathetic to the former – seeing folks in-person that one may have only emailed, or video-conferenced with, does have a meaningful effect on how you interact with them. The challenge, of course, is that none of this has any to do with learning anything new. Are conferences still useful for that function for staff?
In this age of previews, the answer is “not really”. Any organization looking to gain mastery over a new product is far better served kicking the tires (or in the case of my organization, “drinking your own champagne”) on it before it is ready for prime time. By the time a product hits general availability, it is really more a recognition that it is fully baked and can support a service level agreement than any indication of features. Worse: most conferences combine learning into one giant week of education. Study after study have shown that massed practice is inferior to learning small bits of information each day over the course of a year. Many conferences exacerbate the negatives of massed practice by combining a constant stream of information with social activities in the evenings that reduce retention (e.g. actual champagne!). The fact that folks prefer to go to conferences, and believe they are learning, is indicative of the wrong approach.
Without a solid learning component, or the ability to broadly market services, why still go to conferences? The short answer is those face to face meetings mentioned earlier. Large firms tend to be populated by lots of leaders – and often unless one is located in the same town as the headquarters of the organization, it can be difficult to even have a thirty minute meeting with someone capable of translating direct feedback into action. Conversely, there are some conferences (e.g. Microsoft Envision) that are not designed for the general public, and the ability to speak to a smaller subset of critical peers is certainly worth the time and energy. Both of these scenarios provide a solid rationale for attending – though this may mean a smaller number of folks travelling – and that ultimately drives success for everyone involved.