Saturday, May 26, 2018
No matter how digital we may be today there are some devices that we don't seem to ever come to terms with.
My first example is the projector. At every conference or meeting I attend someone has problems with these deceptively simple devices and sometimes these problems can escalate into lengthy battles as the audience murmurs sympathetically and knowingly in the background. We've all been there. You have your slides ready, connect the laptop to the projector but nothing happens ... no signal detected. Even hardened tech professionals can be reduced to looking like embarrassed novices when confronted with a cranky projector. Projectors are totally unpredictable creatures who can be affectionate and happy one minute and then suddenly act as if they've never seen you before. They tend to be faithful to only certain devices and bitter enemies to all the rest and they really take objection to newcomers. Most conferences therefore play safe and insist on uploading all presentations to a computer that they know the projector will trust. Anyone who tries to plug in an outsider simply gets what's coming to them. If you're just going to show a slideshow then that's fine but if you're going to log into different web services and tools during your session it's very complex doing so on a strange device that may object to the sites you try to log into and your own device just works seamlessly. Even when you do make contact with the projector it nearly always chooses a bizarre screen resolution that means that my screen appears in magnified format and you need to play around with various controls to get something that the audience can see properly.
If I was asked to be counsel for the defense, I would probably build my case on how difficult it is for a poor simple projector to adapt to the myriad of settings and applications that people have on their devices. Older projectors simply can't keep up with the pace of chance and maybe it's unrealistic to expect them to do so. However, I do wish we could find a way for laptops and projectors to understand each other a little better.
My other example is the microphone. Here there are two issues: the device itself and our attitudes towards them. Wireless microphones have a habit of running out of battery power in the middle of a session or there's some loose connection somewhere that cuts off the sound at regular intervals. If there's no reserve device close at hand this can result in major interruptions and irritation. This problem is of course easy to remedy with good preparation. The other, trickier issue is people's extreme reluctance to use microphones at all. Even if the venue has microphones ready to use there are always speakers who ask the rhetorical question, "I don't need a microphone do I?" and the audience seldom objects. However those whose hearing is not 100% will seldom raise an objection even if they can hardly hear what is being said. If we are serious about inclusion in education the default should be to use a microphone. It doesn't hurt and everyone can hear you.
I admit that headsets can be awkward to put on but do it before you start and you'll be fine. Handheld microphones are trickier and you need to hold them close to your mouth. I've seen so many speakers gesticulating with their microphone hand or holding the mike too far from the mouth and so only the front rows can hear them at all. But with a bit of concentration and a positive attitude it works well and everyone can hear you. Let's see microphones as inclusive technology and use them better.
Friday, May 18, 2018
Virtually every conference has a panel discussion where a number of decision makers and experts discuss the main themes of the conference. It is a good opportunity to hear these experts state their positions and hopefully engage in a lively and stimulating debate. However although there is interaction on stage the audience is seldom involved apart from a handful of questions from those who dare to speak up. We get to hear their ideas and arguments but how can they get to hear the audience's perspectives? There is an enormous amount of experience and expertise in the room that the panel members would learn a lot from hearing. Politicians and policy makers need to learn more about practical grassroots experience and thereby gain deeper insights into the issues they need to address. To do this, I think we need to flip the panel discussion.
One way could be for the panel to announce a few key questions (one at a time) and ask the audience to work in small groups and write answers on a collaborative document. After a few minutes everyone in the hall has hopefully contributed to the discussion and then the panel could comment on the answers. Then repeat the procedure as necessary. A lot of the session would be fairly silent as the participants write and confer but the activity level will be much higher than in a normal panel discussion. Another idea that would work in a smaller conference where there are quite a few decision makers, is to divide the participants into groups, send them to smaller group rooms and assign a small number of decision makers to each group. The experts' role would be to simply ask questions, let the group discuss and take notes of the answers. The experts would therefore focus on listening. At the end the panel could comment on what they had heard from the discussions.
By using methods like this we can harvest ideas from all participants and give extremely useful input to the invited experts that they would never get from a traditional set-up. The conference could therefore become a greater learning experience, even for the invited guest speakers.
Tuesday, May 15, 2018
The danger of lectures is that they create the illusion of teaching for teachers, and the illusion of learning for learners.
This quote, generally attributed to one of my favourite authors Albert Camus (though I can't find what work it is taken from after about twenty minutes of searching), is often used by those who want to scrap the traditional lecture and replace it with more active forms of learning where the teacher facilitates and mediates rather than being the headline act. The global stereotype of higher education is of the gigantic lecture hall filled with students and the brilliant professor on the stage. It's what many students expect and what a lot of institutions still try to provide though most lectures fall far short of the ideal. Lectures are popular because they are easy to produce, can be delivered to large groups of students and are based on the view of education as consumption of content. But today many institutions are moving towards pedagogical models that focus on active learning, co-creation and collaboration and the physical landscape of the university is changing rapidly as more and more active learning spaces replace the old lecture halls and fixed-desk classrooms. Some universities have gone as far as to scrap the lecture hall completely though they continue to produce them in a digital format on their media platforms.
However, I believe that the lecture still has an important role to play in education as long as it is used wisely and sparingly. That was reinforced for me after reading an article by Michael Merrifield in Times Higher Education, University lecturers should be engaging raconteurs, claiming that the value of a lecture is in terms of its ability to engage and inspire and as such the lecturer must be, above all, a storyteller, a performance artist. It's not about going through the facts and theories that can be read in a book or article, it's about building a narrative that will inspire, provoke thought and challenge the audience.
So what is the point of a lecture? To be honest, I think it is something rather simple. It is to impart knowledge the lecturer currently has but the students do not, through a narrative that is more entertaining than reading the same material out of a book. So, when lecturing, I am not a sage on a stage, a phrase that is clearly intended as deprecating as well as being conveniently alliterative. I am, hopefully, an entertaining storyteller, which also sounds deprecating, but I don’t think it is.
Maybe lectures are about creating illusions but not in the sense implicit in the quote at the beginning of this post. The secret to a good lecture is creating the illusion of a compelling narrative, where you teach ideas and concepts by weaving them into a story with elements of surprise, suspense and inquisitive engagement. The lecture should be an event rather than an everyday ritual and as such it can be a very valuable teaching tool but only when well planned and delivered with enthusiasm. If you want to lecture then you need to ask yourself these questions:
- Are you sure that a lecture is the best way to engage the learners in this topic?
- How can I engage them in my narrative? e.g. short teaser video/quiz to stimulate interest before the lecture, interaction using digital tools, short buzzgroup activities, creating suspense, use of props.
- What happens after the lecture? Is there a (digital) space for reflection, questions, follow-up work?
Your enthusiasm and ability to communicate effectively can make all the difference. Above all, make it unmissable!
Sunday, May 6, 2018
A recurring theme this year is the redefinition of free. I keep returning to this but I believe we are in the midst of a radical change in the way we use the internet. The internet of the nineties was free because it was mostly lightweight text-based pages and was run and written by enthusiastic pioneers. Once the content started getting more sophisticated and demanded much more work to produce, the people who produced the content needed to get paid for their work. But since free had become default the money had to be made somehow and so advertising became the solution. Now when everything is powered by extremely sophisticated advertising, lobbying and disinformation we suddenly realise that we have sold our every click, like, thought and integrity for the fleeting rewards of the "free" internet. Now the model seems to be in a process of change, except that we're not yet sure which way to go.
I can recommend an interesting angle on this in a TechCrunch article, Subscription Hell. It's about the increasing number of content providers, tools, platforms and storage services that are suddenly imposing sometimes rather hefty subscriptions for services that used to be free, or freemium services that radically reduce the scope of the free version in order to force users to go pro. The change may not seem so great from the perspective of the company but when you have become used to using a wide range of services and platforms the prospect of paying for them all can be daunting.
I’m frustrated with this hell. I’m frustrated that the web’s promise of instant and free access to the world’s information appears to be dying. I’m frustrated that subscription usually means just putting formerly free content behind a paywall. I’m frustrated that the price for subscriptions seems wildly high compared to the ad dollars that the fees substitute for. And I’m frustrated that subscription pricing rarely seems to account for other subscriptions I have, even when content libraries are similar.
News media in many countries are disappearing behind paywalls, often leaving behind as meager compensation a simplified free version where all content simply drowns in a sea of ads. I follow many news media from around the world and appreciate the opportunity to read about world news from different perspectives. If they all put up paywalls I'd have to choose which ones I am willing to subscribe to and my perspectives would be seriously narrowed. Similarly in education, I have been forced to abandon useful tools because I can't justify the new subscription cost. It's often not the individual subscription that's the problem, it's multiplying that figure by 10 or 30 or 50.
I understand that all these services cost money to produce and the people who do that work need to be paid. If the advertising and data harvesting model is flawed and must be regulated then we have to accept that a new model for financing the internet needs to be found. I pay for a few services and tools but I'm still dependent on the "free" ones. The article suggests bundles of similar services and discounts for subscribing to several. Many are also discussing the model of micro-payments based on volume of use rather than flat-rate subscriptions. With the growth of digital transactions and the increased security available this is more feasible than before. But if we want to move away from the exploitative model of today where you are the product then we have to find new ways to pay as we go. Are the days of free are drawing to a close?
Subscription hell is real, but that doesn’t mean the business model is flawed. Rather, we need to completely transform our thinking around these models, including the marketing behind them and the features that they offer. We also need to consider consumers and their wallets more holistically, since no one buys a subscription in a vacuum. For too long, paywall playbooks have just been copied rather than innovated upon. It’s time for product leaders to step up and build a better future.