Tuesday, December 29, 2009

Technology in the classroom - fond memories

The year was 1969 and Armstrong and Aldrin were soon to land on the moon using the computing power of the average modern cellphone. My primary school class were also making a giant leap by being allowed to follow the country's first TV-broadcast sex education series. Up till then there was little use of television in the classroom but this awkward subject seemed ideal for TV since it meant that teachers would be released from potentially embarrassing lessons to teach. Furthermore all pupils would get the same message.

As I remember, the series was very well done considering the taboos of the period and we got the message with a minimum of fuss. However, all parents had to give written consent to their children seeing the series and a small group were excluded, having to sit in another room and draw or read. Maybe some parents didn't like the idea of such a subject being taught by TV or maybe they thought we should remain sweet and innocent a few more years. Whatever the reason I think the medium was excellent for the purpose and stands in stark contrast to the more traditional method of sex education I encountered the following year.

I was in a new school then and only a few of the pupils had seen the revolutionary television series. So it was the job of the biology teacher to update us all on the facts of life. He was a decent teacher but on this subject he was rather shy. We first got a confusing lesson about all the vital organs with all the names in Latin and diagrams that reminded me mostly of marine invertebrates. Many didn't even realise we were getting sex education. The final and most memorable part of this process was when we watched a film about the mating habits of locusts. They touched each other with their antennae for a while before the male jumped up on the female and they stayed locked together and motionless for some time. I remember one lad asking the teacher if they enjoyed it and he said it was hard to tell. The lad replied promptly, " Look sir, I think that one is smiling!"

That was it. Goodness knows how the kids who hadn't seen the TV series managed to work out the intricacies of sexual intimacy. I'm not sure what the moral of this story is but it's a good example of how far we've advanced in tackling this sensitive subject and how the "good old days" of education were often less effective than we'd like to remember.

Photo: Cheryl Recca, Stockvault.com

Monday, December 21, 2009

Being bored

How often do you simply just sit staring into space, unable to think of anything to do? Or maybe you don't have an iPhone yet. The fact is that we are seldom in such situations nowadays since you can always watch, listen to or read something on some kind of mobile device. Failing that you're never far from a TV screen or piped muzak. Waiting for a bus or train used to be dull but now I can listen to music or podcasts, update Facebook or Twitter (eg I'm waiting for a bus), check the latest news or sport results and even watch highlights from a match. Now you barely notice that the bus went swishing past you ten minutes ago. The soundtrack of our lives keeps playing wherever we are.

In our always-on society we simply haven't any excuse for being bored. But boredom can be beneficial. Those quiet moments give us time to think and that may even lead to creative thinking. A blog post by Mattias Klang (in Swedish) lists a number of things he will miss in the future: bookshops selling more than just bestsellers, newspapers, notebooks and pens, letter writing and of course non-productive time. That non-productive, quiet time is under threat. It's becoming impossible to resist the temptation to connect.

Those of us brought up in the days of one or two TV channels and not many more options on the radio had plenty of media-free time to contemplate. Are we therefore better at handling silence and inactivity than today's youngsters? Is quiet time an essential part of our lives that is now under threat from media bombardment. I feel it may be and that we all need to be confronted with boredom now and again but it's not something we willingly volunteer to do. It's easy to say "just switch off" but much harder to do.

Friday, December 18, 2009


I read a lot of magazines and probably subscribe to one too many; I just can't kick the habit. It means that I seldom get down to reading so many books because magazines get in the way. I also enjoy visiting newsagents and browsing at the ever-increasing range of magazines on offer. There you really see the overwhelming volume of production available today. On the net you only look at one site at a time (mostly) but on the newstands you see them all side by side.

But is all this soon to go, swept away by the same forces that are undermining video hire stores and record shops? Magazines are glossy, attractive and full of top quality photography. Today's e-book readers like Kindle just can't compete with the paper versions but what happens when the tablet readers get the same graphics as the glossy mags? Plus links to video, animations and interactive content.

This film is a vision of a possible e-magazine reader from the Swedish publisher Bonnier (Digital magazines: Bonnier Mag+ Prototype). We're not there yet but soon will be and the big question is whether people will be willing to pay for this sort of attractive content. Could this be a way for publishers to earn money from content? The device is slim and no bigger than the average paper magazine but should of course be able to store hundreds of magazines. Now if that device can also act as a computer screen so I don't need to carry several devices then I'm very interested. When devices like this come on the market the newsagents could be in trouble.

Mag+ from Bonnier on Vimeo.

Putting it simply

I mentioned recently how hard it is to explain educational technology to friends and family and it's always good to get practical tips on how to explain our much-loved terminology in plain English.

Christopher D Sessums
offers a wonderfully simple definition of web 2.0 that I'd just like to pass on (see A simple definition: Web 2.0). Make further suggestions on his blog.

Web 1.0 = me
Web 2.0 = me + you

Web 1.0 = read
Web 2.0 = read + write

Web 1.0 = connecting ideas
Web 2.0 = connecting ideas + connecting people

Web 1.0 = search
Web 2.0 = recommendations of friends/others

Web 1.0 = find
Web 2.0 = share

Web 1.0 = techies rule
Web 2.0 = everybody rules

Wednesday, December 16, 2009

Making the grade

Finding a reliable, objective and fair way of quantifying learning is the Eldorado of education. Grades are the standard way of showing how much you learned at school and of showing how well the school has taught you. There are plenty cries today to go back to more standardised tests so that schools' quality and efficiency can be assessed. Parents want to send their children to the best schools; those with the best results.

As a result students learn enough to pass the tests and constantly ask teachers if this information will be in the exam, otherwise it isn't worth learning. Teachers teach to ensure that the students pass the exams and the school remains high in the "league table" and gets generous funding. Students with good grades then naturally expect to get the good jobs.

I admit I was pretty good at the art of passing exams but when I look back I didn't really understand how to apply what I had learned until much later. The CV looked good but did that really mean much? I could have learned so much more if I had been more aware.

Clay Burrell's blog post, Why "academic excellence" no longer cuts it today, claims that mere grades are far from enough today. Passing the exam only takes you half way and Burrell names "withitness" as a vital factor; the ability to really learn and apply that knowledge. It's what gets you the job when there are several other candidates with top grades. It's about having a natural curiosity to find out more and to go beyond the limitations of the set curriculum. In most careers there is no textbook you can learn by heart, no set learning objectives. To succeed you have to go outside the walls and explore, take risks, sometimes fail and above all be open to new ideas.

Monday, December 14, 2009

Glass ceiling

Working with net-based education is fascinating and a never-ending learning process. I often have to revise my views and have no doubt displayed a few inconsistencies since this blog began. The frustrating side is that, despite so much evidence that education can benefit greatly from technology, there is so little enthusiasm from educational leaders. Edtech conferences are nearly always stimulating but tend to be gatherings of the converted; the top decision makers are conspicuous by their absence. As a result there's a massive disconnect between the edtech community and the leaders.

I read Bill Ferriter's blog post, Retaining net gen teachers: an impossible dream, with great interest, nodding in agreement at most of it. His point is that innovative, "net gen" teachers all too often leave the profession after getting little or no response for their creative ideas. I'm not sure about the net gen label he uses as there are plenty of older people who are much more net gen than many teenagers. Let's call them innovative teachers instead.

These innovators soon become frustrated at the built-in conservatism in education and leave to find more stimulating work in the business world instead. Maybe it's all part of the educational cycle where those who enjoyed and thrived in a traditional school environment then study to become teachers and continue the tradition. To break the circle we need more disruptive teachers, especially those who did not enjoy their schooldays. But how?

"Our senior leaders do a ton of talking about the power found in collaborative teams but do little to create the kinds of structures that might make achieving something worthwhile alongside motivated colleagues possible......... Not only will it be difficult within the current structures to find the resources to reimagine our profession, I see little political will to make the kinds of changes necessary to retain Net Generation teachers."

Thursday, December 10, 2009

Internet safety - who needs most help, children or adults?

The UK Council for Child Internet Safety has launched an initiative to introduce compulsory lessons in internet safety for all primary school pupils from 2011 (see also a BBC article, Internet safety for children targeted). Good to see coordinated action being taken in this important field but it's not just a case of simply warning the children. Adults must be much more aware of what goes on on the net and the key skill of digital literacy for all comes to mind.

In response to the news there's a highly relevant blog post, So shoot me.., that looks at the real dangers facing children on the net but also stresses the need for adults to learn to become better role models (be sure to read the comments to this post). The net is just a reflection of society and there's a sad lack of respect for other people's feelings and beliefs, not just on the net but in other media. Net bullying, hate campaigns and abusive comments are there for all to see on many websites and children absorb these impressions. Let's help the kids to use the net responsibly but we adults have to radically clean up our act too. It's like insisting on your kids wearing a seat belt or cycle helmet and then not doing so yourself.

Tuesday, December 8, 2009

The texting myth

One of the most prolific urban myths in recent years is that teenagers' cellphone texting is seriously damaging their writing skills. Tales of students handing in school assignments full of text abbreviations are passed around the net but is there any truth behind them?

It's refreshing to get the answer from one of the most respected authorities in language and communication, David Crystal in his new book; the aptly named Txtng: The Gr8 Db8. There's an interview with him in Visual Thesaurus, David Crystal on the myth of texting where he states that the texted assignment was really a hoax put out on the net to stir up feelings and then became a truth that people were only too willing to believe. Internet myths are much stronger than myths of the past since they can become global "truths" in a matter of hours.

Abbreviations are used in SMS-texting and, indeed, in the more adult arena of Twitter due to space restrictions. We're forced to cut out all embellishments and focus on the bare bones. Teenagers, argues Crystal, are able to cope easily with different registers of language and realize clearly when texting language is appropriate. Interviews with many teenagers reveal that they can't believe how anyone would use texting abbreviations in school work. It simply doesn't belong there and they all realise that. In addition, by analysing large amounts of text messages Crystal found that only around 10% of words were abbreviated at all, thereby deflating the whole debate.

There's nothing new with abbreviated forms of course. I certainly used them in my note-taking at lectures at university and they certainly didn't get reproduced in my essays. Property terms like des res, all mod cons (desirable residence, all modern conveniences) have been with us for many tears without any fears for estate agents' literacy skills. Somehow the use of devices that many adults still feel uncomfortable with makes old habits suddenly seem threatening.

Monday, December 7, 2009

Distractions and their price

My favourite themes at present seem to be multitasking and backchannels and I return to the former once again. New and interesting articles on multitasking just keep coming and the latest one to catch my attention is new research into the effects of pop-ups on our computer screens. In this case it's not the brash, flashing pop-up ads that explode in your face on sites of a dubious nature, it's the pop-up alerts we get to tell us that a new e-mail, tweet or Facebook update has arrived. How much do such interruptions disturb our concentration?

That's the theme of research by Dr Helen Hodgetts and Professor Dylan Jones of Cardiff University entitled Now, where was I? Cognitive models and support mechanisms for interrupted task performance. They show that these interruptions break our cognitive focus and it can take a minute or two to get back on track even when the interruption was of little significance. Not surprisingly the louder or more obvious the alert the greater the disturbance. Evidently discrete audio warning alerts can give us time to decide whether to notice or ignore the coming message and thereby maintaining concentration. The moral of the story is that alerts should be as discrete as possible and that we should be able to personalize them according to situation.

I also plead guilty to allowing alerts to interrupt me while trying to concentrate on reading or writing (right now, though, I've only got background music). I think most of us find it difficult to turn off the e-mail, instant messaging, Twitter, cellphone etc when we really need to concentrate. I really must shut them down more often even if it is fun to communicate.

For more on this research read a report on Live Science, Workers should turn off visual alerts, and from Wales Online, Curse of the computer pop-up costs us so dear.

Saturday, December 5, 2009

Backchannel guide

After several recent posts about the pitfalls of back channel communication at conferences I was pleased to find that someone has written a practical guide to this area. It's called The Backchannel Book and is a community wiki full of information on the various tools you can use (Twitter, Jammer, various IM tools, document sharing etc.), how to use them and related articles.

One page in particular that caught my attention with regard to the recent reports of Twitter heckling and disruptive behaviour, is the Backchannel Agreement. This is a list of guidelines outlining a conference code of conduct aimed at organisers, presenters and participants. A very relevant checklist for any conference and, with some adaption, the base of a code of conduct for any class as well.

Friday, December 4, 2009

Terms of participation

Howard Rheingold is a guy I'd like to meet but until that happens I enjoy watching his video contributions and therefore paste in his latest thoughts on digital literacy, The internet as playground and factory.

The Internet as Playground and Factory - Howard Rheingold from Voices from The Internet as Play on Vimeo.

Read also an article by him on Encyclopedia Britannica Blog, Is multitasking evil? Or are most of us illiterate?

So what line of work are you in then?

Do you have the same problem as I have when people ask what you work with? Explaining distance learning can be hard enough since a lot of people have no idea it exists but how do you start explaining social media and how they can be relevant for education? It can be quite a shock to the system to meet people who have no idea of what you're talking about. How do you get the message across clearly, briefly and without frightening them away?

I had such an experience today and I fear that in my enthusiasm to enlight I just succeeded in confusing. Most people still see the classroom as the model for all education and the net as, at best, a source of entertainment. The connection between the two is unclear. Many such people are teachers, working hard and teaching well in most cases. But the potential of the net for accessing knowledge and connecting with others hasn't become apparent to them. How to start explaining?

Then I saw an excellent blog post by Shelly Terrell called Most teachers don't live there which provides a convincing and positive set of arguments for teachers who are doubtful of the value of the net in education. If we are educators shouldn't we participate in discussions with our colleagues around the world? Shouldn't we compare our own work with others and learn from each other? Shouldn't we help students use the net responsibly? To do this we need to be out there reading and writing blogs, participating in forums and sharing our knowledge.

"Technology is not the enemy and ignorance is not bliss. If we don’t show students how to use social media and technology, then we cannot complain when they use this in unhealthy ways."

Monday, November 30, 2009

Let's talk

The debate on the misuse of Twitter back channels at conferences continues and I have to mention another good post on the subject from a participant at the Web 2.0 Expo, Michelle Riggen-Ransom, Web 2.0 Expo: Harshtags, Twecklers and the Silence of the Death Star. She suggests that Twitter flows at conferences should not simply be beamed up on the screen behind the speaker, there should be a moderator function. Admittedly the hecklers would still be able to send their wise cracks but at least they wouldn't be magnified on the big screen.

The other main point in this post is also worrying. The participants were so engrossed in their laptops and cellphones that there was very little direct conversation, one of the main attractions of going to a conference in the first place. I've had the same experience a few times; at break times you look around for people to meet but everyone is too busy typing to notice you. In the end you just find a corner and start typing, look as if you're busy.

Are we hiding behind our devices, afraid of real human contact? Social media can certainly extend the reach of a conference and I have "participated" in several via Twitter, Second Life or web meeting. We can also bring the delegates closer together by providing a pre-conference community site to make contacts. But the main event is actually meeting all these net contacts face to face and discussing over a coffee or an evening drink.

As Michelle concludes:
"Next time you’re at a conference, try putting away the iPhone or the Blackberry during breaks. If you disagree with a presenter, seek them out afterwards, write a thoughtful blog post or contact them via Twitter to start a conversation. Say hello to people. Be open. You could meet someone IRL (!) who could become a friend, a mentor or business partner, or even start a project that makes the world a better place for your being in it."

Thursday, November 26, 2009

Spam spam spam

I read recently that over 90% of all e-mail in the world is spam. Despite this it's still the most popular means of written communication. I suppose the world's spam filters must be doing a good job otherwise we would have given up by now. However, even if the spam count is low many people feel engulfed by the sheer volume of non-spam e-mail. It's a long time since we actually enjoyed getting e-mail.

If e-mail has become passé then we get our pleasure in other services. I still think it's fun getting a comment on my blogs or someone mentioning me on Twitter (sad, I know). However there are signs that the spammers are taking over even there. There's a good post on James Clay's blog E-learning Stuff called Ten reasons why Twitter will eventually wither and die. He lists ten threats to Twitter mostly to do with spamming and sabotage which is already creeping in. The sheer openness of the service makes it extremely vulnerable to attack and if your identity gets used for spamming or worse you will of course stop using Twitter (or whatever other service). Similarly bloggers give up when their blog gets bombed by abusive spammers.

Could the openness of the social web be its ultimate downfall? The potential for constructive collaboration is enormous but also the potential for sabotage and trashing. How to we protect our net freedom without restricting it in some way?

Wednesday, November 25, 2009

Tweckling II, the speaker's view

The use of Twitter to digitally shout down a speaker at the recent Web 2.0 Expo (see previous post) has produced a lot of welcome debate about how the relative anonymity of the net allows some people to behave in a thoroughly disrespectful manner. The speaker at that conference, Danah Boyd of Microsoft Research, has written an admirably honest analysis of the presentation on her blog Apophenia, Spectacle at Web 2.0 expo ... from my perspective.

She was unable to see the Twitter flow during her presentation but felt increasingly uncomfortable as members of the audience laughed without apparent reason. Unaware of the wisecracks and derogatory remarks going on behind her back her presentation suffered accordingly. The back channel had taken centre stage and she was powerless. Of course if someone had actually asked a question or made a direct comment she could have reacted and dealt with the issue but that didn't happen.

The issue is of course one of respect. If you're using Twitter or other such tools professionally isn't it best to include a photo of yourself and adopt a name that is close to your own. I don't see the point of hiding your identity, especially at a conference where the whole point is to interact and meet people. If you are identifiable you are accountable for your comments and people can easily see who is disrupting the session.

This topic has certainly sparked off a debate and Danah's blog post has so far received 105 comments.

Saturday, November 21, 2009

Not waving but drowning

I finally got a Google Wave invitation and logged in a couple of weeks ago. That's it - so far. It's still in quarantine until I have time to work out what to do with it. I'm not sure why I'm keeping it at arm's length since it must be one of the most awaited (and hyped) applications of all time and I've read plenty of rave reviews from people I trust. I think I got a bit turned off by the whole business of sending out a limited number of invitations (according to Google anyway) and letting the world fight over them. Talk about creating demand. Very clever marketing of course.

Already I have a few contacts in my Wave box and I clicked on one of the conversations. It was a long column of messages and embedded dokuments resembling a long chat session. I immediately felt stressed. Google claim that this will sweep away e-mail and I welcome that. The trouble is that right now I have so many communication channels that I can't find room for yet another, especially one with only a select band of users. When Wave is ready to incorporate my e-mail as well as contacts in Facebook, Skype, Twitter etc then I'll be really interested but I really don't want yet another communication app open on my screen.

Wave is not the first app I've kept in quarantine a while. I signed up for Twitter months before I even sent my first tweet. I signed up and then watched it sit there for a while as I tried to think of something useful I could do with it. Now it's one of my favourite tools and a great source of information. Maybe I need time to adjust and Wave will be a hit when I finally decide to examine it.

If you're already using Wave you will realize that I still haven't learned the basics yet but I suppose I am experiencing the same feelings many teachers and colleagues get when they hear me waxing lyrically about the wonders of Web 2.0 etc. Interested yet hesitant to open Pandora's box and let all the demons out. Good to get a reality check basically.

Thursday, November 19, 2009

Tweckling - the negative side of conference back channels

I have previously written about public discussion forums which are often sabotaged by self-styled experts who enjoy humiliating any new members who dare to ask a simple honest question. These are nearly always anonymous users hiding behind a deliberately cryptic name and a picture of a cartoon character. Anonymity can foster brutality

Now we have a new term to add to the dozens already spawned by Twitter: namely tweckling. This means heckling a speaker by Twitter, especially at conferences. Many conferences use Twitter as an effective channel for audience participation, allowing participants the chance to comment on speakers, provide links to more information on the topic under discussion and for networking. However the tool can be used in a more destructive manner as described in an article in The Chronicle of Higher Education, Conference Humiliation: They're tweeting behind your back. Here a speaker was criticized openly on the conference Twitter flow and was basically subjected to digital heckling. The audience can sit silently and apparently attentive whilst shouting down the speaker in the digital space. In some cases the presentation can be silently drowned out by the flow of wise cracks. The speaker, not having time to read the steady flow of comments, is powerless. Further examples of Twitter in class, both positive and negative, are in another Chronicle article, Teaching with Twitter: not for the faint of heart.

Of course it's not the fault of the tool, Twitter, but rather another example of the confusion between private and public communication. There's a big difference between writing a quick note to my neighbour that I don't think much of the speaker and broadcasting my views to the whole auditorium and the world. Let's keep the discussion respectful and open. If we're using social media professionally we should not hide our identity.

As a PS to this post I have just seen an article on CNN (Can the law keep up with technology?) discussing the problems the legal world is having dealing with developments in the digital space and in particular offensive remarks made on Twitter.

Tuesday, November 17, 2009

Terms of service

Every time you sign up for a new service or download an update you get that annoying window with the terms of service. Does anyone read them? We merrily click Yes I accept and move on to the more interesting business of starting the application. We blindly trust that there's nothing unfair or restrictive in these terms and hope for the best. We could be signing away ourselves to lifetime enslavement for all we know. Our trust is complete.

Maybe that's the whole point even from the service provider's perspective. The terms are usually several pages long, in very small print and using lawyer-friendly language. Just sign here please sir/madam. By accepting we can't claim ignorance if we break the agreement. Is there any way of providing a short summary in plain English without compromising the agreement? It is rather important that we understand at least roughly what we're accepting and maybe it's time to press for simpler terms. This short summary can even have a note that full and legally binding conditions are contained in the legal version but that the summary gives a fair representation of those terms.

Saturday, November 14, 2009

Digital divide

In a recent Guardian article on the growth of open educational resources (Any student, any subject, anywhere) there was a quote from David Wiley of Brigham Young University in Utah:

"I don't know whether in future the people who answer questions, provide content and provide the degree will be in the same institution. It's likely that institutions will specialise in just one of those areas and then form partnerships with other institutions that play other roles."

There are already net institutions like Peer 2 Peer University and University of the People using open educational resources and building their courses around student-driven collaborative learning. Obviously there will be a need for universities or other organisations who specialize in examination, providing self-learners and collaborative learners with the opportunity to get academic recognition for their efforts. I now realize that such examination specialists are already up and running according to an article in e-Campus News, Credit by exams expands student options.

Evidently two institutions, Excelsior College and Pearson VUE already offer thousands of students the chance to sit exams without having attended classes. This is of course an extremely attractive way of saving considerable sums of money on tuition fees with exams at Excelsior costing a mere $85. This opens the way for students to study on open courses or simply by pure self study without having to put yourself deeply in debt. However, to be successful you will need to be highly disciplined, have excellent digital competence and have built up a wide personal learning environment to provide , reference, support and encouragement.

While there are plenty resourceful students who can meet these demands it is even more important to find ways of helping new students gain access to education and they need hands-on guidance in how to use the net and filter information. Those who do not have any experience of higher education and who are not so digitally literate need teachers/mentors who are close at hand, preferably face-to-face. If you feel intimidated by computers and the net there isn't much comfort in knowing that all resources and support are on the net. Local learning centres and libraries are already working with this in many countries but often with low funding or through temporary injections of project money. New learners are easily discouraged and if they meet technical difficulties they will drop out. Support must be local and accessible.

Collaborative net-based learning has enormous potential for those with the necessary skills but the majority of people who would benefit from open education lack the skills to get on board. The open courses and examination forms are great for the already initiated. I hope we can find equally creative ways of narrowing the digital divide so open education can benefit the majority.

Tuesday, November 10, 2009

Going public

If you say something controversial at a meeting, in class or even at a party there's an ever-increasing likelihood that your comment will be broadcast to the world almost instantly. Someone in the audience will have a smartphone and can inform all their contacts via Facebook or Twitter almost before you've finished your sentence. Someone may even be filming you.

This can have positive effects of course and can extend the reach of a conference or class but in many cases this sort of social reporting can have damaging effects. It just needs someone to misunderstand a comment or willfully misrepresent what was said to start all sorts of malicious rumours. I read a while ago that many celebrity parties ban cellphones because people can't relax if there's the risk that anything they do or say may be out on the net within seconds.

These themes are discussed in a new BBC article called Social media challenge social rules. The writer, Bill Thompson, admits to tweeting and sending photos during a recent conference but wonders where we should draw the line on this. Gossip has never travelled faster or further than today and maybe we need to develop a new sense of respect for what may or may not be communicated.

We have the ability to communicate with the world and suddenly all of us have to consider issues previously only considered by newspaper editors. When we send a tweet or make a blog post we are publishing in the public domain and have to consider the consequences. Remarks that you can make to a close friend in private may not be appropriate to broadcast. A vital part of the digital competence that needs to be taught in schools and colleges is a sense of appropriacy and respect or other's feelings. You never know who may read your text or see your photo. Maybe we need to learn to be more critical of what we publish.

Monday, November 9, 2009


Say what you will about Twitter but no-one can deny the diversity of content and wealth of imagination that's out there. I can't help spreading the word about a particularly bizarre Twitter service; Big Ben! Yes London's famous chimes can now be heard across the twittersphere though in text format. If you subscribe to @big_ben_clock you will get a tweet every hour on the hour saying quite simply BONG up to 12 times depending on the time. Gripping stuff indeed.

I assume that this service is automated. What's most amazing is that Big Ben has 10,439 followers as I write. Particularly disturbing to people like me who try to use Twitter for relatively constructive purposes and only manage to gather 150 followers (sniff).

Thursday, October 29, 2009

Coping with distractions

When I was a student back in the seventies there were plenty of distractions during lectures. Some doodled, some wrote notes to each other, some read a book or a newspaper and some even slept. Many actually took notes on the lecture but if that got boring we soon switched off. So there's nothing new with the current debate on digital distractions in class or at conferences; it's just more visible than before when the teacher faces a sea of laptops.

There's a good discussion going on the Learning Circuits blog (New Presenter and Learner Methods and Skills) about what you can do as a teacher in a classroom full of distractions. The use of back channels (instant messaging, Twitter etc) is now widespread at conferences and in class and the lesson is that if you don't provide an official one the participants will start an unofficial one themselves (or several). However it can be unnerving for the teacher to see the constant stream of comments roll in via Twitter as you speak. It's hard to concentrate on what you're presenting whilst keeping an eye on all the comments. Then again the comments are directed towards others in the audience not at you as presenter. However, what do you do then when laughter bursts out in the room at a tweet that you haven't seen? Do you immediatley realize that they're not laughing at you (quickly check that all clothes are still on, hair in right place etc)? Do you pause to let the laughter die down and continue uneasily waiting for the next witty remark to turn up in the arena that you are unable to participate in? Wait a minute, who's show IS this?

However back channel comments at least show interest in the subjetc of the session. What do you do when the audience has virtually left the room; the lights are on but there's no-one home. many will say that a bored audience will find other things to do and while that may be true to some extent, is audience boredom only the fault of an uninspiring presenter? Some concepts are tough to explain, some things take time to go through and simply cannot be full of stimulating content. Sometimes you have to concentrate hard and really struggle to come to grips with complicated theories. We tend to zap past channels that are not instantly appealing and risk losing a great opportunity of learning something really new.

I have read several pieces by Howard Rheingold (see several earlier posts on this blog) on how we need to teach the art of attention and how important it is that people learn to switch off the distractors and really concentrate. No significant learning takes place whilst multi-tasking (or pretending to). Read the discussion on Learning Circuits for more on this.

Monday, October 26, 2009

Web addresses soon in Arabic

A report from BBC reveals that it will soon be possible to use non-Latin characters in web and e-mail addresses. Seemingly the organisation in charge of web addresses and suchlike, the Internet Corporation for Assigned Names and Numbers (ICANN) is due to discuss this breakthrough at its conference in Seoul this week (see notice). It will soon be possible to have web addresses in Arabic, Chinese, Japanese, Hindi or Urdu.

This restriction has always struck me as extremely unfair to the majority of people in the world who do not use Latin characters and so far has ensured that if you want to use the net you need to learn our alphabet. Evidently even if the change is approved it will take some time before the new addresses are up and running since there has to be some kind of transliteration tool so that our computers can cope with non-ASCII addresses. However they say that Arabic domain names will be available as soon as next month. This should make the net more accessible to even more people.

Sunday, October 25, 2009

At last - a universal cellphone charger!

A BBC news item today made me jump for joy (well, almost). The International Telecommunications Union has announced that they have approved a new universal cellphone charger that will work with all handsets in the future. I think we have seven or eight different chargers lying around the house plus duplicates at work or in bags. Every time you get a new device you get yet another charger that doesn't work with anything else. ITU, I love you!
Now how about sorting out electricity sockets?

Thursday, October 22, 2009

How free is free?

We all assume that everything on the net is free and that somehow advertising pays for all the services we use. As a result we upload tons of content to servers somewhere out there and believe they're safe there. But what happens when the company providing that service has financial problems and decides to charge for the service or, worse still, decides simply to pull out the plug?

There's a critical article on this theme in Times Higher Education by Tara Brabazon, Beware writers bearing promises of a free internet. In particular Chris Anderson's book "Free" comes under fire as it shows the "freemium" movement to be ultimately highly commercial rather than the philanthropic movement it is sometimes presented as.

"His (Anderson's) “free” is corporatised. The cost of free is permanence, reliability and stability. The old cliché is correct. We get what we pay for: when the price is free, then the “service” can be removed without questions or reprisal."

Brabazon used a web service for storing her audio files that suddenly disappeared because the owners decided that the service wasn't being used enough. Since it was "free" they had no obligation to communicate with the users. In addition, the cost of using many free services is the irritation of having sometimes highly inappropriate ads next to your content; especially sensitive if you're using it for teaching.

We trust companies like Google and Ning but if times get tough who knows what may happen. Our information is at their mercy. Terms can be changed at the drop of a hat and it's important we are aware of this and not place unlimited trust in companies that, after all, are there to make money. The free services are, of course, mostly there as bait to get you into the premium services. I admit the irony of writing this on a free blog tool!

The article does however point us in the direction of a genuine non-profit archive for digital material, the Internet Archive. This is a massive library of films, photos, audio, texts and an archive of 150 billion web pages from 1996 to the present day. The archive “is free and open for everyone to use ..... to encourage widespread use of texts in new contexts by people who might not have used them before." This is possibly the real meaning of "free".

Wednesday, October 21, 2009

E-book competition hots up

I've written several times about e-book readers, especially Amazon's Kindle, and would really like to try one out. They are just becoming available in Europe so maybe I can do so in the not too distant future. Now there's a tough new rival on the scene. I see lots of articles hailing the Kindle's first real competitor - please welcome ladies and gentlemen, in the blue corner, from Barnes & Noble, we give you the Nook!

Yes, the giant US bookseller Barnes & Noble have launched their own e-book reader, the Nook, linking up with their own Wi-Fi network points and able to download e-books from you know who. One attractive feature of the Nook is that it allows users to lend each other e-books (as you would with hard copies) and you can even lend the e-books to friends who have iPhones, iPods or Blackberries as long as they have the necessary B&N software. Sounds promising though I sadly can't find any mention of sharing with a Kindle.

I'm really waiting for a non-proprietary device that allows me to download books from Amazon, Barnes & Noble or whoever else has what I'm looking for. Let's have the freedom of choice in the content and software but not yet more gadgets that are tied to one particular company. Whatever one you choose there's something good that you can't do. The Nook seems a step in the right direction but a long way from making a real breakthrough. I may have to wait a while longer.

Read more at Barnes & Noble, CNN, Read Write Web.

Tuesday, October 20, 2009

Nine till Five

Remember that seventies comedy with Dolly Parton about life in an office? Thirty years on, we've dumped the typewriters but most of us are still stuck in the office working nine till five as the song goes. We study more and more on-line and flexible learning has become an relatively accepted part of educational terminology but what about distance working? We can network with people from all corners of the globe and all knowledge is just a mouse click away but we still spend hours commuting to get to the place from where we do all that.

There's an article on this theme in Inside Higher Ed, Decentralized Work: The Final Frontier. Many universities that have extensive distance learning opportunities have not developed distance working to the same extent. I must admit I work very seldom from home even if there is seldom any good reason for not doing so. But very few of my colleagues do so and it just doesn't seem totally acceptable unless in exceptional circumstances. There's no law against it but the important point is there's no encouragement to do so either.

We're still set in our old industrial ways and somehow the feeling that if you're at your desk you're being productive is hard to erase. With all the fuss about swine flu I would guess that home working would be one way round the problem of infection but I haven't heard of any organisation that has tried this. Of course most people enjoy the social side of the workplace and there's no doubt that all the corridor and coffee room chat is important. However I find some days that I have more interaction with people in other towns and countries than I have with colleagues in the same building and therefore I could probably be able to do most of my work from home without interfering with my social contacts.

As the article writes:
"Whether you call it teleworking, Web working, telecommuting, distance working, or e-working, the concept is the same: Work isn’t some place you go, it’s something you do. It focuses on the information-age idea of decentralizing the office, as opposed to the industrial-age idea of bringing everyone to one single location."

Many people would probably work more efficiently from home and many would benefit from not having to commute every weekday but it requires the management to lead the way and make it not only possible but accepted. The technology is all there it's just the mindset that hasn't caught up.

Saturday, October 17, 2009

40 shades of green

A couple of months ago there was a fascinating debate on the net inspired by a session called The VLE is dead at the ALT-C conference in Manchester. The debate was about whether or not universities needed to use learning management systems (or Virtual Learning Environments) like Blackboard or Moodle. Could we not simply let teachers and students use their own blends of social media, so-called personal learning environments, and escape from the central control and uniformity of the VLE? There's always a tension in most organisations between demands for central control and efficiency and demands for decentralisation, freedom of expression and diversity. Creating a balance between these poles is not easy and I find myself swinging between them almost every week.

One side of me is attracted to the idea of the university deciding on one LMS/VLE plus a select few other common tools and providing coordinated practical support for both students and teachers. The majority of faculty are not familiar with the latest social media and simply want to use reliable, easy and standardised tools. Too much choice can cause stress and confusion so a limited selection of tools with practical support appeals to most. There is a widening digital gap and I suspect that many people realize all too well that they missed the boat many years ago and feel they have no chance of ever catching up. As a result, some of them steer clear of IT as much as possible. The last thing they need is to be presented with the Aladdin's Cave of digital delights that is Web 2.0!

On the other hand there are the experienced teachers who are constantly trying new approaches and experimenting with new technology. They're the ones who feel restricted by the constraints of the standard LMS/VLE and advocate a free PLE approach. It's essential that we explore all the new opportunities available on the net but how do we encourage that without alienating the majority who want a stable and secure learning environment? I like the idea of breaking out of the walled garden and creating truly flexible learning environments but the vast majority of staff (and probably students) are not ready for such freedom.

I don't think students would appreciate a situation when every course they take uses a different mix of tools all with different log-ins (of course!). How does the university provide support for such diversity? How do we link them all with our administrative systems? How much academic freedom can you allow before it becomes unmanageable?

Too much flexibility can have negative effects; what's flexible for the teacher becomes a burden on the student or the administration and vice versa. As the conference debate showed there are appealing arguments for both sides of this question and the answer probably lies somewhere inbetween. Standard supported solutions for the majority but some kind of flexibility to let the pioneers experiment as well.

Sunday, October 11, 2009

Meeting madness

One of the tasks I hate most at work is arranging a meeting. Whether it is face-to-face or on-line the problem is the same: finding a date and time that suits everyone involved. Most people use the extremely inefficient method of fixing a time through dozens of e-mails between all concerned, often causing confusion and frustration. Even a meeting between 5-6 people can take several rounds of e-mail negotiation and when you want more to meet it becomes impossible and more dictatorial methods are required.

Now there are some excellent net-based tools to simplify matters such as Meeting Wizard, Meet-o-matic and Doodle. Problem solved I thought and started using them. The only problem is that the e-mails they create generally end up in my colleagues' spam folders or in some cases vanish completely in the university's firewall. As a result half of the recipients never even know I'm arranging a meeting and we end up going back to primitive e-mail. If we all had access to each others' calenders it might solve things but we all use different calenders.

Will Google Wave improve this mess? It looks promising but I'm wary of the extreme hype on it just now. However e-mail is becoming too unwieldy and we need new solutions fast. I read somewhere recently that close to 90% of all e-mail today is spam. At least we can say that the vast majority of e-mail flooding the net is spam and that's a clear sign that we need new ways to communicate.

I've just read an article related to all this called The end of the e-mail era in Wall Street Journal. Lots about the successors to e-mail but no clear solution to fixing meetings.

Friday, October 9, 2009

Taking the geek out of tech

One of the best things about using Blogger is that, on the whole, what you write is what you get. You don't need to learn any codes to write your blog and as a result it's highly popular. There are now so many similarly user friendly applications that we have grown to expect full transparency. But things are not always so easy.

I enjoyed reading a post on Lisa's Teaching Blog, Four web technologies that shouldn't be geeky anymore, where she lists RSS, wikis, tagging and embedding as four technologies that should be much easier. They are all extremely useful but in most cases remain relatively inaccessible due to what she sees as unneccessary complexities.

RSS, for instance, is probably my most important tool at work. I use Netvibes to gather hundreds of feeds. I find it easy to use but it still involves the process of finding the RSS button on an interesting website (not so easy even on popular sites), copying the link and then pasting it into Netvibes. One click should do it. RSS is one of the most useful web services around, especially for teachers and researchers, but very few that I know use it. It just hasn't been adequately hyped I suppose. The name doesn't help either. "Really Simple Syndication" - yes, quite.

Wikis are widely used but I also wonder if they couldn't just design them so we could dispense with the few codes and symbols they use. One teacher I know tried to get students to use MediaWiki but they found it too complicated, lost interest and solved the task using other tools. It's not that complicated but for many people the mere sight of coding turns them off instantly. The easiest wiki tool I've used is PBworks which hasn't so far required me to write any code at all.

Of course I admit that these problems could simply be down to my own reluctance to learn the finer points but the four technologies mentioned here might be much more widespread if they were just a bit more straightforward.

Sunday, October 4, 2009

Book piracy on the rise

Digital content is, of course, simple to copy and it's getting increasingly harder to persuade people to pay for it. The music and film industry try desperately to stop the copying but unless they can come up with a radically innovative new business model they seem to have a hopeless task on their hands.

Now the publishing industry is also under fire according to a recent article in The New York Times, Will books be napsterized? Until recently there hasn't been much interest in e-books but with more attractive laptops and e-book readers available you can now download many e-books for free. It's not legal of course but just like music file-sharing it's hard to stop.

One possible future model is already employed by some textbook sites; read on-line for free, download a pdf chapter for a small sum or buy the whole book. Even with the increasingly attractive technology on offer today I doubt if many would opt to read War and Peace on a computer screen, even if it was free. But for articles, shorter novels, reference works and so on the free alternative is definitely appealing.

Will this see the end of books? I doubt it, at least not for a long time. Admittedly if all my books were stored on-line we cold free enormous amounts of space in the house and several metres of Billy bookcases from IKEA would be dumped. But it wouldn't be the same. The space once occupied by our old record and VHS collections has admittedly been liberated but books have more intrinsic value somehow; so much more then just naked text. Records and video tapes were short-lived media whereas books go back to ancient Egypt. Our bookshelves summarize our lives and many books are filled with memories and associations that wouldn't be possible if they existed only as digital files. I can't help quickly scanning friends' bookshelves when visiting their homes just to see what subjects we have in common. It wouldn't be the same just scanning their e-book folder.

Photo: www.pixgallery.com © Janne Olander

Wednesday, September 30, 2009

Peer 2 Peer University update

As I wrote a few weeks ago the open learning project, Peer 2 Peer University, has started its first courses. A student on the course Introduction to Cyberpunk Literature has just written his reflections on the first few weeks of the course (Experiencing the Peer 2 Peer University).

Students write their work on blogs (see course blog) and the course material is taken from freely available sources. The transparency of the course seems to have stimulated rather than daunted the students:

"Whilst this did feel a little daunting at first you realise very quickly that everyone is in the same boat and that it is this very transparency that helps to enrich the dialogue between the participants and as an experience, for me personally, it feels far more immersive."

Maybe we shouldn't read too much into these students' experiences since they are willing pioneers with a positive attitude to the P2PU model. The test will be to use the same model on a more representative group of students. I suspect the results there will be mostly positive.

Tuesday, September 29, 2009

Digital Nation

American PBS (Public Broadcasting Service) has produced an impressive website called Digital Nation. The project aims to showcase how the net has become an integral part of our lives and is reshaping the way we interact with each other. Most of the material consists of video interviews with experts, decision makers and members of the public on how they relate to technology and our increasingly net-based society. As the project progresses more films and other material will be added and this will all be the basis of a TV documentary later next year.

The site is divided into five main sections; living faster (daily life in an on-line world), relationships (friendship and socialising), waging war (training, simulation), virtual worlds (gaming, socialising) and learning. Predictably the obligatory stories about multitasking digital natives appear but hopefully that may be questioned by later contributions. Comments are of course invited on almost all the content. One quirky initiative asks you to write in only six words how the web and digital technology are changing the way you think, work, live, or love. Maybe the Twitter influence?

The idea is to create a digital collage reflecting different perspectives on life in the digital age. It'll be interesting to see how the project develops.

Sunday, September 27, 2009

Opening up

I've just discovered a new rich source of free on-line learning resources, Open University's site OpenLearn. Not only have Open University the best iTunes U content of all (in my opinion anyway!) but OpenLearn adds to this by providing a wealth of course modules and learning objects. At present the debate about Open Educational Resources (OER) is just beginning to happen here in Sweden and there are plenty of concerns about copyright and worries about the risks of making learning resources public.

It's good therefore to be able to point at examples of successful implementation of OER such as Open University and of course the pioneer MIT, whose Open Courseware now encompasses around 2,000 courses freely available on line (80% of total production).

The current status of OER is nicely summarized in an article in Times Higher Education, Get it out in the Open, which includes interviews with staff from both MIT and OU. Their experience points out the following advantages of making learning resources freely available:
  • showcasing the university's expertise and thereby marketing the university to future students
  • stimulating interest in higher education around the world and reaching out to new student groups (70% of visitors to OpenLearn are from outside the UK)
  • stimulating informal learning
  • enabling schools to let pupils test themselves on university level material
  • improving the quality of teaching material by publishing publically
  • stimulating the growth of OER at other universities by setting an example to follow
The drawbacks include in particular expensive production with faculty needing extensive support to produce quality material. On the other hand once produced much of the material is reuseable on related courses.

Friday, September 25, 2009

The dark side of the net

When we only had one or two TV or radio channels we sometimes watched/listened to programmes that we didn't really interest us at first. There simply wasn't anything else on. So now and again you might stumble upon something unexpectedly interesting and expand your horizons a bit. No chance of that now. We zap from channel to channel usually only giving a new programme a few seconds' chance before zapping on. Every opinion and subject is out there but most of us only check our favourite channels/sites; those that confirm our view of the world.

We generally assume that access to the net ensures free debate and strengthens democracy. Goverenments try to combat free discussion and political dissent by blocking social networks like Facebook and Twitter as well as stopping bloggers from publishing inappropriate information. However, there are cases where undemocratic governments actually embrace social networking as a means of combatting dissent.

There's a fascinating TED lecture by Evgeny Morozov (How the net aids dictatorships, don't forget to read the discussion under the film) where he claims that social tools can be used to spread disinformation and also enable authorities to gain access to vast amounts of information that would have been impossible in the past. They may even positively encourage bloggers to write on seemingly important issues in order to give the impression that there is indeed free debate in the country. This seems a much smarter policy than simply cutting access or blocking certain sites. Morozov is not denying the power of the net to strengthen democracy and education. He's just pointing out the reverse side of the coin that seldom appears in public debate.

Just as the net enables global networking and increased access to knowledge it can also lead to passivity. Only a small minority of net users are active in any significant way. One problem on the net is that you can choose what information you want to see. You read the news you want to read, visit sites whose views you agree with and seldom get confronted by opinions that challenge your own. Of course this has always been true to some extent but today you are able to filter out unwanted facts and uncomfortable opinions more effectively than ever before.

I remember before commercial radio started and those in favour of it claimed that with commercial radio we'd get a wider choice of music. Now we've got dozens of commercial channels all of which play "non-stop hits" mixed with phone-in competitions. The only channels that play new music and a wide variety of styles are the state-run channels (at least that is true here in Sweden). Without them we'd just hear the same hits round the clock.

Monday, September 21, 2009

Ivory towers

I noticed a thought-provoking seminar to be held soon at the British Library, Don and dusted: Is the Age of the Scholar over?. The question to be debated is the future of academic scholarship in the face of demands for return on investment and output-driven research. In tough times like these there are highly justified claims that public and private finance be used for practical purposes and that research must lead to concrete results.

What is the difference between the old-fashioned scholar and the 21st century researcher? Universities today are under increasing pressure to deliver tangible results and it is hard to justify research that is purely theoretical and exploratory. Will the increased demands on results lead to the end of traditional academic freedom? Hopefully there will always be room for purely inquisitive research but it still requires financial backing from somewhere. Many of the greatest scientific discoveries have occurred almost by accident when the scientists were actually looking for something quite different.

Many people demand that research should be governed by the needs of society/corporations/customers and in many cases this is fine. However, if customer needs were the only criteria for research and development would we ever have developed personal computers or cellphones? I remember back in the late eighties when a cellphone operator claimed, to great public ridicule, that in the future everyone would have a cellphone. There was very little customer demand for the product but they went ahead anyway and the rest is history.

There has to be money available to finance wild-card research. Much of it may not lead to major breakthroughs but every now and again someone will find a missing link, an exception that will turn previous theories upside down and lead us into completely new avenues. The problem is how to judge which projects are worth investing in and which are pointless. If everyone agrees that the world is flat who on earth would back someone who questions that?

I hope the organisers of this debate will post a report of the discussion.

Friday, September 18, 2009

Are crowds wise?

The wisdom of the crowd is another concept that seems wonderfully simple at first but suffers under closer examination (the other being the net generation). The concept of collective wisdom being more valid than individual wisdom gained global coverage through the work of James Surowiecki in 2004 (The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations) and there are many convincing examples of mass collaboration being extremely successful; Wikipedia, Digg, Amazon and many more.

This concept has been a driving force behind the development of Web 2.0; the power of collaboration. Surowiecki noted however that not all crowds are wise and that there are a number of prerequisites for wisdom: the members of the crowd should be independent of each other and represent a diversity of opinions.

I read an article in Read Write Web called The dirty little secret about the "Wisdom of the Crowds"- there is no crowd. It claims that the crowds behind many of the success stories like Wikipedia are actually a small number of dedicated enthusiasts plus a large mass of relatively passive members whose contribution is negligible. Evidently very few actually bother to vote on Amazon and Digg and so the aggregated wisdom represents a much smaller crowd than we previously assumed. I saw an analysis of Wikipedia entries a while ago and although a subject had been edited by hundreds of people about 90% of the editing had been carried out by 2 people, the others had been content to edit a sentence of a misprint.

The conclusion is that crowds can be intelligent but not always. Crowds are unlikely to come up with a stroke of real genius but are good for brainstorming, editing and revising. It's wise to remember that the crowd is seldom as large as it seems.

Thursday, September 17, 2009

Pay attention

Just about every week I read articles about how disruptive technology can be in the classroom. No, this time not disruptive in the sense of challenging traditional methods and structures. More like disruptive in the sense of disturbing other people; using cellphones in class, checking Facebook or YouTube during a seminar etc.

One such article is in the Times Higher Education (Mind your manners, not the phone, please) reporting a survey of staff and student attitudes to various classroom disturbances. The list from a staff point of view is not surprising, including students texting and talking on cellphones, coming unprepared to class and showing no interest in the proceedings. Interestingly, most of these are indeed very low-tech and nothing really to do with technology. The basic problem here seems to be a lack of respect for fellow students and teachers and an inability to focus attention when necessary. Maybe too many distracters. However I think this is a general tendency in society as a whole here that is possibly accentuated in the classroom setting. I've been to many meetings and conferences with delegates of all ages busy with totally unrelated activities on their laptops and cellphones.

Of course we have to ensure that what goes on in the classroom is relevant and interactive but even when it is engaging there are still many who are too distracted by background noise to realize what they're missing. Attention is a vital skill that I think must be taught. We've been given so many exciting tools to use that we have forgotten how to simply concentrate on a task and shut off the distracters for a while. I've heard several teachers who have a class discussion around this and agree on implementing "house rules" during class time. Class time can be divided into tech-free time where listening and participating is required and other periods where all devices are on and the focus is on gathering information and resources.

Howard Rheingold is a great source of inspiration and I can't resist including a new video interview with him on the subject of 21st century literacies. He often writes on the need to teach the skill of attention, of being able to focus on one activity and shut out the distractors. Technology is taking the blame today for a lot of basic human failings. Social media give us enormous opportunities to learn and cooperate but we need to focus more efforts on teaching people how to use them responsibly.

21st century media literacies from JD Lasica on Vimeo.

Sunday, September 13, 2009

What's in a name?

I remember a conference quite a while ago where a manager suggested that we shouldn't see ourselves as mere teachers anymore but as "competence architects". That of course became the subject of many merry comments in the bar that evening but it reflects a modern obsession. We keep inventing new names for occupations, technologies, institutions and behaviour.

There's a nice blog post by Steve Wheeler (Lost in translation - read discussion too) where he discusses the problem of what to call concepts like PLE (Personal Learning Environment) or Web 2.0. Many feel that these names are inaccurate or misleading but the problem is what to call them instead. Once something has been named it's rather difficult to change the name and get everyone to agree.

We're in the midst of a merger with a neighbouring university and will emerge from the process after New Year as a new university - Linnaeus University. That means we have to reorganize absolutely everything and as a result there are countless discussions about what to call our new departments and units.

One such case is the library. Libraries today are increasingly focused on net-based resources and in some cases some of the books and journals are even being moved aside to make room for more flexible learning spaces. However for many people the old concept remains firmly fixed. If we continue to call it a library many people will fail to see how it has changed but if we dream up a new name like learning resource centre we run the risk of getting the response "oh, you mean the library!"

We also had a long discussion about what we mean today by IT. Everyone has a clear picture of what the IT department has done up till now and that is fairly limited to networks, servers, hardware etc. If we widen the scope of IT to include "softer" areas of technology use and web 2.0 (sorry!) we have to think of a new name. But if the new name is seen as pretentious or vague people will continue to call it IT until convinced otherwise.

With English been the dominant world language, all new technical advances are first given an English name and then the world's other languages have to decide whether to find their own equivalent or just to accept yet another anglicism. For example the Danes just say "computer" whilst Swedish uses "dator" and Finnish "tietokone". It's mighty hard to talk about web 2.0 completely in Swedish since no-one has yet thought up Swedish equivalents.

I'm not sure what is easier. Updating people's attitudes to the revised meaning of terms like teacher, IT or library? Or spending years "selling" a new term that few want to buy?

Saturday, September 12, 2009


Sharing photos is one of the most popular social activities on the net and you might wonder if there is a need for another photo sharing site, but I can't help recommending a relatively recent arrival on the scene; Fotopedia. The idea is to create a photo equivalent of Wikipedia allowing photographers to share, tag and collaborate. Superb layout and some breathtaking photos.

The principle is that youcan upload photos and create your own albums or you can add your photos to existing albums on a particular subject. Photos can even be uploaded from Flickr or Picasa and then be linked to Wikipedia articles and Google Maps. The crowdsourcing principle rules here by allowing users to vote on the best and most relevant photos. The more votes a photo collects the further up the hierarchy it climbs.

“After traveling the world, I wanted to share my photos with others. Flickr and other photo sites give you exposure for only a brief window in time, and adding photos to wikipedia proved too complicated for the average user. This sparked the idea for a ‘wikipedia of photos’ – that combines the permanence and community collaboration of wikipedia with the ease of use of consumer desktop applications.” Jean-Marie Hullot, one of the founders of Fotopedia.

Friday, September 11, 2009

Don't believe the hype

It's wonderful that the world isn't as simple as it seems sometimes. It's easy to make sweeping and comfy generalisations that seem to explain something but then discover that the truth is frustratingly complex. If all the simple explanations were true we wouldn't have much left to discover and discuss.

In the last few months the whole net generation issue has been turned on its head as we realize that generations can't be categorized in such simplistic terms. It sounded plausible for a while as it was a good way of forcing the establishment to notice what was happening on the net and realize that it was going to radically change the way we run education. However we now see that the net generation is more complicated than that. Many young people do use new technology intuitively but very many do not. The same holds true for all age groups basically; it's mostly down to interest, curiosity and peer influence. Indeed it seems to me that the driving force behind the growth of social media is not teenagers as previously assumed; it's net enthusiasts over 30 and often well over. Some of the most innovative people I know are older than me! Indeed I've read that many young people are abandoning Facebook because it's full of their teachers and parents.

Who decides what tools to use on courses - the students? We hotly debate the pros and cons of different systems but do the students really care which learning management system we use as long as it is well-structured and reliable? If teachers try to use, say, Facebook as a communication tool on a course isn't there a risk that some students will resent their studies encroaching on their social arena? I read of a teacher who wanted the class to hand in assignments as audio files but met with resistance on the grounds that students were there to learn the subject and not a lot of technology. There have to be convincing reasons for using technology and the learning curve cannot be too demanding. However, the right preparation and motivation can work wonders. One course at my university is held completely in Second Life and the students are all SL beginners at the start yet it works well thanks to good groundwork at the start.

I love testing new tools and write enthusiastically about many of them but it is easy to get carried away. It's rather sobering to show off a new discovery to colleagues expecting them to share your enthusiasm only to be met with a resounding shrugging of shoulders.

People's attitudes to "technology" vary greatly. To many the word has very negative connotations; something that is unreliable, complicated and to be avoided. Anything we don't really like or are intimidated by is immediately dismissed as "technical". Many people still debate whether we should use "technology" in education at all (aren't whiteboards, OH-projectors, pens and microphones also technology?). I meet people who work successfully with complex Excel spreadsheets or administrative systems (that scare me to death!) but are wary of, say, Skype, wikis and blogs because they are too "technical". Beauty is in the eye of the beholder indeed.

Sunday, September 6, 2009

Cultural updates

Doesn't time fly? For many of us the breakup of the Soviet Union feels like a recent event and internet is still new technology. An article in Times Higher Education reports on an American college that has written a cultural update for the teaching staff to remind them that teachers' reference points are no longer undestood by the students (Beloit College Mindset List). For students born in 1991 the EU has always existed, the iron curtain is a vaguely understood archaism and cellphones, cable TV and internet have always been around.

Of course that's the whole digital natives phenomenon but this light-hearted list does bring home a few points to me. It's so easy to talk about concepts like "eastern bloc" and not realize that we're talking to people who have no idea what we mean. If we do make such references we have to be prepared to explain the background.

Saturday, September 5, 2009


The homeschooling movement in the US seems to be growing as more schools offer online teaching. There seems to be a long tradition of not trusting state run institutions and in many states parents can opt to keep their children at home. In Europe this phenomenon has not made much of an impact since the whole concept of keeping children out of school is illegal in many countries, including here in Sweden.

However I was not aware of a an extreme variation on homeschooling called unschooling until I came across an article about it in the Baltimore Sun, From home schooling to unschooling. Homeschooling is still based on a curriculum decided by a school with most teaching and learning being on-line. Unschooling, on the other hand, opts out of even that connection with the education system. Here it's the parents who are completely responsible for their children's education. Parents take their children on outdoor excursions, involve the kids in all aspects of housework and gardening and generally encoursge the kids to learn what they want at their own pace.

To succeed with unschooling parents have to be highly capable in child psychology, pedagogy and management and most importantly should not have regular employment that takes them away from their kids for long. It sounds very idyllic in the article and reminds me of the education principles within varoius hippy communities in the late sixties. The children, however, will be seriously deprived of learning how to interact with others and will probably not be exposed to opinions and information that their parents do not agree with. The potential for indoctrination is very high and I would guess that one main reason for choosing unschooling is that the parents consider the school system in some way dangerous and do not want their children to be exposed to the "wrong" ideology.

As ever, there are elements of this style of education that are appealing; encouraging curiosity, breaking out of the restraints of the classroom, integrating learning and living. However when looking at the typical daily routine of unschooling as desrcibed at the end of the article I would say it closely resembles a pretty normal Saturday or Sunday routine for many regular families. The key to an all-round education is the combination of learning in different environments (school, home, outdoors) with a wide variety of people (family, friends, class, self study) and with a variety of activities (discussion, reading, instruction, work, experimentation). Cutting off any of these components is deprivation and the unschooling principle seems to me to be lacking in several key learning activities.

Please read the comments on this for more links and discussion .....

Wednesday, September 2, 2009


Intersecting Wires

Despite all the advances in wireless technology I still have masses of tangled wires behind my computers and TV screens both at work and at home. Plus, of course, several drawers full of power cords, adapters and other wires. I am constantly amazed at their ability to get tangled up no matter how carefully I arrange them, especially if they're in a bag. The minute you turn your back they start snuggling up to each other.

I have often joked about having wireless electricity to solve all this and I've now started finding reports of exactly this breakthrough. Below is a talk on TED by Eric Giler (MIT) showing the principle of WiTricity; how electricity can be converted to a magnetic field and then back to electricity. Basically electricity can be transmitted wirelessly for short distances and the potential for this in the home and office is enormous.

One solution is to have a power pad on a desk, plugged into the mains, and you simply lay your cellphone or other device on it and it recharges. Could this be the end of all those infuriatingly incompatible battery charger cords that infest the world? A giant-sized pad could be on your garage floor to recharge your electric car at night. I almost feel moved to burst into song ....

Read more in an article from CNN, A cordless future for electricity.

Saturday, August 29, 2009

Good enough

I've just read a good article in Wired Magazine called The Good Enough Revolution; When Cheap and Simple is Just Fine that had me nodding in agreement most of the time. Many of the most popular applications on the net are successful not because of their high quality but because they're easy to use, always available and most importantly cheap or preferably free. The concept of "perpetual beta" for many applications has become the norm and users are quite willing to put up with shortcomings if they get it all for free.

The music industry is a perfect example of good enough. 30 years ago music lovers dreamt of buying a state-of-the-art hi-fi system with massive speakers, hi-tech amplifier and super-sensitive turntable all stacked up to impress in the corner of the living room. Perfect quality was the objective and buying a hi-fi system was a major project. Today I seldom see such sophisticated systems and mp3 is the choice format, much to the dismay of the music industry. The sound quality is not impressive but it's a convenient format and you can have your entire music collection in your pocket.

It's a similar story in many other areas. We use Skype for communication despite occasional lag, use cloud computing applications like Google Docs that lack all the features of Microsoft Office but do the job well and fly with no-frills airlines despite their indifferent customer care.

... what consumers want from the products and services they buy is fundamentally changing. We now favor flexibility over high fidelity, convenience over features, quick and dirty over slow and polished. Having it here and now is more important than having it perfect. These changes run so deep and wide, they're actually altering what we mean when we describe a product as "high-quality."

Is the current interest in free and open education typified by pioneers like Peer 2 Peer University and University of the People a further example of good enough? I hope not and believe that they are necessary to jolt the mainstream universities into more innovative strategies for expanding the reach of higher education. Is there a risk, however, that we see the growth of a cut-price sector in education with freelance faculty working for low wages and without job security? Quality is essential in education and quality costs.

Tuesday, August 25, 2009


There's nothing really new with multitasking, it's just got a cool name. In black and white days we used to get distracted by phone calls, corridor conversations, smoking breaks or listening to the radio. Today we've got many more distractors but the problem is how well we can concentrate on more than one thing at a time. I admit I like to have several applications running on my computer while I'm working (e-mail, Skype, Twitter, web radio, several web pages ...) and if I'm trying to write something I don't really want to do I've got lots of reasons to avoid doing it. However if I need to really concentrate and think clearly all the distractors have got to be shut down.

Multitasking is often seen as a symbol of being modern and flexible. Interestingly we often see multitaskers as highly efficient whereas in the past they may often have been dismissed as not being able to concentrate on the task in hand. An article in The Huffington Post, Study finds people who multitask often bad at it, describes a new piece of research that suggests that people who like to multitask are actually not very good at it and this leads to errors and carelessness. Indeed the more media they use the worse they perform.

However the report suggests that further investigation is needed into the reasons for multitasking and whether it is a natural reflection of our personality. Some people have a talent for detailed and thoughtful analysis and to do this they shut off all distractors. Others, the multitaskers, are inquisitive and enjoy experimenting and are seldom content to concentrate on on matter at a time. When is the multitasking approach fruitful and when is it not? For the full study see Cognitive Control in Media Multitaskers by Eyal Ophir, Clifford Nass, Anthony D. Wagner (Proceedings of the National Academy of Sciences of the USA).

Another angle on the subject can be read at CNN.com, Drop that Blackberry! Multitasking may be harmful.

You've got five minutes

When speaking to an audience there's an old adage that it takes much longer to prepare a short talk than a long one. The art of conveying a clear and convincing message in less than 10 minutes takes time to learn but is highly effective. Few of us really listen to anyone effectively for long and so if you really want to get the message across it's best to do so as concisely as possible before your audience's attention starts to wander.

This has been taken to heart by Ignite which is an event concept where invited experts are allowed 5 minutes each to get their message across. It's a bit like the classic talent show where singers or comedians are given max 5 minutes to convince the audience that they are any good. In the case of Ignite evenings it's not amateur hour but a showcase for highly gifted speakers to make their mark. On the website you can sample some of these short talks.

The concept seems to be popular and I wonder if it can spark more focus in schools and universities on presentation skills. We tend to play down this important skill today and I think we could all (students and teachers) benefit from more practice at communicating effectively in a limited timeframe.