As I wrote a few weeks ago the open learning project, Peer 2 Peer University, has started its first courses. A student on the course Introduction to Cyberpunk Literature has just written his reflections on the first few weeks of the course (Experiencing the Peer 2 Peer University).
Students write their work on blogs (see course blog) and the course material is taken from freely available sources. The transparency of the course seems to have stimulated rather than daunted the students:
"Whilst this did feel a little daunting at first you realise very quickly that everyone is in the same boat and that it is this very transparency that helps to enrich the dialogue between the participants and as an experience, for me personally, it feels far more immersive."
Maybe we shouldn't read too much into these students' experiences since they are willing pioneers with a positive attitude to the P2PU model. The test will be to use the same model on a more representative group of students. I suspect the results there will be mostly positive.
Assorted thoughts and reflections on technology in education, and other things ...
Wednesday, September 30, 2009
Tuesday, September 29, 2009
Digital Nation
American PBS (Public Broadcasting Service) has produced an impressive website called Digital Nation. The project aims to showcase how the net has become an integral part of our lives and is reshaping the way we interact with each other. Most of the material consists of video interviews with experts, decision makers and members of the public on how they relate to technology and our increasingly net-based society. As the project progresses more films and other material will be added and this will all be the basis of a TV documentary later next year.
The site is divided into five main sections; living faster (daily life in an on-line world), relationships (friendship and socialising), waging war (training, simulation), virtual worlds (gaming, socialising) and learning. Predictably the obligatory stories about multitasking digital natives appear but hopefully that may be questioned by later contributions. Comments are of course invited on almost all the content. One quirky initiative asks you to write in only six words how the web and digital technology are changing the way you think, work, live, or love. Maybe the Twitter influence?
The idea is to create a digital collage reflecting different perspectives on life in the digital age. It'll be interesting to see how the project develops.
The site is divided into five main sections; living faster (daily life in an on-line world), relationships (friendship and socialising), waging war (training, simulation), virtual worlds (gaming, socialising) and learning. Predictably the obligatory stories about multitasking digital natives appear but hopefully that may be questioned by later contributions. Comments are of course invited on almost all the content. One quirky initiative asks you to write in only six words how the web and digital technology are changing the way you think, work, live, or love. Maybe the Twitter influence?
The idea is to create a digital collage reflecting different perspectives on life in the digital age. It'll be interesting to see how the project develops.
Sunday, September 27, 2009
Opening up
I've just discovered a new rich source of free on-line learning resources, Open University's site OpenLearn. Not only have Open University the best iTunes U content of all (in my opinion anyway!) but OpenLearn adds to this by providing a wealth of course modules and learning objects. At present the debate about Open Educational Resources (OER) is just beginning to happen here in Sweden and there are plenty of concerns about copyright and worries about the risks of making learning resources public.
It's good therefore to be able to point at examples of successful implementation of OER such as Open University and of course the pioneer MIT, whose Open Courseware now encompasses around 2,000 courses freely available on line (80% of total production).
The current status of OER is nicely summarized in an article in Times Higher Education, Get it out in the Open, which includes interviews with staff from both MIT and OU. Their experience points out the following advantages of making learning resources freely available:
It's good therefore to be able to point at examples of successful implementation of OER such as Open University and of course the pioneer MIT, whose Open Courseware now encompasses around 2,000 courses freely available on line (80% of total production).
The current status of OER is nicely summarized in an article in Times Higher Education, Get it out in the Open, which includes interviews with staff from both MIT and OU. Their experience points out the following advantages of making learning resources freely available:
- showcasing the university's expertise and thereby marketing the university to future students
- stimulating interest in higher education around the world and reaching out to new student groups (70% of visitors to OpenLearn are from outside the UK)
- stimulating informal learning
- enabling schools to let pupils test themselves on university level material
- improving the quality of teaching material by publishing publically
- stimulating the growth of OER at other universities by setting an example to follow
Friday, September 25, 2009
The dark side of the net
When we only had one or two TV or radio channels we sometimes watched/listened to programmes that we didn't really interest us at first. There simply wasn't anything else on. So now and again you might stumble upon something unexpectedly interesting and expand your horizons a bit. No chance of that now. We zap from channel to channel usually only giving a new programme a few seconds' chance before zapping on. Every opinion and subject is out there but most of us only check our favourite channels/sites; those that confirm our view of the world.
We generally assume that access to the net ensures free debate and strengthens democracy. Goverenments try to combat free discussion and political dissent by blocking social networks like Facebook and Twitter as well as stopping bloggers from publishing inappropriate information. However, there are cases where undemocratic governments actually embrace social networking as a means of combatting dissent.
There's a fascinating TED lecture by Evgeny Morozov (How the net aids dictatorships, don't forget to read the discussion under the film) where he claims that social tools can be used to spread disinformation and also enable authorities to gain access to vast amounts of information that would have been impossible in the past. They may even positively encourage bloggers to write on seemingly important issues in order to give the impression that there is indeed free debate in the country. This seems a much smarter policy than simply cutting access or blocking certain sites. Morozov is not denying the power of the net to strengthen democracy and education. He's just pointing out the reverse side of the coin that seldom appears in public debate.
Just as the net enables global networking and increased access to knowledge it can also lead to passivity. Only a small minority of net users are active in any significant way. One problem on the net is that you can choose what information you want to see. You read the news you want to read, visit sites whose views you agree with and seldom get confronted by opinions that challenge your own. Of course this has always been true to some extent but today you are able to filter out unwanted facts and uncomfortable opinions more effectively than ever before.
I remember before commercial radio started and those in favour of it claimed that with commercial radio we'd get a wider choice of music. Now we've got dozens of commercial channels all of which play "non-stop hits" mixed with phone-in competitions. The only channels that play new music and a wide variety of styles are the state-run channels (at least that is true here in Sweden). Without them we'd just hear the same hits round the clock.
We generally assume that access to the net ensures free debate and strengthens democracy. Goverenments try to combat free discussion and political dissent by blocking social networks like Facebook and Twitter as well as stopping bloggers from publishing inappropriate information. However, there are cases where undemocratic governments actually embrace social networking as a means of combatting dissent.
There's a fascinating TED lecture by Evgeny Morozov (How the net aids dictatorships, don't forget to read the discussion under the film) where he claims that social tools can be used to spread disinformation and also enable authorities to gain access to vast amounts of information that would have been impossible in the past. They may even positively encourage bloggers to write on seemingly important issues in order to give the impression that there is indeed free debate in the country. This seems a much smarter policy than simply cutting access or blocking certain sites. Morozov is not denying the power of the net to strengthen democracy and education. He's just pointing out the reverse side of the coin that seldom appears in public debate.
Just as the net enables global networking and increased access to knowledge it can also lead to passivity. Only a small minority of net users are active in any significant way. One problem on the net is that you can choose what information you want to see. You read the news you want to read, visit sites whose views you agree with and seldom get confronted by opinions that challenge your own. Of course this has always been true to some extent but today you are able to filter out unwanted facts and uncomfortable opinions more effectively than ever before.
I remember before commercial radio started and those in favour of it claimed that with commercial radio we'd get a wider choice of music. Now we've got dozens of commercial channels all of which play "non-stop hits" mixed with phone-in competitions. The only channels that play new music and a wide variety of styles are the state-run channels (at least that is true here in Sweden). Without them we'd just hear the same hits round the clock.
Monday, September 21, 2009
Ivory towers
I noticed a thought-provoking seminar to be held soon at the British Library, Don and dusted: Is the Age of the Scholar over?. The question to be debated is the future of academic scholarship in the face of demands for return on investment and output-driven research. In tough times like these there are highly justified claims that public and private finance be used for practical purposes and that research must lead to concrete results.
What is the difference between the old-fashioned scholar and the 21st century researcher? Universities today are under increasing pressure to deliver tangible results and it is hard to justify research that is purely theoretical and exploratory. Will the increased demands on results lead to the end of traditional academic freedom? Hopefully there will always be room for purely inquisitive research but it still requires financial backing from somewhere. Many of the greatest scientific discoveries have occurred almost by accident when the scientists were actually looking for something quite different.
Many people demand that research should be governed by the needs of society/corporations/customers and in many cases this is fine. However, if customer needs were the only criteria for research and development would we ever have developed personal computers or cellphones? I remember back in the late eighties when a cellphone operator claimed, to great public ridicule, that in the future everyone would have a cellphone. There was very little customer demand for the product but they went ahead anyway and the rest is history.
There has to be money available to finance wild-card research. Much of it may not lead to major breakthroughs but every now and again someone will find a missing link, an exception that will turn previous theories upside down and lead us into completely new avenues. The problem is how to judge which projects are worth investing in and which are pointless. If everyone agrees that the world is flat who on earth would back someone who questions that?
I hope the organisers of this debate will post a report of the discussion.
What is the difference between the old-fashioned scholar and the 21st century researcher? Universities today are under increasing pressure to deliver tangible results and it is hard to justify research that is purely theoretical and exploratory. Will the increased demands on results lead to the end of traditional academic freedom? Hopefully there will always be room for purely inquisitive research but it still requires financial backing from somewhere. Many of the greatest scientific discoveries have occurred almost by accident when the scientists were actually looking for something quite different.
Many people demand that research should be governed by the needs of society/corporations/customers and in many cases this is fine. However, if customer needs were the only criteria for research and development would we ever have developed personal computers or cellphones? I remember back in the late eighties when a cellphone operator claimed, to great public ridicule, that in the future everyone would have a cellphone. There was very little customer demand for the product but they went ahead anyway and the rest is history.
There has to be money available to finance wild-card research. Much of it may not lead to major breakthroughs but every now and again someone will find a missing link, an exception that will turn previous theories upside down and lead us into completely new avenues. The problem is how to judge which projects are worth investing in and which are pointless. If everyone agrees that the world is flat who on earth would back someone who questions that?
I hope the organisers of this debate will post a report of the discussion.
Friday, September 18, 2009
Are crowds wise?
The wisdom of the crowd is another concept that seems wonderfully simple at first but suffers under closer examination (the other being the net generation). The concept of collective wisdom being more valid than individual wisdom gained global coverage through the work of James Surowiecki in 2004 (The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations) and there are many convincing examples of mass collaboration being extremely successful; Wikipedia, Digg, Amazon and many more.
This concept has been a driving force behind the development of Web 2.0; the power of collaboration. Surowiecki noted however that not all crowds are wise and that there are a number of prerequisites for wisdom: the members of the crowd should be independent of each other and represent a diversity of opinions.
I read an article in Read Write Web called The dirty little secret about the "Wisdom of the Crowds"- there is no crowd. It claims that the crowds behind many of the success stories like Wikipedia are actually a small number of dedicated enthusiasts plus a large mass of relatively passive members whose contribution is negligible. Evidently very few actually bother to vote on Amazon and Digg and so the aggregated wisdom represents a much smaller crowd than we previously assumed. I saw an analysis of Wikipedia entries a while ago and although a subject had been edited by hundreds of people about 90% of the editing had been carried out by 2 people, the others had been content to edit a sentence of a misprint.
The conclusion is that crowds can be intelligent but not always. Crowds are unlikely to come up with a stroke of real genius but are good for brainstorming, editing and revising. It's wise to remember that the crowd is seldom as large as it seems.
This concept has been a driving force behind the development of Web 2.0; the power of collaboration. Surowiecki noted however that not all crowds are wise and that there are a number of prerequisites for wisdom: the members of the crowd should be independent of each other and represent a diversity of opinions.
I read an article in Read Write Web called The dirty little secret about the "Wisdom of the Crowds"- there is no crowd. It claims that the crowds behind many of the success stories like Wikipedia are actually a small number of dedicated enthusiasts plus a large mass of relatively passive members whose contribution is negligible. Evidently very few actually bother to vote on Amazon and Digg and so the aggregated wisdom represents a much smaller crowd than we previously assumed. I saw an analysis of Wikipedia entries a while ago and although a subject had been edited by hundreds of people about 90% of the editing had been carried out by 2 people, the others had been content to edit a sentence of a misprint.
The conclusion is that crowds can be intelligent but not always. Crowds are unlikely to come up with a stroke of real genius but are good for brainstorming, editing and revising. It's wise to remember that the crowd is seldom as large as it seems.
Thursday, September 17, 2009
Pay attention
Just about every week I read articles about how disruptive technology can be in the classroom. No, this time not disruptive in the sense of challenging traditional methods and structures. More like disruptive in the sense of disturbing other people; using cellphones in class, checking Facebook or YouTube during a seminar etc.
One such article is in the Times Higher Education (Mind your manners, not the phone, please) reporting a survey of staff and student attitudes to various classroom disturbances. The list from a staff point of view is not surprising, including students texting and talking on cellphones, coming unprepared to class and showing no interest in the proceedings. Interestingly, most of these are indeed very low-tech and nothing really to do with technology. The basic problem here seems to be a lack of respect for fellow students and teachers and an inability to focus attention when necessary. Maybe too many distracters. However I think this is a general tendency in society as a whole here that is possibly accentuated in the classroom setting. I've been to many meetings and conferences with delegates of all ages busy with totally unrelated activities on their laptops and cellphones.
Of course we have to ensure that what goes on in the classroom is relevant and interactive but even when it is engaging there are still many who are too distracted by background noise to realize what they're missing. Attention is a vital skill that I think must be taught. We've been given so many exciting tools to use that we have forgotten how to simply concentrate on a task and shut off the distracters for a while. I've heard several teachers who have a class discussion around this and agree on implementing "house rules" during class time. Class time can be divided into tech-free time where listening and participating is required and other periods where all devices are on and the focus is on gathering information and resources.
Howard Rheingold is a great source of inspiration and I can't resist including a new video interview with him on the subject of 21st century literacies. He often writes on the need to teach the skill of attention, of being able to focus on one activity and shut out the distractors. Technology is taking the blame today for a lot of basic human failings. Social media give us enormous opportunities to learn and cooperate but we need to focus more efforts on teaching people how to use them responsibly.
One such article is in the Times Higher Education (Mind your manners, not the phone, please) reporting a survey of staff and student attitudes to various classroom disturbances. The list from a staff point of view is not surprising, including students texting and talking on cellphones, coming unprepared to class and showing no interest in the proceedings. Interestingly, most of these are indeed very low-tech and nothing really to do with technology. The basic problem here seems to be a lack of respect for fellow students and teachers and an inability to focus attention when necessary. Maybe too many distracters. However I think this is a general tendency in society as a whole here that is possibly accentuated in the classroom setting. I've been to many meetings and conferences with delegates of all ages busy with totally unrelated activities on their laptops and cellphones.
Of course we have to ensure that what goes on in the classroom is relevant and interactive but even when it is engaging there are still many who are too distracted by background noise to realize what they're missing. Attention is a vital skill that I think must be taught. We've been given so many exciting tools to use that we have forgotten how to simply concentrate on a task and shut off the distracters for a while. I've heard several teachers who have a class discussion around this and agree on implementing "house rules" during class time. Class time can be divided into tech-free time where listening and participating is required and other periods where all devices are on and the focus is on gathering information and resources.
Howard Rheingold is a great source of inspiration and I can't resist including a new video interview with him on the subject of 21st century literacies. He often writes on the need to teach the skill of attention, of being able to focus on one activity and shut out the distractors. Technology is taking the blame today for a lot of basic human failings. Social media give us enormous opportunities to learn and cooperate but we need to focus more efforts on teaching people how to use them responsibly.
21st century media literacies from JD Lasica on Vimeo.
Sunday, September 13, 2009
What's in a name?
I remember a conference quite a while ago where a manager suggested that we shouldn't see ourselves as mere teachers anymore but as "competence architects". That of course became the subject of many merry comments in the bar that evening but it reflects a modern obsession. We keep inventing new names for occupations, technologies, institutions and behaviour.
There's a nice blog post by Steve Wheeler (Lost in translation - read discussion too) where he discusses the problem of what to call concepts like PLE (Personal Learning Environment) or Web 2.0. Many feel that these names are inaccurate or misleading but the problem is what to call them instead. Once something has been named it's rather difficult to change the name and get everyone to agree.
We're in the midst of a merger with a neighbouring university and will emerge from the process after New Year as a new university - Linnaeus University. That means we have to reorganize absolutely everything and as a result there are countless discussions about what to call our new departments and units.
One such case is the library. Libraries today are increasingly focused on net-based resources and in some cases some of the books and journals are even being moved aside to make room for more flexible learning spaces. However for many people the old concept remains firmly fixed. If we continue to call it a library many people will fail to see how it has changed but if we dream up a new name like learning resource centre we run the risk of getting the response "oh, you mean the library!"
We also had a long discussion about what we mean today by IT. Everyone has a clear picture of what the IT department has done up till now and that is fairly limited to networks, servers, hardware etc. If we widen the scope of IT to include "softer" areas of technology use and web 2.0 (sorry!) we have to think of a new name. But if the new name is seen as pretentious or vague people will continue to call it IT until convinced otherwise.
With English been the dominant world language, all new technical advances are first given an English name and then the world's other languages have to decide whether to find their own equivalent or just to accept yet another anglicism. For example the Danes just say "computer" whilst Swedish uses "dator" and Finnish "tietokone". It's mighty hard to talk about web 2.0 completely in Swedish since no-one has yet thought up Swedish equivalents.
I'm not sure what is easier. Updating people's attitudes to the revised meaning of terms like teacher, IT or library? Or spending years "selling" a new term that few want to buy?
There's a nice blog post by Steve Wheeler (Lost in translation - read discussion too) where he discusses the problem of what to call concepts like PLE (Personal Learning Environment) or Web 2.0. Many feel that these names are inaccurate or misleading but the problem is what to call them instead. Once something has been named it's rather difficult to change the name and get everyone to agree.
We're in the midst of a merger with a neighbouring university and will emerge from the process after New Year as a new university - Linnaeus University. That means we have to reorganize absolutely everything and as a result there are countless discussions about what to call our new departments and units.
One such case is the library. Libraries today are increasingly focused on net-based resources and in some cases some of the books and journals are even being moved aside to make room for more flexible learning spaces. However for many people the old concept remains firmly fixed. If we continue to call it a library many people will fail to see how it has changed but if we dream up a new name like learning resource centre we run the risk of getting the response "oh, you mean the library!"
We also had a long discussion about what we mean today by IT. Everyone has a clear picture of what the IT department has done up till now and that is fairly limited to networks, servers, hardware etc. If we widen the scope of IT to include "softer" areas of technology use and web 2.0 (sorry!) we have to think of a new name. But if the new name is seen as pretentious or vague people will continue to call it IT until convinced otherwise.
With English been the dominant world language, all new technical advances are first given an English name and then the world's other languages have to decide whether to find their own equivalent or just to accept yet another anglicism. For example the Danes just say "computer" whilst Swedish uses "dator" and Finnish "tietokone". It's mighty hard to talk about web 2.0 completely in Swedish since no-one has yet thought up Swedish equivalents.
I'm not sure what is easier. Updating people's attitudes to the revised meaning of terms like teacher, IT or library? Or spending years "selling" a new term that few want to buy?
Saturday, September 12, 2009
Fotopedia
Sharing photos is one of the most popular social activities on the net and you might wonder if there is a need for another photo sharing site, but I can't help recommending a relatively recent arrival on the scene; Fotopedia. The idea is to create a photo equivalent of Wikipedia allowing photographers to share, tag and collaborate. Superb layout and some breathtaking photos.
The principle is that youcan upload photos and create your own albums or you can add your photos to existing albums on a particular subject. Photos can even be uploaded from Flickr or Picasa and then be linked to Wikipedia articles and Google Maps. The crowdsourcing principle rules here by allowing users to vote on the best and most relevant photos. The more votes a photo collects the further up the hierarchy it climbs.
“After traveling the world, I wanted to share my photos with others. Flickr and other photo sites give you exposure for only a brief window in time, and adding photos to wikipedia proved too complicated for the average user. This sparked the idea for a ‘wikipedia of photos’ – that combines the permanence and community collaboration of wikipedia with the ease of use of consumer desktop applications.” Jean-Marie Hullot, one of the founders of Fotopedia.
The principle is that youcan upload photos and create your own albums or you can add your photos to existing albums on a particular subject. Photos can even be uploaded from Flickr or Picasa and then be linked to Wikipedia articles and Google Maps. The crowdsourcing principle rules here by allowing users to vote on the best and most relevant photos. The more votes a photo collects the further up the hierarchy it climbs.
“After traveling the world, I wanted to share my photos with others. Flickr and other photo sites give you exposure for only a brief window in time, and adding photos to wikipedia proved too complicated for the average user. This sparked the idea for a ‘wikipedia of photos’ – that combines the permanence and community collaboration of wikipedia with the ease of use of consumer desktop applications.” Jean-Marie Hullot, one of the founders of Fotopedia.
Friday, September 11, 2009
Don't believe the hype
It's wonderful that the world isn't as simple as it seems sometimes. It's easy to make sweeping and comfy generalisations that seem to explain something but then discover that the truth is frustratingly complex. If all the simple explanations were true we wouldn't have much left to discover and discuss.
In the last few months the whole net generation issue has been turned on its head as we realize that generations can't be categorized in such simplistic terms. It sounded plausible for a while as it was a good way of forcing the establishment to notice what was happening on the net and realize that it was going to radically change the way we run education. However we now see that the net generation is more complicated than that. Many young people do use new technology intuitively but very many do not. The same holds true for all age groups basically; it's mostly down to interest, curiosity and peer influence. Indeed it seems to me that the driving force behind the growth of social media is not teenagers as previously assumed; it's net enthusiasts over 30 and often well over. Some of the most innovative people I know are older than me! Indeed I've read that many young people are abandoning Facebook because it's full of their teachers and parents.
Who decides what tools to use on courses - the students? We hotly debate the pros and cons of different systems but do the students really care which learning management system we use as long as it is well-structured and reliable? If teachers try to use, say, Facebook as a communication tool on a course isn't there a risk that some students will resent their studies encroaching on their social arena? I read of a teacher who wanted the class to hand in assignments as audio files but met with resistance on the grounds that students were there to learn the subject and not a lot of technology. There have to be convincing reasons for using technology and the learning curve cannot be too demanding. However, the right preparation and motivation can work wonders. One course at my university is held completely in Second Life and the students are all SL beginners at the start yet it works well thanks to good groundwork at the start.
I love testing new tools and write enthusiastically about many of them but it is easy to get carried away. It's rather sobering to show off a new discovery to colleagues expecting them to share your enthusiasm only to be met with a resounding shrugging of shoulders.
People's attitudes to "technology" vary greatly. To many the word has very negative connotations; something that is unreliable, complicated and to be avoided. Anything we don't really like or are intimidated by is immediately dismissed as "technical". Many people still debate whether we should use "technology" in education at all (aren't whiteboards, OH-projectors, pens and microphones also technology?). I meet people who work successfully with complex Excel spreadsheets or administrative systems (that scare me to death!) but are wary of, say, Skype, wikis and blogs because they are too "technical". Beauty is in the eye of the beholder indeed.
In the last few months the whole net generation issue has been turned on its head as we realize that generations can't be categorized in such simplistic terms. It sounded plausible for a while as it was a good way of forcing the establishment to notice what was happening on the net and realize that it was going to radically change the way we run education. However we now see that the net generation is more complicated than that. Many young people do use new technology intuitively but very many do not. The same holds true for all age groups basically; it's mostly down to interest, curiosity and peer influence. Indeed it seems to me that the driving force behind the growth of social media is not teenagers as previously assumed; it's net enthusiasts over 30 and often well over. Some of the most innovative people I know are older than me! Indeed I've read that many young people are abandoning Facebook because it's full of their teachers and parents.
Who decides what tools to use on courses - the students? We hotly debate the pros and cons of different systems but do the students really care which learning management system we use as long as it is well-structured and reliable? If teachers try to use, say, Facebook as a communication tool on a course isn't there a risk that some students will resent their studies encroaching on their social arena? I read of a teacher who wanted the class to hand in assignments as audio files but met with resistance on the grounds that students were there to learn the subject and not a lot of technology. There have to be convincing reasons for using technology and the learning curve cannot be too demanding. However, the right preparation and motivation can work wonders. One course at my university is held completely in Second Life and the students are all SL beginners at the start yet it works well thanks to good groundwork at the start.
I love testing new tools and write enthusiastically about many of them but it is easy to get carried away. It's rather sobering to show off a new discovery to colleagues expecting them to share your enthusiasm only to be met with a resounding shrugging of shoulders.
People's attitudes to "technology" vary greatly. To many the word has very negative connotations; something that is unreliable, complicated and to be avoided. Anything we don't really like or are intimidated by is immediately dismissed as "technical". Many people still debate whether we should use "technology" in education at all (aren't whiteboards, OH-projectors, pens and microphones also technology?). I meet people who work successfully with complex Excel spreadsheets or administrative systems (that scare me to death!) but are wary of, say, Skype, wikis and blogs because they are too "technical". Beauty is in the eye of the beholder indeed.
Sunday, September 6, 2009
Cultural updates
Doesn't time fly? For many of us the breakup of the Soviet Union feels like a recent event and internet is still new technology. An article in Times Higher Education reports on an American college that has written a cultural update for the teaching staff to remind them that teachers' reference points are no longer undestood by the students (Beloit College Mindset List). For students born in 1991 the EU has always existed, the iron curtain is a vaguely understood archaism and cellphones, cable TV and internet have always been around.
Of course that's the whole digital natives phenomenon but this light-hearted list does bring home a few points to me. It's so easy to talk about concepts like "eastern bloc" and not realize that we're talking to people who have no idea what we mean. If we do make such references we have to be prepared to explain the background.
Of course that's the whole digital natives phenomenon but this light-hearted list does bring home a few points to me. It's so easy to talk about concepts like "eastern bloc" and not realize that we're talking to people who have no idea what we mean. If we do make such references we have to be prepared to explain the background.
Saturday, September 5, 2009
Unschooling
The homeschooling movement in the US seems to be growing as more schools offer online teaching. There seems to be a long tradition of not trusting state run institutions and in many states parents can opt to keep their children at home. In Europe this phenomenon has not made much of an impact since the whole concept of keeping children out of school is illegal in many countries, including here in Sweden.
However I was not aware of a an extreme variation on homeschooling called unschooling until I came across an article about it in the Baltimore Sun, From home schooling to unschooling. Homeschooling is still based on a curriculum decided by a school with most teaching and learning being on-line. Unschooling, on the other hand, opts out of even that connection with the education system. Here it's the parents who are completely responsible for their children's education. Parents take their children on outdoor excursions, involve the kids in all aspects of housework and gardening and generally encoursge the kids to learn what they want at their own pace.
To succeed with unschooling parents have to be highly capable in child psychology, pedagogy and management and most importantly should not have regular employment that takes them away from their kids for long. It sounds very idyllic in the article and reminds me of the education principles within varoius hippy communities in the late sixties. The children, however, will be seriously deprived of learning how to interact with others and will probably not be exposed to opinions and information that their parents do not agree with. The potential for indoctrination is very high and I would guess that one main reason for choosing unschooling is that the parents consider the school system in some way dangerous and do not want their children to be exposed to the "wrong" ideology.
As ever, there are elements of this style of education that are appealing; encouraging curiosity, breaking out of the restraints of the classroom, integrating learning and living. However when looking at the typical daily routine of unschooling as desrcibed at the end of the article I would say it closely resembles a pretty normal Saturday or Sunday routine for many regular families. The key to an all-round education is the combination of learning in different environments (school, home, outdoors) with a wide variety of people (family, friends, class, self study) and with a variety of activities (discussion, reading, instruction, work, experimentation). Cutting off any of these components is deprivation and the unschooling principle seems to me to be lacking in several key learning activities.
Please read the comments on this for more links and discussion .....
However I was not aware of a an extreme variation on homeschooling called unschooling until I came across an article about it in the Baltimore Sun, From home schooling to unschooling. Homeschooling is still based on a curriculum decided by a school with most teaching and learning being on-line. Unschooling, on the other hand, opts out of even that connection with the education system. Here it's the parents who are completely responsible for their children's education. Parents take their children on outdoor excursions, involve the kids in all aspects of housework and gardening and generally encoursge the kids to learn what they want at their own pace.
To succeed with unschooling parents have to be highly capable in child psychology, pedagogy and management and most importantly should not have regular employment that takes them away from their kids for long. It sounds very idyllic in the article and reminds me of the education principles within varoius hippy communities in the late sixties. The children, however, will be seriously deprived of learning how to interact with others and will probably not be exposed to opinions and information that their parents do not agree with. The potential for indoctrination is very high and I would guess that one main reason for choosing unschooling is that the parents consider the school system in some way dangerous and do not want their children to be exposed to the "wrong" ideology.
As ever, there are elements of this style of education that are appealing; encouraging curiosity, breaking out of the restraints of the classroom, integrating learning and living. However when looking at the typical daily routine of unschooling as desrcibed at the end of the article I would say it closely resembles a pretty normal Saturday or Sunday routine for many regular families. The key to an all-round education is the combination of learning in different environments (school, home, outdoors) with a wide variety of people (family, friends, class, self study) and with a variety of activities (discussion, reading, instruction, work, experimentation). Cutting off any of these components is deprivation and the unschooling principle seems to me to be lacking in several key learning activities.
Please read the comments on this for more links and discussion .....
Wednesday, September 2, 2009
Unplugged
Despite all the advances in wireless technology I still have masses of tangled wires behind my computers and TV screens both at work and at home. Plus, of course, several drawers full of power cords, adapters and other wires. I am constantly amazed at their ability to get tangled up no matter how carefully I arrange them, especially if they're in a bag. The minute you turn your back they start snuggling up to each other.
I have often joked about having wireless electricity to solve all this and I've now started finding reports of exactly this breakthrough. Below is a talk on TED by Eric Giler (MIT) showing the principle of WiTricity; how electricity can be converted to a magnetic field and then back to electricity. Basically electricity can be transmitted wirelessly for short distances and the potential for this in the home and office is enormous.
One solution is to have a power pad on a desk, plugged into the mains, and you simply lay your cellphone or other device on it and it recharges. Could this be the end of all those infuriatingly incompatible battery charger cords that infest the world? A giant-sized pad could be on your garage floor to recharge your electric car at night. I almost feel moved to burst into song ....
Read more in an article from CNN, A cordless future for electricity.
Subscribe to:
Posts (Atom)