How to switch off in a switched-on world - As I make dinner in my kitchen, my daughter is perusing Facebook, my husband is scrolling through emails on his BlackBerry, and my son is playing a game on his iTouch. I am texting friends and checking emails as I season the chicken. My whole family is here, but it’s strangely silent except for the clicks of keyboards. We are all in the same room, yet completely disconnected. I wonder, as I look at their faces, are these gadgets redefining us in ways we don’t realise?
Nicholas Carr, an influential writer and blogger on the social implications of technology, thinks so. His book The Shallows: How The Internet Is Changing The Way We Think, Read And Remember received tremendous media coverage worldwide before its release in the UK. It taps in to our deepest fears about the internet: that our brains will be overloaded and our synapses changed for ever and that ultimately artificial intelligence will overrule our brains as we become like the machines we created. ‘The pace at which technology emerges now doesn’t leave us time to consider its implications,’ he says.
He’s not the only one to think so. Trend forecaster Richard Watson, author of Future Minds, believes we are moving towards a culture of ‘partial stupidity’. Our speed of communication will force people to respond without thinking things through, he says, and we’ll start to place a higher premium on hand-written documents and Wi-Fi-free cafés and hotels. As we reach the brink of ‘peak attention’ – the point at which our brains cannot absorb any more information – we’ll succumb to fatigue and stress.
Technology has always changed us. Television brought us 24-hour news and celebrity culture, and paved the way for the instant-access digital age. The clock ended our reliance on nature’s rhythms and regimented our living for ever. Even the printed page transformed our culture of oral narrative. Carr notes that the philospher Socrates feared the development of writing would make people forgetful. There were losses, and gains, but we adapted. So, do we really need to worry?
‘Perhaps not since early man first discovered how to use a tool has the human brain been affected so quickly and so dramatically,’ writes neuroscientist Dr Gary Small in his book iBrain: Surviving The Technological Alteration Of The Human Mind. He believes that digital technology is altering how we feel and behave, as well as the core circuitry of our brains.
Small distinguishes between ‘digital natives’ – those who have never known a world without texting, internet and home video games – and ‘digital immigrants’ – those whose neural pathways were shaped long before the advent of modern technology. His research shows that the neural networks in these two groups differ dramatically. In a recent study, the so-called natives, who used the internet regularly, showed greater brain activity in regions responsible for decision-making and complex reasoning, whereas ‘immigrants’ were better at reading facial expressions.
While it’s good news that internet use boosts some brain functions, Small is concerned about the growing body of research that shows an increase in scattered thinking among regular users of digital technology. We are living in a state of continuous partial attention in which we keep track of lots of things but don’t focus on one, as we constantly search for a new contact, new information or a titbit of gossip. In one study from the University of California, office workers were shown to spend only 11 minutes per project. Each time they were distracted from a given task, it took 25 minutes to return to it.
It’s in the economic interest of search engines ‘to drive us to distraction’, Carr points out, because we leave bits of information about ourselves as we jump about. This ‘constant crisis’ puts our brains in a state of heightened stress that continues after we log off, says Small, possibly even reducing short-term memory.
We may have been worried about the culture of distraction since the advent of MTV, but the latest wave of research suggests we aren’t just losing focus, we’re losing different modes of thought.
Carr suggests that search engines may diminish creative thinking because they ‘tend to serve as amplifiers of popularity’. Whether we research a historical topic, medical query or a product, we are following a script that reinforces a consensus about what information is and isn’t important.
In this way, he argues, ‘it’s possible that the internet can shape our thoughts as a society as we move towards a futuristic artificial intelligence’.
It is even possible that our increased use of the internet will affect us spiritually, as we spend less time in slow, deep thought. The act of contemplation not only helps us to combat stress, it helps us to feel connected to the world around us. ‘The easy access to information and open lines of communication are great benefits,’ he says. ‘But if that crowds out contemplative, solitary thought, then we lose what makes us distinctive as individuals and our entire culture reshapes itself to become more utilitarian and focused on efficiency.’
It’s possible that much of our hesitation about the demands of new technology is that we are still measuring our new way of thinking against the old. There are merits to each. It is up to us to decide how we use it.
‘The great achievements in art and culture have come from deep, solitary thinking, but that doesn’t mean it’s the only way,’ says Carr. ‘People think in many different ways and sometimes it’s great to be inundated with information and juggle lots of things.’
Present studies suggest that internet usage is crowding out other modes of thought and communication. But there is evidence of a movement resisting ‘our culture of distraction’, according to Maggie Jackson, author of Distracted: The Erosion Of Attention And The Coming Dark Age. In response to office workers complaining that they don’t have time to think anymore, particularly in creative professions, some organisations are setting up quiet rooms, where there is no technology and employees can go to think.
Jackson says that formal policies, such as ‘no emails on Friday’ and designated ‘think days’, have failed in workplaces, largely because she thinks the respect for contemplative time is a ‘collective social challenge’ that needs to come from everyone, rather than a top-down mandate. ‘We need to hammer out etiquette and social values to accompany these dramatic technological advances,’ says Jackson. ‘It’s a topic that needs to be addressed in schools, workplaces and homes.’
In the meantime, we can all think about our use of technology and make an effort to schedule time when we are completely ‘off’. Jackson suggests creating a ‘white room’ at home in which technology is prohibited, and to be aware of distractions and multi-tasking. If we don’t, then perhaps we will put ourselves in real danger of losing that private, quiet part of ourselves.
‘We don’t need to be Luddites or refuseniks,’ says Jackson. ‘But we do need to take tech breaks when we step away and remind ourselves what it means to be human.’ ( psychologies.co.uk )
Nicholas Carr, an influential writer and blogger on the social implications of technology, thinks so. His book The Shallows: How The Internet Is Changing The Way We Think, Read And Remember received tremendous media coverage worldwide before its release in the UK. It taps in to our deepest fears about the internet: that our brains will be overloaded and our synapses changed for ever and that ultimately artificial intelligence will overrule our brains as we become like the machines we created. ‘The pace at which technology emerges now doesn’t leave us time to consider its implications,’ he says.
He’s not the only one to think so. Trend forecaster Richard Watson, author of Future Minds, believes we are moving towards a culture of ‘partial stupidity’. Our speed of communication will force people to respond without thinking things through, he says, and we’ll start to place a higher premium on hand-written documents and Wi-Fi-free cafés and hotels. As we reach the brink of ‘peak attention’ – the point at which our brains cannot absorb any more information – we’ll succumb to fatigue and stress.
Technology has always changed us. Television brought us 24-hour news and celebrity culture, and paved the way for the instant-access digital age. The clock ended our reliance on nature’s rhythms and regimented our living for ever. Even the printed page transformed our culture of oral narrative. Carr notes that the philospher Socrates feared the development of writing would make people forgetful. There were losses, and gains, but we adapted. So, do we really need to worry?
‘Perhaps not since early man first discovered how to use a tool has the human brain been affected so quickly and so dramatically,’ writes neuroscientist Dr Gary Small in his book iBrain: Surviving The Technological Alteration Of The Human Mind. He believes that digital technology is altering how we feel and behave, as well as the core circuitry of our brains.
Small distinguishes between ‘digital natives’ – those who have never known a world without texting, internet and home video games – and ‘digital immigrants’ – those whose neural pathways were shaped long before the advent of modern technology. His research shows that the neural networks in these two groups differ dramatically. In a recent study, the so-called natives, who used the internet regularly, showed greater brain activity in regions responsible for decision-making and complex reasoning, whereas ‘immigrants’ were better at reading facial expressions.
While it’s good news that internet use boosts some brain functions, Small is concerned about the growing body of research that shows an increase in scattered thinking among regular users of digital technology. We are living in a state of continuous partial attention in which we keep track of lots of things but don’t focus on one, as we constantly search for a new contact, new information or a titbit of gossip. In one study from the University of California, office workers were shown to spend only 11 minutes per project. Each time they were distracted from a given task, it took 25 minutes to return to it.
It’s in the economic interest of search engines ‘to drive us to distraction’, Carr points out, because we leave bits of information about ourselves as we jump about. This ‘constant crisis’ puts our brains in a state of heightened stress that continues after we log off, says Small, possibly even reducing short-term memory.
We may have been worried about the culture of distraction since the advent of MTV, but the latest wave of research suggests we aren’t just losing focus, we’re losing different modes of thought.
Carr suggests that search engines may diminish creative thinking because they ‘tend to serve as amplifiers of popularity’. Whether we research a historical topic, medical query or a product, we are following a script that reinforces a consensus about what information is and isn’t important.
In this way, he argues, ‘it’s possible that the internet can shape our thoughts as a society as we move towards a futuristic artificial intelligence’.
It is even possible that our increased use of the internet will affect us spiritually, as we spend less time in slow, deep thought. The act of contemplation not only helps us to combat stress, it helps us to feel connected to the world around us. ‘The easy access to information and open lines of communication are great benefits,’ he says. ‘But if that crowds out contemplative, solitary thought, then we lose what makes us distinctive as individuals and our entire culture reshapes itself to become more utilitarian and focused on efficiency.’
It’s possible that much of our hesitation about the demands of new technology is that we are still measuring our new way of thinking against the old. There are merits to each. It is up to us to decide how we use it.
‘The great achievements in art and culture have come from deep, solitary thinking, but that doesn’t mean it’s the only way,’ says Carr. ‘People think in many different ways and sometimes it’s great to be inundated with information and juggle lots of things.’
Present studies suggest that internet usage is crowding out other modes of thought and communication. But there is evidence of a movement resisting ‘our culture of distraction’, according to Maggie Jackson, author of Distracted: The Erosion Of Attention And The Coming Dark Age. In response to office workers complaining that they don’t have time to think anymore, particularly in creative professions, some organisations are setting up quiet rooms, where there is no technology and employees can go to think.
Jackson says that formal policies, such as ‘no emails on Friday’ and designated ‘think days’, have failed in workplaces, largely because she thinks the respect for contemplative time is a ‘collective social challenge’ that needs to come from everyone, rather than a top-down mandate. ‘We need to hammer out etiquette and social values to accompany these dramatic technological advances,’ says Jackson. ‘It’s a topic that needs to be addressed in schools, workplaces and homes.’
In the meantime, we can all think about our use of technology and make an effort to schedule time when we are completely ‘off’. Jackson suggests creating a ‘white room’ at home in which technology is prohibited, and to be aware of distractions and multi-tasking. If we don’t, then perhaps we will put ourselves in real danger of losing that private, quiet part of ourselves.
‘We don’t need to be Luddites or refuseniks,’ says Jackson. ‘But we do need to take tech breaks when we step away and remind ourselves what it means to be human.’ ( psychologies.co.uk )
No comments:
Post a Comment