For the last few years, I’ve been spoiled. I’ve been surrounded by people who, when asked a question, immediately bring out a digital device and look it up. The conferences that I’ve attended have backchannels as a given. Tweeting, blogging, Wikipedia-ing… these are all just what we do. It’s not all there – it’s still broken. My cohort is still always in search of a power plug and there’s a lag between the time a question is asked and the point at which the iPhone’s slow browser is loaded, the query is entered, and the answer is given. Still, we’re getting there. Or so I thought.
In Italy two weeks ago, I attended Modernity 2.0 (in the lovely Urbino hosted by the fantastic Fabio Giglietto). There were two audiences in attendance – a young cohort of “internet scholars” and an older cohort deeply invested in sociocybernetics. At one point, after a talk, one of the sociocybernetics scholars (actually, the former President of the sociocybernetics organization… I know… I looked him up) began his question by highlight that, unlike most of the audience who seemed more invested in the internet than scholarly conversations, HE had been paying attention. He was sitting next to me. He looked at me as he said this.
It’s not very often that I feel like I’ve been publicly bitchslapped but boy did that sting. And then I felt pissy, like a resentful stubborn child bent on proving him wrong. Somehow, as I grew my hair out and became an adult, I also became less spiteful because boy was I determined to bite back. Of course, I haven’t become that much of an adult because here I am blogging the details of said encounter.
There’s no doubt that I barely understood what the speaker was talking about. But during the talk, I had looked up six different concepts he had introduced (thank you Wikipedia), scanned two of the speakers’ papers to try to grok what on earth he was talking about, and used Babelfish to translate the Italian conversations taking place on Twitter and FriendFeed in attempt to understand what was being said. Of course, I had also looked up half the people in the room (including the condescending man next to me) and posted a tweet of my own.
But, of course, the attack was not actually about the reality of my internet habits but the perception of them. There’s no doubt that, when given a laptop in a lecture setting, most people surf the web, check email, or play video games. Their attention is lost and they’ve checked out. Of course, there’s an assumption that technology is to blame. The only thing that I really blame said technology for is limiting doodling practice for the potential future artist (and for those of us who still can’t sketch to save our lives). Y’see – I don’t think that people were paying that much attention before. Daydreaming and sketching (aka “taking notes”) are not particularly new practices. Now the daydreamer might just be blogging instead.
My frustration at the anti-computer attitude goes beyond the generational gap of an academic conference. I’ve found that this same attitude tends to be present in many workplace environments. Blackberries and laptops are often frowned upon as distraction devices. As a result, few of my colleagues are in the habit of creating backchannels in business meetings. This drives me absolutely bonkers, especially when we’re talking about conference calls. I desperately, desperately want my colleagues to be on IM or IRC or some channel of real-time conversation during meetings. While I will fully admit that there are times when the only thing I have to contribute to such dialogue is snark, there are many more times when I really want clarifications, a quick question answered, or the ability to ask someone in the room to put the mic closer to the speaker without interrupting the speaker in the process.
I have become a “bad student.” I can no longer wander an art museum without asking a bazillion questions that the docent doesn’t know or won’t answer or desperately wanting access to information that goes beyond what’s on the brochure (like did you know that Rafael died from having too much sex!?!?!). I can’t pay attention in a lecture without looking up relevant content. And, in my world, every meeting and talk is enhanced through a backchannel of communication.
This isn’t simply a generational issue. In some ways, it’s a matter of approach. Every Wednesday, MSR New England has a guest speaker (if you wanna be notified of the talks, drop me an email). None of my colleagues brings a laptop. I do. And occasionally my interns do (although they often feel like they’re misbehaving when they do so they often don’t… I’m more stubborn than they are). My colleagues interrupt the talk with questions. (One admits that he asks questions because he’s more interested in talking to the speaker than listening… he also asks questions to stay awake.) I find the interruptions to the speaker to be weirdly inappropriate. I much much prefer to ask questions to Twitter, Wikipedia, and IRC/IM. Let the speaker do her/his thing… let me talk with the audience who is present and those who are not but might have thoughtful feedback. When I’m inspired, I ask questions. When I’m not, I zone out, computer or not.
My colleagues aren’t that much older than me but they come from a different set of traditions. They aren’t used to speaking to a room full of blue-glow faces. And they think it’s utterly fascinating that I poll my twitterverse about constructs of fairness while hearing a speaker talk about game theory. Am I learning what the speaker wants me to learn? Perhaps not. But I am learning and thinking and engaging.
I’m 31 years old. I’ve been online since I was a teen. I’ve grown up with this medium and I embrace each new device that brings me closer to being a cyborg. I want information at my fingertips now and always. There’s no doubt that I’m not mainstream. But I also feel really badly for the info-driven teens and college students out there being told that learning can only happen when they pay attention to an audio-driven lecture in a classroom setting. I read books during my classroom (blatantly not paying attention). Imagine what would’ve happened had I been welcome to let my mind run wild on the topic at hand?
What will it take for us to see technology as a tool for information enhancement? At the very least, how can we embrace those who learn best when they have an outlet for their questions and thoughts? How I long for being connected to be an acceptable part of engagement.
I should note that the conference in Italy was international (predominantly European, talks in English) and that the man who criticized those of us on computers was not Italian. So my comments should be not be taken as a critique of Italians vs. Americans as this was not the dynamic of the conference or the disagreement at all.
I should also note that the conference was split pretty evenly down the line with two vastly different sets of expectations about norms of a conference. There were two “camps” if you will. The difference was primarily marked by academic discipline, not country.
Ah well – darn – what a nice idea I had, but information from you negates my theory! Thanks – didn’t read what you said carefully enough.
That two camp thing is really interesting.
Speaking as someone who is *right now* going back to re-read a book, this time with the Internet in hand, and has already spent thirty minutes on the references on one page, I’m so with you.
I also get frustrated with the low bandwidth of public talks (as opposed to the high bandwidth of public talks and public conversations). But then, podcasts frustrate me the same way, so maybe I’m just a scanning sort of dork…
The communication classes I TAed this last winter and spring forbade laptop use in the lecture hall (though I allowed it in my sections). This was based on a couple of articles implicating computer use with overall decreases in attention and retention:
I had mixed feelings about this. I’ve been taking notes on a laptop since 2001 and love having a searchable and shareable record of my classes, but I’m also very bad at splitting my attention (despite otherwise showing signs of being part of this so-called digital generation) and having a laptop out just makes it so much easier to distract myself. In fact, I’ve had to discipline my use of communication technologies generally to be productive at all, in the classroom or out. As an erstwhile doodler, I can confidently say that doodling never interfered *that* much with my productivity or attention; in fact, it often enhanced it.
Recent interviews of iPhone users on Stanford campus had undergrads on both sides of the debate, as well. Some clearly disciplined their use, others embraced the technology. Both were aware of the other side and were generally open to what they considered “rude” being acceptable to others. Interpretive flexibility?
Anyway, I’m too tired right now to comment more cogently (or even coherently), but I wanted to throw those links up, in any case. Always the danger of catering to the average — for the average student it might be detrimental, but for certain students it’s clearly not!
Aesthetically, more gadgetry is not necessarily better. I am thinking about the notion of making your life into a work of art. Granted, there is something to say about an appealing aesthetic of clutter. However, the type of person that is becoming common, the one who juggles a laptop along with other devices made with little consideration to design, accompanied by a paper cup of coffee, all emblazoned with commercial logos, is in most instances anything but pleasing to the eye.
It is now a truism, “It is not the information but what you do with it.” What does it really matter if you are able to retrieve an additional tidbit of information on a specific point made by a speaker or not? As you suggest Danah, it probably matters as much as our doodles do. Did people write manifestos for doodling? If we are in the business of thinking and creating, then our concern should be on the quality of our thoughts and creations.
People confuse doing with thinking and creating. By reading some of these comments you would think that tweeting, taking your computer to a lecture, or looking something up on the Internet was a revolutionary act in and of itself. Perhaps, as cyborgs become more common, we will be able to stop being enthralled by their novelty and judge them rather by the quality of their thought and art.
This is a great column, danah — one of my favorites; thanks for your perspective on multi-tasking audiences.
Not that I know you at all personally, but I can just imagine you trying to practice meditation!
Here’s a phrase I found in a book of proverbs you might enjoy:
To waste all day
in the busy town,
Forgetting the treasure
in her own house.
I wonder how much is gained and how much is lost by delaying “knowing?” Is “right now” more important than holding off, delaying gratification?
You’ve given us all much to think about. Thank you for sharing.
I work here at Linden Lab (Second Life) with JoRoan, and yes, we live in the cyber-future. I attend all my meetings in-world (online with all the other things at my disposal, and noone can see or hear me typing in another window unles I choose).
I lose my thread of understanding if I do email or scan IRC. So I knit simple projects during meetings. Which, well, is slightly more obtrusive than doodling in-person, but is much the same effect mentally for me, a non-sketching-enabled programmer.
It is nice to be able to tune out of meetings as needed without the social frowns, though.
Stanford is apparently having a Media Multitasking Workshop today! Dean Eckles is blogging it. (Dogfood, anyone?)
Oh, so recognizable, so very true! And so very sad, in fact.
The gap you spotted so well is often seen as ‘generational’ but I think it’s more an attitudinal one. I know quite a lot of relatively young people who treat ‘all these digital things’ as entirely opposite to the ‘true human relationships’, and who therefore demonstrate a mix of disgust and hostility to any manifestation of the ‘digital’.
Like yourself, my life is quite densely (and increasingly) populated with all kind of multiple ‘digital layers’, from blogging, to wikiing, to non-stop FriendFeeding, IM-ing, Second Lifing, World-of-Warcrafting, etherpading etcetc – and give me more, please! But all that adds to the richness of my life and my communication with people, not diminishes it.
Yet from outside it often seen as a distraction and escapism from ‘real life’. “You better spend more time with your family instead of Second Life!’, I hear all too often. “Let’s have a true meeting, without laptops and chats!” How to explain them that the meeting will be much more productive and pleasant if it is enhanced with few more digital channels? How to explain that I often spend time in SL *with* my son, or raid in WoW *with* my wife, and that it doesn’t prevent us from doing a lot of other ‘real life’ things but rather adds new dimensions to them?
The real problem is that I can’t really explain it. There is a true vicious circle many of these ‘anti-digital’ people fall into: they never try any of those instruments, and therefore tend to buy negative rumors about them, and since these second-hand rumors further reinforce their negative attitude, they never try to try!
I guess, it is linked to a pathological fear of many people to find themselves in a position of n00bs, newbies who are keen to learn the new without a prefabricated judgments. Somehow this position of n00bness is severely stigmatized by our culture; most likely, the ‘sociocybernetics professor’ you mentioned in your post perceives himself as such an EXPERT that he can not even imagine entering any space where he would be lvl 1.
Cherish your n00bness. That’s your chance to learn!
I was presenting this morning at BbWorld09 on “Weaving the Social Web into Learning: Blackboard as a Learning Portal”, and used your post as an example of how faculty need to change. Thanks for sharing – this 59-yr old Net Gen-er agrees with you!
I am so down with keeping my laptop up during meetings. Meetings are so much more meaningful when I can google topics being discussed that I am unfamiliar with. And if it is a meeting that doesn’t really involve me, then I can VPN to my desktop and finish the doc I left behind to attend said meeting.
Either way, my company wins. They are the winners, because I’m more- not less- productive because I have this little black HP appendage.
The comment below was written before I read all the other comments, which needless to say would cast my points in a different light. But, rather than revise, I’ll leave it in “first impression” mode, as I think my points as I expressed them may still be of interest.
“If I only had a dollar, for every song I’ve sung.
And every time I’ve had to play while people sat there drunk.”
-Creedence Clearwater Revival
I think this issue is really about the courtesy an audience owes to a performer. As the quoted lyrics suggest, this is a concept perhaps “more honored in the breach”. However, I think the issue is worth looking at.
A speaker, like a performer in other genres, works hard to create a presentation unified in theme and concept. They work to create a particular expperience for their audience. For the audience to blatantly repurpose the presentation shows a certain degree of disregard (I had thought to say “contempt”, but perhaps that’s a bit too harsh) for the hard work and preparation of the presenter.
The whole reason that real music fans prefer a live performance over media is that in a live performance there is opportunity for immediate feedback betweenperformer and audience in real time. If you watch some live performances on You Tube you will see this occur. It is awesome!
Now, let’s just suppose that I were privleged to be in the audience when Avril Lavigne performed her cover of Green Day’s “Basket Case”. Now, I suppose it wpould be conceivable to whip out my mobile tool of choice and access audio and/or video of the original Green Day performance for instant comparison. Arguably I might “learn more” by such means. (In real life this would never happen as I have a cell phone which I keep strictly voice only). But not only would such an action be rude to Avril, it would also diminish my experience of being “in the moment” with her as she performs.
Now back to the instance at hand. A competent speaker will watch the audience and receive cues from their outward and visible signs of the internal structure of their attention. They may use these cues to vary the pace or tone of their presentation in real time – exactly as a musician does at a live gig. When you focus elsewhere, you throw a monkey wrench into that process.
In my personal circumstances, I occasionally invite a friend of the complimentary gender out for coffee and conversation. Now, I am one of those people whio has somewhat of an inability to be “casual” in conversation. I normally structure sentences, or even entire paragraphs as I speak. Imagine my dismay as I’m working toward a point and have about 2/3 of the sppporting structure in place, and then their cell phone rings.
That is frustrating. Now imagine how much *worse* it is when they whip it out and make a call in the middle of my sentence.
I’m not in principle utterly opposed to the concept of mashup. Collaborative art can be a wondeful thing. But I think someone who seeks to add their own input to an experience designed by somebody else owes the originator the courtesy of experiencing it as designed first.
Now, I should say that all this rests on the assumption that the speaker has in fact been responsible and diligent in preparing a quality presentation. I understand that in academia there is a long tradition of diliberately boring presentations, to which those of junior status are required to pay unwavering attention as an act of masochistic submission. If you were to find yourself on the receiving end of something like this, then by all means use your wired paraphenalia to wiggle out from under. It certainly beats having to sleep with your eyes open. 🙂
But if the person has worked hard to make a good presentation, why not experience it as designed?
Just a thought,
P.S. After reading the comments, I’ll just add one point. I’m firmly in the camp of those who mourn the loss of unmediated communication and human contact. It’s only a little bit of hyperbole to say that we are entering a dystopia of digitally enforced autism. No more touches, no more smiles, no more sighs, no more quirky looks, no more gentle whispers, no more thundering rages, no more humanity. We are throttled down to such of ourself as can be captured in a bitstream.
I don’t know about you, but I don’t *want* to be reduced to a bitstream.
I understand you sometimes attend the legendary “Burning Man” festival. Do you go wired?
To me this points out a much broader issue than simply using technology at a conference. The idea is that as technology becomes a larger part of all our lives, there need to be ways to define which environments the technology is appropriate and acceptable in.
The idea that simply because it is useful to _you_ is not enough. Obviously that needs to be respected, but still there has to be respect for other people (audience, speaker, performer) in addition to the environment you are in. The fact that the man next to you pointed out your laptop use may not have just been a sign that he didn’t understand how useful it was to you, but also that it was indeed distracting to him for possibly other reasons (eg. typing, prominent digital display, loud laptop fan).
There are any number of disruptive technologies that are inappropriate in certain environments:
1) Mobile phones ringing at the movies, weddings, graduations, churches, etc. despite warnings to turn off your phone.
2) People who hold their digital cameras in the air for half of a concert despite the obvious fact the people behind them can no longer see and paid for the concert too.
3) The fact that digital displays attract your eye particularly when you are in a dark room. So instead of looking at the actual performers that you paid to see, you find yourself looking at someone else’s phone or camera.
4) Hiking in some pristine setting and knowing exactly how many pictures the people next to you have taken because they don’t know how to mute their camera. (Mind you this is marginally better than listening to someone’s old school film camera rewinding the film.)
5) Flash photography in a museum that damages the art.
All these things point to a need for event venues and workplaces and governments to set a policy for how technology can be used. Obviously there are limitations in enforcing this, but doesn’t change the fact that these things need to be addressed. For example, we’ve already seen laws related to phone calls and text messaging while driving mostly because of how it relates to other people.
Also, this points to a need for technology to be better designed to adapt to common environments (as your “cyborg” title suggests). This could be design changes such as a laptop with quieter keys which is potentially smaller and less distracting to those around you. But it could also be smarter technology that understands what is appropriate in a given setting. I could imagine a negotiated protocol between the device that you are using and the room you are in that _suggests_ what features you have available. This could lead to a smart camera that automatically knows that you can’t use a flash in certain rooms in a museum. Or a mobile phone that understands it is in a movie theatre so it shouldn’t ring.
Lastly, I don’t think we are ever going to get it 100% right for everyone given that people have different perspectives of what is appropriate. I was reminded of this as I just sent my sister a link that wasn’t safe for work. Still, certainly we can get a lot better at it.
PS. I loved your post for the points it does make. My friend’s grandmother was recently complaining about a woman in a meeting who was texting. I certainly think we need to change our perspective on technology use. And I can say that I’ve distracted other people while I was the person doodling so certainly technology is not the only thing to blame.
Yes, the adoption of new technologies are often justified in terms of how they contribute to productivity. However, ‘doing’ often has little to do with ‘creating’. You want productivity? Ants are productive.
Recent research is beginning to demonstrate that daydreaming is important for creative human thought and problem solving. It seems to me that such research could be used to make an argument for disconnecting from the Internet, scheduling in some unplugged moments. It could also be the case, however, that certain forms of online activity are more conducive to creativity than others. Perhaps doodling and aimless browsing have this in common: they both allow the brain a little freedom to create new connections.
The debate between adopters and anti-adopters will continue, I suppose. However, there is another question: What can you do with your digital device? Everybody can retrieve information (granted some better than others), but what novel thing can you create or think with it?
If doing art was as simple as adopting the use of the paintbrush, then nobody would be looking up Raphael on Wikipedia.
Law professors deal with the issue of technology use in their classrooms all the time. Some find it distracting while others do not. I don’t think they are, but I do think that they can lead to some unforeseen risks.
If the writter attended for pleasure without the funding of another party and the expectation to learn the speakers message, have fun and only give a moderate portion of attention to the speaker. But if attending on someone else dime with the expectation of learning and using all the information in the future, give 100% attention. It’s less about being bored or wanting to free associate in ADD heaven, and more about maintaining integrity in the working relationship. Meeting the performance expectations of the one who pays you, giving them the new information or performance they sent you there to receive may be mundane but nevertheless shows a more mature level of personal growth.
Reminds me of an interesting, thoughtful essay by N. Katherine Hayles called ‘Hyper and Deep Attention: The Generational Divide in Cognitive Modes’. Profession 2007, pp. 187-199 (13).
How would your approach change if the speaker expected the audience to engage in co-creating their lecture?
You eloquently express the frustration and angst that a growing number of students and faculty feel about this disconnect: http://tinyurl.com/dgh6dy
Here’s what my advisor (who actually introduced me to your scholarship) was thinking when she imposed her own ban: http://tinyurl.com/ml4k33. She’s relaxed that for her grad students (thankfully), but I think she wishes it were different for her undergrads. She’s not the “banning” type.
FWIW, I too find it unnerving to have eyes fixed on me during a paper or presentation, especially since I’m squarely in the “guide-on-the-side” camp even during conferences, so I like and expect a fair amount of give and take – I usually have a “runner” to collect questions/feedback from the more timid in the group, and I also rely heavily on backchannel chatter to inform future directions, etc.
I think this picture speaks a thousand words to the futility of trying to put the genie back in the bottle: http://tinyurl.com/ksulx6.
New evidence suggests that the brain does not take in, or retain information in the same way when multitasking…even though participants in studies thought they were retaining, they actually were not…paying full attention turns out to be important after all!
Good article, w/ results of scientific studies, MRI brain-scan stuff:
I think using a privacy filter on my laptop display and turning my display backlight down makes my computer use much less distracting to attendees sitting beside or behind me. It turns my display from being a TV flickering in the room into a private notepad.
I heard a joke once: Multitasking is like sex — people are not as good at it as they think they are.
I think a backchannel is great for a conference presentation type of situation. It enables a conversation where the physical layout discourages or prevents one. But count me as old school when it comes to business meetings or conference calls. I consider meetings and calls to be situations where I have to pay attention closely, and I expect others to do the same. (And if it doesn’t require that, then maybe that meeting is unnecessary.) If you’re replying to someone’s backchannel comment, you’re missing the point I or someone else is making in the actual meeting.
I see people backchanneling every day … on the road, where they are weaving all over, almost colliding with other cars, you know the deal. We had a train conductor backchanneling in LA — that worked out well, yeah.
It’s about context. Sometimes, sometimes, really paying attention to the physical world pays off.
Just passed through your blog following a thread I was tracking on PdF. I find it enthralling.
I can recall in the early 90’s really envying a colleague who sat in conferences (generally near the front) and read a paperback. I thought he wasn’t listening most of the time but was surprised when on one ocassion he stopped reading the novel and commented to the speaker that he was wrong in part of a cross-correlation function that he used. From that moment I was aware of two things; namely: he was much brighter than the average attendee and he used the novel to help him multi-task.
For what its worth, I believe that the anti-computer attitude of your colleague at the conference reflects an arrogance often found in high ranking professionals and specifically when assymetric information flow is concerned. I used to lecture at a military college and ofen used the start up line that ‘sleep is a legitimate form of criticism’.
I first came across backchannel when I visited the EBC at Redmond. It was fascinating that even back then in 2004, the EBC provided laptops for visitors specifically to enamble backchannel and enhance the EBC experience. I have found few other facilities with a similar ethos.
Totally agree that its an approach thing rather than a generational issue. I regularly check background facts during seminars/conferences and have a habit of doing it in meetings as well (primarily using a Blackberry) and I’m no spring chicken! However, I also believe that the benefits of social media channels cybercommunications during conferences are now starting to emerge, if somewhat slowly. Recent workshops and conferences I have attended have used twitter to track issues and raise questions, and online voting has been used quite successfully.
Finally, paying attention is not neccessarily all its cracked up to be. Your comnment about the value of the engagement and the added-value of backchannel makes for a much more personalised learning experience and I firmly believe we learn more by doing than s8imply listening and watching.
Thanks for stimulating such an interesting discussion.
Oh, boy… do I know how you feel. I’m also one of those people who perpetually wants to look things up on the web, no matter the context. Indeed, my first response when asked a question is usually some variation on “Just ask Google.” I’ve been lucky enough to have progressive professors who understand the power of having a laptop in the classroom, esp. at the fingertips of dedicated and inquisitive students. But, unfortunately, I think I’ve been spoiled.
When you said “am I learning what the speaker wants me to learn? Perhaps not. But I am learning and thinking and engaging.” it reminded me of something Edward T. Hall wrote in his book Beyond Culture:
“…language is not…a system for transferring thoughts or meaning from one brain to another, but a system for organizing information and for releasing thoughts and responses in other organisms.”
This connects with something one of my mentors said a while back, that we don’t teach subjects, but subjectivities. And I think once we begin to understand this more deeply, we will realize that the tangents students go off on in their heads during a lecture are unique and can be aided incredibly by the interwebz by allowing them to pursue their own lines of inquiry, ultimately enriching the learning experience. /soapbox
Great post, as always.
I sometimes tell people that there are four stages in the adoption of technology in higher education. They are
1. “What is that stupid thing the students are using?”
2. “Ban the device! If students use that, they will lose the ability to do things that students did in my day, and they may use it to cheat! ”
3. “Damn It! If all the students have one, I guess I’ll just have to allow them in class.”
4. “Since my students started using it in class I find that I can teach about more important topics in more interesting ways.”
Currently netbooks and cell phones are in about stage two.
I notice you are only posting positive affirmations of your own perspective here, Danah.
That senior academic took time out to meet with you and contribute. He could have just kept working on his next paper too. If you read back in the literature, people made just the same kind of fuss about wanting to be able to respond to e-mail instantly – and we observed negative results on team work when this happened. Human beings are good at reading attention from gaze – probably initially to avoid being predated, now to avoid being dissed.
Your language is violent – “bitch-slapped” indicates the need for approval you feel and your desire to be approved of in an academic publishing world. The violent affirmations of your followers show the same level of insecurity. Get over it. If you want to change the world and you think your way works best, go right ahead typing thru every meeting you attend. Just don’t expect to get invited back – or to influence folks to adopt the stuff you are proposing.
I have become a “bad student.” I can no longer wander an art museum without asking a bazillion questions that the docent doesn’t know or won’t answer or desperately wanting access to information that goes beyond what’s on the brochure (like did you know that Rafael died from having too much sex!?!?!).
This nicely illustrates the drawbacks of your approach. You had a docent in front of you trying to tell you what is worthwhile about Rafael and his artwork but you’re distracted looking up trivia about his life.
An expert (ideally) is making a case for what is important about a topic. When you google a topic, it’s really easy to end up with a handful of whatever trivia just happened to catch your eye.
The internet makes us more informed, but at the same time it gives us a powerful illusion of being even more informed than we really are.
Hmmm. I’ve been thinking about this for a couple of days. One the one hand, I like to knit while I listen. I can drop the knitting and pick up a pen if I want to make a note. Lots of people find knitting to be at least as ‘rude’ as clattering on a laptop while a talk is going on. OTOH, I certainly can’t listen while I’m surfing, reading wikipedia etc. My approach would be to make notes and look stuff up later. I guess I’m not as good at multi-tasking as I’m supposed to be (as a woman). As for people thinking that it’s rude to a speaker to knit or surf, my reply is that many many speakers over years have been very rude to me by presenting mumbling, incoherent, disorganised, rambling lectures or talks. If a speaker isn’t engaging an audience, it’s not the audience who need to look at themselves. In summary, the speaker has the power to engage us or not; the audience has the right to not be engaged, or to express their engagement as they choose.
And Matt Schofield, danah’s not posting what we write here, we are. Perhaps her fearsome reputation prevents people from disagreeing with her? 🙂
high school teacher here.
gosh – i wish your confidence to seek out learning with zest on my students. i fear we’ve boxed them up too much – for them to feel that freedom in school.
This experience of realizing you “want information at my fingertips now and always” contrasts with Aaron Swartz’s reflections after 30 days offline: “Normally I feel buffeted by events, a thousand tiny distractions nagging at the back of my head at all times. Offline, I felt in control of my own destiny. I felt, yes, serene.”
Are there other examples of people thinking through the self they have created to live in the online?
Like Sherman, I was at THATCamp, and the Twittering there really did add a lot of value to the discussion. Twitter led to discussions, discussions led to tweets, and it really helped us all communicate our ideas better– not to mention that some people who couldn’t make it could actually follow the conference in real-time via Twitter, blog posts, and even a Wiki, all created on-site during the conference.
I’m also on Livejournal, and most of my friends there are grad students and professors. Weirdly, when I talked about how nicely twitter worked to augment conversation, I found I was facing a lot of resistance from my friends– from academics who blog, and even from a couple who work on New Media projects in the humanities.
I know that I can’t listen *AS CLOSELY* when I multitask, and I try to not overdo it, but at the same time, when was the last time you went to a lecture and actually paid attention to EVERY. SINGLE. WORD?
For me, it’s only ever happened a handful of times. Most of the time, lecturers gain and lose your attention many times in the course of a single lecture. Aren’t you even more engaged if you’re using your laptop to find supporting or conflicting data?
I’ve been playing with a pair of myvu “shades” plugged into an iPhone running an IRC client.
Compared to looking down at the screen in my hand, I feel like I’m better able to pay attention to both the IRC channel and what’s in front of me at the same time. For example, I walk into fewer things while manuvering through the supermarket or walking around town. I can do this because the shades are organized to place the screen in the bottom of my vision (think bifocals), with the straight ahead portion of my vision free.
Works with real time maps, too — avoids the problem of missing a street sign because you were looking down at your device. Haven’t tried it while attending a talk yet but that would be a good test.
Talking back is still hard. The iPhone is actually much worse than the previous generation of smartphones for this because it is hard to make touch gestures without viewing the screen. My windows mobile phone or G1 is much better here.
May be visiting MSR cambridge to say hi to Henry Cohn this week. if by some random chance happen to run into you (I don’t believe we know each other) will be happy to show you.
certainly can’t listen while I’m surfing, collector-solar.com reading wikipedia etc. My approach would be to make notes and look stuff up later. I guess I’m not as good at multi-tasking as I’m supposed to be (as a woman).
I’m lurking on you. Sounds weird I know coming from a 55 year old woman. The truth is I write a blog about teens. I live in Silicon Valley and I first saw you at the MacArthur research presentation at Stanford. You are one of the few people I know that has both in-depth computer experience AND a cultural/sociological background/viewpoint about how technology is used by youth. I was glad to see you on a panel this year at Blogher talking about technology use. You bring a vital and different perspective that I need in order to balance my “parenting” cap. I started my blog last year when it became painfully obvious to me that I wasn’t keeping up with my teens use of technology, especially social media and I wanted to understand the why of it. I continue to write my blog for parents to provide a bridge. Many parents are either too scared of social media to let their kids use it (they use it anyway) OR they are completely hands off and end up not keeping vital communication channels open. So,I’m glad to see you’re keeping your blog going – your viewpoint is so completely unique and different. You’ve taught me that young people process technology differently than adults who don’t integrate as “smoothly”. I seem to back channel pretty good though!
It’s even worse for people like me who excel at multitasking. I am an exceptional musician who can play and sing complex lines at the same time. So, in social situations, I take personal offense when I am judged for not behaving like everyone else. To me, it’s bad as being insulted for my skin color, because really what the authority figure is expressing is a complete ignorance of what kind of person I am and a reflexive dismal of my way of culture. For this reason I limit interaction with others in the real world to the consumer realm. The singularity makes this the longest upcoming decade in history.
Coming into this posting very late. No excuse: I wasn’t twittering or facebooking or.
First: absolutely love your posting and ecstatic that we live in this time.
But: as an old presenter (claim thriller), I have relied on the following techniques for *overpowering* the backgrounding phenom: showing at least 3 slides per minute; making every slide cognitive demanding; making every slide background transparent; talking faster than my slides; never saying anything I’m absolutely sure of (else how to say ‘explore’?); repeatedly begging for participants to read my online stuff and THEN welcomely contacting me. I don’t believe in the instant GROK that goes with the just-in-time-informed approach. That wouldn’t make me a superficial person, would it?
Lawrie Hunter, out of the loop in remote Japan
great to see that you have rewritten this piece, Danah. or maybe I just read it differently a year later.
We are adopting your approach to allow participants in our weekly seminars to twitter – I’ve written this up in our blog with a case study contrasting a couple of conferences we were involved in simultaneously http://blog.cambridgenetwork.co.uk/2010/10/citizen-journalists.html
but cut out the word “bitchslapping”. debate among peers SHOULD be fierce and free. violence against women should not be normalized by transposing the language of pimps to academic debate