In mid-April, journalists heard about a student poster at the American Educational Research Association’s annual meeting called “A Description of Facebook Use and Academic Performance Among Undergraduate and Graduate Students.” The poster suggested that Facebook use might be related to lower academic achievement in college and graduate school. As the media picked this up (most likely without reading more than the abstract), a new story emerged: Facebook is the cause of poor grades in school. Unhappy with what was panning out, Eszter Hargittai penned a blog post at Crooked Timber to critique the situation: “ZOMG! Facebook use and student grades.”
Move forward a few weeks… Josh Pasek, eian more, and Eszter Hargittai just published an article at First Monday on this issue: “Facebook and academic performance: Reconciling a media sensation with data.” In this article, they examine three different datasets that contradict the claims made by the AERA poster and concluded that the AERA findings could not be reproduced.
Indeed, if anything, Facebook use is more common among individuals with higher grades. We also examined how changes in academic performance in the nationally representative sample related to Facebook use and found that Facebook users were no different from non-users.
The samples used in this First Monday article include a large sample of undergraduates at a diverse undergraduate institution, a nationally representative cross sectional sample of American youth, and a longitudinal panel of American youth. There are also scholars elsewhere that have data that contradict the AERA poster’s claims. Quoting from an email from Sam Gosling (a professor of psych at UT-Austin):
I teach a big intro psych class every year and my co-teacher and I always do a bunch of surveys, questionnaires, etc. and ask the class various questions….in 2007 we asked the class how often they check FB…the options were “never’ “less than once a week” “once a day” 2-5 times a day” and “6 or more times a day”….I knew we had that so I ran a quick correlation between that variable and the overall class score….the correlation was .12, which was not statistically significant, but is in the direction of showing the people who check their FB more often got higher grades…note that was computed over only 149 onlyf the students…I probably do have data on a larger number but that was what matched up in my hasty data merge to see what we’d find.
Given the way that these things typically turn out, I doubt that many journalists will be clamoring to scream, “We were wrong! Facebook doesn’t cause bad grades!” This is a sad reality of media sensationalism. Unfortunately for all of us, when scholars (or students) disseminate findings based on poor methodology that reinforce myths that the media wants to propagate, they get picked up even if they are patently untrue and can be disproved through multiple alternative data sets. Even though I doubt this article will make it into mainstream media, I hope that some of you will take the time to make it clear to those around you that the media coverage of this story was patently ridiculous and unfounded. Or at least start by reading the article: “Facebook and academic performance: Reconciling a media sensation with data.”
Note: The author of the AERA poster, Aryn Karpinski, also published a commentary in First Monday this month: “A response to reconciling a media sensation with data” where she makes it clear that her study was exploratory and that she wanted to place it at AERA to start a conversation with scholars, not to attract media en masse. She then continues on to critique the critique of her work.