Home Political science
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Conservatives and News FeedsKatherine Haenschen News exposure on Facebook does not occur at random: it comes from users’ friends who share links, from paid promotions by media companies, and from posts by pages liked by users. All of these are examples of what Thorson and Wells (2015) refer to as curated flows. Facebook users sit at the nexus of multiple flows of political and nonpolitical information, which are shaped by the user through personal choice and by algorithms that determine what content to prioritize in the newsfeed (Dylko, 2015). This analysis considers the specific curated flows that result from liking public Facebook pages for national news sources and whether partisanship drives individuals’ choices of which outlets to follow. Over the last decade, concerns have arisen that individuals are choosing to expose themselves to information that confirms their political biases (e.g., Garrett, 2009a, 2009b; Par- iser, 2011; Sunstein, 2007). This phenomenon is referred to as selective exposure (Iyengar & Hahn, 2009; Stroud, 2008, 2010). Evidence that selective exposure determines the national news pages that individuals choose to like on Facebook would add to concerns that digital media are creating “filter bubbles” that in turn have negative consequences for deliberative democracy. Concerns about selective exposure have increased in this era of technological customization, due in part to digital features that enable individuals to easily choose the sources and content to which they are exposed (Dylko et al., 2017). What is yet unknown is how this manifests itself in terms of the Facebook pages for national news sources that individuals choose to like. This analysis fills that gap by pairing participants’ survey responses with their Facebook digital trace data (DTD) to study the actual pages liked by subjects, thus overcoming problems with self-reporting endemic to communication research (Haenschen, 2019). DTD refers to traces of user behavior left behind on platforms or website (Freelon, 2014; Jungherr & Jurgens, 2013). DTD offers information about what people are interested in or paying attention to (Jungherr tk Jiirgens, 2013), and when paired with survey responses from the same participants, it can provide knowledge about the types of people who leave particular traces (Freelon, 2014), Results show that individuals like Facebook pages for national news sources in a manner consistent with partisan-selective exposure. Furthermore, partisanship predicts the aggregate bias in subjects’ national news pages and all Facebook pages liked. However, several patterns emerge that are unique to Republicans in the sample: they like a much higher share of like-minded sources than Democrats, and as Republicans like more Facebook pages for national news, their aggregate bias increases. Implications of this asymmetric filter bubble are discussed in terms of the potential spread of viral disinformation and our understanding of how people choose to expose themselves to news. Curated Flows on Facebook The news—or lack thereof—in users’ Facebook feeds are an example of what Thorson and Wells (2015) refer to as curated flows, in which content is selected or filtered by humans or algorithms before appearing in the newsfeed. In this era of overwhelming media abundance, curated flows help explain how individuals actively choose what content should flow inward to them, and what they in turn push outward to their friends and followers. Scholars distinguish between two main forms of curation: user driven, in which the user herself chooses content, and pre-selected or system-driven customization, in which algorithmic decisionmaking informed by the user’s prior behavior determines what content appears and in what order (Dylko, 2015; Zuiderveen Borgesius et al., 2016). Thorson and Wells (2015) extend this argument by detailing five examples of curation that result in news exposure, including friends sharing links, algorithmic prioritization, and users liking pages that post links to news articles. This analysis concerns itself with the latter—personal curation or the “intentional customization of one’s media environment” through the act of liking news sources on Facebook (Thorson & Wells, 2015, p. 316). Several recent studies that combine survey and DTD have explored the individual-level factors associated with exposure to different types of news and political information on Facebook. Wells and Thorson (2017) determine that subjects’ news interest and use of online information customization tools are both predictive of the number of political pages an individual likes, and news interest is not predictive of liking any news information pages. Individuals with high news interest who engaged in customization were more likely to like news information pages. Self-reported news consumption was also associated with liking news information pages. An analysis of users’ Facebook pages liked and the advertising categories assigned by Facebook to that user found that evidence of political interest in either measure predicted increased exposure to news and political content on the platform (Thorson, Cotter, Medeiros, & Pak, 2018). Another study found that higher political interest was associated with subjects’ improved ability to self-report whether they like political pages on Facebook (Haenschen, 2019). However, these studies engage with news and political content generally. What remains unknown are the individual-level factors that explain why individuals like particular news sources on Facebook. The literature on selective exposure— which explains patterns in media choice broadly—offers some insight. Selective Exposure and News Choice The theory of selective exposure states that individuals choose news sources that align with their partisan predispositions (Iyengar & Hahn, 2009; Stroud, 2008, 2010). The phenomenon manifests in terms of individuals choosing affirmative sources, rather than necessarily avoiding those that challenge their views (Garrett, 2009a, 2009b). This phenomenon has broad implications for society and may result in a “less tolerant and more fragmented public” (Stroud, 2010, p. 571). While selective exposure has been well documented across different forms of media (e.g., Stroud, 2008), the explosion in digital news outlets has led to concerns that individuals now exist within online “echo chambers” or “filter bubbles,” only exposed to information that affirms their existing biases, often received from like-minded friend networks (e.g., Pariser, 2011; Prior, 2007; Sunstein, 2007). However, empirical research exploring selective exposure finds that while individuals express a predilection for concordant media, they are still exposed to mainstream and cross-cutting content as well (Garrett, 2009b; Zuiderveen Borge- sius et al., 2016). Facebook, in particular, has raised concerns about increased selective exposure due to technological features that provide users with the ability to customize or curate their flow of information (Dylko et al., 2017). Indeed, analysis of Facebook pages from 920 news organizations shows that users tend to consume news from a limited number of pages, forming polarized clusters with little overlap among audiences (Schmidt et al., 2017). A study ofbrowsing history found that the news users click on from social media was more segregated by ideology than news accessed by navigating directly to the source, especially for opinion pieces (Flax- man, Goel, & Rao, 2016). Facebook s own data science team finds that users’ predilection toward selective exposure reduces clicks on cross-cutting content shared by friends by 6% to 17%, while the Facebook algorithm depresses it by approximately 5% to 8% (Bakshy, Messing, & Adamic, 2015), suggesting that both user- and system-driven selective exposure are at work. However, while individuals choose media links based on perceived agreement with the source, this effect disappears when social recommendations are added to the manipulation (Messing & Westwood, 2014). Given that this study explores subjects’selection of news outlets in the form of national news pages they choose to like on Facebook, the following hypothesis is proposed: HI: Individuals will like national news pages on Facebook in a manner consistent with their political predispositions. However, individuals like other types of pages on Facebook than solely those for national news outlets, and some people may like more national news pages than others. Thus, the share of concordant national news pages should be considered, in terms of all national news pages and all public pages liked. Again anticipating selective exposure, a second hypothesis is proposed: H2: Partisanship will predict the aggregate bias of (a) national news pages liked and (b) all pages liked by subjects on Facebook. Finally, given that not all individuals in this sample or in the real world like pages for national news organizations on Facebook, it stands to reason that the number of national news pages liked by subjects may impact this analysis, though it is difficult to predict in which direction. Liking more national news pages may result in a more balanced information ecosystem, or it may be associated with subjects following an even greater number of concordant sources. Similarly, an individual who likes two concordant pages out of 10 total pages liked may have a very different experience of their newsfeed than someone who likes two concordant pages out of 1,000 total pages liked. Thus, a research question is posed: RQ1: Does the number of (a) national news pages or (b) all pages liked moderate the effect of partisanship on aggregate bias? Method To address these hypotheses, data were collected from a paired survey and Facebook app from subjects who consented to providing both forms of information. Subjects were recruited on Amazon Mechanical Turk between December 29, 2016, and January 2, 2017, who were located in the United States and had a prior approval rating of 90% or greater. Once subjects completed a survey asking about a range of political variables, they were directed to a page where they installed an app that collected the names of all public Facebook pages they liked, as well as other information. The app was built by a developer in compliance with IRB guidelines (this study was reviewed and approved by the Princeton University IRB prior to data collection) and MTurk rules that prohibit the collection of personally identifying information. Subjects who provided both forms of data were compensated $2.01 for their time. Participants A total of828 subjects completed the survey and installed the app, of whom 62.4% were female and 37.4% male. Subjects’ self-reported ages ranged from 20 to 69 (M = 35.73, SD = 10.3); 67.15% self-reported their race and ethnicity as Caucasian Non-Hispanic, with 11.96% African American, 7.72% Hispanic or Latino, 6.76% Asian, 1.57% Native American, and 4.83% multiple races or “Other.” In terms of educational achievement, self-reported percentages are as follows: high school diploma or less, 10.0%; some college, 28.02%; associates degree, 13.77%; bachelor’s degree, 34.66%; and graduate degree, 13.53%. According to the app, subjects had an average of387.27 Facebook friends (SD = 505.94, range 0-4969) and liked an average of 280.04 pages on Facebook (SD = 275.03). Of those 828 subjects, 12 did not have any page data, so rather than assume they liked no pages they are dropped from this analysis. Additionally, subjects who self-reported their party ID as “other” were excluded from this analysis since the third parties with which they self-reported identifying span the ideological spectrum, making generalization impossible, and are too small of a sample size for subgroup analysis. Thus, the final sample size is 790 subjects. Measurements Partisanship was measured by survey responses to a question asking “Generally speaking, what political party do you identify with?” Subjects selected Democratic (52.15%), Independent (24.94%), or Republican (22.91%). Facebook page variables were calculated using data collected by an app to which subjects consented and installed, which collected the names of all pages liked by each subject. A total of 141,693 unique pages were liked by subjects; due to coding capacity only those 25,080 pages liked by two or more subjects were coded for page topic and bias. Coding proceeded in several waves. First, MTurk workers (excluding those in the underlying study) were hired to code pages as to whether the topic of the page was for a political organization, candidate, or elected official; news source or personality; or neither. A qualification task for MTurkers was created from the most common pages; those who scored 97% or higher were able to work on the main dataset. All codes were reviewed by the researcher; coders who maintained a 97% accuracy were able to continue working on the dataset. Only 2.2% of all codes were rejected by the researcher, suggesting a very high accuracy rate and reliable coding scheme. Facebook pages for blogs, alternative media, and news personalities were also included in the “news” category. Combining news and blog sources echoes the method by Wells and Thorson (2017), who found tremendous overlap in subjects who liked both types of pages. Here, their distinct category for “journalist” pages is also collapsed into the news category. These steps resulted in preliminary codes of 566 news pages. These 566 pages were then coded by student research assistants and the researcher. First, pages were coded to ascertain whether they were for a local or national news entity, not news, or unable to be coded; the latter was chosen when the page was no longer visible. Of the 566 news pages, 51 were unable to be coded, 21 were deemed not news, 197 were for local media, and 297 were national media. Intercoder reliability for this measure was calculated using ReCal OIR (Freelon, 2013), Krip- pendorffs a = .756. Next, national news pages were coded for their partisan bias, with a focus on identifying very strong partisanship. To perform the coding, assistants clicked on the page, looked at the content, and then coded the media bias as either liberal, non-biased, conservative, other, or can’t tell. Coders also utilized the website allsides.com to confirm their own coding; only allsides.com ratings in the left-most and right-most categories were considered evidence of bias. Of the 297 national media pages, 115 were coded as non-biased, 104 as liberal, 54 as conservative, and 24 as other. Intercoder reliability was calculated using ReCal OIR (Freelon, 2013), Krippendorffs a = .449; since inter-coder agreement was low, the researcher coded the 49.3% of pages on which at least one student coder disagreed to determine a final measure. Based on this coding, the following measures were calculated; descriptive statistics are reported in Table 11.1.
TABLE 11.1 Descriptive Statistics, Pages Liked by Respondents and Aggregate Bias Measures
by -1, then the two percentages were added together to create a final variable with a theoretical range of—1 (entirely left bias) to 1 (entirely right bias). • Aggregate page bias: similar to the previous variable, except percentages were calculated by dividing by each subject’s total Facebook pages liked. Results Before beginning the analysis, it is worth noting that within the sample, liking Facebook pages for national news outlets is somewhat rare; 48.2% of subjects chose to follow no pages for national news, and the median number of pages liked is 1 out of 170 total pages liked, as reported in Table 11.1. Turning to those subjects who do like national news pages to determine if selective exposure motivates this behavior, the aggregate number of biased pages liked by all subjects and their party affiliation are considered; a frequency table is reported in Table 11.2. Figure 11.1 depicts the percentage of political pages liked by each party group by page bias. Unsurprisingly, Democrats like a higher share of liberal pages than Independents and Republicans; Republicans like a higher share of conservative pages than Independents and Democrats. A chi-square test of independence was performed to determine if the distribution of biased pages liked by subjects varied by subjects’ party; results were significant, x2 (4, N = 1913) = 631.68, p < .001. Subsequent pairwise comparisons between Democrats and Independents (x2 (2, N = 1,616) = 84.58, p < .001), Independents and Republicans (x2 (2, = 683) = 153.83, p < .001), and Democrats and Republicans (%2 (2, N = 1,527) = 642.41, p < .001) were also significant. The results provide full support for FI 1: individuals like national news pages in a manner consistent with their political predispositions. These results find some notable differences between party groups, as Figure 11.1 depicts. Republicans in the sample exhibited vastly different curation of national news pages: 58.25% of national news pages liked by Republicans came from conservative media, whereas only 29.6% came from non-biased sources and 12.12% from liberal sources. For Democrats and independents, at least 40% of their pages liked are liberal sources and non-biased sources, though Democrats like a smaller percentage of conservative sources. Thus, Republicans’ curated flow TABLE 11.2 Distribution of National News Pages Liked by All Subjects, By Party Identity
![]() FIGURE 11.1 Percentage of Media Pages Liked by Page Bias and Subject Party ID of content from news pages is remarkably different from that of Democrats and Independents, with the former selecting a majority of conservative outlets at the expense of non-biased or liberal sources. However, this initial finding considers all pages liked by all subjects and does not consider the potential for individual-level variation. Indeed, within the sample of 790 individuals, a plurality (381) liked no national media pages at all; other subjects liked over 30 different national media pages. A subsequent analysis explored what predicted the individual-level bias in subjects’ curation of news pages liked, using a measurement of the overall partisan lean of all national pages liked, ranging from -1 (entirely liberal) to 1 (entirely conservative). For this analysis, Independents were set as the baseline group and number of national news pages liked was used as a covariate and interaction term. The first series of linear regressions considered all subjects (models I. II. and III); the second series (models IV, V, and VI) only considered those subjects who liked any national news pages. Results are reported in Table 11.3. Several interesting trends emerge from this analysis. The results affirm expectations of selective exposure, in that coefficients show Republican subjects’individual- level aggregate news page bias is to the right of center and Democrats to the left. Independents are slightly to the left of center (indicated by the significant intercept), and Democrats are further to the left of them in all models with significant main effects and interaction terms. This provides support for H2a, which expects that partisanship will predict the direction of aggregate news bias. Considering first models I—III with all subjects, including those who like no pages, results show that as the number of national news pages liked by a subject increases, their cumulative bias of all pages liked moves leftward. This is likely due in part to a TABLE 11.3 Regression Results, Individual-Level Bias in Subjects’ Aggregate National News Pages Liked
* p < .10; * p < .05; ** p < .01; *** p < .001 higher average number of pages liked by Democrats (M = 3.15; SD = 6.49) than Republicans (M = 1.78; SD = 4.44) and marginally higher average for Democrats and Independents (M =2.10; SD = 4.66). Indeed, an omnibus ANOVA for average pages liked by party was significant (F(2,787) = 4.63, p < .001); a post-hoc Tukey HSD test found significant differences in the averages between Democrats and Republicans at the p < .05 level and between Democrats and Independents at the p < .10 level. However, the significant interaction term between number of national news pages liked and partisanship shows that as Republicans like more pages, their aggregate news bias becomes more conservative relative to Independents and Democrats. A separate series of regressions were performed with Democrats as the baseline group to enable direct comparison between Democrats and Republicans. There is no such effect for Democrats relative to Independents. Turning to models IV-VI, which consider only subjects who like any national news pages, the number of pages liked is no longer a significant predictor of the aggregate level of bias in news pages liked nor does it moderate the effect of Republican identity. Results provide only partial affirmation for RQla. However, when considering the aggregate bias of all pages liked, the moderating effect of pages changes for Republicans. Results are reported in Table 11.4. Again, coefficients line up in the expected directions, with Democrats’ aggregate bias to the left of Independents and Republicans, and Republicans to the right of Independents, offering support for H2b. Total number of pages liked on Facebook has no association with aggregate page bias, likely because there was TABLE 11.4 Regression Results, Individual-Level Bias in Subjects’ Aggregate Pages Liked
* p < .10; * p < .05; ** p < .01; *** p < .001 only a marginally significant difference in the distribution of pages liked by party group. Due to heteroskedasticity, a Kruskal Wallis test was performed to assess differences in total pages liked by party groups; it was marginally significant (H(2) = 5.36, p = .07). Independents had the highest median number of pages liked (205), followed by Democrats (172) and Republicans (140). Furthermore, the measures of aggregate bias are relatively small: while values range from -.10 on the left to .08 on the right, the median is 0 and interquartile range is 0.001. When all pages liked are considered as a covariate and interaction term among all subjects, there is no significant interaction. However, when considering only subjects who like any national news pages, total pages liked moderates aggregate bias for Republicans: Republicans who like more total pages on Facebook have a slightly less conservative aggregate page bias. This suggests that for Republicans, liking more pages dilutes the share of concordant partisan national news in a manner that it does not for Democrats or Independents. This again provides only partial support for RQlb. Taken together, the results suggest Republicans have different experiences and make different choices in terms of exposure to news on Facebook than Democrats and Independents. Republicans on the whole tend to like conservative pages on Facebook rather than liberal or centrist outlets, such that as the number of media pages liked increases, Republicans are selecting additional conservative pages to like on Facebook. However, among Republicans who like any national news pages, liking additional Facebook pages generally appears to dilute the effect of partisan-selective exposure, reducing the aggregate bias in pages liked. Discussion The theory of curated flows argues that individuals play an active role in choosing at least some of the content that appears in their newsfeeds. This chapter extends that argument by exploring the degree to which selective exposure motivates the way in which partisans curate the news that appears in their feed through liking Facebook pages for concordant national news sources. Overall, liking Facebook pages for news outlets is a somewhat rare behavior within the sample, which echoes prior research (Wells & Thorson, 2017; Flaxman et al., 2016) that find limited self-induced exposure to news on Facebook. However, when individuals do choose to like Facebook pages for national news sources, partisanship plays a curating role in choosing outlets. Evidence of selective exposure in news outlet choice is found by analyzing all pages liked by subjects or just national news pages among all subjects or just those who like national news pages. Notably, the results show marked differences in self-identified Republicans’ page choices than Independents and Democrats, which suggests an asymmetric, partisan component to filter bubbles and may explain some of the spread of viral disinformation during the 2016 election. Republicans’ curation of national news pages differs markedly from that of Democrats and Independents. Over half of all national news pages liked by Republicans are conservative outlets, whereas less than 30% are non-biased sources. Among Democrats and Independents, approximately 40% of pages liked are from both liberal and centrist news outlets. Additionally, as Republicans like more national news pages, their aggregate news page bias becomes more conservative. These results may be explained in part by survey research indicating that Republicans have very low trust in mainstream news and perceive most outlets to be biased (Mitchell, Gottfried, Kiley, & Matsa, 2014). Thus, it is self-identified Republicans who appear to curate their Facebook feeds to create their own partisan filter bubble, whereas Democrats and Independents’ curation is much more similar. Republicans are curating their Facebook newsfeed to encounter primarily concordant media, at the expense of non-biased “mainstream” media or liberal outlets. And while Fox News (26 subjects) was the most popular conservative news source of the 54 conservative pages in the sample, other popular pages included For America (20 subjects in the sample, 7.9 million total fans) and IJR Red (13 subjects in the sample, 8.6 million total fans), both of which are dedicated to curating content for conservative audiences that was originally reported elsewhere. This partisan filter bubble may also explain in part how the so-called fake news was able to spread during the 2016 cycle. Facebook has been credited as a major source of exposure to “fake news,” which itself tended to promote pro-Trump, anti-Clinton messages (Allcott & Gentzkow, 2017; Guess, Nyhan, & Reifler, 2018). One analysis finds that the most popular fake news stories in 2016 generated more engagement on Facebook than major mainstream stories (Silverman, 2016); another shows that self-reported Trump supporters were statistically more likely to be exposed to fake news online (Guess et al., 2018). This analysis contributes to our understanding of this phenomena by demonstrating how Republicans tend to follow conservative news sources on Facebook at the expense of mainstream and liberal sources, potentially exposing them to more “fake news” that may have been shared by these pages. Independents and Democrats who follow fewer conservative pages would have been less exposed to such content circulating on conservative pages. Of course, these assumptions hinge on the notion that conservative pages are presenting either differential topics or differential views than the non-biased or liberal pages in the sample, a topic which is ripe for a thorough content analysis. Yet, many indications suggest that this is, indeed, the case. The Wall Street Journal's Red Feed, Blue Feed feature demonstrates how stories on the same issue—guns, abortion, immigration—are differentially framed and discussed in Facebook posts shared by predominantly liberal or conservative users. An analysis of newspaper endorsements finds that the partisan lean of the editorial board is predictive of bias in economic reporting that caters to the politics of readers (Larcinese, Puglisi, & Snyder, 2011). Even word choice differs markedly between partisans, which is reflected in the language used by news outlets and members of Congress (Gen- tzkow & Shapiro, 2010). Since Democrats and Independents are choosing to follow a similar mix of Facebook pages for national news sources and Republicans are largely choosing others and this represents the majority of individuals’ news exposure, that would support concerns that the Internet is indeed creating a “filter bubble,” though specifically for Republicans versus Independents and Democrats (Pariser, 2011; Sunstein, 2007; Zuiderveen Borgesius et al., 2016). Moreover, as selective exposure has been shown to exacerbate polarization over time (Stroud, 2010), the choice of which national news pages to follow on Facebook could contribute to Republicans moving further to the right of Democrats and Independents over time due to the news sources to which they are exposed. These findings also expand our theoretical understanding of how curated flows influence exposure to news information. The theory of curated flows relies on a uses and gratifications framework to explain why individuals engage in this “active, intentional customization of one’s media environment” (Thor- son & Wells, 2015, p. 316). Flere, subjects are choosing like-minded sources, likely because it is enjoyable. Generally, exposure to concordant individuals and information is considered a pleasant experience, whereas encountering counter- attitudinal information generates mental discomfort (Garrett, 2009b; Huckfeldt, Johnson, & Sprague, 2002). Thus, people are choosing to like national news pages not only because they are news consumers or politically interested (Haenschen, 2019; Wells & Thorson, 2017) but also because they enjoy the content from a partisan perspective. The pages that were coded as part of this analysis illuminate empirically the degree to which layers of curation are taking place in users’ feeds. As Wells and Thorson (2017) argue, Facebook national news pages are themselves curated flows, in which administrators choose which articles to share. Journalists’ pages are also curated flows, sharing their work and encouraging readers to do so as well. Indeed, a prior study of the content shared by the Facebook pages for The Rachel Maddow Show and O’Reilly Factor found that both shared content from a relatively small pool of sources with very little overlap between them (Jacobson, Mvung, & Johnson, 2016). In this study, many of the most popular liberal and conservative pages, such as Addicting Info or For America, do little to no original reporting, instead repackaging the work of others or sharing links from other like-minded sources. Thus, partisans are curating their newsfeed with not only actual news channels such as Fox News and MSNBC, but also news outlets that are themselves curations of concordant flows of information. As previous studies have already demonstrated, this user-generated behavior further influences the algorithmic processes that put other content in users’ feeds (Bakshy et al., 2015; Thorson et al., 2018), compounding the effect of user-generated selective exposure and likely increasing the system-driven component as well. Limitations This study has several limitations. Data were collected from a convenience sample, in keeping with other studies of this nature (e.g., Wells & Thorson, 2017); nevertheless, the sample cannot be viewed as representative. That said, the sample size of 828 is larger than prior studies that pair a survey and Facebook app and offers a greater range of age and geography. The coding scheme used in the study also presents a limitation, as it may have potentially underestimated the degree of selective exposure. Coding was designed to capture only overt media bias; one could argue that outlets coded as “non-biased” such as CNN or NPR actually have a liberal bias on certain issues. Recoding outlets that are considered as having a liberal lean rather than outright bias would likely result in Democrats having an even higher share of aggregate concordant pages liked. Additionally, 51 pages were unable to be coded due to being unpublished or taken down by Facebook. Many of these, such as “The Kelly File” or “The Rachel Maddow Fan Page,” have an ideological slant; the ability to code and include such pages would have likely increased subjects’ share of aggregate page bias. Conclusion The concept of selective exposure was reintroduced a decade ago (Iyengar & Flahn, 2009; Stroud, 2008, 2010) in response to the emergence of cable news channels and political talk radio shows that provided Americans with increased choice in media outlets. Since then, the widespread adoption of the Internet and social media platforms have resulted not only in a dramatic increase in news outlets (Prior, 2007) but also in the ability to selectively expose oneself to content online (Sunstein, 2007; Pariser, 2011). Indeed, platforms such as Facebook, which provide users with technologies to further select and filter the information that reaches them are associated with greater selective exposure (Dvlko et al., 2017). The analysis presented here provides empirical evidence for this phenomenon through demonstrating the degree to which Facebook users are engaging in selective exposure in terms of liking national news pages. The results suggest an asymmetric filter bubble, in which Republicans curate their feeds from a majority of conservative sources, whereas Democrats and Independents choose non-biased and liberal outlets. This user-driven curation in turn influences Facebook’s algorithm in terms of what stories to show and in what order, and may over time lead to increased political polarization. Discussion Questions
References Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211-236. doi:10.1257/jep.31.2.211 Bakshy E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130-1132. doi: 10.1126/science. aaal 160 Dylko, I. B. (2015). How technology encourages political selective exposure. Communication Theory, 26(4), 389-409. doi:10.1111/comt. 12089 Dylko, 1. B„ Dolgov, I., Hoffman, W„ Eckhart, N„ Molina, M.. & Aaziz, O. (2017). The dark side of technolog)" An experimental investigation of the influence of customizability technolog)' on online political selective exposure. Computers in Human Behavior, 73, 181-190. doi:10.1016/j.chb.2017.03.031 Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, S0(S1), 298-320. doi:10.1093/poq/nfw006 Freelon, D. (2013). ReCal OIR: Ordinal, interval, and ratio intercoder reliability as a web service. International Journal of Internet Science, 8(1), 10-16. Freelon, D. (2014). On the interpretation of digital trace data in communication and social computing research. Journal of Broadcasting & Electronic Media, 58(1), 59-75. doi:10.10 80/08838151.2013.875018 Garrett, R. K. (2009a). Echo chambers online? Politically motivated selective exposure among internet news users. Journal of Computer-Mediated Communication, 14(2), 265- 285. doi:10.1111/j.l 083-6101.2009.01440.x Garrett, R. K. (2009b). Politically motivated reinforcement seeking: Reframing the selective exposure debate. Journal of Communication, 59(4), 676-699. doi:l 0.1111/ j. 1460-2466.2009.01452.x Gentzkow, M., & Shapiro, J. M. (2010). What drives media slant? Evidence from US daily newspapers. Econometrica, 78(1), 35-71. doi:10.3386/wl2707 Guess, A., Nyhan, B., Sc Reifler, |. (2018). Selective exposure to misinformation: Evidence from the consumption offake news during the 2016 US presidential campaign. Retrieved from www.dartmouth.edu/~nyhan/fake-news-2016.pdf Haenschen, K. (2019). Self-reported versus digitally recorded: Measuring political activity on Facebook. Social Science Computer Review, http://doi.org/10.1177/0894439318813586 Huckfeldt, R., Johnson, P. E., & Sprague, |. (2002). Political environments, political dynamics, and the survival of disagreement. Journal of Politics, 64(1), 1—21. doi:10.1111/1468-2508.00115 Iyengar, S., Sc Hahn, K. S. (2009). Red media, blue media: Evidence of ideological selectivity in media use. Journal of Communication, 59(1), 19-39. doi:10.1111/j.l460- 2466.2008.01402.x Jacobson, S., Myung, E., Sc Johnson, S. L. (2016). Open media or echo chamber: The use of links in audience discussions on the Facebook pages of partisan news organizations. Information, Communication & Society, 19(7), 875-891. doi:10.1080/1369118X. 2015.1064461 Jungherr, A., & Jurgens, P. (2013). Forecasting the pulse: How deviations from regular patterns in online data can identify offline phenomena. Internet Research, 25(5), 589-607. doi:10.1108/IntR-06-2012-0115 Larcinese, V, Puglisi, R., Sc Snyder Jr, J. M. (2011). Partisan bias in economic news: Evidence on the agenda-setting behavior of US newspapers. Journal of Public Economics, 95(9-10), 1178-1189. doi:10.3386/wl3378 Messing, S., Sc Westwood, S. J. (2014). Selective exposure in the age of social media: Endorsements trump partisan source affiliation when selecting news online. Communication Research, 41(8), 1042-1063. doi: 10.1177/0093650212466406 Mitchell, A., Gottfried, J., Kiley, J., & Matsa, К. E. (2014, October 21). Political polarization and media habits. Retrieved from www.journalism.org/2014/10/21/political- polarization-media-habits/ Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. New York, NY: Penguin. Prior, M. (2007). Post-broadcast democracy: How media choice increases inequality in political involvement and polarizes elections. Cambridge: Cambridge University Press. Schmidt, A. L., Zollo, F., Del Vicario, M., Bessi, A., Scala, A., Caldarelli, G., . . . Quat- trociocchi, W. (2017). Anatomy of news consumption on Facebook. Proceedings of the National Academy of Sciences, 114(12), 3035-3039. doi:10.1073/pnas.l617052114 Silverman, C. (2016, November 16). This analysis shows how viral fake election news stories outperformed real news on Facebook. BuzzFeed News. Retrieved from www. buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed- real-news-on-facebook Stroud, N. ). (2008). Media use and political predispositions: Revisiting the concept of selective exposure. Political Behavior, 50(3), 341-366. doi: 10.1007/sl 1109-007-9050-9 Stroud, N. ). (2010). Polarization and partisan selective exposure. Journal of Communication, 60(3), 556-576. doi: 10.1111/j. 1460-2466.2010.01497.x Sunstein, C. R. (2007). Republic.com 2.0. Princeton, NJ: Princeton University Press. Thorson, K., Cotter, K., Medeiros, M., & Рак, C. (2018, August). Digital traces of political interest and exposure to political content on Facebook. Presented at the annual meeting of the American Political Science Association, Boston, MA. Thorson, K., tk Wells, C. (2015). Curated flows: A framework for mapping media exposure in the digital age. Communication Theory, 26(3), 309-328. doi:10.11 ll/comt.12087 Wells, C., & Thorson, K. (2017). Combining big data and survey techniques to model effects of political content flows in Facebook. Social Science Computer Review, 45(1), 33-52. doi:10.1177/0894439315609528 Zuiderveen Borgesius, F., Trilling, D., Moeller, J., Bodo, B., de Vreese, С. H., Sc Hel- berger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1), 1-16. doi: 10.14763/2016.1.401 12 |
<< | CONTENTS | >> |
---|
Related topics |