Swedish teenagers’ difficulties and abilities to determine digital news credibility

Thomas Nygren 1  and Mona Guath 2
  • 1 Department of Education, Uppsala University, Sweden
  • 2 Department of Psychology, Uppsala University, Sweden
Thomas Nygren and Mona Guath

Abstract

In this study we investigate the abilities to determine the credibility of digital news among 483 teenagers. Using an online survey with a performance test we assess to what extent teenagers are able to determine the credibility of different sources, evaluate credible and biased uses of evidence, and corroborate information. Many respondents fail to identify the credibility of false, biased and vetted news. Respondents who value the importance of credible news seem to hold a mindset helping them to determine credibility better than other respondents. In contrast, respondents self-reporting to be good at searching information online and who find information online trustworthy are not very good at civic online reasoning. Our findings, which may be linked to theories of disciplinary literacy, science curiosity and overconfidence, provide a basis for further research of how to better understand and support civic online reasoning in classrooms and society.

Introduction

New media and modern journalism can, in many ways, facilitate the spread of exaggerations and lies (Del Vicario et al., 2016; Silverman, 2015). The viral spread of disinformation may foster mistrust and disconnect people (Vosoughi et al., 2018; Wardle & Derakhshan, 2017). However, digital media may also connect people across cultural and ideological borders and digital news media is pivotal to stimulate conversations and active citizenry, not least among young citizens (e.g. Carlsson, 2018; Ekström et al., 2014; Lee et al., 2013). Today, digital news from established news media and other sources are mixed and shared in social media, making it difficult to assess credibility (Fletcher & Park, 2017). This implies new requirements for both readers and society (Flanagin & Metzger, 2008; McGrew et al., 2017; McGrew et al., 2018; Silverman, 2015).

Scholars argue that teaching and learning to use online news in critical and constructive ways are absolutely essential to informed and engaged citizenship (e.g. Kahne & Bowyer, 2017; Wardle & Derakhshan, 2017). A central part in active citizenry in a digital world is civic online reasoning, defined as “the ability to effectively search for, evaluate, and verify social and political information online” (McGrew et al., 2018: 1). A pivotal democratic and educational challenge is to enable citizens to distinguish between credible, biased and fake information (Carlsson, 2018; Lazer et al., 2018; Wardle & Derakhshan, 2017). The importance of using online information in critical and constructive ways is recognized both nationally and internationally (EU, 2006; OECD, 2015; Skolverket, 2017; UNESCO, 2011). Specifically, civic online reasoning permits citizens to engage with political and social topics in critical and constructive ways. It has, however, proven to be quite a challenge to implement digital tools and digital literacy in education and we need to better understand the abilities and inabilities of teenagers to navigate news online.

The aim of the current study was to investigate Swedish teenagers’ ability to assess the credibility of online news and how this relates to personal characteristics and beliefs, which in turn will enable us to suggest educational measures to support and scaffold young people’s abilities to assess online information.

The challenges of civic online reasoning

In order to navigate online information in constructive ways, people seem to need a mix of content knowledge and digital skills. Scholars have carefully outlined different aspects of disciplinary literacy, both in theory and practice. For instance, theories of disciplinary thinking, stemming from the critical literacy of history, underpin findings of how experts and novices make sense of texts and evaluate information in many disciplines (Shanahan et al., 2011; Wineburg, 1991Wineburg, 1998). This perspective on disciplinary literacy has also been valuable in studies of, for instance, pupils’ critical thinking across disciplines, the use of digital databases and media narratives in human rights education (Nygren, 2014; Nygren et al., 2018; Nygren & Johnsrud, 2018; Nygren & Vikström, 2013).

However, theories and methods, which have proven useful to understand and promote intellectual skills within academic disciplines, have only been used to a limited extent to study what students do when they encounter information online. Recent research indicates that people of all ages and education, also elite students and professors, may struggle to understand trustworthiness when facing online news (McGrew et al., 2017, 2018; Wineburg & McGrew, 2017). Although historians are experts at sourcing, close-reading, and corroborating information from the past, they are not necessarily experts on sourcing and corroboration in contemporary digital environments. Professors of history are experts of historical literacy, but this skill seems to be a disciplinary literacy that differs from that of fact-checkers (Wineburg & McGrew, 2017).

Drawing on these recent findings, scholars underscore the importance of updated heuristics when navigating information like professional fact-checkers do (McGrew et al., 2017, 2018; Nygren, 2018; Wineburg & McGrew, 2017). Specifically, fact-checkers note the source of information, scrutinize the evidence of the information, and corroborate information with other online sources in updated ways. This indicates a special type of disciplinary literacy that is perhaps linked to their journalistic education and experience from working with feeds of fake, biased and credible information (Wineburg & McGrew, 2017). People without this disciplinary literacy also use cues and heuristics (intellectual rule of thumb) to estimate the credibility in online environments. The heuristics may be helpful in some situations, but it is not obvious that intellectual rules of thumb from other disciplines facilitate educated people to separate false and biased news from authentic information regarding debated topics (e.g. Wineburg & McGrew, 2017). Given that critical thinking among pupils seems to be primarily a subject-specific disciplinary literacy (Nygren et al., 2018), we need to consider the possibility that pupils’ ability to evaluate digital news may be part of a disciplinary literacy, which has not been specifically addressed in school.

It is evident that this is a special challenge for education within the international framework of media and information literacy (UNESCO, 2011). Research concepts such as digital literacy, digital critical literacy, media literacy and digital competence have been used to describe many aspects necessary to access, evaluate, analyse and create information online. Clearly, it is a great challenge, both in theory and practice, for citizens to navigate online information in critical and constructive ways (Hatlevik & Christophersen, 2013; Hatlevik et al., 2015; Hinrichsen & Coombs, 2014; Kahne & Bowyer, 2017; Kirschner & De Bruyckere, 2017; Kirschner & van Merriënboer, 2013; Livingstone, 2004; van Laar et al., 2017).

We perceive civic online reasoning as a limited but central part of the complex field of media and information literacy. Civic online reasoning includes knowledge, skills and attitudes found in the social sciences and journalism in a digital world (McGrew et al., 2017, 2018). A pivotal part of civic online reasoning is the ability to determine the credibility of true, false and biased news. With an abundance of more or less biased information citizens must be able to make judgments about the credibility of that information. Citizens need to be able to identify and determine whether they trust different sources of information. Everybody needs, for instance, to be able to discern whether information is vetted by a credible journalist or comes from a source with a commercial interest (McGrew et al., 2017, 2018). Further, citizens must be able to see how texts and images based upon scientific evidence are more reliable than biased information, intended to manipulate their opinions (Wineburg & McGrew, 2017). Such skills enable citizens to engage with political and social topics in critical and constructive ways. Thus, civic online reasoning is a central part of a civic literacy, necessary for all citizens to better understand and participate in democratic conversations and decision-making in a digital world.

But this is a complicated matter and we note that prior beliefs, coherence of the message, and cognitive ability may affect people’s ability to assess the credibility of information (Lewandowsky et al., 2012). This is true not least in digital environments where, for instance, professional design, functionality, and manipulative strategies may seduce users (Francke et al., 2011; Hilligoss & Rieh, 2008; Metzger et al., 2010). Moreover, research in cognitive psychology points at difficulties in processing information correctly when it is presented sequentially, as is the case on a webpage, (Basu & Savani, 2017) and that processing is affected by the framing of the information, especially in fast decisions (Guo et al., 2017) and when the information triggers emotion (Pachur et al., 2014).

It has also been noted that attitudes to news, scientific curiosity and socio-economic factors may influence how people navigate new information, facts and digital environments (Hatlevik et al., 2015; Kahan et al., 2017; Strömbäck et al., 2013). Today, researchers find a second-level digital divide where productive ways of using digital media seem to mirror levels of education and societal inequalities (Deursen & Dijk, 2014; Hargittai, 2001Hargittai, 2010; Hatlevik & Christophersen, 2013; Hatlevik et al., 2015; Min, 2010). Hence, we need to map out challenges and potentials for more people to navigate better in a digital news world.

Finally, recent research indicates that pupils in Sweden may be quite good at finding and sharing credible news online. Most pupils seem to find and share news from established media written and vetted by journalists (Nygren et al., 2018). When young people in Sweden self-rate their abilities to determine credibility online in questionnaires, most say that they are good at this (Davidsson & Thoresson, 2017) and approximately 80 per cent of the students in secondary and upper-secondary schools self-rate themselves as good or very good at fact-checking online (Skolverket, 2016). However, it has not previously been tested, in a Swedish context, to what extent Swedish pupils are able to determine the credibility of the source, the evidence presented and corroborate trustworthy and biased information. To better understand how to support civic online reasoning in education we need to delineate pupils’ abilities and inabilities to determine the credibility of digital news – credible, biased and false.

Method and design

To assess the abilities of pupils to determine credibility of online news, researchers in education and psychology designed and piloted, in close collaboration with in-service teachers, an online survey with test items. Ideas for test items, the design of the survey, and assessment came from researchers and teachers who together scrutinized the items in relation to theories of civic online reasoning (McGrew et al., 2017). To safeguard validity and reliability we designed items in line with previous research1 (McGrew et al., 2017, 2018) and each category of civic online reasoning had at least two test items. Test items were designed at multiple meetings and tested in two different versions and we collected responses and comments from approximately 100 pupils in the pilot. The pilot highlighted difficulties in navigating some items, which were deleted or redesigned. After piloting and updating the questions and test items, the survey was distributed via teachers to 483 pupils, age 16–19. Participation was voluntary, with informed consent from participants, and 35 pupils chose not to participate, thus the response rate was 93 per cent. No traceable data were collected, in line with ethical guidelines.2 Our data come from five upper-secondary schools three separate cities. The sample is non-random and collected by teachers interested in signing up their classes to participate. Due to ethical considerations we cannot track data in an exact way; however, we find that the distribution of programmes to some extent echoes the national distribution of pupils in theoretical programmes in Sweden. The pupils responding to the survey studied programmes of social science (48%), aesthetics (23%), science (16%) and economics (9%). In Sweden, 2017, circa 30 per cent studied social science, 11 per cent aesthetics, 22 per cent science and 20 per cent economics (SiRiS). Our study does not include pupils in vocational training. One-third (34%) of the students stated that they speak a second language at home.3 Sixty per cent of the respondents described themselves as girls, 35 per cent as boys, 3 per cent as other identity and 2 per cent chose not to present their gender.

The survey contained questions regarding the participants’ background, education, political party sympathies, media habits, attitudes to information and some questions from previous questionnaires conducted by Skolverket (2016) and Internetstiftelsen (Davidsson & Thoresson, 2017), in combination with multiple-choice questions. The questions relating to attitudes to information asked pupils to: (1) self-rate their ability to find information online; (2) self-rate their fact-checking ability; (3) rate how reliable information on the Internet is; and (4) rate how much source critical evaluation [källkritik] they have had in school. They were also asked to rate the importance of having access to credible news. All self-reported variables were on a 5-point scale, with the exception of source critical evaluation in education, which was on a 10-point scale (see Table 1 for a more detailed account).

Table 1

Overview of questions and measured constructs in the online survey

Reported and measured constructQuestion
Background variablesGender (girl/boy/other identity)

Upper-secondary school programme (year & programme)

Language at home (foreign language/Swedish)

Political party sympathies (1 or more parties)4
Self-reported categoriesItems in self-report
Preferred newsPreferred news format(s) (e.g. paper news, radio etc.)

Preferred news source (s) (e.g. local newspaper, evening paper etc.)
Self-rated abilitiesSource criticism on the Internet (fact-checking ability)

Finding information on the Internet (search ability)
Information credibilityImportance of news credibility (credibility importance)

Reliability of information on the internet (info reliability)
Sourcing in educationSource critical evaluation in education (sourcing in education)
Categories of civic online reasoningTest-items
Detecting sponsored material (sourcing)Screenshot from evening paper 1 (Aftonbladet)

Screenshot from evening paper 2 (Expressen)

Screenshot from IT journal (Techworld)
Comparing articles (corroboration)Two articles about weight loss (weight loss)

Two articles about the government’s policy on racism (racism)
Scrutinizing comments and images (evidence)Manipulated photograph of a smoking girl (smoking)

Article about the government’s energy goals (energy goals)

False information about climate change (climate change)

Reader’s comment on incomes (income)

Manipulated photograph on daisies in Fukushima (Fukushima)

Comment: The measured constructs refer to abilities that have been identified as important for the detection of false news online. The questions are referred to in the text with the names in parentheses.

The assessed variables are native advertisements, unknown commenters, and scientific evidence. Participants were asked to identify the source of information on screenshots from three online newspapers (Aftonbladet, Expressen and Techworld) and distinguish between information designed to influence buyers (advertisements) from information designed to inform (news).

We asked participants to compare the credibility of an article based upon current research about weight loss with an article about a surgeon interested in selling surgery for weight loss, to test their abilities to corroborate information (weight loss). We also asked them to corroborate the credibility of a balanced text from public service with a right-wing text on the government’s new policies regarding hate crime (racism).

To test their abilities to evaluate evidence we asked them to rate the credibility of three articles: a credible news article about the government’s energy goals (credible); a reader’s subjective comment on income differences (biased); and an article from a climate change denier (false). We also asked them to evaluate whether manipulated images could be used as evidence (smoking and Fukushima).

The variables reflect three basic skills for assessing credibility of information on the Internet: 1) sourcing – identify where the news come from; 2) corroborate information – what other sources say about the news; 3) evidence – evaluate the presented evidence (see Table 1). To assess the skills three different tasks with multiple test items were used: 1) detecting sponsored material in newspapers; 2) comparing articles with credible and biased information; and 3) scrutinizing credible and biased comments and authentic and manipulated images.

Results

Self-rated abilities and perceptions of information credibility

Pupils self-reported that they were quite competent at finding and evaluating online information: 68 per cent rated their fact-checking ability as good (55%) or very good (13%) and 79 per cent rated their search ability as good (57%) or very good (22%). The self-rated abilities were noted on a scale from 1–5: fact-checking ability (M = 3.97, SD = 0.748) and search ability (M = 3.97, SD = 0.748).

On average, most pupils rated it important to have access to credible news (M = 3.98, SD = 1.09) and found information on the Internet not to be very trustworthy (M = 2.88, SD = 0.758). Most students reported that they have had quite a lot of practice in school in critically scrutinizing sources, M = 7.25 of 10, SD = 2.03.

Performance and self-reported abilities

To investigate the performance we first computed the total number of correct answers for all question items that could be classified as correct or incorrect, where the questions with embedded advertisements (see Table 1) were coded as correct if all answers were correct. The mean number of correct answers from a maximum of 8 for all participants was M = 3.47 (SD = 1.77). The items that the pupils found most difficult were Aftonbladet (12% correct), Expressen (21 % correct), Techworld (33% correct), racism (43%) and weight loss (53%).

While most pupils rate themselves as quite skilled at finding and scrutinizing online information, most of them (88%) could not separate news from advertisements in Sweden’s most-read newspaper. They performed better at debunking manipulated images: 75 per cent and 81 per cent identifing manipulated images as poor evidence. Still, a majority struggled to separate balanced from biased information.

Analysis of results

In order to relate performance to background, variables and self-rated abilities, we conducted a number of regressions. For detecting sponsored material (e.g. Aftonbladet, Techworld and Expressen), we performed Poisson regressions, with the number of correct answers as dependent variable, and programme, sourcing in school, credibility importance, internet info reliability, fact-checking ability, language spoken at home, party score, search ability and gender as predictor variables.5 The results describe the probability for the number of correct items; for instance, a positive coefficient indicates that the predictor increases the probability of number of correct items, and inversely, a negative coefficient indicates that the predictor decreases the probability of number of correct items. For detailed information on regression coefficients and model fit we refer to Appendix A, Table A1.

For racism, weight loss, income, smoking, and Fukushima we conducted logistic regressions, with correct/incorrect as dependent variable and the same predictor variables as in the Poisson regressions. The results describe the probability for a correct answer, hence a positive coefficient indicates that the predictor variable increases the probability for a correct answer, and inversely, a negative coefficient indicates that the predictor variable decreases the probability for a correct answer. For detailed information on the regression coefficients and model fit we refer to Appendix A, Tables A6A9. Finally, for analysing media habits, we conducted Poisson regressions, with the incidence for each participant of a specific media source or media format as dependent variable, and form, sourcing in school, search ability, credibility importance, and party score as predictor variables. The results describe the probability of indicating a specific media source/format, where a positive coefficient indicates that the predictor variable increases the probability for the incidence of a certain source/format, whereas a negative coefficient indicates that the predictor variable decreases the probability. For detailed information on the regression coefficients and model fit we refer to Appendix A, Tables A2A5.

Detecting sponsored materials in relation to background and self-rated variables

There were no significant associations between self-rated abilities and background variables for the number of correct items for detecting sponsored material or news articles for either Aftonbladet or Techworld. However, for Expressen, there was a significant increased probability for the number of correct answers when speaking Swedish at home.

Comparing articles in relation to background and self-rated variables

Less than half of the pupils (44%) identified a text from Swedish public radio as more credible when asked to corroborate this text with a right-wing populist text from Fria Tider. The texts, with source information deleted, described the government’s new policies regarding hate crime. Pupils rated the texts as equally credible (35%) or rated the right-wing text as more credible (22%). Thus, identifying racism in a biased text was hard for many pupils.

For identifying racism, there was a significant increased probability for a correct answer when being in the aesthetics programme compared with the social science programme, being a girl as compared with indicating other identity, and a marginally significant increased probability for a correct answer for a unit increase in sourcing in school. Even if this is a politically charged topic we did not find a link to the pupils’ self-rated political orientation.

Many pupils believed in the authority of a surgeon selling surgeries when asked to compare an article in which a surgeon states that the only option for weight loss is surgery with an article reporting the latest research on weight loss, diets and exercise. Forty-seven per cent of the pupils found the interview with the surgeon to be more credible than the article reporting research findings. Due to large residual variance, we could, unfortunately, not specify the regression model for weight loss.

Scrutinizing comments and images in relation to background and self-rated variables

The overall high self-reported abilities of finding and evaluating online information can be contrasted with the mean credibility ratings for credible, biased, and false information on a scale from 1–10. The false information on climate change was given a mean credibility rating of 6.02 (SD = 2.25), while a balanced comment from Greenpeace on the government’s energy goals was rated as 4.59 (SD = 2.13) and a reader’s comment on income distribution was 3.99 (SD = 1.85). Hence, the false news article denying climate change was rated as the most credible article, by pupils rating themselves at good or very good at searching and evaluating information.

A correct identification of a reader’s comment on income distribution (income) as poor evidence was associated with a higher rating on credibility importance (illustrated in Figure 1), whereas a higher rating on internet info reliability was associated with a decreased probability of a correct answer.

Figure 1
Figure 1

Predicted probabilities of a correct answer on the comment on incomes as a function of internet info reliability and credibility importance

Comment: The predicted probabilities of a correct answer on the comment on incomes as a function of internet info reliability (x-axis) and credibility importance (legend). The higher the ratings on credibility importance, the higher the probability of a correct answer, whereas a higher rating on internet info reliability was associated with a lower probability of a correct answer.

Citation: Nordicom Review 40, 1; 10.2478/nor-2019-0002

As mentioned above, many pupils were able to debunk manipulated images. In the case of identifying the manipulated image of a smoking girl as poor evidence, a higher rating of fact-checking ability was associated with a lower probability of a correct answer and a higher rating on credibility importance was associated with a marginally significant increased probability of a correct answer. For Fukushima, being in the aesthetics programme compared with the social science programme was associated with a higher probability of answering correctly.

To sum up, a higher rating on credibility importance was associated with an increased probability of correct answers. The inverse is true for self-reported fact-checking ability and internet info reliability – here a higher rating was associated with a decreased probability of a correct answer. Studying in the aesthetics programme, speaking Swedish at home were also associated with correct answers.

News habits, background variables and self-rated abilities

Key components to high performance are, hence, high ratings on credibility importance and low ratings on internet info reliability and fact-checking ability, and to shed some light on the concept we investigated the youths’ media habits. As previously described, we regressed the incidence of each media source (e.g. local newspaper, public service radio, etc.) and media format (e.g. computer, radio, etc.)6 with background variables and self-rated abilities (see Table 1 for a complete list). For the media sources, one unit of increase in fact-checking ability was associated with a decreased probability of indicating radio format, whereas one unit of increase in credibility importance was associated with an increased probability of indicating radio format. For news sources, a unit increase in credibility importance marginally increased the probability of indicating public service radio, whereas a unit increase in fact-checking ability was associated with a decreased probability of indicating public service radio. For public service TV, a unit increase in credibility importance and speaking Swedish at home were associated with an increased probability of indicating this medium. Finally, speaking a foreign language at home was associated with an increased probability of indicating international news as a news source.

In sum, credibility importance and fact-checking ability are strong predictors for preferred media format and source, the same variables that are associated with performance on the test items. The data did not permit a direct analysis of the association between media format/source and performance on the tested abilities. Hence, we cannot say anything about how credibility importance, fact-checking ability and media format/source are linked to performance; further investigation is needed to uncover how these variables are related to each other.

Concluding discussion

In this paper we present three findings from our survey of pupils’ abilities and problems in assessing credibility in online news: first, the performance on all items was relatively poor – most pupils got less than half of the items correct; second, the credibility ratings on three articles, one credible, one false news and one reader’s comment were relatively high and did not differ much; and, last, but not least, regressions showed that self-rated importance of credibility of news and being in the aesthetics programme were associated with better performance, whereas higher self-reported ratings on fact-checking ability and reliability of news on the internet were associated with a worse performance. Our findings are in line with previous research on civic online reasoning (McGrew et al., 2017, 2018) and we add new dimensions of mindsets and background variables to this previous research, highlighting the complexity of evaluating digital news.

Although pupils in Sweden see themselves as good or great at searching and evaluating online information (Davidsson & Thoresson, 2017; Skolverket, 2016), most pupils in our performance test (88%) struggled to separate news from ads in a common digital newspaper. Importantly, claiming to be good at finding information online is associated with the opposite. Our findings indicate that self-reported surveys regarding media and information skills do not provide us with a better understanding of people’s actual abilities. Self-reported skills of fact-checking and search ability are inversely associated with performance.

Further, our findings show that pupils, like other people, have a hard time determining the trustworthiness of credible, biased and false information online. Pupils may be good at finding and sharing credible news (Nygren et al., 2019) but they are not very skilled at determining credibility in critical and constructive ways. Teaching students to find and use credible sources may be easier than teaching them to scrutinize digital information. The fact that pupils are not very good at determining credibility highlights the importance of access to credible news.

Our findings indicate the existence of subgroups, which in turn implies that there is a digital divide between the young who are skilled at determining credibility, and those who are not. In the group with great problems determining credibility we find pupils who see themselves as skilled at searching and fact-checking online information, and searching for information online, and do not rate access to credible news as important. A speculative account of the low-performing pupils’ attitudes is that this reflects a mindset of overconfidence and ignorance, enhancing confirmation bias. The fact that they do value access to credible news may indicate that the pupils are “news-avoiders” who do not read or seek credible news, and therefore cannot differentiate between true, biased, and fake information. This reasoning is supported by research showing that lack of knowledge in a domain results in overconfidence in one’s own ability and an incapacity to judge others’ performance. A vast amount of research in the field of overconfidence shows an overestimation of performance among incompetent performers and underconfidence in high performers, called the Kruger-Dunning effect (e.g. Kruger & Dunning, 1999; Dunning et al., 2003; Dunning, et al., 2004; Kruger & Dunning, 2002; Ehrlinger et al., 2008). The effect has been replicated in different areas, from logical reasoning and grammar competence to ratings of humour and knowledge of new financial markets. The reason for overconfident ratings among low performers has been attributed to their lack of metacognitive skills; that is, since they have no competence in the area they are unable to gauge their own as well as others performance on the task.

In contrast, pupils doing well on our test seem to have a mindset of scientific curiosity and an openness to consider biases, including their own (Kahan et al., 2017). The fact that they find it important to consume credible news from trustworthy sources may indicate that pupils in this group are aware that there is a lot to learn from news and people with better knowledge on different topics. Further, our findings indicate that there is, possibly, a gap between news-seekers and news-avoiders (Strömbäck et al., 2013). If this is the case, there might be a link between news habits and skills at determining credibility, which may be associated with socio-cultural patterns, separating young people in society (Lindell & Hovden, 2018). At present, we do not know in any detail what characterizes individuals who rate credibility importance highly. Nor do we know what characteristics are embedded in the problematic high ratings of fact-checking ability and reliability of information on the internet. In any case, education can play a vital role in bridging the gaps. The fact that pupils from the aesthetics programme are better than other pupils at debunking the racist narrative of Fria Tider and classifying the distorted flowers in Fukushima as unreliable, highlights how a special focus in education may support various aspects of civic online reasoning. The finding also raises questions regarding how aesthetic ability relates to civic online reasoning.

In sum, our findings indicate that teaching and learning source critical evaluation is important to support a critical and constructive treatment of digital news. Civic online reasoning may be linked to appreciating the importance of reliable information and understanding the difficulties associated with finding and evaluating online information. We understand this as a call to make a more detailed investigation of how education may support this reflective, humble and curious approach to news, as well as how to best support students to become active citizens in a digital world.

Acknowledgements

This study was funded by Vinnova. We are very grateful for all the support from teachers and students making this study possible. We would also like to direct a special thanks to Stanford History and Education Group and very special thanks to Jenny Folkeryd, Ebba Elwin, Kerstin Ekholm and Maria Lindberg for valuable input in the process.

References

  • Basu, S. & Savani, K. (2017). Choosing one at a time? Presenting options simultaneously helps people make more optimal decisions than presenting options sequentially. Organizational Behavior and Human Decision Processes, 139: 76–91.

    • Crossref
    • Export Citation
  • Carlsson, U. (2018). Medie-och informationskunnighet (MIK) i den digitala tidsåldern: En demokratifråga-Kartläggning, analys, reflektioner [Media information competence (MIK) in the digital age: A question of democracy - survey, analysis, and reflection]. Gothenburg: Nordicom.

  • Davidsson, P. & Thoresson, A. (2017). Svenskarna och internet 2017: Undersökning om svenskarnas internetvanor [The Swedes and Internet 2017: Survey on Swedes’ Internet habits. IIS, Internetstiftelsen i Sverige.

  • Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., . . . Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the USA, 113(3): 554–559. doi:http://doi.org/10.1073/pnas.1517441113

    • Crossref
    • Export Citation
  • Deursen, A. J. v. & Dijk, J. A. v. (2014). The digital divide shifts to differences in usage. New Media & Society, 16(3): 507–526. doi:http://doi.org/10.1177/1461444813487959

    • Crossref
    • Export Citation
  • Dunning, D., Heath, D. & Suls, J. M. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5(3): 69–106.

    • Crossref
    • Export Citation
  • Dunning, D., Johnson, K., Ehrlinger, J. & Kruger, J. (2003). Why people fail to recognize their own competence. Current Directions in Psychological Science, 12(3): 83–87.

    • Crossref
    • Export Citation
  • Ekström, M., Olsson, T. & Shehata, A. (2014). Spaces for public orientation? Longitudinal effects of Internet use in adolescence. Information, Communication & Society, 17(2): 168–183.

    • Crossref
    • Export Citation
  • Ehrlinger, J., Johnson, K., Banner, M., Dunning, D. & Kruger, J. (2008). Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent. Organisational Behaviour and Human Decision Processes, 105(1): 98–121.

    • Crossref
    • Export Citation
  • EU (2006). Recommendation of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning. Brussels: Official Journal of the European Union, 30(12).

  • Flanagin, A. J. & Metzger, M. J. (2008). Digital media and youth: Unparalleled opportunity and unprecedented responsibility. Digital Media, Youth, and Credibility, 5–27. doi:http://doi.org/10.1162/dmal.9780262562324.005

  • Fletcher, R. & Park, S. (2017). The impact of trust in the news media on online news consumption and participation. Digital Journalism, 5(10): 1281–1299. doi:http://doi.org/10.1080/21670811.2017.1279979

    • Crossref
    • Export Citation
  • Francke, H., Sundin, O. & Limberg, L. (2011). Debating credibility: The shaping of information literacies in upper secondary school. Journal of Documentation, 67(4): 675–694.

    • Crossref
    • Export Citation
  • Guo, L., Trueblood, J. S. & Diederich, A. (2017). Thinking fast increases framing effects in risky decision making. Psychological Science, 28(4): 530–543.

    • Crossref
    • PubMed
    • Export Citation
  • Hargittai, E. (2001). Second-level digital divide: Mapping differences in people’s online skills. arXiv preprint cs/0109068

  • Hargittai, E. (2010). Digital na(t)ives? Variation in Internet skills and uses among members of the “net generation”. Sociological Inquiry, 80(1): 92–113.

    • Crossref
    • Export Citation
  • Hatlevik, O. E. & Christophersen, K.-A. (2013). Digital competence at the beginning of upper secondary school: Identifying factors explaining digital inclusion. Computers & Education, 63: 240–247. doi:httpdoi.org/10.1016/j.compedu.2012.11.015

  • Hatlevik, O. E., Guđmundsdóttir, G. B. & Loi, M. (2015). Digital diversity among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information and digital competence. Computers & Education, 81: 345–353. doi:http://doi.org/10.1016/j.compedu.2014.10.019

    • Crossref
    • Export Citation
  • Hilligoss, B. & Rieh, S. Y. (2008). Developing a unifying framework of credibility assessment: Construct, heuristics, and interaction in context. Information Processing & Management, 44(4): 1467–1484.

    • Crossref
    • Export Citation
  • Hinrichsen, J. & Coombs, A. (2014). The five resources of critical digital literacy: A framework for curriculum integration. Research in Learning Technology, 21(21334) doi:http://doi.org/10.3402/rlt.v21.21334

  • Kahan, D. M., Landrum, A., Carpenter, K., Helft, L. & Hall Jamieson, K. (2017). Science curiosity and political information processing. Political Psychology, 38(S1): 179–199. doi:http://doi.org/10.1111/pops.12396

    • Crossref
    • Export Citation
  • Kahne, J. & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1): 3–34.

    • Crossref
    • Export Citation
  • Kirschner, P. A. & De Bruyckere, P. (2017). The myths of the digital native and the multitasker. Teaching and Teacher Education, 67: 135–142. doi:https://doi.org/10.1016/j.tate.2017.06.001

    • Crossref
    • Export Citation
  • Kirschner, P. A. & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48(3): 169–183.

    • Crossref
    • Export Citation
  • Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6): 1121.

    • Crossref
    • PubMed
    • Export Citation
  • Kruger, J. & Dunning, D. (2002). Unskilled and unaware – but why? A reply to Krueger and Mueller. Journal of Personality and Social Psychology, 82: 189–192

    • Crossref
    • Export Citation
  • Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., . . . Rothschild, D. (2018). The science of fake news. Science, 359(6380): 1094–1096.

    • Crossref
    • PubMed
    • Export Citation
  • Lee, N.-J., Shah, D. V. & McLeod, J. M. (2013). Processes of political socialization: A communication mediation approach to youth civic engagement. Communication Research, 40(5): 669–697.

    • Crossref
    • Export Citation
  • Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N. & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3): 106–131.

    • Crossref
    • Export Citation
  • Lindell, J. & Hovden, J. F. (2018). Distinctions in the media welfare state: Audience fragmentation in post-egalitarian Sweden. Media, Culture & Society, 40(5): 639–655.

    • Crossref
    • Export Citation
  • Livingstone, S. (2004). Media literacy and the challenge of new information and communication technologies. The Communication Review, 7(1): 3–14. doi:http://doi.org/10.1080/10714420490280152

    • Crossref
    • Export Citation
  • McGrew, S., Breakstone, J., Ortega, T., Smith, M. & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 46(2): 165–193.

    • Crossref
    • Export Citation
  • McGrew, S., Ortega, T., Breakstone, J. & Wineburg, S. (2017). The challenge that’s bigger than fake news: Civic reasoning in a social media environment. American Educator, 41(3): 4.

  • Metzger, M. J., Flanagin, A. J. & Medders, R. B. (2010). Social and heuristic approaches to credibility evaluation online. Journal of Communication, 60(3): 413–439.

    • Crossref
    • Export Citation
  • Min, S-J. (2010). From the digital divide to the democratic divide: Internet skills, political interest, and the second-level digital divide in political Internet use. Journal of Information Technology & Politics, 7(1): 22–35. doi:http://doi.org/10.1080/19331680903109402

  • Nygren, T. (2014). Students writing history using traditional and digital archives. Human IT: Journal for Information Technology Studies as a Human Science, 12(3): 78–116.

  • Nygren, T. (2018). Digital källkritik i nyhetsflöden och undervisning [Digital sourcing in news feeds and teaching]. Paper presented at the Conference Medie-och informationskunnighet (MIK) i den digitala tidsåldern.

  • Nygren, T., Brounéus, F. & Svensson, G. (2019). Diversity and credibility in young people’s news feeds: A foundation for teaching and learning citizenship in a digital era. Journal of Social Science Education, forthcoming.

  • Nygren, T., Haglund, J., Samuelsson, C. R., Af Geijerstam, Å. & Prytz, J. (2018). Critical thinking in national tests across four subjects in Swedish compulsory school. Education Inquiry, 1–20. doi:https://doi.org/10.1080/20004508.2018.1475200

  • Nygren, T. & Johnsrud, B. (2018). What would Martin Luther King Jr. say? Teaching the historical and practical past to promote human rights in education. Journal of Human Rights Practice, 10(2): 287–306.

    • Crossref
    • Export Citation
  • Nygren, T. & Vikström, L. (2013). Treading old paths in new ways: Upper secondary students using a digital tool of the professional historian. Education Sciences, 3(1): 50–73.

    • Crossref
    • Export Citation
  • OECD (2015). Assessing progression in creative and critical thinking skills in education.

  • Pachur, T., Hertwig, R. & Wolkewitz, R. (2014). The affect gap in risky choice: Affect-rich outcomes attenuate attention to probability information. Decision, 1(1): 64.

    • Crossref
    • Export Citation
  • Shanahan, C., Shanahan, T. & Misischia, C. (2011). Analysis of expert readers in three disciplines. Journal of Literacy Research, 43(4): 393–429. doi:http://doi.org/10.1177/1086296x11424071

    • Crossref
    • Export Citation
  • Silverman, C. (2015). Lies, damn lies and viral content: How news websites spread (and debunk) online rumors, unverified claims, and misinformation. Tow Center for Digital Journalism, 168

  • SiRiS, National Statistics from Swedish National Agency for Education, https://www.skolverket.se/skolut-veckling/statistik/sok-statistik-om-forskola-skola-och-vuxenutbildning

  • Skolverket (2016). IT-användning och IT-kompetens i skolan. Skolverkets IT-uppföljning 2015 [IT use and IT skills in the school. The Swedish National Agency for Education’s IT follow up 2015]. Dnr: 2015:00067. Stockholm: Skolverket. Retrieved from https://www.skolverket.se/sitevision/proxy/publikationer/svid12_5dfee44715d35a5cdfa2899/55935574/wtpub/ws/skolbok/wpubext/trycksak/Blob/pdf3617.pdf?k=3617. [accessed 2019, January 12].

  • Skolverket. (2017). Få syn på digitaliseringen på gymnasial nivå/grundskolenivå [Perceiving digitalisation at a secondary/primary school level]. Wolters Kluwers.

  • Strömbäck, J., Djerf-Pierre, M. & Shehata, A. (2013). The dynamics of political interest and news media consumption: A longitudinal perspective. International Journal of Public Opinion Research, 25(4): 414–435.

    • Crossref
    • Export Citation
  • UNESCO (2011). Media and information literacy: Curriculum for teachers. Paris, France: United Nations Educational, Scientific and Cultural Organization. UNESCO. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000192971.[accessed 2019, January 9].

  • van Laar, E., van Deursen, A. J., van Dijk, J. A. & de Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computers in Human Behavior, 72: 577–588.

    • Crossref
    • Export Citation
  • Vosoughi, S., Roy, D. & Aral, S. (2018). The spread of true and false news online. Science, 359(6380): 1146–1151.

    • Crossref
    • PubMed
    • Export Citation
  • Wardle, C. & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policymaking. Council of Europe report, DGI 2017, 9.

  • Wineburg, S. (1991). On the reading of historical texts: Notes on the breach between school and academy. American Educational Research Journal, 28(3): 495–519.

    • Crossref
    • Export Citation
  • Wineburg, S. (1998). Reading Abraham Lincoln: An expert/expert study in the interpretation of historical texts. Cognitive Science, 22(3): 319–346.

    • Crossref
    • Export Citation
  • Wineburg, S. & McGrew, S. (2017, October 6). Lateral reading: Reading less and learning more when evaluating digital information. Stanford history education group working paper no. 2017-A1. Retrieved from doi:http://dx.doi.org/10.2139/ssrn.3048994

Appendix A
Table A1

Estimates of an additive poisson regression model for number of correct answers on expressen, with coefficients denoting the expected log count for a unit increase of the ordinal variables and the expected log count for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with search ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept2.2210.5094.363***
FormSecondary 1stBaseline
Secondary 2nd−0.6220.404−1.537
Secondary 3rd−0.5430.358−1.514
ProgramSSBaseline
Other−0.2930.336−0.872
EC−0.5030.400−1.256
AS−0.1110.163−0.677
NA−0.5130.394−1.300
Sourcing in school0.01860.02420.770
Credibility importance−0.001980.0469−0.042
Internet info reliability−0.04480.0590−0.759
Search ability−0.06820.0612−1.116
Language homeYesBaseline
No0.2230.09372.378 *
Party score0.003720.1640.023

SS= social sciences, Other= all other programs, EC= economic, AS= aesthetic, NA=natural sciences.

Model fit: Residual variance: 77.568 on 111 degrees of freedom. AIC: 516.01.

Significance codes: (*) .1, * <.05, **< .01, *** <.001

Table A2

Estimates of an additive poisson regression model for incidences of public service radio, with coefficients denoting the expected log count for a unit increase of the ordinal variables and the expected log count for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−2.1621.780−1.215
FormSecondary 1stBaseline
Secondary 2nd−0.9270.794−1.169
Secondary 3rd−0.2670.485−0.552
Sourcing in school0.1630.1371.193
Credibility importance0.4520.2741.652 (*)
Internet info reliability−0.06700.283−0.225
Fact-checking ability0.5650.283−1.993*
Language homeYesBaseline
No0.5540.5291.049
Party score0.1330.8920.149
GenderOther identityBaseline
Girl−0.2870.678−0.423
Boy−0.8990.772−1.164

Model fit: Residual variance: 66.952 on 113 degrees of freedom. AIC: 138.95.

Significance codes: (*) .0986, * <.05, ** <.01, *** <.001

Table A3

Estimates of an additive poisson regression model for incidences of public service tv, with coefficients denoting the expected log count for a unit increase of the ordinal variables and the expected log count for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−3.2381.268−2.553 *
FormSecondary 1stBaseline
Secondary 2nd−0.2430.445−0.584
Secondary 3rd−0.1060.324−0.744
Sourcing in school−0.06150.0841−0.731
Credibility importance0.5050.08412.595 **
Internet info reliability−0.2810.204−1.435
Fact-checking ability0.1980.2210.897
Language homeYesBaseline
No0.7280.3741.949 (*)
Party score0.2680.5060.531

Model fit: Residual variance: 74.078 on 115 degrees of freedom. AIC: 194.08.

Significance codes: (*) .0513, * <.05, ** <.01, *** <.001

Table A4

Estimates of an additive poisson regression model for incidences of international news, with coefficients denoting the expected log count for a unit increase of the ordinal variables and the expected log count for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−3.4881.914−1.822
FormSecondary 1stBaseline
Secondary 2nd−0.1100.611−0.181
Secondary 3rd0.4880.5400.904
Primary 9th−1.1381.0761−1.058
Sourcing in school−0.1350.115−1.172
Credibility importance0.4060.2661.528
Internet info reliability0.4720.3011.567
Fact-checking ability0.1080.2940.367
Language homeYesBaseline
No−0.9360.443−2.114 *
Party score0.2960.8010.369
GenderOther identityBaseline
Girl−0.3340.834−0.401
Boy−0.5000.848−0.590

Model fit: Residual variance: 71.741 on 113 degrees of freedom. AIC: 133.74.

Significance codes: (*) .0603, * <.05, ** <.01, *** <.001

Table A5

Estimates of an additive poisson regression model for incidence of radio as preferred media format, with coefficients denoting the expected log count for a unit increase of the ordinal variables and the expected log count for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−0.8681.401−0.620
FormSecondary 1stBaseline
Secondary 2nd−0.5880.577−1.019
Secondary 3rd0.1720.4040.426
Sourcing in school0.09900.1010.984
Credibility importance0.5430.2292.368 *
Internet info reliability−0.06870.248−0.276
Fact-checking ability−0.6870.224−3.060 **
Language homeYesBaseline
No−0.04530.376−0.121
Party score−0.2780.690−0.402
GenderOther identityBaseline
Girl−0.1570.585−0.268
Boy−0.6080.647−0.940

Model fit: Residual variance: 72.644 on 113 degrees of freedom. AIC: 170.64.

Significance codes: (*) .100, * <.05, ** <.01, *** <.001

Table A6

Estimates of an additive logistic regression model for correct/incorrect answer on racism, with coefficients denoting the log odds of answering correct for a unit increase of the ordinal variables and the log odds of answering correct for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−2.0053.679−0.545
FormSecondary 1stBaseline
Secondary 2nd3.1782.3651.344
Secondary 3rd0.1192.0185−0.059
ProgramSSBaseline
Other0.8581.7370.446
EC1.03552.321−1.256
AS2.8821.2802.251*
NA1.9952.3400.853
Sourcing in school0.2630.1521.729 (*)
Credibility importance−0.2950.356−0.827
Internet info reliability−0.5980.431−1.388
Fact-checking ability−0.04340.380−0.114
Language homeYesBaseline
No0.2990.6490.460
Party score−0.6581.225−0.538
GenderOther identityBaseline
Girl2.4151.2181.983*
Boy1.6521.2951.259

Model fit: Residual variance: 87.534 on 70 degrees of freedom. AIC: 117.53.

Significance codes: (*) .0837, * <.05, ** <.01, *** <.001

Table A7

Estimates of an additive logistic regression model for correct/incorrect answer on income, with coefficients denoting the log odds of answering correct for a unit increase of the ordinal variables and the log odds of answering correct for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected logSE (standard error)Z-ratio
Intercept2.1062.6090.807
Sourcing in school−0.05210.120−0.261
Credibility importance1.1000.4212.391*
Internet info reliability−1.5560.485−2.382*
Fact-checking ability0.04120.4650.009
Language homeYesBaseline
No−0.2040.755−0.271
Party score−0.6501.291−0.504

Model fit: Residual variance: 60.133 on 78 degrees of freedom. AIC: 74.133.

Significance codes: (*) .100, * <.05, ** <.01, ***<.001

Table A8

Estimates of an additive logistic regression model for correct/incorrect answer on smoking, with coefficients denoting the log odds of answering correct for a unit increase of the ordinal variables and the log odds of answering correct for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected logSE (standard error)Z-ratio
Intercept2.5792.3801.084
Sourcing in school0.1090.1460.748
Credibility importance0.5700.3441.655(*)
Internet info reliability0.2280.3720.612
Fact-checking ability−1.2800.490−2.613**
Language homeYesBaseline
No0.3450.6040.571
Party score−0.2100.977−0.215

Model fit: Residual variance: 81.230 on 78 degrees of freedom. AIC: 95.23.

Significance codes: (*) .098, * <.05, ** <.01, ***<.001

Table A9

Estimates of an additive logistic regression model for correct/incorrect answer on fukushima, with coefficients denoting the log odds of answering correct for a unit increase of the ordinal variables and the log odds of answering correct for each category compared with the baseline category for the categorical variables. Search ability was omitted due to perfect correlation with fact-checking ability

ParameterEstimate (expected log count)SE (standard error)Z-ratio
Intercept−2.08574.0026−0.521
FormSecondary 1stBaseline
Secondary 2nd2.2132.1311.038
Secondary 3rd−0.3371.614−0.209
ProgramSSBaseline
Other0.6421.5820.406
EC1.2461.9820.628
AS3.0101.1902.530*
NA3.7892.2401.691(*)
Sourcing in school0.02080.2020.103
Credibility importance0.4680.4141.131
Internet info reliability−0.1310.568−0.230
Fact-checking ability0.4100.4390.934
Language homeYesBaseline
No−1.5510.993−1.562

Model fit: Residual variance: 61.629 on 98 degrees of freedom. AIC: 85.629.

Significance codes: (*) .098, * <.05, ** <.01, ***<.001

Appendix B
Table B1

Media format and media sources that were measured in the survey. The youths indicated what format and sources they used when consuming news; they could indicate multiple formats and sources.

Media formatMedia source
Paper formatLocal news
RadioNational news
TVEvening paper
Mobile phonePublic service radio
ComputerPublic service TV
TabletInternational news
Social media

Footnotes

1

A manipulated photograph on daisies in Fukushima was also used in McGrew et al., 2018.

3

In Sweden, 2017, 20 per cent of the pupils in upper-secondary schools had a family with a foreign background (SiRiS).

4

Since participants were allowed to choose more than one party, we calculated a normalized mean score that spans from 0–1, where 0 represents the most right-winged party (SD) and 1 represents the most left-winged party (V). All parties in between those are given a score based on their position (0–7) on the spectrum (SD-V) divided by 7.

5

Not all predictor variables were used in all models: For some of the models we only used a subset, in order to get a better model fit. For a detailed specification of what variables we used in each regression we refer to Appendix A.

6

For a complete list of media format and media sources we refer to Appendix B.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • Basu, S. & Savani, K. (2017). Choosing one at a time? Presenting options simultaneously helps people make more optimal decisions than presenting options sequentially. Organizational Behavior and Human Decision Processes, 139: 76–91.

    • Crossref
    • Export Citation
  • Carlsson, U. (2018). Medie-och informationskunnighet (MIK) i den digitala tidsåldern: En demokratifråga-Kartläggning, analys, reflektioner [Media information competence (MIK) in the digital age: A question of democracy - survey, analysis, and reflection]. Gothenburg: Nordicom.

  • Davidsson, P. & Thoresson, A. (2017). Svenskarna och internet 2017: Undersökning om svenskarnas internetvanor [The Swedes and Internet 2017: Survey on Swedes’ Internet habits. IIS, Internetstiftelsen i Sverige.

  • Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., . . . Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the USA, 113(3): 554–559. doi:http://doi.org/10.1073/pnas.1517441113

    • Crossref
    • Export Citation
  • Deursen, A. J. v. & Dijk, J. A. v. (2014). The digital divide shifts to differences in usage. New Media & Society, 16(3): 507–526. doi:http://doi.org/10.1177/1461444813487959

    • Crossref
    • Export Citation
  • Dunning, D., Heath, D. & Suls, J. M. (2004). Flawed self-assessment: Implications for health, education, and the workplace. Psychological Science in the Public Interest, 5(3): 69–106.

    • Crossref
    • Export Citation
  • Dunning, D., Johnson, K., Ehrlinger, J. & Kruger, J. (2003). Why people fail to recognize their own competence. Current Directions in Psychological Science, 12(3): 83–87.

    • Crossref
    • Export Citation
  • Ekström, M., Olsson, T. & Shehata, A. (2014). Spaces for public orientation? Longitudinal effects of Internet use in adolescence. Information, Communication & Society, 17(2): 168–183.

    • Crossref
    • Export Citation
  • Ehrlinger, J., Johnson, K., Banner, M., Dunning, D. & Kruger, J. (2008). Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent. Organisational Behaviour and Human Decision Processes, 105(1): 98–121.

    • Crossref
    • Export Citation
  • EU (2006). Recommendation of the European Parliament and of the Council of 18 December 2006 on key competences for lifelong learning. Brussels: Official Journal of the European Union, 30(12).

  • Flanagin, A. J. & Metzger, M. J. (2008). Digital media and youth: Unparalleled opportunity and unprecedented responsibility. Digital Media, Youth, and Credibility, 5–27. doi:http://doi.org/10.1162/dmal.9780262562324.005

  • Fletcher, R. & Park, S. (2017). The impact of trust in the news media on online news consumption and participation. Digital Journalism, 5(10): 1281–1299. doi:http://doi.org/10.1080/21670811.2017.1279979

    • Crossref
    • Export Citation
  • Francke, H., Sundin, O. & Limberg, L. (2011). Debating credibility: The shaping of information literacies in upper secondary school. Journal of Documentation, 67(4): 675–694.

    • Crossref
    • Export Citation
  • Guo, L., Trueblood, J. S. & Diederich, A. (2017). Thinking fast increases framing effects in risky decision making. Psychological Science, 28(4): 530–543.

    • Crossref
    • PubMed
    • Export Citation
  • Hargittai, E. (2001). Second-level digital divide: Mapping differences in people’s online skills. arXiv preprint cs/0109068

  • Hargittai, E. (2010). Digital na(t)ives? Variation in Internet skills and uses among members of the “net generation”. Sociological Inquiry, 80(1): 92–113.

    • Crossref
    • Export Citation
  • Hatlevik, O. E. & Christophersen, K.-A. (2013). Digital competence at the beginning of upper secondary school: Identifying factors explaining digital inclusion. Computers & Education, 63: 240–247. doi:httpdoi.org/10.1016/j.compedu.2012.11.015

  • Hatlevik, O. E., Guđmundsdóttir, G. B. & Loi, M. (2015). Digital diversity among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information and digital competence. Computers & Education, 81: 345–353. doi:http://doi.org/10.1016/j.compedu.2014.10.019

    • Crossref
    • Export Citation
  • Hilligoss, B. & Rieh, S. Y. (2008). Developing a unifying framework of credibility assessment: Construct, heuristics, and interaction in context. Information Processing & Management, 44(4): 1467–1484.

    • Crossref
    • Export Citation
  • Hinrichsen, J. & Coombs, A. (2014). The five resources of critical digital literacy: A framework for curriculum integration. Research in Learning Technology, 21(21334) doi:http://doi.org/10.3402/rlt.v21.21334

  • Kahan, D. M., Landrum, A., Carpenter, K., Helft, L. & Hall Jamieson, K. (2017). Science curiosity and political information processing. Political Psychology, 38(S1): 179–199. doi:http://doi.org/10.1111/pops.12396

    • Crossref
    • Export Citation
  • Kahne, J. & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1): 3–34.

    • Crossref
    • Export Citation
  • Kirschner, P. A. & De Bruyckere, P. (2017). The myths of the digital native and the multitasker. Teaching and Teacher Education, 67: 135–142. doi:https://doi.org/10.1016/j.tate.2017.06.001

    • Crossref
    • Export Citation
  • Kirschner, P. A. & van Merriënboer, J. J. (2013). Do learners really know best? Urban legends in education. Educational Psychologist, 48(3): 169–183.

    • Crossref
    • Export Citation
  • Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6): 1121.

    • Crossref
    • PubMed
    • Export Citation
  • Kruger, J. & Dunning, D. (2002). Unskilled and unaware – but why? A reply to Krueger and Mueller. Journal of Personality and Social Psychology, 82: 189–192

    • Crossref
    • Export Citation
  • Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., . . . Rothschild, D. (2018). The science of fake news. Science, 359(6380): 1094–1096.

    • Crossref
    • PubMed
    • Export Citation
  • Lee, N.-J., Shah, D. V. & McLeod, J. M. (2013). Processes of political socialization: A communication mediation approach to youth civic engagement. Communication Research, 40(5): 669–697.

    • Crossref
    • Export Citation
  • Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N. & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3): 106–131.

    • Crossref
    • Export Citation
  • Lindell, J. & Hovden, J. F. (2018). Distinctions in the media welfare state: Audience fragmentation in post-egalitarian Sweden. Media, Culture & Society, 40(5): 639–655.

    • Crossref
    • Export Citation
  • Livingstone, S. (2004). Media literacy and the challenge of new information and communication technologies. The Communication Review, 7(1): 3–14. doi:http://doi.org/10.1080/10714420490280152

    • Crossref
    • Export Citation
  • McGrew, S., Breakstone, J., Ortega, T., Smith, M. & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 46(2): 165–193.

    • Crossref
    • Export Citation
  • McGrew, S., Ortega, T., Breakstone, J. & Wineburg, S. (2017). The challenge that’s bigger than fake news: Civic reasoning in a social media environment. American Educator, 41(3): 4.

  • Metzger, M. J., Flanagin, A. J. & Medders, R. B. (2010). Social and heuristic approaches to credibility evaluation online. Journal of Communication, 60(3): 413–439.

    • Crossref
    • Export Citation
  • Min, S-J. (2010). From the digital divide to the democratic divide: Internet skills, political interest, and the second-level digital divide in political Internet use. Journal of Information Technology & Politics, 7(1): 22–35. doi:http://doi.org/10.1080/19331680903109402

  • Nygren, T. (2014). Students writing history using traditional and digital archives. Human IT: Journal for Information Technology Studies as a Human Science, 12(3): 78–116.

  • Nygren, T. (2018). Digital källkritik i nyhetsflöden och undervisning [Digital sourcing in news feeds and teaching]. Paper presented at the Conference Medie-och informationskunnighet (MIK) i den digitala tidsåldern.

  • Nygren, T., Brounéus, F. & Svensson, G. (2019). Diversity and credibility in young people’s news feeds: A foundation for teaching and learning citizenship in a digital era. Journal of Social Science Education, forthcoming.

  • Nygren, T., Haglund, J., Samuelsson, C. R., Af Geijerstam, Å. & Prytz, J. (2018). Critical thinking in national tests across four subjects in Swedish compulsory school. Education Inquiry, 1–20. doi:https://doi.org/10.1080/20004508.2018.1475200

  • Nygren, T. & Johnsrud, B. (2018). What would Martin Luther King Jr. say? Teaching the historical and practical past to promote human rights in education. Journal of Human Rights Practice, 10(2): 287–306.

    • Crossref
    • Export Citation
  • Nygren, T. & Vikström, L. (2013). Treading old paths in new ways: Upper secondary students using a digital tool of the professional historian. Education Sciences, 3(1): 50–73.

    • Crossref
    • Export Citation
  • OECD (2015). Assessing progression in creative and critical thinking skills in education.

  • Pachur, T., Hertwig, R. & Wolkewitz, R. (2014). The affect gap in risky choice: Affect-rich outcomes attenuate attention to probability information. Decision, 1(1): 64.

    • Crossref
    • Export Citation
  • Shanahan, C., Shanahan, T. & Misischia, C. (2011). Analysis of expert readers in three disciplines. Journal of Literacy Research, 43(4): 393–429. doi:http://doi.org/10.1177/1086296x11424071

    • Crossref
    • Export Citation
  • Silverman, C. (2015). Lies, damn lies and viral content: How news websites spread (and debunk) online rumors, unverified claims, and misinformation. Tow Center for Digital Journalism, 168

  • SiRiS, National Statistics from Swedish National Agency for Education, https://www.skolverket.se/skolut-veckling/statistik/sok-statistik-om-forskola-skola-och-vuxenutbildning

  • Skolverket (2016). IT-användning och IT-kompetens i skolan. Skolverkets IT-uppföljning 2015 [IT use and IT skills in the school. The Swedish National Agency for Education’s IT follow up 2015]. Dnr: 2015:00067. Stockholm: Skolverket. Retrieved from https://www.skolverket.se/sitevision/proxy/publikationer/svid12_5dfee44715d35a5cdfa2899/55935574/wtpub/ws/skolbok/wpubext/trycksak/Blob/pdf3617.pdf?k=3617. [accessed 2019, January 12].

  • Skolverket. (2017). Få syn på digitaliseringen på gymnasial nivå/grundskolenivå [Perceiving digitalisation at a secondary/primary school level]. Wolters Kluwers.

  • Strömbäck, J., Djerf-Pierre, M. & Shehata, A. (2013). The dynamics of political interest and news media consumption: A longitudinal perspective. International Journal of Public Opinion Research, 25(4): 414–435.

    • Crossref
    • Export Citation
  • UNESCO (2011). Media and information literacy: Curriculum for teachers. Paris, France: United Nations Educational, Scientific and Cultural Organization. UNESCO. Retrieved from https://unesdoc.unesco.org/ark:/48223/pf0000192971.[accessed 2019, January 9].

  • van Laar, E., van Deursen, A. J., van Dijk, J. A. & de Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computers in Human Behavior, 72: 577–588.

    • Crossref
    • Export Citation
  • Vosoughi, S., Roy, D. & Aral, S. (2018). The spread of true and false news online. Science, 359(6380): 1146–1151.

    • Crossref
    • PubMed
    • Export Citation
  • Wardle, C. & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policymaking. Council of Europe report, DGI 2017, 9.

  • Wineburg, S. (1991). On the reading of historical texts: Notes on the breach between school and academy. American Educational Research Journal, 28(3): 495–519.

    • Crossref
    • Export Citation
  • Wineburg, S. (1998). Reading Abraham Lincoln: An expert/expert study in the interpretation of historical texts. Cognitive Science, 22(3): 319–346.

    • Crossref
    • Export Citation
  • Wineburg, S. & McGrew, S. (2017, October 6). Lateral reading: Reading less and learning more when evaluating digital information. Stanford history education group working paper no. 2017-A1. Retrieved from doi:http://dx.doi.org/10.2139/ssrn.3048994

OPEN ACCESS

Journal + Issues

Search

  • View in gallery

    Predicted probabilities of a correct answer on the comment on incomes as a function of internet info reliability and credibility importance

    Comment: The predicted probabilities of a correct answer on the comment on incomes as a function of internet info reliability (x-axis) and credibility importance (legend). The higher the ratings on credibility importance, the higher the probability of a correct answer, whereas a higher rating on internet info reliability was associated with a lower probability of a correct answer.