Teenage pupils’ searching for information on the Internet
Introduction. Quantitative studies measuring skills related to the Internet are mainly based on self- reports. The aim of this paper is to study how Swedish teenagers, in their last year of compulsory school, carry out different information searching tasks on the Internet, by using a performance test. The test measures teenagers’ searches for information on the Internet from different points of view, such as medium-related and content-related aspects as well as operational and strategic aspects.
Methods. The test was performed on the Internet in an open environment, without predefined solutions. During the test, some screens were recorded to obtain information about aspects not covered by the given test answers.
Analysis. The test outcome was analysed in relation to curriculum goals, which correspond to the European framework DigComp2.1.
Results. Reading long texts to find information was the easiest part but many failed to solve assignments because they did not master the digital environment.
Conclusion. The result differed in the participating classes, which points to the importance of covering all aspects of the area when teaching, not forgetting the holistic approach.
Searching for information on the Internet is one of the most common activities in the Swedish school (National Agency for Education, 2016), and the national curriculum requires compulsory schools to ensure that pupils “can use modern technology as a tool in the search for knowledge, communication, creativity and learning” (National Agency for Education, 2011, p. 16). Qualitative studies have contributed to the understanding of the area within a school context (e.g. Enochsson, 2001; 2005; Limberg, 1998; 2014), and over the past 15 years, there has also been an increasing number of quantitative studies aimed at measuring different aspects of competencies related to information technology (e.g. Fraillon, Ainley and Schulz, 2015).
One criticism against previous research is the lack of instruments to measure digital skills in everyday life (Calvani, Fini and Ranieri, 2009). Another criticism is that many studies on digital competence only measure respondents’ self-evaluated skills (Samuelsson & Olsson 2014). One of few answers to these examples of criticism is a performance test developed in a Dutch context by van Deursen, van Dijk and Peters (2011). The test measured a particular form of digital competence through the level of separate Internet skills assessed on the basis of the respondent’s ability to carry out different tasks, in this case searching for, evaluating and using information.
The aim of this paper is to study how Swedish teenagers, in their last year of compulsory school, carry out different information searching tasks on the Internet, by using van Deursen’s et al. (2011) test, revised for a Swedish context.
Researchers agree that information literacy is not only about locating information but also about evaluating it with a critical eye and using it, preferably wisely – however wisely is defined. Focus on how to use information has been important especially in educational settings: Eisenberg and Berkowitz (1992) presented the well-known model ‘Big6’, Kuhlthau (1993) involved the users’ feelings in her model, Bruce (1997) described the seven faces of information literacy and Limberg (1998) pointed at how content affects process. All these ‘pioneers’ talk about meaningfulness and how information literacy is part of lifelong learning (e.g. Limberg, 2013; 2014; Bruce, Hughes and Somerville, 2012).
Different researchers focus on different aspects and use different concepts. Many concepts are, however, similar, involving formulating needs, finding, analysing and evaluating information and using the information (for an explanatory background, see for example Catts, 2012). The European Commission approved an updated digital competence framework in 2017, DigComp2.1, including five competence areas and detailed proficiency levels for each of them (Carretero, Vourikari and Punie, 2017). The competence area Information and data literacy includes all the above-mentioned aspects, which also means all aspects must be coordinated and form one entity (cf. Enochsson, 2005). However, the medium is not particularly stressed in this specific area, since the whole competence framework DigComp2.1 concerns digital environments. This study focuses on searching for information by using a specific digital environment - the Internet, which also adds a specific dimension, since information literacy differs depending on the medium (Forsman, 2014).
Concepts relating to the medium change according to technological developments. In the beginning, digital literacy mostly concerned the practical handling of a computer; the concept of ‘competence’ was widely used. Today there has been a shift towards literacy (e.g. Oxstrand, 2012). As the two fields come closer together, it becomes more and more difficult to separate them. Depending on the researcher’s main field, information literacy is either a part of media literacy or just the opposite. More and more, the two fields are connected in MIL, Media and Information Literacy (Wilson, Grizzle, Tuazon, Akyempong and Cheung, 2011; Forsman, 2014). Writing for UNESCO, Wilson et al. (2011) distinguish between information literacy and media literacy and divide the two into seven and five aspects, respectively. This study is carried out using one specific medium, namely the Internet. The test developed by van Deursen et al. (2011) covers both information literacy aspects described in research as well as related digital competence – in combination.
Included in the concept of information literacy, but also a research field of its own, is critical thinking. The area has evolved along two lines of thinking. One is a stage theory, where the most developed form of critical thinking is ‘good’, ‘fair’ and of higher quality than ‘evil’ critical thinking, which also exists (Elder and Paul, 1996a; 1996b). This theory assumes that there is a universal good, even if this is not explicit in Elder and Paul’s texts. In contrast, Lipman (1991) bases his ideas about critical thinking on Vygotsky’s ideas, which differ from Elder and Paul’s in the conviction that thinking can develop in many ways. What is good or evil is a result of conventions and agreements, which we cannot always be sure we are in agreement on.
The analysis of critical thinking in this study is based on Clinchy’s (1990) theory, which, in line with Lipman and Vygotsky, views thinking – or knowing, as Clinchy prefers to express it – as something that can develop along different routes. Clinchy’s (1990) theories have been used to discern the critical thinking aspect of children’s information seeking on the internet (Enochsson, 2001). Using a model based on Clinchy’s theory, it was possible to discover how girls revealed their critical thinking in relation to seeking information on the Internet. The model was simplified in Enochsson’s study, using three levels of critical thinking. However, an important result from using the model is that critical thinking cannot be separated from neither the context nor the other aspects of information literacy.
As mentioned above, concepts are used differently. Following the DigComp2.1 (Carretero et al., 2017) framework, which is used for the analyses, information (and data) literacy is used to indicate the holistic approach to searching for information. In this study, only the Internet medium is used, which is not always mentioned. DigComp2.1 uses competencies when discussing aspects of information and data literacy. Van Deursen et al. (2011) uses skills for the same purpose. These two concepts can therefore change depending on what concept is used in the referred research.
Overview of research on measurement of teenagers’ information literacy
As described in the theoretical framework, researchers focus on different aspects and existing studies including performance tests are therefore not easily comparable. Siddiq, Hatlevik, Olsen, Throndsen and Scherer (2016) present a systematic review of 36 tests in the field, the main body of these tests measure information literacy as defined by DigComp (Ferrari, 2013). This earlier version of DigComp does not differ to the more recent version in this respect. The main point from Siddiq et al. (2016) is that many of those tests are not sufficiently described regarding test quality, and there is a need to develop guidelines for reporting to enable comparisons of various outcomes. Another conclusion from this overview is also that many tests include practical matters such as searching and finding specific information, but assignments testing problem solving are limited. However, in this respect van Deursen and van Diepen’s (2013) study with secondary pupils is an exception.
All testing does not have the aim to estimate a certain level of competence, but to check for correlating variables like age, gender, socio-economic status, home background, earlier digital experiences etcetera (e.g. Fraillon, Ainley, Schulz, Friedman and Gebhardt, 2014; van Deursen and van Dijk, 2009; van Deursen and van Diepen, 2013; Hatlevik, Throndsen, Loi and Gudmundsdottir, 2018; Hatlevik and Gudmundsdottir, 2013). There are also examples of projects working on a longitudinal goal, i.e. to see how the competence within a certain population develops over time (e.g. Senkbeil, Ihme and Wittwer, 2013; ACARA, 2012; Zelman, Schmis, Avdeeva, Vasiliev and Froumin, 2011), or just to validate the instrument for future use (Huggins, Ritzhaupt and Dawson, 2014). This means that there is no interest in using a theoretical framework with a fixed standard like a curriculum, when the test persons are compared only to other test persons. In Siddiq’s et al. (2016) review, there are three tests with test persons of about the same age as in the present study and which also includes curriculum goals in their theoretical framework, but only two of these aim at assessing towards curriculum goals (Zelman et al., 2011; ACARA, 2012). ACARA have earlier results to compare with and conclude that there is an improvement in ICT literacy among younger students. Zelman et al. compare two countries with different curriculum goals, and state that what teachers focus on makes a bigger difference than the national goals.
Although the theoretical frameworks of the above-mentioned tests have much in common, the foci of the tests differ, and they all have a wider focus than only information literacy. This means that information literacy in relation to the Internet only constitutes a minor part of the tests.
Instead of testing, which is complicated, self-efficacy has been used as a measure for competence. A few of the tests above compare the test persons’ self-evaluation of their competence, and the outcome differs. Hatlevik et al. (2018) finds a complicated pattern, which differs between the 14 countries in the study. In a study among sixth-graders in Flanders, Aesaert and van Braak (2017) found that those who were the most competent could better judge their ICT competence. The less they performed, the bigger the gap between estimated competence and real ability. It is also found that girls have a tendency to underestimate their competence especially in mathematics and science (Schunk, Meece and Pintrich, 2014), but in Hatlevik’s et al. (2018) study on ICT competence, girls reported higher levels of ICT-efficacy in 10 of 14 countries.
Gross and Latham (2007; 2012) conducted studies on students in their first year of college regarding information literacy. The students estimated their competency before and after taking a validated, computer-based, multiple choice test of information literacy. Firstly, it was shown that the first-year students had a very low level of proficiency in information literacy.
Secondly, the lower the level of proficiency, the less the students could estimate their competency. The post-test estimation showed that students scoring higher could better calibrate their estimation of competency (Gross and Latham, 2012). A report that has been highlighted in this area in Sweden, based only on self-reports, showed that 87% of pupils in Swedish lower secondary school claim they are confident in their ability to search for information on the Internet (National Agency for Education, 2016).
The test developed by van Deursen et al. (2011) covers both information literacy aspects described in research as well as related digital competence. In the process of translating the test and revising it for a Swedish context, it was important that the assignments covered the main goals in the curriculum within the area of information literacy and corresponding digital skills for Swedish compulsory school, which was the case. During six months of revision in 2015–2016, the assignments were tested in small groups of teenage pupils, who did not participate in the test later. The difficulty of content and language was also discussed with teachers working with this age group.
Van Deursen et al. (2011) have constructed their assignments around the themes medium related and content related Internet skills respectively. Medium related skills can be operational or formal, where operational is about practical matters and formal about not getting lost. Content related skills can be information Internet skills or strategic Internet skills according to van Deursen et al. (2011). Each of the assignments is classified as measuring one of these skills and there are no open-ended answers in their set of assignments, but there are a few in the present study.
In addition to the performance test, some pupils allowed us to record their screens, which in those cases was an active choice from their side. It was not technically possible to record all screens, but the recordings we got were important in understanding some of the problems the pupils encountered, and we got information about aspects not measured by the test itself. The material covers recordings of 35 screens, whole sessions or parts of sessions.
Depending on the school, the test was carried out on iPads or laptops. The assignments were described in a web-based software, where the pupils also gave the answers. The searches were performed on the Internet, without restrictions or predefined solutions. Some data required written explanations, while other questions had multiple choice answers. Pupils who normally used for example text-to-speech software were allowed to use that type of resource. The aim was to keep the working environment as similar to the pupils’ normal setting as possible. There was space for the pupils to give additional information of importance for the result. An example of this was that several pupils at one of the schools were newly arrived students in Sweden and did not master the language. They wrote this information themselves or with the help of the present teacher, and their results were taken away from the analysis. After some test rounds, some of the assignments had to be changed, because the accessible information had moved.
To solve the assignments, it was necessary for the participants to master the medium, e.g. a digital device connected to the Internet and mainly a web browser. Examples of medium- related items are opening websites by entering a URL or saving content to find it again. These medium-related items could not be totally controlled for in this study. The way to understand it was partly through the recorded screens, and that some assignments could not be solved if the test person could not carry out the tasks involved. For example, it was easier to answer the final questions in the assignments if a separate window or tab was open in the web browser. It was the same with a few content-related items, such as defining search options, but the recorded screens gave an indication of the success-rate of these items. Evaluating sources from certain aspects appeared as both multiple-choice questions and open-ended questions.
At the beginning of the test, there was a question about how well they estimated their information seeking competence. The question was answered by choosing a number from 1 to 5, where 1 indicated the lowest degree of competence and 5 the highest.
In total, the results from 123 participating pupils from 9 classes in 4 schools in their last year of Swedish compulsory school were left to be analysed out of the about 200 pupils taking the test. The distribution between girls and boys were 42 % and 54 % respectively. The remaining 4 % did not answer to the question related to gender. At this age, participants are considered capable of deciding upon participation in a research study themselves. Apart from written and oral information to the pupils, their parents and the schools, there were links to written information at the end of the text, and a question on whether they still agreed on letting us use their answers for research purposes. Fifty-one pupils answered negatively or not at all and were removed from the analysis.
Questions guiding the analysis were: are the pupils able to access data, store and retrieve information, navigate between data, organise searches and search strategies, sort in a large body of information, interpret information, test sources’ reliability, argue from facts, values and perspectives? All of these aspects are included in the DigComp2.1 framework, and they are also requirements in the Swedish curriculum for this age group. The analyses of the results of the performance test were mainly frequencies and t-tests where possible. The open-ended questions, where the participants had written explanations to motivate their answers, were analysed from a theoretical framework in critical thinking, but the responses were few, and this part of the analytical framework did not serve its full purpose in this respect.
The self-reported competence
The average of self-reported competence in general was 3.77 on a scale from 1 to 5. The girls had 3.61 and the boys 3.90 which shows a significant difference between girls’ and boys’ self-reported competence.
The performance test
In this paper reporting on the revised version, I have chosen to organize the results from concepts used in the theoretical framework of DigComp2.1 and the Swedish curriculum. The concepts are sometimes overlapping and appear in several assignments (Table 1).
|Assignment||Success rate Type of Q||Concepts used inDigComp and curriculum|
|1. What is the subject of the third hit in the list when searching with a predefined word in the search function of a certain website? Includes being guided through a big web site, but without links.||.52MC||navigate between data|
|2. How much subsidies can a secondary school pupil get if studying away from home? Includes calculating the distance and reading a table. Guided with the help of links.||.62RW||sort in a large body of information|
|3. Find a hairdresser maximum 1 km from a defined place||.58RW||Access dataorganize searches and strategies|
|4a. Find the price of a new passport||.46RW||organize searches|
|4b. Save the form for legal guardians to fill in to get the passport.||0.48RW||store and retrieve (sample 88)|
|5a. Salary/hour for a 15-year-old working in a shop.||.37MC||organize searches|
|5b. How late is a 15-year-old allowed to work in the evening?||.33RW||interpret datasort a large body of information|
|6a. Is it allowed for two 15-year-olds to share the work in a certain way?||.13YN||interpret datanavigate between data (combination)|
|6b. If the answer is “no” on 6a, give a reason!||.02Free||interpret datanavigate between datasort a large body of information|
|7a. Where is this newspaper article published? Paste the link.||.56RW||organize searches and strategies (sample 43)|
|7b. Do you think it is true or false? If the answer is “no”, give a reason!||.23YN Free||interpret datatest sources reliability (sample 43)|
|8a. Who is the person on the image?||.07RW||access data (sample 43)|
|8b. Is the image manipulated?||.33YN||access datainterpret data (sample 43)|
|9. Rate these products. Take into account environmental conditions, child labour and quality of the product. Give a reason.||.02–.04Free||interpret datatest sources reliabilityargue from facts, values and perspectives|
The following section reports on a further analysis of data, which goes beyond the plain success rate. The headers correspond to the “Concepts used in DigComp and curriculum” and the numbers in the parentheses relate to the number of the assignment in Table 1.
Access data (3, 8)
All the pupils in the study knew how to open a web browser and how to make simple searches with the help of Google, which is one of the first steps to get access to data. We did not see any other search engine used. All assignments had elements of this aspect, but two of them will be mentioned here. One of the assignments was to find a hairdresser within the distance of 1 kilometre from a certain place (3). Fifty-eight percent of the pupils solved this assignment.
Most pupils used Google Search to solve this assignment. At one school, the map function was used to a higher extent, and the success rate was higher at this school; 67 % compared to 55 % on average at the other schools. In one of the revised assignments, which only 43 of the pupils had, they had to find the name of a person on an image (8a). Three pupils managed to do this, and it was obvious that most of the pupils did not know how to make a picture search. The following question was if the image was modified (8b). Fourteen out of 43 pupils gave the right answer when they could choose from YES or NO, but none of those three who found the name of the person was among these 14.
Store and retrieve (4b)
To store and retrieve is also a way of accessing data – again. The participants were asked to find a form for legal guardians to fill in when a minor applies for a passport (4a). The first three schools used iPads, and they were asked to save the form locally on the devices (4b). The iPads were brought by the researchers, who checked for the forms manually. From the many questions and frustrating comments, it was obvious that quite many did not understand how to carry out the task of saving a form locally on an iPad. In the fourth school, where they used laptops, the question was changed to asking them how they would save it. Due to data loss on the iPads the sample is smaller for this assignment. Out of 88 pupils, 48 % had either a reasonable explanation for how to store and retrieve (save as a document, save the link as a favourite etc.), or had actually saved the document, saved the link, or made a screen shot which in most cases did not cover the whole form.
Navigate between data (1, 6)
To navigate from directions worked quite well, but it was more difficult for the participants to solve an assignment that required going back and forth between web pages (1). From recorded screens, we could see that many did not have more than one window open, although it was stated in the assignment from the start that they would need to go back to the original page. This caused many pupils to fail the assignment. Others who failed to solve this assignment had problems understanding what the third hit in the result list was when using the website’s search function. Instead they answered by reading the third suggestion from the search function. This was a quite simple fact question answered by multiple choice, but only 52 % found the right answer. Assignment 6a and b required combining information from different web pages, which of course was difficult for those who did not have more than one window open at the time.
Organise searches and strategies (3, 7)
Thirty-seven percent found the recommended salary for a sixteen-year-old person working in a shop (5a), and 33 % found the right answer to how late in the evening they were allowed to work according to laws and regulations (5b). This was easier to find for those who knew that trade unions typically work with this type of questions and started out there. These two job-related questions were multiple choice questions and it was possible to guess the right answer. When they had to give a reason for why two teenagers could not work at certain hours (6b), only three pupils had an answer that could be considered as them having understood why. Solving the assignment required an ability to carry out several steps, and, as mentioned above, compare information from different places.
Sort a large body of information (2, 5b, 6b)
Sixty-two percent of the pupils found information about what subsidies a pupil can get in upper secondary school if he or she enrols at a school at a certain, quite long, distance from home (2). To solve this the participants were guided through several steps, “go to that website”, “click on the link about…” etc. This was the most easily solved assignment, although at the end, it included reading a long text and interpreting a table.
Interpret data, test sources reliability (5b, 6, 7b, 8b, 9)
One of the assignments (7a) was to find where a text about Donald Trump being nominated for the Nobel Peace Prize was published and to paste the link in the field for answer. Only 56 % percent solved the assignment although it was possible to copy and paste the text into a search engine. In connection to this there was a question about whether they thought the information in the text was true or not (7b), and they had to give reasons for a negative answer. Most reasons were connected to their opinions about Trump, while some claimed that the big Swedish newspaper, where they found the text, was serious and credible. This could be considered as a low level of critical thinking. A higher level was shown by those who had been checking different sources to have the information confirmed. In addition, the pupil who wrote that it was not written in the article and it was difficult to find showed some critical thinking. This person had at least tried to go behind what was actually written.
Argue from facts, values and perspectives (9)
A last assignment concerned choosing a brand that cared about the environment and took a stand on child labour (9). The content changed after the third school, due to changing information. Only very few pupils found a solution to this, some of those did well, while others did not show they could use argumentation skills based on facts, values and various perspectives. The arguments were more about personal preferences than facts. An argument which was not considered as part of critical thinking was when the participants gave reasons for their choice of brand by referring to the brands’ own information, where the provided brand information only focused on positive aspects related to sustainable development and fair trade.
The performance test is designed to be as close to everyday assignments as possible by asking for information useful to Swedish teenagers. All in all, the pupils at the four schools did not solve the assignments very well, there are weak points, but there are also aspects where the pupils are doing quite well. A strength showed in the results is to find specific information on a long text-based webpage. This means that when they are directed to the information, even if there is a need to read tables, calculate and read a long text, they find it. Although the final answers of the assignments were in focus, it was clear that all pupils did not have the digital competencies needed. One example of this was difficulties in navigating back and forth between webpages to find information, but also saving and storing information posed a problem.
Problems with finding an answer which could be considered as correct, was more about navigating in the digital environment than about understanding the words. Lacking digital competence also affects how critical thinking can be shown, since a prerequisite for choosing a way of searching is to know there are different options.
Most pupils evaluated their competence as quite good. Compared to a national survey (National Agency for Education, 2016) this could be expected. It could also be expected, according to research, that the pupils estimated their competence higher than their actual competence (e.g. Kruger and Dunning, 1999; Gross and Latham, 2007; 2012, Aesaert and van Braak, 2017). Boys estimated their competence higher than the girls, which is a known phenomenon (e.g. Schunk et al., 2014), even if they did not succeed better.
There are many factors affecting how well a person succeeds in a test situation. In some of the reported tests above, the researchers chose not to use the Internet because of possible differences in network (e.g. ACARA, 2012; Fraillon et al., 2014). In this study, the aim was not to focus on individual results, and that makes a difference to high-stake tests. We can still point at areas that need to get more attention in the schools. There are of course limitations within a classroom. The conditions cannot be kept equal and the sample can only give an answer to the competence at the specific schools. Data loss is also a problem, but most of the data loss was due to pupils’ inability to handle their device, which is a result in itself.
The performance test should be regarded as one way to study how teenagers search for information on the Internet. These kinds of studies are important as similar tests have not been used widely previously. As both social context and technological context can fluctuate, regular updates of the test are necessary, which makes strict comparisons difficult. On the other hand, a starting point in the theoretical framework is that context matters, and it is important to identify parts and take a holistic approach. Nevertheless, the study can be seen as a first try to find ways to identify strengths and weaknesses in the area of searching information on the Internet as a complement to self-reports. For teachers – and librarians – it is important to start from the pupils’ level of knowledge and not take for granted they have the digital competence needed, for example. Ever since Prensky coined the expression digital natives in 2001, lots of teachers have believed that all young persons know how to fully handle digital devices, which later has been problematized and found to be a myth (e.g. Livingstone, Haddon, Görzig and Ólafsson, 2011; Selwyn, 2009). In a project at one of the participating schools in this study, the teachers did the same performance test and it was apparent that the teachers had the same problems as their pupils (Enochsson & Larsson, unpublished manuscript). One reason for not teaching digital skills could be that the teachers thought their pupils knew everything about handling the digital devices and did not put any focus on teaching digital skills. Another reason could be that the teachers had to define the level of teaching themselves, and since they themselves were low- proficient caused by lack in their education, they did not understand there was more to know (cf. Kruger and Dunning, 1999).
The National Agency for Education has put a lot of effort into educating teachers in source critique and it was found that many pupils evaluated the news about Trump from the fact that a serious newspaper had published it – and stopped there. In the first rounds of the test, there was an assignment where the pupils should discuss why two researchers could claim totally opposite standpoints. This assignment was taken away from the test due to other difficulties, but we could see that instead of discussing the content, many pupils started arguing about where the articles were published and which publisher was the most credible – Swedish national TV or a serious newspaper. Talking too much about source critique on its own seems to create other problems. Critical thinking could not be studied as intended here, and one reason was that it did not exist in the answers to a great extent and focus only on source criticism can be one reason - not to say that source criticism is not important, rather that it must not stop there. There are many more aspects related to critical thinking that could be discussed in this context. One such aspect is the choice of search engine.
Like in Zelman’s et al. (2011) study, what the pupils actually learn seems to be more dependent on how teachers interpret the curriculum than on the intentions of the curriculum. The result differed in some aspects in the participating classes, and a conclusion is that it is important that all aspects of information literacy is covered in teaching. As found earlier, all aspects are related to the context and thus also related to each other, which means a holistic approach is needed (e.g. Enochsson, 2005). How teachers can become able to work this way, is a subject that will need a separate article.
This project has been collaborating with a project working on a portable lab, which means that we aim to collect more data than what is written as answers to the questions in the assignments. Through the portable lab it will be easy to collect data to see more in detail the routes a person takes to find an answer. This is done in many qualitative studies, but this way, we will be able to see how common it is to do searches in one way or another.
This study was partly funded by The Internet Fund in Sweden (grants number IFv2015-0027) and by the Faculty Board of Teacher Education at Karlstad University. The author would like to thank the doctoral students Lennart Karlsson and Zeeshan Afzal who made data collection possible. The author would also like to thank professor Annica Löfdahl Hultman, lecturer Ann Scott and also Lennart Karlsson for their constructive feedback on the paper.
About the author
Ann-Britt Enochsson Associate Professor in Educational Science in Karlstad University, Department of Educational Studies, 651 88 Karlstad. She received her PhD from Karlstad University and her research interests are digital technology in educational settings from preschool to teacher education. She can be contacted at email@example.com.
- ACARA. (2012). National assessment programe: ICT literacy years 6 & 10 report 2011. Sidney, Australia: Australian Curriculum, Assessment and Reporting Authority. Retrieved from http://www.nap.edu.au/verve/_resources/nap_ictl_2011_public_report_final.pdf
- Aesaert, K. & van Braak, J. (2017). Measuring ICT competences in a valid way: a study on the accuracy of ICT self-efficacy. Paper presented at ECER 2017, 22 August, 2017 in Copenhagen.
- Bruce, C. (1997). The seven faces of information literacy. Adelaide, Australia: Auslib Press.
- Bruce, C., Hughes, H. & Somerville, M.M. (2012). Supporting informed learners in the twenty-first century. Library Trends, 60(3), 522–545. doi:10.1353/lib.2012.0009
- Catts, R. (2012). Indicators of adult information literacy. Journal of Information Literacy, 6(2). doi:10.11645/6.2.1746
- Calvani, A., Fini, A. & M. Ranieri. (2009). Assesing digital competence in secondary education: issues, models and instruments. In M. Learning (Ed.), Issues in information and media literacy: education, practice and pedagogy (pp. 153–172). Santa Rosa, CA: Informing Science Press.
- Carretero, S., Vourikari, R. & Punie, Y. (2017). DigComp 2.1. The digital competence framework for citizens: with eight proficiency levels and examples of use. Luxembourg: Publications Office of the European Union.
- Clinchy, B.M. (1990). Issues of gender in teaching and learning. Journal on Excellence in College Teaching, 1, 52–67.
- Eisenberg, M. & Berkowitz, R. (1992). Information problem-solving: the big six skills approach. School Library Media Activities Monthly, 8(5), 27-29, 37, 42.
- Elder, L. & Paul, R. (1996a). Critical thinking: a stage theory of critical thinking. Part II. Journal of Developmental Education, 20(2), 34–35.
- Elder, L. & Paul, R. (1996b). Critical thinking: a stage theory of critical thinking. Part I. Journal of Developmental Education, 20(2), 34–35.
- Enochsson, A. (2001). Meningen med webben: en studie om internetsökning utifrån erfarenheter i en fjärdeklass [The use of the web]. Doctoral dissertation, Institutionen för utbildningsvetenskap, Karlstads universitet. Karlstad, Germany.
- Enochsson, A. (2005). The development of children’s web searching skills - a non-linear model. Information Research, 11(1). Retrieved from http://www.informationr.net/ir/11-1/paper240.html
- Enochsson, A-B. & Larsson, J. (unpublished manuscript). Utvärdering av Arvika Kommuns 1- 1-projekt [Evaluation of a 1-to-1 project in Arvika].
- Ferrari, A. (2013). DigComp: a framework for developing and understanding digital competence in Europe (JRC Scientific and Policy Reports). Luxembourg: Pulications Office of the European Union.
- Forsman, M. (2014). Medie - och informationskunnighet i Sverige - En kartläggning av aktörer. Retrieved from http://www.statensmedierad.se/download/18.6e2654261506810579b2ec6a/1452243731603/MIK-kartlaggning-Sverige-2014.pdf
- Fraillon, J., Ainley, J., Schulz, W., Friedman, T. & Gebhardt, E. (2014). Preparing for life in a digital age. The IEA international computer and information literacy study, international report. Springer International Publishing, IEA.
- Fraillon, J., Ainley, J., & Schulz, W. (2015). Preparing for life in a digital age. The IEA international computer and information literacy study international report. Switzerland: Springer International Publishing.
- Gross, M. & Latham, D. (2007). Attaining information literacy: an investigation of the relationship between skill level, self-estimates of skill, and library anxiety. Library & Information Science Research, 29, 332-353.
- Gross, M. & Latham, D. (2012). What’s skill got to do with it? Information literacy skills and self-views of ability among first year college students. Journal of the American Society for Information Science &Technology, 63(3), 574-583.
- Hatlevik, O.E. & Gudmundsdottir, G. (2013). An emerging digital divide in urban school children's information literacy: challenging equity in the Norwegian school system. First Monday, 18(4). doi: 10.5210/fm.v18i4.4232.
- Hatlevik, O., Throndsen, I., Loi, M. & Gudmundsdottir, G. (2018). Students’ ICT self-efficacy and computer and information literacy: determinants and relationships. Computers & Education 118(2018), 107–119. doi: 10.1016/j.compedu.2017.11.011.
- Huggins, A., Ritzhaupt, A. & Dawson, A. (2014). Measuring information and communication technology literacy using a performance assessment: validation of the student tool for technology literacy (ST 2L). Computers & Education 77(2014), 1–12. doi: 10.1016/j.compedu.2014.04.005.
- Kruger, J. & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. doi:10.1037/0022-3518.104.22.1681
- Kuhlthau, C. (1993). Seeking meaning: a process approach to library and information services. Norwood, NJ: Ablex.
- Limberg, L. (1998). Att söka information för att lära: en studie av samspel mellan informationssökning och lärande. Gothenburg, Sweden: Gothenburg University.
- Limberg, L. (2013). Informationskompetens i undervisningspraktiker. In U. Carlsson (Ed.), Medie- och informationskunnighet i nätverkssamhället. Skolan och demokratin. (pp. 67–76). Gothenburg, Sweden: Nordicom.
- Limberg, L. (2014). Informationsaktiviteter och lärande I skola och bibliotek. In J. Rivano Eckerdal & O. Sundin (Eds.), Medie- och informationskunnighet: en forskningsantologi (pp. 27–38). Stockholm: Svensk biblioteksförening.
- Lipman, M.S. (1991). Thinking in education. Cambridge: Cambridge University Press.
- Livingstone, S., Haddon, L., Görzig, A. & Ólafsson, K. (2011). EU kids online II: final report 2011. London: EU Kids Online, London School of Economics & Political Science.
- National Agency for Education (NAE) (2011). Curriculum for the compulsory school, preschool class and the recreation centre 2011. Stockholm: National Agency for Education.
- National Agency for Education (2016). IT-användning och IT-kompetens i skolan: Skolverkets IT-uppföljning 2015 [ICT use and ICT competence in school]. Stockholm: National Agency for Education.
- Oxstrand, B. (2013). Från Media Literacy till Mediekunnighet. Doctoral dissertation, Department of Journalism, Media and Communication. Gothenburg University, Gothenburg, Sweden.
- Perry Jr, W.G. (1970). Intellectual and ethical development in the college vears. New York, NY: Holt, Rhinehart and Winston.
- Samuelsson, U. & T. Olsson. (2014). Digital inequality in primary and secondary education: findings from a systematic literature review. In M. Stocchetti (Ed.), Media and education in the digital age (pp. 41–62). Bern: Peter Lang Publishing Group.
- Schunk, D.H., Meece, J.L. & Pintrich, P.R. (2014). Motivation in education: theory, research, and applications (4th ed.). Harlow, UK: Pearson.
- Selwyn, N. (2009). The digital native-myth and reality. Perspectives, 61, 364–379.doi: 10.1108/00012530910973776.
- Senkbeil, M., Ihme, J.M. & Wittwer, J. (2013). The test of technological and information literacy (TILT) in the national educational panel study: development, empirical testing, and evidence for validity. Journal for Educational Research Online, 5, 139– 161.
- Siddiq, F., Hatlevik, O., Olsen, R., Throndsen, I. & Scherer, R. (2016). Taking a future perspective by learning from the past: a systematic review of assessment instruments that aim to measure primary and secondary school students' ICT literacy. Educational Research Review 19, 58–84.
- van Deursen, A. & van Diepen, S. (2013). Information and strategic internet skills of secondary students: a performance test. Computers & Education, 63, 218–226.
- van Deursen, A. & van Dijk, J. (2009). Using the Internet: skill related problems in users' online behavior. Interacting with Computers, 21(1–2), 393–402.
- van Deursen, A.J.A.M., van Dijk, J.A.G.M. & Peters, O. (2011). Rethinking Internet skills: the contribution of gender, age, education, Internet experience, and hours online to medium- and content-related Internet skills. Poetics, 39(2), 125–144. doi:10.1016/j.poetic.2011.02.001
- Wilson, C., Grizzle, A., Tuazon, R., Akyempong, K. & Cheung, C.-K. (2011). Media and information literacy: curriculum for teachers. Retrieved from http://unesdoc.unesco.org/images/0019/001929/192971e.pdf.
- Zelman, M., Avdeeva, S., Shmis, T., Vasiliev, K. & Froumin, I. (2011). International comparison of information literacy in digital environments. Paper presented at the international association for educational assessment (IAEA), Manila. Retrieved from http://www.iaea.info/documents/paper_30e43f54.pdf.