NOU 2023: 19

Learning: Lost in the Shuffle?— Use of pupil and student data to enhance learning

To table of content

Part 2
The Expert Group’s assessments

6 Assessing the potential and challenges related to learning analytics

Figure 6.1 

Figure 6.1

A key task of the mandate given to the Expert Group is to provide advice on a good and sound practice for learning analytics. This involves describing the potential of learning analytics, the challenges to be resolved, and how learning analytics affect learning for pupils and students. In this section, we will give explain our assessment of the value of learning analytics in pedagogical work, as well as a few of the special educational and ethical challenges that learning analytics entail. The purpose is to assess how learning analytics can contribute to the realisation of the fundamental values and principles on which the educations are based, and to assess the extent to which learning analytics may challenge these principles.

An assessment of the educational value of learning analytics and the ethical and pedagogical challenges they may present is intricately linked to the rights regarding participation and privacy. The educational value of the information is crucial with respect to, among other things, the issue of the legal aspects of collecting and processing personal data. There is a close connection between the educational value of this data, and the extent to which learning analytics meet the requirements of the data protection legislation stating that processing personal data must be necessary to fulfil its purpose. We will therefore also assess participation in learning analytics and the need to regulate learning analytics to a greater extent than we do today.

Difficult to separate learning analytics from the general digitalisation of education

It is difficult to separate learning analytics from the general digitalisation of education. Digitalisation often encompasses the same central elements that are part of learning analytics. Examples include the measurement, collection, analysis and reporting of data. We have found it extremely challenging and at times impossible to separate the issue of learning analytics from the broader issues of digitalisation and the use of digital learning resources in education.

Although we view learning analytics as part of a broader discussion on digitalisation, our aim in this section is to try to answer questions that directly involve the use of pupil and student data to enhance learning in various ways. We have focused our attention on when and how learning analytics can lead to better learning for pupils and students, and when the disadvantages of learning analytics outweigh the advantages. We are also concerned with situations where learning analytics reinforce existing opportunities and challenges related to digitalisation.

The assessment of learning analytics is closely linked to other forms of learning analyses

Teachers and instructors have always measured, collected, analysed and reported information about pupils and students with the objective of understanding and promoting learning. This means that the value of learning analytics largely coincides with the value of well-known didactic processes related to teaching and learning. It may therefore be difficult to identify how learning analytics provide added value.

Learning analytics introduces new methods, sources, data and systems for understanding and promoting learning. Teachers and instructors continue, as before, to collect and analyse information about the academic development, performance, behaviour and working methods of their pupils and students. The major difference is that in a digitalised education, all of these areas have been given parallel digital information sources, analytic methods and presentations. Our aim is to shed light on when these aspects entail new disadvantages and opportunities.

7 How can learning analytics enhance learning and improve teaching?

In this chapter, we will explain our assessment of the pedagogical value of learning analytics for primary and secondary education and training, higher education and tertiary vocational education. This is predicated upon the knowledge base and discussions in the Expert Group’s interim report, the status description in chapter 3, and on the input we have received during our work. We will compare the value of learning analytics in pedagogical work with the areas of education where we believe learning analytics would have the greatest added value.

We will begin by pointing out seven general requirements that must be present in order for this added value to be realised, and to ensure that learning analytics can appropriately enhance learning and improve teaching and instruction (the list is not exhaustive):

  1. the data on pupil and student learning must be relevant and of good quality

  2. there must be good management systems for privacy and data security that are adapted to the use of artificial intelligence

  3. information on data processing must be clearly communicated to students, pupils and parents

  4. information from the analytics must be presented in a comprehensible manner

  5. information from learning analytics must be viewed in connection with other relevant information about learning and teaching

  6. teachers and instructors must have sufficient and relevant competence

  7. it must be possible to adapt the use of learning analytics to the unique nature of the subject, professional judgement and local conditions

  8. sufficient time, resources and capacity must be provided in order to follow up data gained from the analyses

In addition to these broad requirements, we will in each subchapter explain certain relevant prerequisites in each of the areas.

7.1 The value of learning analytics in primary and secondary education and training

The value of learning analytics is intricately linked to how learning analytics can contribute to the realisation of the values, goals and principles of primary and secondary education and training. It is thus necessary to take a closer look at how learning analytics relate to the principles expressed in laws and regulations, white papers and other governing documents.

The broad part of the National Curriculum for primary and secondary education and training is a key source, as it elaborates on the fundamental values of the education and describes the basic perspectives that characterise pedagogical practices. The National Curriculum has status as a regulation and is therefore binding for schools. Fundamental values of teaching and instruction reflect the universal values of society based on human rights and emphasise that the best interests of the pupil must always be a vital consideration (Norwegian Ministry of Education and Research, 2017). Consideration of the best interests of the pupil must therefore also be a guiding principle when performing learning analytics.

7.1.1 Insight into own learning

Teaching and instruction must give pupils a strong foundation for understanding themselves and for making good decisions in life. The National Curriculum emphasises that schools have the important task of providing pupils with knowledge of and insight into their own learning processes: “Schools must help pupils to reflect on their own learning, understand their own learning processes and independently acquire knowledge” (Norwegian Ministry of Education and Research, 2017, Chapter 2.4).

Through learning analytics, it is possible to obtain more knowledge of pupils’ academic work during the learning process, and to convey this insight to the pupils as an integral part of the instruction. Learning analytics that provide pupils with greater insight based on relevant and qualitatively good data, where they also receive the necessary assistance to interpret and understand the data, can support schools in their formative task that involves having pupils reflect on their own learning and understand their own learning processes. Vestfold and Telemark County authority (2022) describes this as follows: “Learning data can be visualised and presented in ways that can potentially make pupils more capable of understanding what they need to work on to increase their learning outcomes and desire to learn – an increased understanding of their own learning” (p. 3).

Learning analytics can help pupils become more aware of their own positions and help them reflect on what could be good choices for further development. These are important elements for strengthening pupils’ self-regulation1. A better understanding of their own learning processes with the aid of learning analytics can better equip pupils to make decisions on issues related to their instruction. Learning analytics can thereby provide better conditions for pupil participation and involvement.

Preconditions for learning analytics to contribute to insight into personal learning

One of the necessary conditions for learning analytics to help enable pupils to achieve a better understanding of their own learning is for teachers to include pupils when interpreting and understanding the information from the analytics. Pupils must be given assistance to understand and make use of the information from the analytics and not be left to interpret analyses on their own. The type of guidance given to pupils must be based on their age group and context. For the youngest pupils, it will be essential for the school to cooperate with the parents to determine how pupils can understand and make use of information from learning analytics and other feedback.

7.1.2 Formative assessments in school subjects

Regulations to section 3-3 of the Education Act state that the purpose of assessments in a school subject is to continually enhance learning, contribute to the desire to learn, and to provide information on academic performance both during the school year and upon conclusion of instruction in the subject. Section 3-10 relating to formative assessments emphasises that the assessment must become an integral part of the instruction and that it should be used to enhance learning, adapt instruction and increase competence in school subjects. Pupils shall

  • participate in the assessment of their own work and reflect on their own learning and academic development

  • understand what they are to learn and what is expected of them

  • be informed of their proficiencies

  • receive advice on how to continue working to improve their competence

Systematic use of data from learning activities can provide information on pupils’ academic development throughout the year and over time. This information may be useful for giving pupils feedback on their academic progression and learning processes.

Learning analytics generally provide good support for teachers’ feedback to their pupils. However, relevant feedback from the learning resource and directly to the pupil may also be highly valuable. Learning analytics enable pupils to receive more immediate feedback on their own work than a teacher would normally have the capacity to manage in a classroom or with a large group of pupils.

Preconditions for learning analytics to support formative assessments

In order for learning analytics to support formative assessments, more complex information from the analyses would generally have to be conveyed to the pupils through feedback from their teacher. Concrete information about pupils’ assignment answers may be suitable for direct feedback from the learning resource to the pupil. Pupils must also be permitted to make mistakes during their learning processes without the need for storing this data and using it as part of the formative assessment. They should also be made aware that data is being collected for the purpose of formative assessments, or that the data will be incorporated in the final assessments.

7.1.3 Adapted and inclusive instruction

According to the National Curriculum, schools must facilitate equal opportunities for learning and development for all pupils, regardless of their abilities, through differentiated instruction (Norwegian Ministry of Education and Research, 2017). Schools must make plans to ensure that “instruction is experienced as both manageable and sufficiently challenging” (Norwegian Ministry of Education and Research, 2017). The National Curriculum emphasises that differentiated instruction should primarily involve variations in the materials and adaptations for diversity in the community. It is valuable for children from different backgrounds and with different abilities to learn together at school Meld. St. 8 (2022–2023). According to the National Curriculum, teachers must also reflect on how their pupils learn and how they themselves can best lead and support pupils’ learning development and formative development.

Differentiated and inclusive instruction requires knowledge of how pupils learn and what they are able to do. Learning analytics may be one source of such information. With the aid of digital tools, it may be easier to gain an overview of how pupils solve academic problems, how they work, what they are able to achieve, and what they are struggling with in various parts of a school subject. Teachers can gain a better understanding of pupils’ work and learning processes through access to various types of information. Data in the form of test scores can provide ongoing information about how pupils solve specific problems, and it can chart academic progression. Furthermore, activity data on pupil navigation through a learning resource may indicate the components, modules and tasks that work best for different pupils. In this way, systematic use of digital traces from learning processes could provide insight into pupils’ misconceptions and identify academic areas where they require more adapted instruction. The teacher can then use this information to adapt the instruction and to identify pupils who need additional help. Learning analytics can help uncover challenges at an early stage. The analytics can provide a basis for implementing measures quickly, which would then contribute to achieving the goal of early intervention with a greater probability of completing an upper secondary education.

Adaptive teaching aids and learning resources have often been highlighted as a means of differentiating instruction. One of the purposes of adaptivity is to offer pupils assignments that are tailored to their levels and preferences. It may thus be easier and quicker to identify pupils who need additional support and follow-up as the means to achieve this lie in the resource. Having all pupils work on individually adapted assignments in the same classroom can also help promote inclusion (Statped – National Support System for Special Needs Education, 2022). We will elaborate on the value of adaptivity in Chapter 7.3.

Preconditions for learning analytics to contribute to differentiated and inclusive instruction

In order for learning analytics to provide a basis for differentiated and inclusive instruction, teachers must have access to a variety of digital teaching aids that can encompass all pupils, and that teachers are given sufficient time and guidance to learn how to use them.

7.1.4 Quality development

According to section 13-3(e) of the Education Act, municipalities and county authorities must ensure that schools regularly determine the extent to which the organisation, adaptivity and implementation of the instruction contributes towards achieving the objectives set out in the National Curriculum.

Data from learning situations with digital resources is a relevant basis for assessing and evaluating a school’s practices and for supporting decisions. Resources that analyse and compile data from several sources can provide insight into teaching and instruction, as well as learning that takes place in the classroom, for a specific year group, at a school or across schools. The Norwegian Government’s digitalisation strategy for schools also states that data from pupils’ learning situations can be aggregated and used analytically at a broad level as support for decisions, e.g. on purchases or for knowledge development (Norwegian Ministry of Education and Research, 2023). In upper secondary education, it may be relevant to assess how learning analytics can contribute to knowledge of conditions that may increase the probability of completing studies, and for implementing measures to prevent dropout.

Learning analytics are thus well-suited for assisting the work on quality development in school, which is consistent with findings from the Committee for Quality Development in Schools regarding school owners’ need for support and information for quality development (NOU 2023: 1). If analytics are based on data generated in learning situations, the contribution will also be time-efficient compared to data generated through reports. Data from pupils’ learning can also provide an insight into their participation over time, and in this way play a role in schools’ long-term work to prevent dropout.

Learning analytics as a basis for quality development at an organisational level is discussed in more detail in Chapter 7.4. This form of learning analytics is also relevant for the ongoing work of the Committee for Quality Development in Schools.

Preconditions for learning analytics to contribute to quality development work

In order for learning analytics to provide a valuable and meaningful contribution to quality development work, the data must contain information that can be analysed at an organisational level. This is particularly important for compilations and comparisons across subjects, year groups, schools and municipalities. It is also essential to be able to link the various data sources

7.1.5 Professional practice

Information from learning analytics is primarily used to give teachers insight into pupils’ learning activities and learning environments, and as support for pedagogical decisions. It can therefore support professional practices as described in the National Curriculum.

Chapter 3.5 of the core curriculum notes that the teaching profession must regularly assess its pedagogical practices to best ensure the needs of the pupils: “Teachers must carefully consider what, how and why pupils learn, and how they best can lead and support the pupils’ education and all-round development” (Norwegian Ministry of Education and Research, 2017). At the same time, it is emphasised that complex pedagogical questions rarely have definite answers. Learning analytics provides information about learning based on data from a number of actions by pupils. This can reduce a complex phenomenon to a manageable selection of variables that can help form the basis for pedagogical decisions.

Teachers and the professional community at a school have a responsibility to assess their pedagogical practices in light of research and evidence-based knowledge and use relevant information about how the instruction is working. If a school has access to high-quality learning analytics, it would be natural for this to be incorporated as one of the tools for further developing the school.

Preconditions for learning analytics to contribute to insight into professional practice

Information from learning analytics must be viewed in connection with other information teachers have about their pupils. Practising professional judgement also involves making decisions on when and how learning analytics should be performed.

7.2 The value of learning analytics in higher education and tertiary vocational education

The Expert Group will assess the value of learning analytics in higher education and tertiary vocational education based on how they can help realise fundamental principles expressed in laws and regulations, white papers and other governing documents. Section 1-1(a) of the Universities and University Colleges Act describes the purpose of such educations as “offering higher education at a high international level”. As stipulated in section 1-5, first paragraph, institutions have a responsibility for ensuring that teaching is “conducted in accordance with recognised scientific, artistic, educational and ethical principles”. Apart from this, institutions are given substantial freedom and responsibilities to design their own academic and set of values within the legal framework (Section 1-5, second paragraph of the Universities and University Colleges Act). In tertiary vocational education, there is a broad set of values stated in section 4 of the Vocational Education Act: “Tertiary vocational education must be based on knowledge and experience from one or more occupational fields and be consistent with relevant pedagogical, ethical, artistic and scientific principles”.

7.2.1 Active student learning

A fundamental principle in higher education and tertiary vocational education is that students should engage in their studies as responsible participants in their own learning (Meld. St. 9 (2016–2017); Meld. St. 16 (2016–2017). Section 2-2, fifth paragraph of the Academic Supervision Regulations emphasises that higher education institutions must facilitate opportunities for students to take an active role in their learning processes. The Report to the Storting (white paper) on vocational college education states the following: “An attractive vocational college will have engaged students who are involved in its direction and can influence its development” (Meld. St. 9 (2016–2017), p. 7).

Students must be able to plan, implement and monitor their own learning and assess the extent to which they must change something to achieve their objective. This is key to what is known as self-regulation. In higher education and tertiary vocational education, learning analytics directed at the students could be valuable for enhancing active student learning and facilitating self-regulation. If students have access to data and analyses of their own learning, they may have a better understanding of their own learning processes, which in turn would provide a good basis for taking steps to make changes as needed. Learning analytics can also give students information on learning activities they have completed, how they have spent their time, and what results they have achieved. Certain tools also have a functionality that provides students with notifications and reminders to help them structure their work. Student organisations point out that students must be given control over these functionalities to prevent them from contributing to greater stress. The Expert Group has received input suggesting that students may experience more stress if they, for instance, receive notifications on learning platforms late at night, during weekends or on public holidays.

Preconditions for learning analytics to contribute to active student learning

In order for learning analytics to strengthen active student learning, students must be given sufficient instructions for understanding and interpreting information coming from the analytics. Higher education institutions must involve students to find out what type of information students need from the learning analytics to help them in their learning processes. They must also ensure that students have some control over functionalities that are directed at them in the form of notifications and reminders. It is important that learning analytics are not performed in ways that could blur the lines between student life and private life.

7.2.2 Student follow-up

Education should be based on knowledge of how students learn best Meld. St. 16 (2016–2017). Although research is lacking on what is needed to ensure that students achieve the best possible learning outcomes, certain factors appear to be more important than others. “The most important factors determining a student’s success are student engagement, the amount of time they spend on their studies, and how they use that time” Meld. St. 16 (2016–2017), p. 16)).

One essential question in higher education and tertiary vocational education is to what extent the use of data that directly identifies individual students has pedagogical value for student follow-up. The answer to this question is crucial for determining whether learning analytics will constitute a proportionate intrusion on privacy.

Good student follow-up requires information on their study activities. Here, learning analytics can contribute. In certain areas, the data base from student learning situations would be sufficient for providing valuable information, and for following up individual students or groups of students. This may, for instance, apply to studies where much of the instruction takes place on digital platforms, or in subject areas where there are quality digital resources that are suitable for learning analyses.

Information from learning analytics may be included as a basis for providing feedback to students and for adapting instruction. However, it is essential that information coming directly from learning analytic systems does not become the only form of student feedback, or that the information from learning analytics replaces feedback that has traditionally been the responsibility of the teacher or instructor. In dialogue with the Expert Group, the National Union of Students in Norway has expressed their concern that an increased collection of data from students may lead to a less authentic dialogue between students and teachers. Learning analytics must not undermine the existing dialogue between students and teachers, regardless of how good the analyses are. Universities Norway (2023) has also emphasised that learning analytics neither can nor should replace student participation and involvement.

Data collected on student learning often says something about how students manage resources, their use of time, ongoing assignments and learning outcomes measured by digital tools. To learn more about larger issues – such as how students experience their instruction, what type of alternative methods of teaching they may envision, or what they believe would be a positive development of comprehensive academic offerings – it would be necessary to engage in a dialogue with them. It is important to keep in mind that data from students’ use of digital resources provides information on how they relate to the various digital resources. For most students, this is only a limited part of their study programme, which emphasises the importance of engaging in a real dialogue with students and ensuring student democracy.

Preconditions for learning analytics to contribute to student follow-up

In order for learning analytics to strengthen student follow-up, teachers and instructors must use analyses to follow up the same students from whom the data has been collected instead of making adjustments for the next class. It is also essential for teachers, instructors and students to collaborate on interpreting student data and on the implications for further instruction and learning.

7.2.3 Inclusive education and lifelong learning

Inclusion and equal access are important principles in both higher education and tertiary vocational education These principles are based on the UN sustainable development goal of an inclusive and equitable quality education, and that everyone should have opportunities for lifelong learning.2 Universities, university colleges and tertiary vocational colleges must therefore facilitate good access to study programmes, also for students from diverse backgrounds Meld. St. 9 (2016–2017); Meld. St. 16 (2016–2017). A central goal in the long-term plan for higher education is to strengthen access to flexible educational programmes and to use digital teaching methods as a means of achieving this Meld. St. 5 (2022–2023).

Learning analytics can contribute to inclusion by providing higher education institutions with a better foundation for adapting study programmes and instruction for students with different abilities. There are examples of targeted learning analytics for higher education in other countries to reduce the performance gap between majority and minority students (Johnson, 2018).

In tertiary vocational education, there are many part-time mature students and online students with obligations that make it challenging for them to complete their studies. Learning analytics can provide greater knowledge of the type of adaptations for learning that work best through access to data from students’ learning situations. This can give higher education institutions a foundation for adapting studies in ways that enable more students to succeed, and to increase the access to study programmes for other groups in society.

Preconditions for learning analytics to contribute to inclusive education and lifelong learning

In order for learning analytics to become inclusive, there are digital learning resources and data sources that are suitable and accessible for students with different backgrounds, and data from these sources can be collated. It is also necessary for higher education institutions to have a conscious approach to how learning analytics can promote inclusion, and that they understand how it may also be exclusionary. There is a risk of discriminatory outcomes with the use of artificial intelligence if we are not cautious (Costanza-Chock, 2020; Selwyn, 2022). Measures for reducing such a risk are essential for ensuring that learning analytics help to promote inclusion.

7.2.4 Quality development

The Universities and University Colleges Act states that requirements for quality development involve helping to ensure that society has confidence in the quality of Norwegian higher education. According to section 1-6 of the Act, higher education institutions must have mechanisms to “ensure and further develop the quality of education”. Vocational colleges are required to have systems in place for quality assurance, according to section 4-3 of the Vocational Education Act. Section 4-1 of the Vocational Education Academic Supervision Regulations stipulates that vocational colleges must systematically collect information from students to assess whether each individual study programme has achieved its objectives for quality.

Learning analytics can contribute to the work on quality development by compiling and analysing data obtained from students’ learning activities and other relevant sources. These regulations stipulate that the quality work at the education institutions must be continuous. Learning analytics can be a good tool for succeeding in these efforts. If higher education institutions use learning analytics that provide ongoing information about student learning and activities, the analyses could comprise a continually updated basis for improving the quality of study programmes. In addition, learning analytics can help facilitate the testing of new forms of instruction and assessment.

Learning analytics as a basis for quality development at an organisational level is discussed in more detail in Chapter 7.4.

Preconditions for learning analytics to contribute to quality development work

In order for learning analytics to make a positive contribution to the work on quality development, it must be possible to compile data from a variety of sources. Access to relevant databases will probably vary between higher education institutions and different subject areas, but in many cases, learning analytics would be able to contribute information that can be a starting point for improving a study or course design.

7.2.5 Standard time for completion of studies

One ambition of higher education is for students to complete their education as efficiently as possible Meld. St. 16 (2016–2017). Deviating from the standard completion time for a study programme can be challenging for certain students, and higher education institutions should therefore attempt to identify potential dropouts as early as possible.

It would be relevant to see whether learning analytics could be incorporated as a mechanism to obtain more knowledge of conditions that increase the probability of completing studies, and to implement measures to prevent dropout. Artificial intelligence would be a suitable tool for locating and recognising patterns in data. Data analyses from student activities to predict potential dropout has been used extensively by higher education institutions in other countries. The Expert Group notes that analyses of student data to prevent dropout is less common in Norway, although we are aware of a few examples, such as BI (Norwegian Business School, 2022).

It may be difficult to obtain good information about student participation and student work on the basis of data from learning activities. In many cases, students who are collaborating or using analogue learning resources may appear to be neither active nor well-integrated based on data from the learning activities, even if the opposite is true. For instance, it will often appear that a student has not been very active on the learning platform if they have systematically been looking at resources together with another student while only the other student is logged in. It can feel stigmatising if an intervention is implemented for students who are active in ways that have not been registered by the system. It would also be unfortunate if students were to begin adjusting their behaviour solely to satisfy the system and to avoid interventions.

Preconditions for learning analytics to contribute to completing studies within the standard time

In order for learning analytics to help increase the probability of completion, higher education institutions must make a thorough assessment of the correlation between the data they have obtained on the students and the students’ social and academic integration.

7.3 The value of adaptivity

Giving recommendations on the use of adaptive teaching aids and exams is part of the Expert Group’s mandate. In our first interim report, we gave an account of the relationship between adaptivity and learning analytics. Here we concluded that these two areas are interconnected to a large extent, but that adaptive systems do not necessary fall within the definition of learning analytics. Learning analytics are to an increasing extent based on data from adaptive systems, particularly those used in primary and secondary education. It is therefore useful to view learning analytics and adaptivity in context.

Because adaptive teaching aids are so far mostly used in primary and secondary education, we will mainly refer to adaptivity in a school context. Nevertheless, several of our assessments will also be relevant for higher education and tertiary vocational education.

7.3.1 Adaptive systems for instruction

In this context, adaptivity means that the content of a digital system is adapted to the person using the system. The antithesis of adaptive tests and teaching aids are linear tests and materials. This is because pupils who use them all follow the same sequence of tasks. An adaptive system can also adjust the tempo and presentation of the content and offer individualised feedback. Since these adjustments are part of a digital system, this is done with the aid of algorithms. Algorithms that control adaptations to tests and learning systems may be complex, e.g. when involving machine learning, but in most cases, they are quite simple. What is typical for adaptive teaching aids and tests is that they are divided into smaller parts, which in turn consist of several problems. These parts comprise an overview of the material to be learned or measured.

The knowledge base on the use of adaptive systems is limited, but a growing number of studies have focused their attention on how they affect learning (Egelandsdal et al., 2019; Moltudal et al., 2020).

7.3.2 Differentiated instruction

Differentiated instruction involves tailoring instruction to ensure that all pupils receive the best possible outcome of the instruction (Norwegian Ministry of Education and Research, 2017, Chapter 3.2). Adaptivity is often related to this principle because the primary goal of adaptivity is differentiated instruction. The desired result of adaptive tests and teaching aids is to give pupils assignments at the correct level, so that pupils are either able to learn more (teaching aids) or demonstrate what they know (tests). At the same time, there are some issues concerning the adaptivity of digital tools in terms of differentiated instruction. This is partly due to the fact that differentiations in digital systems are based on algorithms processed by computers, while the idea of differentiated instruction is primarily that the teacher – and the teacher’s relationship with the pupil – is key to this task. In some cases, the objective of algorithms is to do something that the teacher also does, e.g. tell the pupil which problems to solve and in what order. Many suppliers stress that adaptive systems can free up time for teachers. Teachers also use this as an argument for utilising such tools (Baker et al., 2017).

Preconditions for adaptive systems to contribute to differentiated instruction

In order for adaptivity to have a positive impact on differentiated instruction, teachers must be in proximity to their pupils when working with adaptive teaching aids. Studies indicate that pupils learn very little when teachers mostly let them work on their own with such systems. It is therefore recommended that these teaching aids are included as part of a more comprehensive instruction (McTigue et al., 2020; McTigue and Uppstad, 2019). According to Statped – National Support System for Special Needs Education, (2022), this is unfortunately not currently the case: “The intention of the developer is usually for these teaching aids to be used as part of the instruction. In practice, many pupils sat alone in front of a screen, and the teaching aids were used for most of the instruction without teacher follow-up” (p. 2).

7.3.3 Motivation

According to the Core Curriculum, schools should stimulate motivation, and throughout their education, pupils should be given challenges that promote the desire to learn (Norwegian Ministry of Education and Research, Chapter 3, 2017). Curriculum texts regarding assessments for school subjects emphasise the same.

Adaptive systems have the capacity to strengthen motivation through both the adaptations themselves and through certain elements related to the adaptivity, such as advanced reward systems. Good adaptivity will ensure that pupils receive assignments that they have the ability to complete, and the pupils are then motivated through their accomplishments. Skolelederforbundet (2022) [Norwegian School Heads Association] also emphasise mastery and motivation as a benefit of adaptivity and learning analytics. “Learning analytics can help give certain pupils entirely differentiated assignments to strike the proximal development zone with greater accuracy. Similarly, customised and entirely differentiated instruction can create motivation and a strong belief among the pupils in their own abilities” (p. 1).

Mastery is a key part of motivation theory, where it is essential that the task is neither too easy nor too difficult. But although mastery is essential, it is not always sufficient to promote motivation. One example of this is from an article with a literature review of studies comparing linear and adaptive tests, where the adaptive tests showed no showed no significant effect on motivation (Akhtar et al., 2022). In contrast, one study found positive effects on motivation when the adaptivity was reduced to a lower level, where the pupil was given slightly easier tasks than those provided through standard adaptive systems. The same study indicates the importance of giving pupils sufficient information on how the adaptations are made, since adaptivity also implies that pupils with a higher level of performance are given tasks that are more difficult than the ones they receive in linear systems. These pupils would therefore find the adaptive system more difficult than the one they are accustomed to.

Preconditions for adaptive systems to contribute to motivation

In order for adaptive systems to stimulate motivation and mastery, it is necessary to know precisely how difficult a task is, or to have other precise information as a basis for the adaptations. This part of the development work would be costly and demand a high level of competence. When the foundation for adaptivity is poor, there would naturally be significant limitations on the adaptivity itself. This challenge also applies to linear tools. However, the scope of the challenges is broader for the adaptive tools because one would have to have several functioning pathways in an adaptive learning aid or test – not just one.

7.4 The value of data-supported quality development

In its work, the Expert Group has primarily focused on learning analytics that aim to enhance learning for certain individuals or groups, and where the measures are implemented close to the learning situations. Another form of learning analytics is aimed at the organisational level is known as institutional analytics. In this sub-chapter, we refer to the use of this form of learning analytics as data-supported quality development. The aim is to support decisions on quality related to the conditions and adaptations for learning and the design of the learning environment. This form of learning analytics uses aggregate data from learning situations, often in combination with several relevant sources. It is the management and owner level at the schools and education institutions, as well as local and national authority levels that use the information from such learning analytics.

Data-supported quality development will be relevant for all education institutions, regardless of the form and level. In this subchapter, the Expert Group has primarily chosen to discuss data-supported quality development in higher education and tertiary vocational education. The reason is partly that these sectors believe this potential is relevant: “Learning analytics can contribute to a more holistic and coherent design of courses and study programmes” (Sikt, 2022, p. 2). Another reason is the ongoing work of the Committee for Quality Development in Schools, whose mandate is to survey needs and propose changes to tools and data sources that will facilitate quality development in primary and secondary education.

7.4.1 Better quality of education

In the new long-term plan for research and higher education, high quality and accessibility is one of the three goals that will help realise the thematic initiatives for the next decade (Report to the Storting (2022–2023)). The action plan for the digital transformation includes learning analytics as a possible measure for contributing to better education quality (Norwegian Directorate for Higher Education and Skills, 2022).

The quality report for higher education indicates a number of definitions and interpretations of quality in education Meld. St. 16 (2016–2017). However, it includes the following broad ambitions as a basis for the understanding of high-quality education: “Students shall achieve the best possible learning outcomes and personal development, have access to relevant study programmes to sufficiently prepare them for active participation in a democratic and diverse society and for a future professional career, and complete their education as efficiently as possible” (p. 15). These ambitions are also relevant for high quality in tertiary vocational education Meld. St. 9 (2016–2017).

There is a broad range of data sources that can form the basis for work on quality development. Examples include:

  • aggregate data from learning situations: activity data from students’ use of teaching aids, platforms and other digital solutions

  • data from the administrative systems of study programmes and registers which contain, among other things, data on the results of completed coursework requirements and final assessments, such as the Common Student System (FS)3

  • data from the Student Survey4 and other surveys

7.4.2 Data as a resource

More data is now being produced weekly than data produced in the last millennium. This vast body of data affects the way we do everything from research and product and process innovation to the way we develop organisations, design business models, and how we interact with each other. Meld. St. 27 (2015–2016), p. 101)).

Data as a basis for value creation has been a topic of organisational literature for quite some time, often under headings such as “the data-driven organisation” (Andersen et al., 2018). At the national level, data has been highlighted as the starting point and foundation for the development of the modern society in the white paper Data as a resource – The data-driven economy and innovation Meld. St. 22 (2020–2021). The white paper outlines the Norwegian data policy: “The Government’s principles for data policy should underpin the efficient sharing and use of data within safe and responsible parameters and should ensure that value created from data benefits the private sector, the public sector and society” (p. 8). A separate chapter has also been devoted to data as a resource in the digitalisation strategy for the public sector, which states that data can be better utilised as a resource by the public sector, and that this opens up entirely new methods of solving problems (Ministry of Local Government and Modernisation, 2019).

The Digital Agenda for Norway addresses the large amount of data produced by the education sector, and the importance of utilising this data to improve quality Meld. St. 27 (2015–2016). The white paper highlights the link between data-supported quality development and the good use of technology in instruction. Data from digital resources such as learning platforms, teaching aids, tests and other systems are relevant sources for such quality development.

In the Ministry of Education and Research’s Strategy for digital transformation in the higher education sector utilising data on the knowledge sector is one of the six strategic focus areas (Norwegian Ministry of Education and Research, 2021b). This strategy indicates several challenges in this context: The knowledge and value creation potential of data from the knowledge sector has not been sufficiently utilised. The culture of data sharing has been poorly developed, and there is a lack of common standards for metadata. There are also many data owners in an ambiguous landscape, and the benefits may often appear in places other than where efforts have been made. One of the strategy’s focus areas has therefore been to establish systems and infrastructure for data capture, sharing, storage and the reuse of data on the knowledge sector. It is both desirable and necessary for enterprises to share and exchange data internally and between themselves, as the combination of several datasets will often provide the basis for entirely new and much broader insights than that provided by a single dataset.

7.4.3 The use of aggregate data

In contrast to learning analytics that occur in close proximity to learning situations, data-supported quality development is based on aggregate data that is not directly identifiable. This means that the risk to privacy is generally low for this form of learning analytics. The risk to privacy is also lower because measures are directed at groups or systems for education and not at the individual level.

One example of quality development work with the aid of aggregate data could be exploring various questions about what promotes good learning. This may involve questions about the type of learning platform functionalities, collaboration solutions and learning resources that could contribute to a higher quality of education. Similarly, questions about which learning activities and assessment methods, and which types of assignments and coursework and the order of these would be relevant to explore on the basis of aggregate data. In its input report to the Expert Group, the BI Norwegian Business School explained how they use data analyses on the use of video instruction to obtain a stronger foundation for decisions on the continued use of video at the institution (BI Norwegian Business School, 2023).

Preconditions for data to contribute to better education quality

A good use of data within and across higher education institutions requires an appropriate infrastructure and holistic system architecture (Ministry of Local Government and Modernisation, 2019). This is referred to as a “shared digital foundation” in both the Ministry of Education and Research’s Strategy for digital transformation in the higher education sector, and in the action plan for this strategy (Norwegian Ministry of Education and Research, 2021b; Norwegian Directorate for Higher Education and Skills, 2022). The action plan gives the modernisation of a shared student system a key role in the development of the digital foundation. Data from different systems must be open, i.e. accessible in a form and format that makes it possible to share the data with others and to collate them.

7.5 Summary of the Expert Group’s assessments

The research summarised by the Expert Group in the first interim report indicates a number of potential benefits of learning analytics for pupils and students, for teachers and instructors, and for those who have a responsibility to ensure that schools and education institutions offer high-quality education. One important pedagogical value of learning analytics involves having a clear and systematised insight into the academic development of pupils and students. It is difficult to determine the greatest value of learning analytics in a Norwegian context, but we would like to emphasise three areas that we believe are particularly relevant.

Firstly, we believe that learning analytics has a strong potential to enable pupils and students to gain greater insight into their own learning during their learning processes. As mentioned, this requires the collection of relevant data from the pupils’ and students’ work in their subjects, and that data analyses are effectively communicated to pupils and students. Secondly, we have assessed that teachers and instructors would have much better opportunities to adapt their instruction if they have the sufficient and relevant information to assess their own teaching. Learning analytics can assist with this. The third area where we believe learning analytics could have a significant potential over time for improving instruction and promoting learning is in the work on quality development in schools and education institutions.

8 What pedagogical and ethical challenges are associated with learning analytics?

In the Expert Group’s first interim report, we explained the four primary dilemmas related to learning analytics:

  • teachers’ and instructors’ need for information about pupils and students to support learning, balanced against the protection of pupils’ and students’ data.

  • how learning analytics affect the balance between learning through interaction and learning as an individualised process

  • the balance of centralised support and autonomy when drawing conclusions on learning analytics

  • the balance between the competence of teachers and instructors required by learning analytics and the actual competence of the education sector

Based on these discussions, the status description in chapter 3 and the input we have received along the way during our work, we have identified a few areas where learning analytics either augment existing challenges or introduce new ones. In this chapter, we will point out specific pedagogical and ethical challenges that we believe are necessary to address. as well as our assessment of these.

8.1 Restriction of content and working methods in instruction

One important pedagogical challenge discussed in the first interim report is the risk that learning analytics may contribute to a restriction of the content and working methods in education. To summarise, this involves concerns that learning analytics may lead to an increased use of individual work methods and less emphasis on the more exploratory and reflective parts of study subjects. The input we have received also points out that learning analytics will appear less relevant when there is a need to apply competence to more complex problem-solving tasks (Norwegian Association of Graduate Teachers, 2023).

Save the Children Norway (2023) has also questioned whether the extensive use of learning analytics in schools “may not be line with what “Fagfornyelsen” [Curriculum Renewal] would entail – that pupils would have a greater opportunity to explore topics in more depth, experience a higher degree of participation, have a more practical approach to subjects, and to work across subjects on topics” (p. 5). They also believe there is a risk that learning analyses may break with some of the values and principles in the core curriculum and with many of the competence aims stated in the subject curricula.

8.1.1 Restrictions to subjects and competence

The Expert Group notes that there is widespread concern that learning analytics could draw more attention to “what can be counted or measured”:

There is a major concern that learning analytics are being developed on a foundation that is too narrow to be measured or adapted to pupils’ learning. We should reflect on how the concept of competence in the LK20 National Curriculum will be addressed in learning analytics. How, for instance, will problem solving, critical thinking and other aspects be reflected in learning analytics? Skolelederforbundet, 2022, p. 2) [Norwegian School Heads Association]

The concept of competence in Norwegian schools states that competence is “the ability to acquire and apply knowledge and skills in order to master challenges and solve problems in both known and unknown contexts and situations” (Norwegian Ministry of Education and Research, 2017, Chapter 2.2). Furthermore, it is emphasised that competence entails both the understanding of and ability for reflection and critical thinking. Our experience with current tools and practices indicates that it is unlikely that information from learning analytics will be able to cover the breadth of the Norwegian concept of competence in the near future. We believe that it is more likely that the analyses will provide information about pupils’ abilities to acquire and apply their skills and knowledge rather than their abilities for reflection and critical thinking. At the same time, rapid developments in the field of artificial intelligence have made it more difficult to predict the potential areas of use for information from learning analytics in the near future. Certain skills and areas of knowledge would also be more suitable for learning analytics than others: “Also in individual subjects, we see that learning analytics may be sensible for certain partial subjects and not for others (e.g. in mathematics, where computational problems are fine, but it would be less appropriate for more complex problem solving)” (Norwegian Association of Graduate Teachers, 2022, p. 3).

Academic areas with content that can be divided into clear and measurable areas of knowledge and skills, and where there is an algorithm that can determine whether the answer fits, is significantly easier than offering learning analytics to someone in more open academic areas that are more suitable for exploration. In mathematics, several learning resources are currently planning to use learning analytics, for instance, within the four basic arithmetic operations, algebra and geometry. We have yet to see a meaningful interpretation of data associated with more inquiry-based areas of the subject, such as creative solutions to open-ended problems. This may also be related to the type of data that is primarily forms the basis for learning analytics in today’s resources. Many digital teaching aids and resources currently contain a large number of closed problems. These are problems where all questions are followed by a limited number of pre-defined answers, such as multiple choice or assignments where different words, numbers or images must be placed in a certain sequence. The Expert Group does not believe that closed problems are problematic in themselves, but it does see the need to continue to explore other methods of designing assignments and measuring competence. Using technology to practice simple skills and knowledge may also have an educational value, but this should not be given too much room in instruction.

There is a market-driven development of learning technology in Norwegian education, something the Norwegian Privacy Commission emphasises in its report (NOU 2022: 11). This means that developers and suppliers will largely prioritise areas of a study where it is both academically and technically simple to develop digital resources. The Norwegian Association of Graduate Teachers (2023) also points out that the freely accessible learning arena NDLA5 limits the selection of good learning resources in upper secondary instruction for marketing reasons. The Expert Group therefore believes that we need national schemes that stimulate the development and purchasing of digital learning resources that reflect the breadth of school subjects throughout the education pathway.

Furthermore, the Expert Group is concerned that current digital learning resources are nearly always offered in Norwegian Bokmål. The National Parents’ Committee for Primary and Secondary Education (2022b) shares the same concerns:

Many pupils in Norwegian schools are multilingual or have various challenges that entitle them to special education instruction. The situation regarding physical and digital teaching aids is challenging, as publishers and business models do not take such things into account. No do they consider statutory rights regarding Sámi, Bokmål, Nynorsk and universal design. What impact will language and linguistic variations have on our pupils with the utilisation of DLA (learning analytics)? (p. 2)

Save the Children Norway (2023) has also expressed concerns about this: “We are concerned that DLA (learning analytics) do not capture pupil diversity in a positive sense, and we question whether DLA may possibly reinforce differences among the pupils inappropriately. We would particularly like to emphasise Sámi pupils’ right to instruction in the Sámi language” (p. 14).

It has been well documented that the scope of and access to teaching aids and learning resources available in Sámi and Nynorsk are not good enough Prop. 57 L (2022–2023). It is therefore reasonable to assume that there is no real access to learning analytics for Sámi or Nynorsk. This is regrettable, given the requirements set out in the regulations and the rights of the pupils. Sections 6-2 and 6-3 of the current Education Act stipulate that:

In the Sámi districts, everyone of primary and secondary school age has the right to learn the Sámi language and the right to be taught in Sámi. Outside the Sámi districts, at least ten pupils in one municipality who wish to learn the Sámi language and be taught in Sámi have the right to such instruction as long as there are at least six pupils left in the group. […] Outside Sámi districts, Sámi pupils of compulsory school age have the right to instruction in Sámi. […] Sámi pupils in upper secondary schools have the right to instruction in Sámi.

Given these rights, there must be a sufficient amount of varied teaching aids and learning resources in the Sámi language. In its investigation of Sámi pupils’ right to learn Sámi and be taught in Sámi, the Office of the Auditor General concluded that the lack of Sámi teaching materials diminishes educational instruction services for these pupils Dokument 3:5 (2019–2020).

With respect to Nynorsk, section 9 of the Education Act states that teaching aids must be available in both the Bokmål and Nynorsk dialects at the same time and for the same price. This is referred to as the parallelism requirement. In the consultation for the new Education Act, many consultation bodies have mentioned that digital resources other than teaching aids, such as writing programs, should also be covered by the parallelism requirement. In the legislative proposal on the new Education Act, the Ministry’s assessment and proposal is to maintain that the parallelism requirement should only cover that which can be defined as a teaching aid. However, it also proposes a new rule that writing programs should support both Bokmål and Nynorsk Prop. 57 L (2022–2023).

The Expert Group believes that special attention must be paid to the development of learning resources in both Sámi and Nynorsk to ensure that schools are able to fulfil the statutory rights of the pupils.

8.1.2 Less varied and more individualised working methods

Throughout their educations, pupils and students will encounter varied and inquiry-based working methods. Exploratory or inquiry-based learning and competence has also been significantly emphasised in many subjects in the National Curriculum LK20/LK20S. The Expert Group has assessed that it is difficult to see how learning analytics can strengthen an inquiry-based approach to learning with the resources and digital teaching practices we have today. It is essential that the scope of variation and inquiry-based working methods are not reduced in practice by enabling the collection of data for learning analytics to determine the type of learning activities pupils are offered and participate in. The Norwegian Association of Graduate Teachers (2023) has posed a question on this particular point: “How can we facilitate a sensible use of learning analytics without simultaneously implying, for instance, guidelines suggesting that we choose digital rather than analogue teaching aids?” (p. 2).

We have seen such unintended changes in a Norwegian context earlier with respect to digitalisation. The fact that there is increasingly more individual work in Norwegian classrooms (Gilje et al., 2020), is not the result of intended change, but rather that the one-to-one access makes it more “natural” for each pupil to log in with their username and work individually. Variations of working methods also means that large areas of learning must take place through interactions with others. The Norwegian Association of Graduate Teachers (2023) ask: “Will the opportunity for digital learning analyses lead to an increased use of digital, and especially adaptive teaching aids, and what may this increased use of adaptivity do to the learning community of the classroom?” (p. 3). If learning analytics are to counteract the predominance of individual working methods, which we have seen signs of in the fully digital classroom, this would place substantial demands on both digital teaching practices and learning resources (Blikstad-Balas and Klette, 2020).

The opportunity to choose freely between different academic resources is not currently particularly widespread in either primary and secondary education, the vocational college sector, or in the university and university college sector. In primary and secondary schools, it is becoming increasingly more common to purchase licenses for entire “package solutions” where a supplier delivers resources for all relevant subjects (Rambøll, 2023). The Expert Group believes that this is a solution which, at best, safeguards totality at the expense of flexibility. We believe that teachers and instructors should be able to freely choose between a broader range of resources in order to increase the opportunities for local adaptations and variations. This would require teachers and instructors to have the time, competence and capacity to learn how to use them. It would also require making sufficient information about how the digital resource has been developed available to them.

8.2 Links between learning analytics and the National Curriculum

The Expert Group has received several suggestions indicating that there must be a clear link between learning analytics and the National Curriculum if learning analytics is to have real value in primary and secondary education. This was also clearly demonstrated in the assessment of barriers to learning analytics in primary and secondary education (see Chapter 3.4.4). In order for digital teaching materials and analyses to function adequately as a basis for decision making for teachers, teachers must be aware of which parts of the National Curriculum the different resources will help develop. A broad range of different resources are currently available in today’s market. However, it is not always clear how well the various resources harmonise with the National Curriculum or the values of Norwegian schools. In Norwegian-produced digital teaching aids that are tailored to the various levels and subjects, it can also be challenging to know which parts of the National Curriculum the resources are meant to contribute to. There are many contexts in the National Curriculum, both between core curriculum and the subject curriculums, and between subject curricula. Linking digital resources to the National Curriculum is therefore a complex process.

We fully understand why teachers and school leaders wish to learn more about which parts of the National Curriculum a resource with a functionality for learning analytics would be aimed at before they begin using it. Exactly who will be doing the work to ensure coherence between the National Curriculum and the digital resources, and whether it is at all possible to link all the digital resources to competence aims remains unanswered. We also mention the latitude among teachers and at the local level for interpreting these links in the National Curriculum and how the schools wish to put this into practice.

8.2.1 Link to competence aims

It has long been common among Norwegian schools for different teaching aids to signalise both the subject, year and academic topics they profess to cover (Askeland et al., 2013; Tønnesen, 2013). Several teaching aids developed after the introduction of the National Curriculum for Knowledge Promotion in Primary and Secondary Education and Training (Kunnskapsløftet – LK06/LK06S) have also included different competence aims in the National Curriculum and linked these, for example, to different chapters in the teaching aids. Textbooks have therefore long been viewed as an interpretation of the “curriculum’s perspective on subjects and knowledge” (Tønnessen, 2013, p. 149).

When applying for a grant from the Norwegian Directorate for Education and Training to develop teaching aids, there is a requirement for teaching aids to be developed for use in instruction and to cover all or parts of the competence aims stated in the National Curriculum. This is because teaching aids in the Education Act are defined as materials that cover significant parts of the subject’s curriculum. In this sense, it is reasonable to expect that digital teaching aids in a given subject also give an indication of the competence in the subject that the supplier believes the digital resource will help to develop. The Expert Group believes it is important for school owners and schools to be given information that provides a basis for selecting and using teaching aids. At the same time, we also see certain issues in linking parts of digital teaching aids – specifically where there is a functionality for learning analytics – to specific parts of subject curricula or competence aims.

For one thing, it is not possible, with the broad concept of competence stated in the National Curriculum, to assume that only one method for measuring and developing pupils’ knowledge and skills is sufficient. A supplier will therefore seldom deliver more than a few concrete suggestions for how one or more competence aims can be measured or worked on. Competence aims should function as targets for a competence that pupils should have the opportunity to develop over time, and pupils develop different knowledge and skills along the way towards reaching these objectives. Therefore, these are not things that can simply be ticked off a list as each is completed. We believe there is a risk that the sector will continue to view them in this way if individual modules involving resources with a functionality for learning analytics are directly linked to specific competence aims. In addition, the competence aims for a subject are related to each other, and to the introductory parts of the curriculum. It is thus beneficial for pupils’ learning to work in larger contexts rather than just one individual competence aim after the other.

Secondly, there will be a need to break up these competence aims into smaller and more fragmented units to create a more appropriate structure for including smaller modules of the teaching aid. Thus, the understanding of the curriculum becomes narrower, which could lead to splitting instruction in the subject into more parts than would be desirable.

The Expert Group believes that instead of linking parts of the teaching materials and analyses to specific competence aims, they should describe the areas of knowledge and skills in a subject that the analytics in question could help develop. We believe it is essential to develop methods of linking the learning analytics to the content in a way that promotes rather than narrows teachers’ didactic reflection and their work on the what, how and why aspects of learning. As with all other learning resources brought into the classroom, teachers must always determine which specific competence aims the learning resource would be relevant for. This method of linking learning analytics to content would also be more flexible considering the future changes in primary and secondary education in the National Curriculum.

The Expert Group specifies that this challenge applies to how the link between the content of the teaching aid and competence aims are presented to the user, and to not the use of metadata that supports the technological functionality of the solution.

8.2.2 Commercial considerations

Furthermore, there is a question of whether there are commercial providers who will be permitted to define parts of their own services that they believe correspond to different parts of the National Curriculum. Although we fully understand that the sector would prefer such overviews, we believe there is reason to warn against permitting commercial actors to be given even greater power to define such aspects for Norwegian schools than they already have. Not only are suppliers focused on pupils’ learning, but also whether developing various types of technology is actually profitable in a market under pressure. We believe there is a real danger that the monopoly tendencies we have already seen in digital instruction will intensify if all suppliers must link all content to the National Curriculum. It can quickly become worthwhile to link as much content as possible to give the impression that they are offering a “complete package solution” – a tendency that is already evident among the major suppliers today, where several have promised to give teachers “everything they need for the new curricula”, and similar promises. For a smaller supplier who is perhaps able to deliver a good product that is only relevant for limited areas of a school subject, this type of linkage could be a disadvantage. The Expert Group believes that suppliers should be required to provide a realistic description of the relevance of their resources to pupils’ development of knowledge and skills in a subject.

8.3 Academic freedom and decisions on learning analytics

Systematic learning analytics require adaptations to be made at a more central level than by individual teachers. At the same time, it must be possible to adapt the use of learning analytics to the nature of the subject, professional judgement and local conditions. This is where difficulties may arise. Since academic and pedagogical freedom is regulated unequally and the sectors have varying structures, assessments would also differ between primary and secondary education and higher education.

8.3.1 Academic freedom in higher education

In higher education, academic freedom as a principle remains strong. Academic freedom implies freedom in the teaching role, but with corresponding responsibilities. Section 1-5, fourth paragraph of the Universities and University Colleges Act stipulates that the teacher or instructor has an independent academic responsibility for the content and structure of the instruction within the frameworks set by the institution. The long-term plan for research and higher education specifies that academic freedom does not pose an obstacle for developing measures at the authority level to realise education policy objectives.

NOU 2020: 3 Ny lov om universiteter og høyskoler [NOU 2020: 3 The new Act relating to Universities and University Colleges] summarises a few restrictions on teachers’ academic freedom as follows: “Teaching must, however, be designed to lead students towards the applicable exam or degree. A study plan may also place restrictions on the free choice in the instruction. Teachers will regardless have freedom with respect to the presentation of materials and perspectives” (p. 128).

Making decisions on which learning analytic tools should be made available for teachers would therefore lie within the institution’s frameworks. Today, most institutions have also determined which digital learning platforms teachers must use in their instruction, and these often contain opportunities for certain learning analytics. Apart from requirements for the use of certain learning platforms, the Expert Group is not aware of institutions requiring teachers to use these platforms for learning analytics or to adopt the functionality of learning analytics in other systems. This indicates a practice whereby teachers and instructors make independent decisions within the frameworks set by the institutions.

The Expert Group believes that the way in which each teacher or instructor performs learning analytics within these frameworks falls under their academic freedom and responsibilities. This entails an assessment of academic presentations and working methods that are most appropriate for certain courses. However, relevant competence is required for assessing the learning analytics. Universities Norway (2023) sees the need for a collaboration in the sector on teaching resources and information about the possibilities, limitations and risks of learning analytics. The organisation points out that if common guidelines are to be drawn up, the sector must be involved, e.g. through Universities Norway’s bodies.

8.3.2 Methodological freedom in primary and secondary education

The current competence-based National Curriculum has not set guidelines for specific working or teaching methods (Norwegian Ministry of Education and Research, 2017). The way in which teachers plan and conduct their instruction is therefore a professional responsibility.

NOU 2015: 8, which is the foundation for the revision of the National Curriculum in 2020, specifies that the freedom to choose methods must be based on well-founded and research-based decisions:

[…] teachers have a professional responsibility to choose subject content, ways of working and organisation that are based on research relevant for pupils’ learning and adapted to the particular group of pupils. This means that teachers’ professional autonomy involves a responsibility for making well-reasoned and research-based choices of methods and approaches in their teaching. (p. 78)

The Expert Group notes that these professions are adamant that learning analytics must fall under professional autonomy. “Teachers must themselves be able to choose the situations in which they want to incorporate learning analytics as part of the planning work for the instruction, based on an assessment of subjects and learning situations to which this would provide real added value, and for which groups of pupils” (Norwegian Association of Graduate Teachers, 2023, p. 2).

We emphasise that the assessment from the first interim report stating that the decision to use resources with a functionality for learning analytics should be made as close as possible to where instruction is taking place. At the same time, we realise that there is a high demand among teachers for guidance and support in making these pedagogical decisions. Many teachers have expressed uncertainty about how learning analytics work, when and how they would be appropriate for teaching and learning, and what is needed to ensure that learning analytics are responsible. The Expert Group believes there is a strong need to ensure a better overview of resources with a functionality for learning analytics that is available for primary and secondary school instruction, and to develop good support structures for assessing the relevance and quality of the resources. We emphasise that it is essential to have national frameworks that can ensure that all learning analytics in Norwegian schools are performed in accordance with legal and ethical frameworks that protect the privacy of pupils to the greatest extent possible. In addition, teachers must have the necessary competence to make good pedagogical decisions on learning analytics.

8.4 When decision support becomes the decision

In the first interim report, we asked whether information from learning analytics could be perceived as more authoritative than other information that teachers have about pupils’ and students’ learning. We are concerned that digital presentations and visualisations based on automatic calculations may have too great an influence on pedagogical decisions. This concern was also expressed in the input from the sector: “Experience has shown that teachers become more passive and place too much trust in data/results obtained from digital learning” (Møre og Romsdal County authority, 2022, p. 1).

The Norwegian Data Protection Authority (2022b) has also noted the risk that learning analytics can become an automatic decision-making system, even if it is not intended as such:

Another risk is that the system in practice could be used as an automatic decision-making system, even if it is not intended as such: In other words, some teachers may accept recommendations from the system without making independent assessments. This may be due to a heavy workload, insufficient knowledge of the algorithm, insufficient insight into how the system works, etc. We can also imagine a situation where recommendations from the system are perceived to be so good that teachers do not feel they can disprove the system. (Chapter 4)

For learning analytics, the issue would be particularly relevant in the delineation between decisions on adapting instruction and decisions on pupil and student learning. We will illustrate how information from learning analytics may function as a decision rather than as decision support by describing two examples.

8.4.1 Decision support becomes a measurement of competence

Let us take a hypothetical example: A class in year 9 uses a fully digital learning resource for the subject of Norwegian, which includes a multiple-choice test for the genre of short stories. If the teacher uses the results systematically and looks at how continued instruction and the follow-up of an individual pupil should be adapted, this would be in accordance with good learning analytics used for decision support. For the teacher, the test may have contributed important information about what pupils have learned about the genre of short stories, as well as what the teacher needs to spend more time on before the pupils can write or analyse their own short stories. Here, learning analytics have supported the teacher in their future work. Learning analytics alone have not “decided” anything, but rather provided information about future decisions. Yet we can also imagine another possible use of the same data.

If the teacher, who is about to make a half-year assessment or coursework grade in the subject of Norwegian, only months later goes back and focuses on the results of a test for a specific pupil as documentation of the pupil’s competence in the short story genre, the analysis is used as a decision regarding the pupil’s competence at a specific point in time. In this case, the test score does not provide information about future work. Instead, it has become a measurement of what the pupil knows or does not know about short stories.

The Expert Group believes there is reason for concern that the information from learning analytics can become a decision rather than a support for the teacher. This may present challenges in that a number of digital resources today offer results and overviews of pupils’ academic answers, but it is not made clear whether this is intended as a basis for adapting the instruction or as a measurement of what the pupil has done. It reinforces the problem that several suppliers suggest that they can provide an “overview” over what pupils know, understand, need to practice, or similar formulations.

The Expert Group also believes it is important for pupils and students to be aware of whether the data collected on their learning is meant to provide insight into their learning processes and their academic progression along the way, or whether the data will be used to document their competence. In order to provide this information, suppliers must state which of the two objectives the resource will help to achieve.

8.4.2 Narrow analyses are broadly interpreted

Another example of how learning analytics may be given too much emphasis: If a pupil repeatedly displays an average achievement of objectives in various parts of a digital teaching aid in a subject, there is a risk that this will contribute to preventing the pupil from being assessed as anything but average in this subject. There is therefore a risk that the measurable aspects of the subject, where there is an opportunity for learning analytics, would have disproportionate importance in the overall assessment of the pupil’s total competence. This is in itself problematic, as there are no teaching aids that can measure all aspects of a subject. If the digital teaching aid is meant to cover all aspects of subject’s curriculum, it would still be problematic to lean on information from the learning analytics for decision making, as pupils have the right to demonstrate their competence in several different ways6. When learning analytics form the basis for an assessment, it would be necessary to comply with the principles of fairness and accuracy stated in the General Data Protection Regulation to ensure that pupils’ and students’ overall competence are assessed correctly.

In order for learning analytics to function as intended, teachers and instructors must have sufficient prerequisites to assess the information from the analyses and draw independent conclusions based on their professional and academic judgment. Among other things, this would require suppliers to make information available about how the factors being measured are part of a greater whole. Union of Education Norway (2022) believes it is especially important for teachers to have the appropriate prerequisites for assessing adaptive teaching aids: “It is […] essential that teachers are enabled to assess functionality and the database/data sources for each adaptive teaching aid in order to determine what teaching aid says about a pupil’s/student’s academic level” (p. 1).

8.5 Data that is irrelevant, misleading or difficult to interpret

Through the use of digital learning resources, data is collected on pupils and students that may be irrelevant, misleading or difficult to interpret. If learning analytics are to be accurate and relevant, it is necessary to have data that provides information about learning and analyses that draw the correct conclusions. In our discussions in the first interim report, we mentioned examples of mechanisms that challenge these conditions, such as collaboration using shared logins, as well as pupils and students who manipulate the systems.

In this context, we are concerned with the quality of the data, and not necessarily the quantity. It is important to emphasise that although the amount of data may be relevant for performing good learning analytics – especially if it involves machine learning – more data would not in itself imply a higher quality analysis. The supplier Neddy (2023) mentions this in its input to the Expert Group:

Yes, learning analytics require a continuous collection of activity data, but do good learning analytics really require large amounts of data? We believe that this discussion must primarily focus on what can be described as high-quality activity data, i.e. more accurate data, and whether this would increase the quality of the analysis. What if we primarily think qualitatively and not quantitatively here, before concluding that more data equals better insight? (p. 16).

In the following, we have highlighted certain aspects that give cause for concern about the quality of data and analyses. All aspects are also scenarios that indicate a risk of failing to comply with the principles of fairness and accuracy in the data protection legislation.

8.5.1 Data as a basis for uncertain inferences about learning

One example of data that is difficult to interpret is data on time expenditure, which tell us that a student has spent an unusual amount of time on an assignment. This may imply that the student has found the assignment difficult. However, there may also be several other explanations. For instance, the student may have taken a break and done something else while time was automatically being measured, or the student may have found the assignment so interesting that they decided to look up other resources to learn more about it. Pupils have also expressed this in their discussions with the Expert Group: “It’s okay that you can see what we’ve answered, but not how much time we spent on it. We may make mistakes, take breaks, and so on. So that would be completely wrong” (pupil in year 9).

Similarly, it would be difficult to determine what the number of attempts a pupil needed for solve a problem before it was correct says about the pupil’s learning. Pupils that the Expert Group have spoken with explain that they may have correctly understood and correctly used a calculation strategy but still repeatedly gotten a wrong answer as they made a small error in their calculation. This is a good example of something any math teacher would have immediately noticed, but that a machine would not necessarily have identified. It illustrates the necessity of having a teacher close at hand when pupils are working in such systems.

The problem here is not that the data provides incorrect information, but rather that it is difficult to arrive at good conclusions about learning based on the data, unless such a relationship has been demonstrated through research and testing. The Expert Group believes that there is a need for a thorough pedagogical discussion on which data should be included in learning analytics, and what conclusions could be drawn on the basis of the analytics in each case. This requires a high degree of transparency from suppliers and developers of resources with functionalities for learning analytics. Teachers and instructors must take an active role in the use of learning analytics to reduce the risk of incorrect conclusions. We emphasise that responsible learning analytics would primarily involve a teacher’s professional assessment of the conclusions before they have pedagogical consequences.

8.5.2 Data as a basis for incorrect conclusions

Examples of misleading data include when several pupils are working together to solve problems in the same login, or when pupils use strategies when solving problems that result in misleading data. The Norwegian Association of Graduate Teachers (2023) mentions this in its input to the Expert Group: “Pupils log in with each other’s usernames or they help each other, which leads to incorrect data. Some of the assignments are designed such that pupils can randomly press a key until they get it right” (p. 2).

We have also received input from parents who say that pupils are using strategies for assignments that do not reflect their academic competence. For instance, they may identify a pattern in how the correct answers are formulated, or they may intentionally give the wrong answers to get easier problems. We are also aware of instances where pupils using resources with strong reward systems begin collecting as many points as they can rather than solve the assignment problems.

Adaptive teaching aids are adjusted according to pupils’ earlier answers. If a pupil has received help or has collaborated with others when working with the teaching aid, future assignments may have an artificially high level of difficulty. This level will gradually readjust itself to the pupil’s level if the pupil continues working alone. However, the pattern of the pupil’s answers may be interpreted incorrectly by the algorithms of the teaching aid. This is especially applicable to younger pupils who receive help from their parents at home. The Norwegian Union of School Employees (2023) ask: “And when the pupils work together or get help at home, what answers do they then get from the adaptive assignments?” (p. 1).

To avoid wrong conclusions, it is important that those who are following up the results of learning analytics are aware of the type of data used and the context from which it was collected. The Expert Group believes there is reason to be critical of the results of the analytics if adaptive teaching aids are used for collaborative assignments with a shared login. We believe that adaptive models must take into account that pupils may work together and be given help. The system must therefore quickly “reset”, and sequences with strong performances must be valued and viewed in context with the progression report. For example, achievement peaks could show what the pupils are capable of when receiving help, as opposed to what they can achieve on their own. In this way, such achievement peaks can provide information about a pupil’s potential.

8.5.3 Does the data really provide information about learning?

The last category of data we must address is irrelevant data. Large amounts of data are collected on pupil and student learning situations for learning analytics, which may have an uncertain pedagogical value. The supplier Neddy (2023) has posed some good questions on the matter:

Today, we see that information about when and for how long an activity is performed is used as an indication of learning outcome. Furthermore, information that is easy to manipulate is used as a basis for personal adaptations. Why is this information currently being used for learning analytics? Is it tradition that is holding us back in this field? Perhaps such data points are not good data points for learning analytics (p. 20).

Time data on when and for how long a pupil or student has been working is easy to dismiss as irrelevant for learning analytics – and one could ask why it is collected at all. Many pupils have also expressed that they feel uncomfortable knowing that teachers have access to information about what time of day they do their schoolwork: “I don’t think it’s okay for teachers to see when you’re working. The most important thing is to hand in your homework, so it doesn’t really matter when you do it” (pupil, year 7).

Pupils and students have also expressed concerns that access to time data in some places may lead to an unwritten “absentee limit” on learning platforms that could conceivably impact their relationship to a teacher or instructor – and in the worst case also play a role in the academic assessment of the pupil or student.

The Expert Group believes that data on what time of day a certain pupil or student does their work would have limited pedagogical value and should therefore not be collected for the purpose of learning analyses. When information about pupils and students is not relevant for pedagogical or administrative reasons, and pupils also express concerns about this information being available, it would also be highly problematic to process the data from the privacy perspective.

8.6 Monitoring and increased stress

The Expert Group devoted considerable space in the first interim report to a discussion on the dilemma of the need for data as opposed to data protection. One of the challenges brought up in this discussion was the discomfort that many pupils and students experience when an education institution collects, shares and analyses data from their activities on digital devices. Such information flow may affect the relationship and trust between a teacher/instructor and a pupil/student. Many students and pupils have expressed that continuous data collection has given them a feeling of being under constant evaluation. This leads to greater pressure and stress for many young people. Save the Children Norway (2023) has called for a public debate on this issue:

Save the Children Norway calls for a public debate on the use of personal data for learning and development in schools, which is not only linked to sectoral goals for learning outcomes, but also raises more ethical questions about children’s autonomy, their right to privacy, and their right to speak, write and play freely without having their activities registered, shared and used, even if this could be beneficial for pupils’ learning and at the aggregate level for the education sector or other public services (p. 4).

8.6.1 Fear of trial and error

Pupils and students should feel secure in being able to try things, even when they are uncertain as to whether they will succeed. They should feel willing to tackle challenging tasks as this can enhance learning – even if they make a few mistakes along the way (Kapur, 2008). When traces of the learning process are saved, this may make some pupils unwilling to try and fail, and they may feel uneasy knowing the teacher has access to this information:

I think it’s difficult knowing the teacher will see everything. We feel insecure and afraid of making mistakes. You should have the chance to make mistakes without someone seeing them. You may understand it later, and then maybe the teacher won’t notice. It’s like if you were to hand in a draft every time you write something. Then the teacher can see everything you’ve done wrong before you figure it out. What could happen is that teachers don’t focus on your end product, but instead on the mistakes you make along the way (pupil, year 9).

In the interim report, we discussed the risk that the willingness of pupils and students to try and fail may be impacted by having data from digital learning activities saved and used for learning analytics. The Expert Group notes that there are concerns in the sector that learning analytics may lead to changes in pupils’ behaviour for this reason. The Norwegian Union of School Employees (2023) gives examples of pupils who have written down vocabulary words from an app for language learning instead of using the potentials of the app to practice – because they didn’t want the teacher to see when and how often they practiced for a vocabulary test. In a privacy context, this type of altered behaviour is referred to as a chilling effect:

The fear of being monitored, or that someone will use information about us for purposes we are unaware of may lead to changes in our behaviour. When we lose control over who knows what about us, we are forced to consider the uncertainty factor. As a result, we may begin to think through and re-evaluate what we write, who we are in contact with and what we do. This self-regulation due to fear of surveillance is called a “chilling effect”. Norwegian Data Protection Authority, 2020a, Chapter 7).

In order to counteract the risk that pupils will change their behaviour when their personal data is used for learning analytics, national authorities must stimulate privacy-promoting functionalities in the resources. Furthermore, it is necessary that this use is included as part of an assessment practice that contributes to a safe and inclusive learning environment. We emphasise that pupils and students must have the opportunity to try and fail during their learning processes without having the school and education institutions save this data and use them for learning analytics. It is crucial that schools, vocational colleges and higher education institutions are aware of this responsibility.

8.6.2 The experience of constant assessment

Viken Youth Council (2023) is concerned that learning analytics may contribute to increased pressure on young people if they feel that are in a situation of constant assessment that does not give the full picture of their competence. The Norwegian Union of School Employees (2023) talks of their experiences: “When they [pupils] never know exactly when they are being assessed, or if they feel that are being assessed all the time, this creates an unnecessary level of stress” (p. 1). Save the Children Norway (2023) formulates this as follows:

There must be an evaluation of what part of the pupils’ work that can be assessed and what part does not need to be assessed. With the introduction of digital teaching aids and tablets in Norwegian schools, pupils may feel as though they are always being assessed. Pupils need a break from these continuous assessments. It is important for all pupils to have the courage to express their opinions and participate in the exchange of opinions, and to explore various dilemmas in a secure community of their teachers and fellow pupils (p. 13).

The National Curriculum points out that “schools and teachers must balance the need for good information about pupils’ learning and the unwanted consequences of various assessment situations” (Norwegian Ministry of Education and Research, 2017, Chapter 3.2). According to the National Curriculum, consequences of the unfortunate use of assessments are that it may prevent the development of a good learning environment, and that it may negatively affect the self-confidence of certain pupils. Concerns about systems “remembering all mistakes” were well-articulated by young people at an input meeting the Expert Group held with Sentralt ungdomsråd i Oslo [Umbrella Youth Council of Oslo]: “If you say something stupid during a class, the teacher will forget about it. It’s a lot worse if these mistakes are saved so that the teacher never forgets stupid mistakes”.

This comment contains an important point about the difference between humans and machines. A teacher can systematically disregard mistakes that are of no importance, interpret context (e.g. understanding that a pupil is having a particularly difficult time just before a test), and distinguish between significant and insignificant indications of what a pupil is capable of. This ability is also part of the reason for the system of classwork grades in primary and secondary education and training, where subject teachers are all given substantial autonomy precisely to emphasise what they believe is significant information about pupils’ competence. It is essential that teachers have a wise approach to how they emphasise information from learning analytics, and that there are always people involved who can actually ensure all pedagogical aspects of the instruction. This approach must also be applied to pupils to ensure that they do not feel that “everything they do” can be used against them when determining grades.

Nor should students in higher education and tertiary vocational education have to worry about whether data being collected today – especially data obtained from their activities on learning platforms – will affect them negatively during their education or in the future. In the vocational college sector, student associations have expressed concerns about the close connections between study programmes and the industry. The National Union of Students in Higher Vocational Education and Training in Norway (2022) points out that students may feel it is invasive when analytics from their use of the learning platform are followed up by the education institution. Students fear that information about their learning activities and their behaviour may be passed on to the labour market and impact their future recruitment and employment relationships.

The Expert Group believes that this must be assessed thoroughly on principle to determine what type of data can be used for learning analytics. More knowledge is also needed on how continuous data collection and analytics may impact pupils and students. Here, we see a clear need for more practice-based research on the extent of the learning analytics and the actual impacts that the collection, analyses and further use of data has on pupils’ and students’ experience of teaching and instruction, and on their learning outcomes.

8.7 Random and planless use of data

One of the main challenges noted by the Expert Group is that large amounts of data are currently being collected from digital learning resources for undetermined educational purposes. For pedagogical, ethical and privacy reasons, this is questionable if, in connection with learning analytics, data is collected without a plan for how the information will be used to follow up pupils and students. Much of the input we have received stresses that there are few systematic learning analytics in today’s primary and secondary education, higher education, or tertiary vocational education. In the survey of learning analytics in primary and secondary education and higher education, we have seen that this practice currently appears to be ad hoc and unmethodical (Rambøll, 2023). Nor have surveys identified municipalities that are working systematically with the functionality of learning analytics (FIKS, 2023).

8.7.1 Run by enthusiasts

Random and planless practices with learning analytics may be due to the problem that its use relies entirely on so-called digital enthusiasts, something we also problematised in the first interim report. Primary and secondary education has long relied on teachers with a particularly strong interest in digital competence and specific digital tools (Egeberg et al. 2017; Gudmundsdottir and Hatlevik, 2018). How teachers incorporate digital technology in their instruction, and to what extent, has in many cases been up to each individual teacher. This is unfortunate, as it leads to inequalities in the instruction, with varying opportunities for pupils to develop their digital competence.

There is a similar issue with respect to learning analytics. As Rambøll (2023) summarises, learning analytics appear to be “entirely dependent on teachers and school leaders with a special interest in these tools” (p. 14). Several informants have stated that they must use their leisure time to acquire and learn about relevant resources, and to develop a teaching plan where the resources can best be employed. We emphasise that a good, systematic learning analysis requires sufficient competence, time and capacity to become familiar with the use of these resources.

8.7.2 Lack of knowledge

One of the reasons for the random practices with learning analytics may be a lack of knowledge in the sector. The Expert Group has noticed that not only is term ‘learning analytics’ unfamiliar to many, but also its definition: “Currently, the use of learning analytics is not particularly systematic or evidence based. Many may not know what it is” (Skolelederforbundet, 2022, p. 1) [Norwegian School Heads Association]

As the Universities Norway (2023) points out, the low prevalence of learning analytics may be due both to a lack of knowledge and access to tools and opportunities. It is understandable that little knowledge will result in little use, although this is not necessarily a problem in itself. The issue is that the existing use is not systematic, which makes it difficult to build a good evidence and knowledge base on learning analytics. Another challenge is that resources with a functionality for learning analytics are already collecting large amounts of data that are not used for good pedagogical purposes. The Expert Group believes that there is a strong need for a clear and explicit framework to determine what type of information that can be collected, and for what purpose. We also believe that teachers and instructors need greater competence in learning analytics to ensure that they have the sufficient ability to make pedagogical decisions based on the learning analytics.

8.8 Summary of the Expert Group’s assessments

Research and input from various actors indicate several barriers to good learning analytics. We are especially concerned when pupils and students say that learning analytics make them feel as though they are constantly being monitored and that they do not have enough freedom to make mistakes. We also believe there is a danger that the resources used to collect and analyse data could themselves lead to more individual work, and that this could narrow down the subject content and working methods.

Another issue is the interpretation of the analytics. It is highly problematic that much of the data collected for learning analytics have an ambiguous pedagogical value and can be misleading or difficult to interpret. This problem is exacerbated by the fact that analytics intended for decision support for teachers and instructors are in many cases perceived as authoritative and become decisions instead. We have assessed that although the research does indicate that learning analytics have promising pedagogical advantages, we are still far from a practice in today’s schools and education institutions that outweighs the disadvantages.

9 Participation in learning analytics

Pupils and students have the right to participate in matters that concern them. This is emphasised in several current regulations and is considered a key value of the Norwegian education system. Learning analytics use digital traces from pupils and students as a source and may provide important insight into their learning. Both aspects of learning analytics concern pupils and students to a large extent.

On the one hand, learning analytics can create better conditions for the participation of pupils and students, partly by increasing their insight into their own learning, which can make them better equipped to answer questions about learning and teaching practices. On the other hand, learning analytics can also make it more challenging for pupils and students to participate and contribute if they are not given the opportunity to gain such insight and an understanding of how this concerns them. In this chapter, the Expert Group will describe the right to and experience of participation in education, and we will cite the input we have received from pupils, parents, students and their representatives. We will also assess how participation should be safeguarded in learning analytics.

9.1 Pupil participation

Pupils have the right of participation according to several of our current regulations, such as the Norwegian Constitution, the Education Act, and the Independent Schools Act. A pupil’s capacity to ensure their own interests and to participate develops over time with age. This applies to both their own learning and privacy issues (UN Committee on the Rights of the Child, 2021). The Education Act and its regulations determines frameworks and provides guidelines for pupil participation that applies to all teaching and instruction. According to the National Curriculum, pupils shall have the opportunity to participate in decisions regarding their own learning, and to participate actively in the assessments of their own work and their own competence, as well as their academic and social development. Schools shall promote a commitment to democratic values, and pupils shall experience democracy in practice through their schooling. This means that they must have influence and be able to have a say in matters that concern them through various forms of participation and contribution.

Knowledge of pupil participation is limited. Utvalget for kvalitetsutvikling i skolen [The Committee for Quality Development in Schools] writes in its knowledge base:

Despite several decades with a greater focus on the participation of children and young people, there is limited research on pupil participation in general, and on pupil democracy and pupils’ individual participation in Norwegian schools. For instance, a recently published research summary on youth participation in Norwegian municipalities indicates that there is little peer-reviewed research on this topic, and that young people are rarely given opportunities to voice their opinions. (NOU 2023: 1, p. 156)

A report prepared on behalf of the Norwegian Association of Local and Regional Authorities (KS) and the School Student Union of Norway states that those who have successfully facilitated pupil participation understand that pupil participation is about more than simply making decisions (Faannessen et al., 2022). They recognise that participation is a key part of pupils’ learning and understanding of democracy, and they know that secure relationships are a prerequisite. By facilitating processes to involve pupils and to develop competence in the area of pupil participation among teachers, leaders and school owners, they are also creating a broader and deeper insight into how their own practices must be adjusted according to their developmental needs.

Experience of participation

The Pupil Survey7 is annual web-based survey on pupils’ school and learning environments. This is conducted at all primary and lower secondary schools in years 7 and 10, and in the first year at upper secondary schools. The purpose of the survey is to give pupils the chance to give their opinions on learning and well-being at their schools. Results of the survey are used by schools, school owners and national education authorities to improve the schools. The results of the 2022 Pupil Survey show that year 7 pupils have expressed, to a much larger extent than year 10 pupils, that they are able to help decide how they will work on different subjects, that their teachers facilitate pupil council work, and that the school listens to pupils’ suggestions.8 Pupils in the first year of upper secondary school express a higher degree of co-determination in their responses to the same questions than year 10 pupils in lower secondary school. Pupils in vocational education programmes experience a higher degree of participation than upper secondary school pupils in higher education preparatory studies. Although the Pupil Survey provides an indication of the types of opportunities for participation pupils feel they have in Norwegian school, it does not provide detailed insight into what these opportunities imply in practice.

9.2 Student participation

Student bodies have a right under section 4-1 of the Universities and University Colleges Act to be heard in all matters concerning the students. Student bodies at vocational colleges have a corresponding right according to section 14 of the Vocational Education Act. Student participation is rooted in democratic principles, pedagogical considerations and agreements on study programmes. This is required to ensure that students can become active participants in their own learning. This is also emphasised in the white paper on vocational college education, which states:

Strengthening the position of student democracies at vocational colleges will enable students to influence the education environment and the academic content of their study programmes, and to contribute to improving vocational colleges even further. An attractive vocational college means more engaged students who will help determine its direction and development. Meld. St. 9 (2016–2017), p. 7)

Experience of participation

The Student Survey (Studiebarometeret9) is a national survey that is sent out to more than 70,000 students each autumn. The survey asks students for their opinions on the quality of their study programmes at Norwegian universities and university colleges. The aim of the Student Survey is to improve the work on quality development in higher education, and to provide useful information about study programme quality. There are similar surveys for vocational college students.10

In the 2022 Student Survey, just under 40 per cent of students responded that they had the opportunity to offer input on the content and structure of their study programme to a high degree (Hauge et al., 2023). It is worth noting that there are substantial differences between the education institutions. Just over 30 per cent responded that they had this opportunity to a low degree. Among vocational college students, approx. 60 per cent answered that they had the opportunity to offer input to a high degree (Øygarden and Stensby, 2022). A similar rate completely agreed that their vocational college facilitated participation through student representatives and local student councils. The Student Survey does not provide detailed insight into the type of participation this entails or what forms of participation the students actually utilise.

9.3 Input on participation in learning analytics

In conversations the Expert Group has had with the School Student Union of Norway, they have said that pupils should be given the same information about themselves that the teacher has. In addition, they emphasise that the pupil council should have access to aggregate data on their school if they are to contribute to better pupil participation. They believe that parents should also receive sufficient information but that they should have less access at the upper secondary level than at the primary and lower secondary level.

The Youth Panel in Møre og Romsdal (2023) believes it is important for youth to use existing meeting places to have their voices heard on the issue of learning analytics. Such places include the School Student Union of Norway and its national conference, Elevtinget, as well as the pupil council, and both municipal and county authorities. They emphasised that pupil surveys and evaluations should be developed that can provide a broader understanding of pupils’ opinions. They also believe that there is a need for a national youth council to ensure pupil participation.

Vestland ungdomsutval (2023) [Western Norway Youth Council] notes that teachers should discuss learning analytics with their pupils in each class: “In this way, teachers will know which learning tools to use that pupils find useful, as well as what information to use and how it should be used for their pupils” (p. 1).

In its input report to the Expert Group, Universities Norway (2023) has specifically expressed concerns about student participation in learning analytics:

Learning analytics can be a useful tool and a supplement to quality assurance work. However, it neither can or should replace student participation. Students must also be included when assessing the types of learning analytics that are needed and when, and therefore what type of data should be collected (p. 1).

The National Union of Students in Higher Vocational Education and Training in Norway (2022) emphasises that it is important for these tools to be used on the students’ terms, and that good student participation in the processes associated with learning analytics is actively facilitated.

9.3.1 Discussions with pupils on learning analytics

Through its discussions with pupils, the Expert Group has received good input, suggestions and ideas on learning analytics. We have spoken with pupils from several classes in different year levels at three schools. See more information about how the Expert Group has involved children and young people in Chapter 2.4.2.

To summarise, some of the pupils say they find it stressful to know that there is a machine that saves their answers, and that the teachers can see what they have been working on. Others think it is a good idea for the teacher to follow their work progress and contribute to better learning by adapting their instruction and providing pupils with the help they need. It is worth noting that the pupils have different perspectives on giving teachers and schools insight into how they are working on different school subjects. This emphasises the need to engage in dialogue with each individual class and each individual pupil.

The quotes in box 9.1 by pupils in years 7 and 9 provide examples of both the stressful and the motivating aspects of using digital teaching aids as described by the pupils.

Textbox 9.1 Reponses by pupils in years 7 and 9 regarding stressful or motivating aspects of digital teaching aids

“I think you feel more stressed. You don’t want to make a mistake because the teachers will see it.”
“It’s okay that they see what we answer. If I make some mistakes, I can just ask them anyway. That motives me to work even harder.”
“It’s not good to feel afraid of what you’ve done, but it’s good that a teacher can see what you’re able to do. Then you can improve and become better at what you’re working on.”
“It’s good that they can see whether we’re on the right track and give us feedback on that.”
“It’s a positive thing that the teacher can see what we’re answering if it can be used along the way to get help.”
“It’s a form of stalking, since they know exactly what I’m doing all the time. You become more alert when you know that a teacher can see what you’re doing. You can also become very cautious. Sometimes you feel afraid of doing anything.”
“The teacher can adapt their teaching to each individual pupil when they see what we can and can’t do.”
“I don’t think about it that much. It’s not that important. You may have written something wrong and had the wrong idea, but you learn from it.”

Many pupils have expressed that teachers should not have access to information about what the pupils do in their leisure time. Several have stated that they are uncertain as to which school employees can see what the pupils are doing when using the tools. Box 9.2 displays some of the response we received to our question of what type of data on the pupils the school should potentially not have access to.

Textbox 9.2 Responses from pupils in years 7 and 9 about which data on the pupils the school should not have access to

“That which has nothing to do with school. That’s what comes to mind. […] That which has to do with school is collected to help us.”
“Assignments where you share things about yourself. For example, that you take the bus from this or that station. It’s not okay for them to save that and share it.”
“It is very important that they [the teachers] can’t see personal things. We do a lot of personal things on the pc too. No one should ever see my chat log.”
“I wouldn’t have liked it if they [the teachers] had known which YouTube videos I watch or what skin I wear in Minecraft.”
“What happens if my mother uses the pc while I’m at the gym and I was in the middle of an assignment? Would something be registered about me that isn’t me?”
“It’s how I answer and get to the answer that is what is important to know, not everything else.”

Nearly all of the pupils say that the feedback they receive from their teacher has a far greater impact than feedback they are given by a machine through various digital teaching aids and apps. A few mentioned some positive aspects of automatic feedback. See the pupils’ responses in Box 9.3.

Textbox 9.3 Responses from pupils in years 7 and 9 about automatic feedback

“It’s real feedback [if the teacher says so]. You have a relationship with the teacher, and you don’t know who has given you feedback on the application.”
“They [the apps] are just programmed to tell you. Teachers only say something if they want to. Teachers actually mean what they say. Machines just say the same thing over and over, so what teachers say means more to you.”
“You can’t talk to a program. It’s better to get it from the teacher. I would rather raise my hand and say that I needed help.”
“I would rather get it from a program because then you avoid favouritism and that it becomes unfair. It is never completely objective when a teacher gives you feedback. Some teachers really like a certain person and will give that pupil a better assessment. So it’s better with machines.”
“Even if a program nags at you each day, it doesn’t matter as much as when a teacher does it.”
“I think it’s very similar. I feel that you get feedback from the teacher and that the same thing happens with a machine.”

Most of the pupils are sceptical to allowing parents access to everything the pupils do in digital teaching aids. Many believe that the pupils themselves should be able to show their parents what they do, and that parent-teacher conferences are a suitable place for them to get information about how the pupil is doing. Other pupils believe that it is fine for parents to have access, saying that parents often want to know how things are going and to help their own children with their schoolwork. See the pupils’ responses in Box 9.4.

Textbox 9.4 Responses from pupils in years 7 and 9 about what parents should have access to

“I think that the parent-teacher conference is the place where the parents get the info. I don’t want to be monitored by anyone else.”
“They shouldn’t see the feedback we get, even if we’re their kids. I’m the one who does or doesn’t do something, and I have to accept the consequences if it’s poor. It’s my private life and it would be troublesome if they were to know everything.”
“Parents should see more. They can see more of what we are working on, but not how we’ve done it.”
“I want to decide what mum and dad should see. Maybe I’m the one who should have access?”

The pupils are generally positive to the idea that teachers can see how they work and what they are working on. However, they feel differently about developers having access to data on how pupils are working. However, some pupils are positive to having developers use the data to further develop their own solutions. See the pupils’ responses in Box 9.5.

Textbox 9.5 Responses from pupils in years 7 and 9 about what developers and suppliers should have access to

“Those who are developing should not be able to know what we’re doing. We don’t know them, so I don’t think they should have access.”
“It’s uncomfortable to know that they know a lot. You lose all control.”
“They can actually have access to whatever they want, but they should not sell it to others.”
“It’s important to think about who actually needs my data and what it should be used for. I don’t actually know who shouldn’t have access to it, but it’s intended for the school, so should anyone else have access to it at all?”
“I wouldn’t care if my English text from year eight was saved and used by someone for something that could be good for someone else.

The pupils also express that they feel motivated by being able to combine digital and analogue resources. None of the pupils want a school that only uses analogue teaching aids. Nor do they want a school where digital feedback replaces an ongoing dialogue with teachers.

9.4 The Expert Group’s assessments

Participation in learning analytics requires pupils and students to understand and have good insight into what type of data about them is collected, as well as the type of information they can receive from learning analytics. The Expert Group believes it is important for pupils and students to receive adapted and comprehensible information, so that they can consider the issues regarding learning analytics. Through the dialogue and input, we have found that there are significant differences in the type of information that pupils, parents and students receive from the various schools and education institutions.

The Expert Group believes that national authorities must make efforts to make sure that all school owners and education institutions ensure that pupils, parents and students are given the information they need, and that they have the opportunity to participate and be involved in decision making. We also believe that schools and education institutions must actively facilitate good student participation when learning analytics are performed. In order for students and pupils to feel confident that their data is collected in a responsible manner, it is essential that they are clearly informed of their rights in connection with data collection in a manner that is adapted to their age and level of maturity. Facilitating the opportunity for pupils and students to check whether their data is used for learning analytics, to a greater extent than is currently the case, would also help increase participation. Furthermore, it is important that participatory bodies for pupils and students are included in decisions on learning analytics.

10 The need to regulate learning analytics

The overall aim of regulating learning analytics is to help ensure that the value of learning analytics is achieved, while simultaneously reducing privacy risks. In this chapter, the Expert Group will first assess the privacy risks of learning analytics, adaptivity and the use of artificial intelligence. We will then discuss the existing legal bases in the regulations for processing personal data for learning analytics. As an extension of this, we will assess the need for regulatory changes.

10.1 Privacy risks related to learning analytics, adaptivity and artificial intelligence

The risk associated with privacy entails the danger that the rights and freedoms of pupils and students are not sufficiently safeguarded. The General Data Protection Regulation requires national supervisory authorities to create a list of categories of methods of processing personal data which, by definition, involve a high risk. On the list that the Norwegian Data Protection Authority (2019) has compiled of such processing methods, we find this example: “The processing of personal data to evaluate learning, mastery and well-being in schools or kindergartens. This includes all levels of education, from primary and lower secondary school to upper secondary school and higher education. The reason this type of processing is associated with high risk is that it involves children and young people who are considered vulnerable, and because continual evaluation entails invasive, systematic monitoring. The Norwegian Data Protection Authority has thereby assessed that learning analytics and adaptivity that involves processing personal data entails a high risk.

Artificial intelligence challenges the rights of pupils and students in new ways. Artificial intelligence is becoming increasingly widespread in study programmes. Therefore, the Expert Group will discuss a few risk scenarios that may arise with learning analytics and adaptivity that include artificial intelligence. Our starting point will be Article 5 of the General Data Protection Regulation and the fundamental principles11, where four of these are particularly challenged by learning analytics and adaptivity with artificial intelligence. These are the principles of fairness, transparency, data minimisation and accuracy.

10.1.1 Fairness

For the processing of personal data in learning analytics and adaptivity to be fair, it must be predictable and comprehensible to pupils, teachers and parents, and it must not be done in a concealed, manipulative or discriminatory manner. One characteristic of artificial intelligence is that it is an innovative and complex technology that we cannot readily understand the scope of.

One risk scenario worth pointing out is that the data controller does not decide the extent to which a legal basis applies and may therefore be in danger of unjustly depriving students and pupils of their right of co-determination. Co-determination in this context may be placed on a scale from deciding whether learning analytics will be performed to having a say on when and how the learning analytics will be performed. Another risk scenario related to the principle of fairness is that algorithms that are not monitored may develop biases in the database and lead to discrimination. Ensuring that a machine learning model behaves fairly and does not discriminate is a challenging task. A third risk scenario is when learning analytics and adaptivity lead to unreasonably differential treatment. We could say that the entire purpose of learning analytics is to support a form of differential treatment: to assist the teacher in assessing how their instruction can be adapted to different pupils and students, and groups of these. The question is therefore not whether the model is treating people differently, but whether it does so correctly, reasonably, and without discrimination. A fourth risk scenario is when learning analytics may lead to a form of chilling effect and stress for pupils and students. By chilling effect, we mean that our awareness of the fact that the things we say and write are being registered and analysed causes us to change our behaviour. As a result, we may begin focus more on what we write, what we do and who we are in contact with – and we begin to curb our behaviour.

10.1.2 Transparency

For the processing of personal data in learning analytics and adaptivity to be in line with the principle of transparency, both the supplier and the data controller must be able to explain data flow – how the data is used in the solution – and algorithms in a comprehensible manner. One characteristic of machine learning is the algorithms are dynamic and programmed to weigh responses in different ways. It is also a characteristic for business models to have little transparency.

A risk scenario that becomes apparent with regard to transparency is that the data controller knows too little about what the learning analytics entail to configure them correctly while also fulfilling their duties to pupils and students, such as providing them with information about the processing. Another aspect of this risk scenario is that students, pupils and parents know too little about what learning analytics involve to be able to safeguard their rights. In general, this could increase the risk of mistrust in learning analytics as a method, which could then lead to more opposition.

10.1.3 Data minimisation

For the processing of personal data in learning analytics and adaptivity to be in line with the principle of data minimisation, it must be possible to limit the amount of collected personal data to that which is necessary for achieving its purpose. One characteristic of machine learning is that it requires large amounts of information to train the algorithms. This is generally counter to the data minimisation principle. In addition, there is a continuing need, which can raise the threshold for initiating deletion or anonymization.

One risk scenario that stands out with respect to data minimisation is that the need for a large amount of data, which could result in the collection of a great deal of information on pupils’ and students’ activities, regardless of relevance and necessity. Another risk scenario is that personal data is not deleted after the purpose of the processing has been achieved, because this data is needed to train the model.

10.1.4 Accuracy

For the processing of personal data in learning analytics and adaptivity to be in line with the principle of accuracy, personal data that is processed must be accurate and give the correct impression of the person from whom the data is collected. One characteristic of machine learning is that the algorithms are meant to improve themselves over time. This creates a need for continuous “training” based on information from a large number of people.

A risk scenario that stands out with respect to accuracy is that the database develops biases that result in discriminatory algorithms. Another risk scenario is when the information being measured about the pupils and students constitutes an inaccurate and deficient database. This may occur if the information does not reflect what the pupil or student has actually done, or when an adaptive learning resource adapts to the pupil or student incorrectly. This may cause the analyses to present an inaccurate picture of the actual situation, which may influence decision making. A third risk scenario is when a teacher or instructor misinterprets the analysis provided by the system due to poor insight into what the analysis is based on, and because the teacher or instructor does not have sufficient competence to interpret the data.

Textbox 10.1 Privacy consequences the Norwegian Data Protection Authority has identified for learning analytics

In the report from the Norwegian Data Protection Authority’s sandbox, the Norwegian Data Protection Authority described several privacy consequences of learning analytics:

  • chilling effect

  • risk of inaccurate information

  • risk that the technology causes pupils unwanted stress

  • special categories of personal data that require a special legal basis

  • processing of third-party information

  • danger of decision support systems becoming decision systems

(Norwegian Data Protection Authority, 2022b)

10.2 Introduction on the legal basis and necessary processing of personal data in learning analytics

In section 5.2, the Expert Group described the relevant data protection legislation for learning analytics. In order for the processing of personal data to be lawful, the processing must have a legal basis in Article 6(1) of the GDPR. Some of the legal bases presuppose the existence of provisions in national legislation that can serve as a legal basis. Before the Expert Group assesses the extent to which existing provisions in Norwegian legislation are suitable as a supplementary basis for processing personal data in learning analytics, we will review which legal bases in the data protection legislation are likely to be used for the purpose of processing personal data in this context.

10.2.1 Legal obligation or task carried out in the public interest

The two bases the Expert Group considers relevant in the GDPR are Article 6(1)(c), processing which is “necessary for compliance with a legal obligation”, and (e), processing which is “necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller”.

As the Expert Group explained in Chapter 5, a basis in legislation will probably not fall under the legal obligation category if the basis entails a considerable degree of free choice as to how the processing of personal data is to be performed in order to fulfil this obligation. For instance, provisions that exclusively authorise someone to do something will not be covered by the legal obligation category in Article 6(c) (Kotschy, 2020). In such cases, the processing will instead fall under to the category “task carried out in the public interest”. If the legal basis in the legislation is found in the legal obligation category, the data subjects (pupils and students) will also lose the right to object to the processing pursuant to Article 21. This indicates in favour of exercising caution regarding the use of legal obligation as legal basis.

Several of the provisions that constitute the relevant legal bases for learning analytics are formulated as obligations in relation to schools or educational institutions. However, these are mainly obligations that largely facilitate the exercise of professional judgment in terms of the manner in which these obligations are to be fulfilled, e.g., in connection with formative assessments and differentiated instruction. This indicates that the processing of personal data linked to these provisions is less compatible with the GDPR’s legal obligation. In addition, the Expert Group finds that learning analytics is not a required tool to achieve the obligations in the legislation. It is possible to fulfil the requirements of the legislation without learning analytics.

In terms of provisions on quality development and quality assurance work, there are other arguments as to whether the provisions fall under the legal obligation category. If learning analytics will be essential or have greater value than other methods for quality development, the processing of personal data could fall under the legal obligation category. As of today, however, we do not find examples of learning analytics that are of such crucial importance to the work on quality development that could it entail a legal obligation. The provisions on quality development in primary and secondary education and training involve, among other things, facilitating local adaptation in terms of how quality development occurs (Prop. 57 L (2022–2023), section 56.5.2). This possibility of adapting the quality assurance work to local conditions will in any case indicate that the provision should not be considered a legal obligation.

The Expert Group’s assessment is that the Article 6(1)(e) of the GDPR – “processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority […]” – is the relevant basis for learning analytics.

10.2.2 On the necessity of processing personal data in learning analytics

For the processing to be legal, it is not sufficient that the processing is related to a “task carried out in the public interest or in the exercise of official authority”. According to the GDPR, it must be necessary. The Regulation does not provide further guidance on what is required for the criterion of necessity to be met. The general linguistic understanding of the processing being necessary would indicate that the purpose cannot be achieved absent the processing in question. There must be an objective connection to the collected data and the manner in which it is processed, in order to fulfil the purpose. As the Norwegian Ministry of Justice and Public Security has noted in other contexts, it is not sufficient that the data are useful; they must be of significance to the work, either on their own or when compiled (Prop. 59 L (2017–2018), section 4.1.3.2).

The ECJ has examined necessity under Article 6(1)(e) and whether the processing in question goes beyond what will be necessary to achieve the objective thereof (C-439/19 Latvijas Republikas Saeima [Grand Chamber], 2021 paragraph 109). The Court also refers to recital 39, where it is stated that the data “should be processed only if the purpose of the processing could not reasonably be fulfilled by other means”.

Currently, it is fully feasible to meet the requirements for differentiated instruction, formative assessment, the educational institutions’ tasks related to educational provision and the requirements for quality assurance work, without learning analytics. However, the Expert Group believes that learning analytics has the potential to improve and enhance the pedagogical follow-up of pupils and students and the work on quality in a manner that will contribute to achieving the objectives of primary and secondary education and training, tertiary vocational education and higher education.

There are several sources of information on pupils and students that, jointly, form the basis for pedagogical decisions and which are used in the work on quality development. The Expert Group does not believe that learning analytics alone will be able to solve the pedagogical tasks referred to in the legislation, but envisages that it will make a valuable contribution in some cases. If learning analytics is to serve as one such source of information, this will require that the learning analytics offers sufficient pedagogical value and that the disadvantages to data protection are sufficiently limited. It is the pedagogical value related to having to process personal data that must be justifiable. The same applies when learning analytics occurs in connection with the further development of subjects and teaching plans, as the value of the processing in relation to achieving the aim must be justifiable.

The Expert Group will highlight some aspects that characterise learning analytics, which we believe will increasingly necessitate the processing of personal data in this context:

  • the processing gives pupils and students better insight into their own learning

  • the processing provides teachers and instructors with a better basis for adapting instruction and making pedagogical decisions

  • the processing gives teachers and instructors a better basis for providing feedback to pupils and students

  • the processing offers a better basis for quality assurance work in schools, in the municipality, the county authority or at the educational institution.

10.3 Legal basis in the legislation governing the processing of personal data in learning analytics for primary and secondary education and training

The fact that pupils and teachers can utilise learning analytics in line with the purposes as defined in the Education Act, Regulations to the Education Act and the National Curriculum is a prerequisite for both the success and lawfulness of learning analytics (Vestfold and Telemark County Authority, 2022).

The Expert Group finds that there are primarily three provisions that are relevant to discuss as legal bases for the processing of personal data in learning analytics in schools. These are the provision on differentiated instruction in section 1-3 of the Education Act, provisions on formative assessment in the Regulations to the Education Act and the provision on quality development in s 13-3e of the Education Act. A key point is that the processing of learning analytics will largely not, in itself, constitute differentiated instruction, formative assessment or work on quality development. However, learning analytics can constitute one of several sources of information relied upon to better succeed with differentiated instruction, formative assessment or work on quality development. The Expert Group also emphasises that the broad statutory purpose in section 1-1 of the Education Act must be viewed in the context of the specific legal bases.

10.3.1 General provisions on the processing of personal data

Provisions on the processing of data

In 2021, a general provision on the processing of personal data was adopted in section 15-10 of the Education Act. The first paragraph of the provision grants municipalities, county authorities and educational institutions the right to process personal data “including personal data as mentioned in articles 9 and 10 of the GDPR, when this is necessary to perform tasks pursuant to the Act”.

Section 15-10, first paragraph does not constitute an independent supplementary legal basis and the provision does not extend the right to process personal data Prop. 145 L (2020–2021), section 2.4.2.5). For the specific processing of personal data, it is the provisions of the Act and Regulations pursuant to the Act that constitute supplementary processing grounds and which govern rights and obligations. In the preparatory works, the Norwegian Ministry of Education and Research writes that the question of what personal data will be necessary for the school owner or others to process must be assessed based on the purpose of the individual obligation or task Prop. 145 L (2020–2021), section 2.4.2.2).

During the consultation process, it was asked whether section 15-10 precludes the use of article 6(1)(e) “public interest”, as the wording does not mention that it can be used as a legal basis for processing. In its proposal, the Norwegian Ministry denied that the provision precludes the use of Article 6(1)(e) as a legal basis (Prop. 145 L (2020–2021), section 2.4.2.5).

Access control

Section 22A-2 of the Regulations to the Education Act stipulates requirements for access control when processing personal data that are based on a legal basis in the legislation. According to the provision, municipalities and county authorities have a duty to ensure that persons employed by the undertaking only have access to data that are necessary for the purposes stipulated in section 15-10.

10.3.2 Provisions on differentiated instruction

According to section 1-3 of the Education Act, the training must be “differentiated according to the abilities and aptitudes of the individual pupil, apprentice, candidate for certificate of practice and training candidate”. In the preparatory works it is stated that both the organisation of the school, pedagogical methods and the progress in the instruction shall be adapted according to the prerequisites and abilities of the pupils (Proposition to the Odelsting Ot.prp. nr. 40 (2007-2008), section 3.2). Differentiated instruction is a basic and overriding principle in the instruction. The provisions on differentiated instruction require schools to differentiate the instruction according to the pupils’ prerequisites, but do not enshrine an individual right for the individual pupil. In the preparatory works to the provisions, the data protection consequences of processing personal data for the purpose of providing differentiated instruction have not been considered.

As part of differentiated instruction, a duty is included to provide intensive instruction (early intervention) to pupils in grades 1 to 4. Section 1-4 of the Act imposes a duty on school owners to provide suitable intensive instruction in reading, writing and mathematics so that they achieve the expected progress. The provision does not express how the early intervention is to be carried out. The Norwegian Ministry’s statement in the preparatory works to the provision is illustrative of this:

The Norwegian Ministry specifies that there is no definitive answer as to how intensive instruction shall be carried out. Different methods and pedagogical approaches may be suitable. It will be a pedagogical and didactic task to decide which measures are necessary and expedient in relation to the individual pupil. How the intensive instruction should be arranged must be assessed in the light of, among other things, the needs of the pupil and other measures in the instruction. Prop. 52 L (2017–2018), section 3.5.1)

The duty to provide differentiated instruction is fulfilled through teachers’ choices of methods and tasks. However, the provision on differentiated instruction and early intervention does not state what methods are suitable. The provision on differentiated instruction establishes a broad objective of educational practice.

The Expert Group finds that differentiated instruction is a relevant legal basis for processing personal data in learning analytics that is necessary to “perform a task carried out in the public interest” in Article 6(1)(e) of the GDPR. The provision on differentiated instruction falls under the category in Article 6(1)(e), among other things, because the provision establishes a general and broad principle, which does not clarify specific guidelines as to how the instruction should be differentiated in relation to the pupils. The provision does not provide further guidance on what renders learning analytics necessary processing in order to carry out differentiated instruction.

However, using differentiated instruction as a legal basis for processing personal data in learning analytics entails a challenge in that the provisions on differentiated instruction make it difficult for pupils to predict what personal data will be processed and how. In addition, the provision does not contain any information on how the principles of purpose limitation, data minimisation and accuracy will be safeguarded in the processing. When processing personal data about children, it is important that the legal basis is clearly formulated, due to their vulnerable position.

The Expert Group finds that the provision is not suitable for specifying that the processing of personal data in learning analytics can be used in connection with differentiated instruction. Codifying one of the methods teachers can use to differentiate instruction would be contrary to the general scheme of the statutory provision. Moreover, such a specification could give the impression that learning analytics plays a particularly important role in differentiated instruction, which we believe is not currently the case.

10.3.3 Provisions on the right to assessment

The right to assessment is laid down in Regulations pursuant to the Education Act. Section 3-2 grants pupils the right to “formative assessment, final assessment and documentation of the instruction”. According to section 3-3, the purpose of the assessment is to enhance learning and express the pupil’s competence. Pursuant to section 3-3, the basis for assessment is the pupil’s participation and activity in light of the objectives in the National Curriculum.

According to section 3-10 of the Regulations pursuant to the Education Act, “[a]ll assessment that takes place before the end of the instruction” is formative assessment. According to the Regulations, the formative assessment must be an “integral part” of the instruction. In its circular, the Norwegian Directorate for Education and Training elaborates that the wording “integral part” entails that “pupils and apprentices must be assessed, or assess their own work, where this is natural in the instruction, without this necessarily being planned or occurring at fixed and agreed intervals” (Norwegian Directorate for Education and Training, 2021a). The Norwegian Directorate for Education and Training further elaborates that the wording “integral part” means that “pupils and apprentices must be assessed, or assess their own work, where this is natural in the instruction, without this necessarily being planned or occurring at fixed and agreed intervals”. In other words, formative assessment refers to a set of different methods and ways of working to aid pupils in the learning process.

In the description of formative assessment, there are several elements that fit what the Expert Group identifies as important characteristics of learning analytics. The description of formative assessment in the section 3-10 of the Regulations is also recognisable in terms of the purpose of learning analytics. The pupil shall participate in the assessment of their own work and reflect on their own learning and academic development. They shall also understand what they need to learn, what they master, what is expected of them and how they can further work to enhance their competence.

The Expert Group finds that the processing of personal data in learning analytics in connection with formative assessment will fall under the legal basis of “performing a task carried out in the public interest” in Article 6(1)(e). Learning analytics can be one of several ways of carrying out formative assessments. Furthermore, we have found that formative assessment will not cover all relevant contexts where learning analytics takes place. Examples could be that information from learning analytics is used as a basis for didactic assessments, e.g., selecting relevant and motivating working methods and teaching aids, or other matters not directly related to the purpose of formative assessment.

Because formative assessment is to be an integral part of instruction, it can be difficult to separate formative assessment from other pedagogical activities. Avoiding such a distinction is also part of the purpose of the emphasis in the Regulations that assessment must be an integral part of the learning activities. Linking learning analytics specifically to formative assessment may also send unintended signals that the formative assessment should to a greater extent be based on learning analytics, even if this is not the intention. This can have unfortunate consequences, as it is precisely the breadth and variety in how pupils show and develop their competence that is important.

Another challenge in using formative assessment as a legal basis for learning analytics is that formative assessment is closely linked to the final assessment. According to section 3-15, second paragraph of the Regulations, “competence that the pupil has shown during the instruction” must be part of the assessment when the mark awarded for classwork is determined. It will then be the teacher’s task to decide whether the competence the pupil has shown provides relevant information about the pupil’s competence at the end of the instruction. Based on current practice, it would be unfortunate to signal that learning analytics provides sufficient breadth in the assessment of pupils’ competence that it can be used as a basis for determining the mark awarded for classwork.

The Expert Group believes that the provision on formative assessment is a relevant legal basis for some forms of learning analytics. It is a strength that the provision indicates how the pupil should be involved in the formative assessment and what the pupil should understand. This procedure will help ensure that the data that are processed in learning analytics through formative assessments are accurate and adequate. Nevertheless, the provision on formative assessment will not make it predictable how personal data may be processed in connection with learning analytics. The Expert Group finds that the provision on formative assessment is not a suitable place to specify the legal basis for processing personal data in learning analytics. Such a specification may indicate that learning analytics, compared to other methods, plays a particularly important role in formative assessments, which the Expert Group does not believe is currently the case. The Expert Group therefore believes that specifying the provision on formative assessment for the processing of personal data in learning analytics could send the wrong signal.

10.3.4 Provisions on quality development

Section 13-3e of the Education Act contains a duty on the part of the municipality and the county authority to work on quality development. Private schools have a corresponding duty to work on quality development in section 5-2b, where the school board is responsible. The purpose of the quality development work is for the school owners and the schools to use knowledge about the learning environment and learning outcomes at the schools to assess how the education can be improved Prop. 81 L (2019–2020), section 12.4).

The provision in the section 13-3e, second paragraph of the Education Act reads as follows: “The municipality and the county authority shall ensure that the schools regularly assess the extent to which the organisation, differentiation and implementation of the instruction contribute to achieving the objectives set out in the Curriculum for Knowledge Promotion in Primary and Secondary Education and Training. The pupils shall be involved in this assessment”.

Information from learning analytics may represent one of several sources for the quality development work. The purpose of the quality assurance work is for the school owners and the schools to use knowledge about the learning environment and learning outcomes at the schools to assess how the instruction can be improved (Prop. 81 (Bill) (2019–2020), section 12.4). Neither the provision on quality development nor the related preparatory works state what personal data or how personal data are to be processed in connection with quality development.

To more closely examine what the work on quality development entails, we can look to discussions about school self-evaluation. Previous provisions on quality development used the term school self-evaluation, but the removal of this term was not intended to alter the content of the provision. The Education Act Committee noted that the content of school self-evaluation is unclear (NOU 2019: 23). The Expert Group on school performance indicators noted a need to prepare a harmonised interpretation of the specific academic content in section 13-3e (Expert Group on performance indicators, 2021). School self-evaluation was one of the topics in the Norwegian Directorate for Education and Training’s joint national inspection for 2014–2017, and what was examined in the inspection expresses how national authorities have understood the term (NOU 2023: 1). In the inspection, the Directorate investigated whether schools have a broad and comprehensive assessment of goal attainment for pupils, assess whether changes can contribute to greater goal attainment by pupils, conduct regular assessments and ensure broad participation (Norwegian Directorate for Education and Training, 2019).

The Committee for Quality Development in Schools believes that the work on quality development must be based on a broad platform of knowledge that builds on the experiences of the professional community, relevant research and the local context (NOU 2023: 1). Information from learning analytics will represent one of several sources of information for quality development. The fact that the provision’s professional content has not been determined makes it difficult to clarify the degree to which the provision is suitable as a legal basis for learning analytics. Furthermore, the Expert Group believes that the legal basis for learning analytics in quality development primarily falls under “performing a task carried out in the public interest” in Article 6(1)(e) of the GDPR.

In general, learning analytics used for quality development could have less intrusive data protection consequences than the processing of personal data that form the basis for decisions concerning the direct follow-up of a pupil. For quality development, it will also often be sufficient to process de-identified information.

The Expert Group does not consider it suitable to specify that learning analytics may be an instrument in section 13-3e when carrying out quality development. Such a specification would be contrary to the general scheme of the statutory provision. The statutory provision only stipulates that schools shall work on quality, but not how this is to be done.

10.3.5 Need for legislative amendments

There are broad provisions in the Education Act which enunciate the objectives of education, differentiated instruction, assessment, pupils’ benefit from the instruction and the school owner’s responsibility for quality development, and which refer to contexts where learning analytics may be relevant. In Chapter 10.3.1, the Expert Group referred to the general provision in the section 15-10 of the Education Act which emphasises that the school owner may process personal data, including those mentioned in Articles 9 and 10 of the GDPR, when this is necessary to carry out a task in the Act.

Determining that learning analytics may be in accordance with the overall purposes of the Act is not necessarily challenging. However, determining accordance is not sufficient. As shown in Chapter 5.2.1, the GDPR and the ECHR set requirements for how the legal basis for processing personal data is designed. The legal basis “must lay down clear and precise rules governing the scope and application of the measure in question” (C-439/19 Latvijas Republikas Saeima [Grand Chamber], 2021, paragraph 105).

The Expert Group believes that it would be beneficial if the legislation clarified that learning analytics may contribute to carrying out statutory duties imposed on the schools. The Expert Group has shown that learning analytics can have pedagogical value and be a relevant method for differentiating instruction, providing formative assessment and for the work on quality development.

A key argument in favour of specifying the legal basis in the Education Act for learning analytics is to address the pupils’ need for predictability regarding how their personal data are processed when the processing may have significant consequences for their privacy. A clearer basis may contribute to making it easier for school owners to ensure their responsibilities when processing personal data in learning analytics. The need for a clearer legal basis is precarious as it concerns the processing of data in ways that pupils find invasive. In addition, special categories of personal data described in the Article 9 of the GDPR may be processed, e.g., health data related to learning difficulties or the like. In order to prevent disproportionate interference with children’s privacy, the legislation must reflect the challenges raised by learning analytics.

A clearer basis in the legislation could help prevent personal data from being used for purposes other than those for which they were collected. A more precise legal text can offer pupils and parents a clearer sense of boundaries in terms of their own rights. There is, however, a risk that any clarification of the legal basis for learning analytics will remain symbolic provisions devoid of real content, and which fail to establish guidelines for how the processing of personal data should take place. Another risk in clarifying the basis for processing personal data for learning analytics is that it could be perceived as an opportunity to carry out more extensive learning analytics without ensuring corresponding data protection requirements.

In connection with the consultation process prior to the adoption of section 15-10 of the Education Act, the Norwegian Data Protection Authority stated that the legal basis in the Education Act “should set out fixed, objective, statutory criteria for the processing if it is to comply with the data protection legislation’s requirements for a legal basis” (Norwegian Data Protection Authority, 2021a, p. 1). It is difficult to answer how the legal basis in the legislation can be designed to ensure that the processing of pupils’ personal data in learning analytics occurs in a manner that safeguards their privacy.

An important question is how the legal basis can contribute to ensuring that the processing of personal data is necessary for the purpose. A general starting point is that the school owner, as controller, must be able to demonstrate that processing personal data in learning analytics has a pedagogical value that outweighs the data protection consequences it entails. However, it is not only the necessity of the processing that makes it lawful, the legal basis should also make it easier for the school owner, as controller, to observe the principles of accuracy, purpose limitation and data minimisation.

The Expert Group believes that there is a clear need to elucidate the legal basis for learning analytics. The basis must contribute to ensuring that the processing of the pupils’ personal data in learning analytics occurs in a more predicable manner.

10.4 Legal basis in the legislation governing the processing of personal data in learning analytics for universities and university colleges.

The possible legal bases for processing personal data in learning analytics are found in various parts of the legislative framework governing higher education. The Expert Group considers that there are mainly provisions in three areas that are relevant as a legal basis for the processing of personal data in learning analytics in higher education.

Firstly, there is a general provision on the processing of personal data in section 4-15 of the Universities and University Colleges Act. Second, sections 1-3 and 1-5 contain provisions on the institutions’ tasks and professional responsibilities. The third area of legislation that is relevant as a legal basis for learning analytics is the provisions on the institutions’ quality assurance work in section 1-6 of the Universities and University Colleges Act, in addition to the provisions on quality assurance work in the Regulations on the quality of programmes of study and the Academic Supervision Regulations.

The Expert Group will discuss whether the provisions in the respective areas are suitable as a legal basis for processing personal data in learning analytics. This involves, among other things, the extent to which the provisions are suitable to fulfil the requirement of a clear and distinct legal basis in Article 6(3) of the GDPR, in light of statutory requirements in the Constitution of Norway and the ECHR.

10.4.1 Provision on the processing of personal data in course management systems

Section 4-15 of the Universities and University Colleges Act has the heading “Obtaining and processing personal data in course management systems”. The first paragraph of the provision stipulates that the educational institutions may process personal data regarding the students “when the purpose of the processing is to safeguard the rights of the data subject, or to fulfil the institution’s tasks and duties under the Universities and University Colleges Act.” The wording in the text of the Act indicates that the provision does not constitute an independent legal basis, but merely expresses what is already stipulated in Article 6(1), cf. Article 6(3), i.e., that personal data may be processed when this is necessary for compliance with a legal obligation or for the performance of a task carried out in the public interest pursuant to acts or regulations. An unresolved question is whether the provision is limited to the collection and processing of personal data that exclusively takes place in course management systems. In the preparatory works, the Norwegian Ministry states that the term “course management systems” is used as “a common name in the bill to include all course management systems in the sector” Prop. 64 L (2017–2018), section 7.2.2).

The provision’s second and third paragraphs list the types of data the institutions can process when necessary in relation to the purpose in the first paragraph. The second paragraph lists data that will typically be necessary to process upon admission to a study programme, such as names and marks from upper secondary education, universities and university colleges, retrieved from public authorities. The second paragraph mentions “information about health, social issues and other sensitive information which the student him/herself has given to the institution”.

During the consultation round for the current section 4-15, both the University of Oslo (UiO) and the Norwegian University of Science and Technology (NTNU) expressed in their consultation responses that the processing of personal data regarding students will increasingly take place in contexts other than in the traditional course management settings. NTNU first states that the proposal for the provision in section 4-15 is unlikely to meet the needs of the educational institutions, before summarising the uncertainty surrounding the processing basis for, among other things, learning analytics as follows: “Current areas where the legal basis may be uncertain are the processing of personal data in connection with learning analytics, development and quality improvement in education, access control, exchange of information between institution and place of professional practice, as well as storage of answers in the plagiarism control system” Prop. 64 L (2017–2018), section 7.2.3).

When the Norwegian Ministry discussed the consultation responses, it stated that the provision will not cover all the institutions’ needs for processing personal data, and also noted that was not the purpose of the bill Prop. 64 L (2017–2018), section 7.2.4). In the same place, the Norwegian Ministry pointed out that “broad ‘all-encompassing’ authorisations cannot be granted without further investigation of consequences and needs”. Nor has it been considered in the preparatory works whether data in course management systems can be used in learning analytics. Based on the provision’s wording and statements in the preparatory works, the Expert Group believes that the provision in its current form in section 4-15 does not clarify the extent to which personal data can be processed in learning analytics.

10.4.2 Provisions on the institution’s tasks, as well as academic freedom and responsibility for instruction

Section 1-3 of the Universities and University Colleges Act sets out the institutions’ duties at an overarching level. According to section 1-3 letter (a), universities and university colleges shall “provide higher education based on the foremost within research, academic and artistic development work, and experience-based knowledge”. In the preparatory works, the Norwegian Ministry writes that the provision on the institutions’ tasks notes what is needed to achieve the ambitions set out in the statutory objective of the Act, and that the provision will not involve a regulation of specific rights or duties Prop. 111 L (2020–2021), sections 2.3.5.1 and 2.3.5.2).

Pursuant to section 1-5, first paragraph, the institutions have academic freedom and responsibility for, among other things, instruction being conducted in accordance with “recognised […] pedagogical and ethical principles”. This also means that the institutions are responsible for the academic follow-up of the students. The Universities and University Colleges Act Committee noted the importance that the legislation “does not set limitations for new and varied ways of teaching and learning” (NOU 2020: 3, p. 194). The Expert Group believes that the regulations should also ensure that learning analytics safeguards student privacy, which presupposes a clear legal basis.

The Norwegian Ministry considered proposing a provision that specifies that the institutions are responsible for good academic follow-up of the students, but concluded that such a specification was not necessary as long as the responsibility is already stated in section 1-5, first paragraph Prop. 74 L (2021–2022), section 7.4). The extent to which learning analytics can be carried out in the academic follow-up of students has not been a topic in the preparatory works. In the Expert Group’s view, section 1-5 on academic freedom and responsibility constitutes a legal basis for “performing a task carried out in the public interest” i Article 6(1)(e) of the GDPR.

The provision in section 1-5 stipulating that the institutions must ensure that instruction takes place in line with “recognised […] ethical and pedagogical principles”, does not make it particularly predictable how the personal data of students are to be processed in learning analytics. It would not be appropriate for the provision containing the institutions’ overarching task to list only one of the means institutions can use to ensure that instruction takes place in accordance with recognised ethical and pedagogical principles. Therefore, the Expert Group does not find that section 1-5 is a suitable place in the legislation to specify that the institutions may process personal data in learning analytics.

The Expert Group finds that section 1-5 is a relevant legal basis for processing personal data in learning analytics. At the same time, the Expert Group believes that the provision does not provide a clear enough basis for processing personal data in learning analytics.

10.4.3 Provisions on quality assurance work

In section 1-6, universities and university colleges are required to have a “satisfactory internal system for quality assurance that will ensure and further develop the quality of the education”.

According to section 2-1, third paragraph, NOKUT shall, in consultation with the sector, issue regulations on criteria for the quality assurance work of the institutions. The further details on the content of the quality assurance work are regulated in section 2-1 of the Regulations on the quality of programmes of study:

Universities and university colleges shall manage the responsibility for the quality of education by way of systematic quality assurance work that ensures and contributes to developing the quality of the study programmes. Furthermore, the institutions shall facilitate ongoing development of the quality of education, be able to detect declines in quality in the study programs and ensure satisfactory documentation of the quality assurance work. The institutions shall ensure the quality of all matters that impact the quality of studies, from information to potential applicants to completed education.

In addition, section 4-1 of the Academic Supervision Regulations contains requirements for systematic quality assurance work:

1) The institution’s quality assurance work must be based on a strategy and cover all substantial areas of significance for the quality of students’ learning outcomes […]. 4) The institution shall systematically collect information from relevant sources in order to be able to assess the quality of all study programmes. 5) Knowledge obtained from the quality assurance work shall be used to develop the quality of the study programs and detect any declines in quality. Declines in quality must be rectified within a reasonable period of time. 6) Results from the quality assurance work shall be included in the platform of knowledge for assessment and strategic development of the institution’s overall study portfolio.

According to the guide that NOKUT has prepared, the above-mentioned requirement will also entail an assessment of whether students actually achieve the desired learning outcomes (NOKUT, 2022). NOKUT also notes that the institutions shall collect sources from all matters of relevance to the quality of studies. At the same time, they specify that the institutions should not collect more information than is necessary to inform and assess the quality of studies.

Information from learning analytics can represent one of several bases that may be relevant for the quality assurance work at institutions in higher education. The statutory tasks of the institutions presuppose their processing of personal data. However, the manner in which such processing should take place in order to comply with the data protection legislation has not been addressed. The Expert Group believes that the legislation should clarify that processing personal data in learning analytics is relevant in the quality assurance work. In the case of quality assurance work, it will primarily be relevant to use data at an aggregated level as a basis for decisions, which contributes to minimising the data protection risks involved.

The Expert Group believes the legal basis for processing personal data in learning analytics for quality assurance work should be clarified. The Expert Group also believes that it is relevant to consider the provisions on quality assurance work as a legal basis for the “performance of a task carried out in the public interest” in Article 6(1)(e) of the GDPR.

10.4.4 Need for legislative amendments

In the sections above, the Expert Group has shown that relevant provisions in the current legislation governing higher education do not provide a clear legal basis for processing personal data in learning analytics. The discussions in the proposition prior to the adoption of section 4-15 show that it is uncertain in what situations the institutions are permitted to process personal data in learning analytics. In its comments to the Expert Group, Sikt notes that the unresolved legal basis for the university and university college sector to carry out learning analytics has been an obstacle (Sikt, 2022). The Expert Group believes that the provisions on the institutions’ tasks in section 1-5 and the provisions on quality assurance work contain relevant legal bases for processing personal data in learning analytics.

The Expert Group believes that there is a need to clarify what is needed for institutions to process personal data in learning analytics, both for processing in pedagogical practice and in quality assurance work.

10.5 Legal basis in the legislation governing the processing of personal data in learning analytics for vocational colleges

Several of the provisions that are relevant for assessing whether there is a suitable legal basis in the vocational college legislation are designed according to similar provisions in the Universities and University Colleges Act. In the following sections, the Expert Group assesses whether provisions in the vocational college legislation are suitable to fulfil the requirement of a legal basis in Article 6(3).

10.5.1 Provisions on the processing of personal data and the content of vocational college education

Section 4 of the Vocational Education Regulations is entitled “Collection and processing of personal data by the vocational colleges”. The first paragraph of the provision contains a right to process personal data regarding students “when the purpose of the processing is to safeguard the data subject’s rights, or to fulfil the school’s obligations pursuant to the Vocational Education Act”. In the consultation paper on which section 4 is based, it is noted that processing personal data is necessary in course management systems and during the admissions process for the vocational colleges (Norwegian Ministry of Education and Research, 2019). In the second paragraph of section 4 of the Regulations, it is expressly stated which personal data may be used in the processing to achieve the purposes stated in the preceding paragraph. Among other things, it mentions: “name, national identity number […] work experience and marks from upper secondary education, vocational colleges, universities and university colleges and subjects and journeyman’s certificates retrieved from public authorities”. The provision does not address the right to process other personal data, e.g., from students’ learning activities and the like.

According to section 4 of the Vocational Education Act, vocational college education shall be “based on knowledge and experience from one or more professional fields and be in accordance with relevant pedagogical, ethical, artistic and scientific principles”. The Vocational Education Academic Supervision Regulations contain requirements for the content and form of vocational college education. Section 2-1 sets out a broad requirement that the education should, among other things, have “instruction, learning and assessment forms that are suitable for the students to achieve the learning outcomes”. The Expert Group finds that the provision is a relevant basis for processing students’ personal data in vocational colleges in learning analytics.

The provision does not describe in more detail which types of processing of personal data may be relevant for the suitable instruction and learning methods. Thus, the provision makes it difficult for students to predict how their personal data will be processed in instruction and learning situations. The Expert Group believes that the provision does not provide a clear legal basis for processing students’ personal data in learning analytics. The Expert Group nevertheless finds that it would not be appropriate to specify that learning analytics may be included in the instruction and learning methods in section 2-1 of the Vocational Education Academic Supervision Regulations. Learning analytics is just one of several instruments that can help ensure that students achieve learning outcomes. Codifying this one instrument may give the impression that learning analytics takes priority in terms of choice of instruction and learning methods, which is not the intention.

10.5.2 Provisions on quality assurance work

Section 5 of the Vocational Education Act addresses accreditation and quality assurance. The fifth paragraph of the provision stipulates that vocational colleges shall have “satisfactory internal quality assurance systems”, and grants the Norwegian Ministry the right to issue regulations relating to the “requirements for quality assurance systems and quality assurance work”. According to section 5, sixth paragraph (d), the Norwegian Ministry may issue regulations on “requirements for quality assurance systems and quality assurance work”.

The requirements for the vocational colleges’ systematic quality assurance work are specified in section 4-1 of the Vocational Education Academic Supervision Regulations. In order to assess whether each individual education meets the quality assurance target, the vocational colleges must, according to Section 4-1, third paragraph, “systematically collect […] information from students, employees, representatives from the professional field and any other relevant sources”. However, NOKUT’s guidance on the provision specifies that the institution should not collect more information than is necessary to inform and assess the quality of studies (NOKUT, 2020). In the consultation paper that accompanied the bill, there is no discussion of what this information may entail or how it may be processed (NOKUT, 2019). The absence of data protection discussions renders the provision unclear with regard to the extent to which it may constitute a legal basis for the processing of personal data in learning analytics for quality assurance work.

10.5.3 Need for legislative amendments

The provisions that are relevant for processing personal data for learning analytics in vocational colleges govern the processing of personal data at different levels of detail. The Expert Group believes that this causes uncertainty in situations where neither the legislation nor the consultation process have assessed the frameworks for and consequences of processing personal data.

The Expert Group believes that the legal basis for learning analytics in vocational colleges should be clarified. This applies both to the vocational colleges’ pedagogical practice and to their quality assurance work. The Expert Group believes it is important to clarify the legal basis, as a large proportion of students in the vocational college sector are online students, where analyses of student activity are particularly relevant. For all vocational college students, and online students in particular, it is important that the legal basis in the legislation increases the predictability as to how personal data are processed in learning analytics.

10.6 Summary of the Expert Group’s assessments of legal bases

The Expert Group finds that the current provisions on processing personal data in learning analytics are unclear. A general challenge is that the provisions that determine tasks in legislation where learning analytics may be relevant largely fail to ensure predictability for pupils and students in terms of how the personal data are processed. Moreover, the data protection consequences of learning analytics are hardly mentioned in the preparatory works to these provisions. For primary and secondary education and training, higher education and tertiary vocational education, there is a glaring need to clarify the legislation so that it indicates when there is a right what is required to process personal data in learning analytics.

For the primary and secondary education and training, the Expert Group’s assessment is that the provisions that determine the relevant tasks of differentiated instruction, formative assessment and quality development, can serve as legal bases for processing personal data in learning analytics. The Expert Group believes that the basis for processing personal data in learning analytics should be clarified, but does not find that the provisions that determine the tasks are suitable places to include specifications about learning analytics.

Regarding higher education, the Expert Group finds that the provisions on the institutions’ tasks and quality assurance work could be relevant legal bases for processing personal data in learning analytics. The Expert Group believes that the grounds for processing personal data in learning analytics must be clarified, but does not consider it appropriate to specify the actual provision that determines the institution’s tasks and responsibilities.

The Expert Group finds that the provisions on vocational colleges’ instruction and learning methods and the provisions on quality assurance work are relevant legal bases for processing personal data in learning analytics in tertiary vocational education. The Expert Group believes that the processing basis must be clarified, but nevertheless finds that the provision on the vocational colleges’ instruction and learning methods is not a suitable place to include such specification.

Footnotes

1.

Self-regulation involves the ability to plan, implement and monitor one’s own learning and assess the extent to which one must change something to achieve a goal (Hopfenbeck, 2011; Pintrich, 2002; Winne, 2015).

2.

https://www.regjeringen.no/no/tema/utenrikssaker/utviklingssamarbeid/bkm_agenda2030/id2510974/

3.

https://www.fellesstudentsystem.no/

4.

https://www.studiebarometeret.no/

5.

https://ndla.no/

6.

cf. Regulations to the Education Act, section 3-15, second paragraph.

7.

https://www.udir.no/tall-og-forskning/brukerundersokelser/elevundersokelsen/

8.

https://www.udir.no/tall-og-forskning/statistikk/elevundersokelsen/

9.

https://www.studiebarometeret.no/no/artikkel/2

10.

https://www.studiebarometeret.no/no/artikkel/6

11.

https://www.datatilsynet.no/rettigheter-og-plikter/personvernprinsippene/

To front page