SciELO - Scientific Electronic Library Online

 
vol.27DTICE oriented to individuals with LD: a narrative review and its indicationGoal 19 of the PNE 2014-2024 and democratic pedagogical practices author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Share


Linhas Críticas

Print version ISSN 1516-4896On-line version ISSN 1981-0431

Linhas Críticas vol.27  Brasília  2021  Epub May 26, 2021

https://doi.org/10.26512/lc.v27.2021.36840 

Article

Educational Panel: making educational data more easily accessible for municipal and state-level decision-making

1PhD in Philosophy of Education from University of Toronto (2011). Researcher in Educational Information and Assessment/Permanent Staff at the Anísio Teixeira National Institute for Educational Studies and Research (Inep). Acted as Coordinator for Saeb's Federal Articulation and Information Dissemination (2017). Member of GERAJU - Generations and Youth Research Group in the Comparative Education Line (ECOE) of the University of Brasília.

2Master's degree in Public Administration from the Universidade of Brasília (2017). Researcher in Educational Information and Assessment/Permanent Staff at the Anísio Teixeira National Institute for Educational Studies and Research (Inep). Acted as Coordinator for Saeb's Federal Articulation and Information Dissemination (2016-2017).


Abstract

This article analyzes the possibilities and limitations that an interactive platform such as the Educational Panel offers towards meeting the goals of the System of Basic Education Assessment (Saeb). It is divided in three parts: 1) a description of how the Panel is organized and the data it presents; 2) its adherence to the goals of Saeb and some of its potential as management tool; and 3) challenges and possibilities that the reformulation of Saeb present to the platform. In conclusion, we argue that the Panel contributes to operationalizing the goals of Saeb, but some gaps require attention, especially with the successive changes in the assessment and the new scenario imposed by covid-19.

Keywords Large-scale assessment; System of Basic Education Assessment (Saeb); Educational Panel; Educational management

Resumo

Este artigo analisa as possibilidades e limitações que uma plataforma interativa como o Painel Educacional oferece para o cumprimento dos objetivos do Sistema de Avaliação da Educação Básica (Saeb). Ele se divide em três partes: 1) a organização do Painel e os dados que ele reúne; 2) sua adesão aos objetivos do Saeb e algumas de suas potencialidades enquanto instrumento de gestão; e 3) alguns desafios e possibilidades que a reformulação do Saeb apresenta à plataforma. Como conclusão, argumentamos que o Painel tem contribuído para operacionalizar os objetivos do Saeb; porém, restam lacunas que requerem atenção, especialmente com as sucessivas mudanças na avaliação e com o novo cenário imposto pela pandemia.

Palavras-chave Avaliação em larga escala; Sistema de Avaliação da Educação Básica (Saeb); Painel Educacional; Gestão educacional

Resumen

Este artículo analiza las posibilidades y las limitaciones que una plataforma interactiva como el Panel Educativo ofrece para cumplir los objectivos del Sistema de Evaluación de la Educación Básica (Saeb). Está dividido en tres partes: 1) la organización del Panel y los datos que recopila; 2) algunas de sus potencialidades de uso como herramienta de gestión; y 3) algunos retos y posibilidades que presenta a la plataforma la reformulación del Saeb. En conclusión, argumentamos que el Panel ha contribuido a hacer operativos los objetivos de Saeb; pero quedan brechas que requieren atención, especialmente con los sucesivos cambios en la evaluación y el nuevo escenario impuesto por la pandemia.

Palabras clave Evaluación a gran escala; Sistema de Evaluación de la Educación Básica (Saeb); Panel Educativo; Gestión educativa

Introduction

Although it is already part of common sense, the understanding that human action in the daily transformation of the world happens in virtue of multiple determinations and is the result of an active (and perhaps unique) combination of historical moments, of economic, social, political, cultural and psychological situations, of social groups and of individuals reveals the complexity associated to the analysis of any social experience. In this sense, any reflection about the process of creation and development of the System of Basic Education Assessment (Saeb) does not exhaust the possibilities of analysis and, because of that, is always necessary. (Pestana, 2016, p. 72)[1]

In recent years, there has been an increase in the use of indicators in the public policy cycle in Brazil, in a tendency aligned with the changes in public administration towards searching for tools for better planning and management of the actions of the State (Jannuzzi, 2005). In the educational field, the use of such evidence may contribute for more grounded policies, with better decision-making and may generate better practices and results (Bauer & Sousa, 2015; Basso, 2017; Leão & Souza, 2020). However, this is not a linear and univocal trajectory, but rather complex and multidimensional: the process of formulation of educational policy in general, and assessment policy in particular, is a mosaic of complex factors that are themselves diversely situated, permeated in turn by different visions, interpretations, and actions of the various agents of educational policy, as well as political pressures, contextual and historical contingencies and tensions among the different arenas of society (Ball, 1998; Mainardes, 2006; Campbell & Levin, 2009; Castro, 2016; Pertile & Mori; 2020; Aguiar & Tuttman, 2020). In this sense, there are inherent limitations to the process of assessment, and therefore it is important that the use of such indicators be constantly grounded on properties such as: relevance to the sociopolitical agenda, validity of the representation of the concept, reliability of the measurements, sensitivity relative to foreseen actions, methodological transparency, communicability to the public, periodic updates, historic comparability, among others (Jannuzzi, 2005; Willms et al., 2012; Bauer & Sousa, 2015).

These requisites, especially the attention to a multidimensional view of education that has as pillars equity, plurality, scientific rigor and transparency, guided the proposal of the Educational Panel[2], a Business Intelligence (BI) platform launched by the Anísio Teixeira National Institute of Educational Studies and Research (Inep) in 2015, in whose conception and development we were part as permanent staff in the sector responsible for the articulation and dissemination of information regarding the National System of Basic Assessment (Saeb) (Basso, 2017; Macedo, 2011). The platform was created with the goal of contemplating a historic demand of schools, academy and society at large, that the results of large-scale assessment, Saeb in particular, be presented beyond the mere performance results in cognitive tests, and that it took as main target audience the municipal and state educational administration, since the goal of Saeb is to provide subsidy for public policy (Araújo, 2016; Bonamino, 2016; Castro, 2016; Freitas, 2016; Horta Neto et al., 2016; Pestana, 2016; Soares, 2016; Leão & Souza, 2020). The Educational Panel was therefore created with the goal of gathering in a single site, and in a more friendly manner, the information spread in different sites and formats, such as statistical synopses [3] and microdata. [4]

In order for assessment to make sense and fulfill its function, it needs to include feedback tools for society, and these tools must be intrinsically connected to the stated goals of the assessment (Bauer & Sousa, 2015; Pestana, 2016; Leão & Souza, 2020; Lustosa, 2020; Pertile & Mori, 2020). The idea for the Educational Panel arose from the challenges of making the results of Saeb known, a system that in turn had been originally design to assess three main dimensions: educational indicators, school indicators and managerial indicators (Pestana, 2016). Throughout the years, the organization of Saeb underwent structural and methodological changes, which made cognitive tests the main focus of analysis, at the expense of analyses regarding the social context, the school context or the pedagogical context (Bauer & Sousa, 2015; Bonamino, 2016; Pestana, 2016; Waiselfisz & Horta Neto, 2016). The challenge that remains through the years is: how can the data gathered by this assessment be presented in ways that are informative, clear, coherent, contextualized and fair?

This article analyses the possibilities and limitations of an interactive platform such as the Educational Panel for attaining the goals of the System of Basic Education Assessment (Saeb). In order to achieve this objective, this analysis is divided in three parts: 1) the organization of the Panel and the data which it presents; 2) its adherence to the goals of Saeb and some of its potential as management tool at the municipal and state level; and 3) some of the challenges and possibilities that the reformulation of Saeb to adjust to the new National Common Core (BNCC) and to the National Policy of Basic Education Assessment present to the platform, especially as regards to a conception of assessment that includes a conception for feedback to society. In conclusion, we argue that the Panel contributes to presenting the data collected by Saeb in a manner that is more contextualized, user-friendly and coherent with the goals of the assessment. However, some gaps require attention, especially as regards the non-cognitive aspects of the survey. Unless there is a very specific and directed effort, such gaps run the risk of widening, especially with the successive changes in the assessment and with the new educational scenario imposed by covid-19.

Platform Design

The Educational Panel was created in Inep, in the Department of Basic Education Assessment (Daeb) in partnership with the Department of Educational Statistics (Deed) and the Department of Technology and Broadcast of Educational Information (DTDIE), in dialogue with representants officially nominated by the department of education in each of the 27 Brazilian States (Basso, 2017). It had as initial goal to broadcast in a contextualized manner the results of the 2014 National Literacy Assessment (ANA) (Brazil, 2015a, pp. 67-69). In the following year, its scope is expanded so as to include the results of the different grades assessed by Saeb 2015 (Brazil, 2018, p. 86), at the end of the initial cycle (grade 5), middle cycle (grade 9) and high school (grades 12 or 13). From this year onwards, the Panel is formally institutionalized in article 16 of Inep Document nº 410/2016 (Brazil, 2016), that establishes the strategy for that year´s ANA, being replicated in the article 22 of Inep Document nº 447/2017 (Brazil, 2017a) and in Document nº 366/2019 (Brazil, 2019), which regulated Saeb in 2017 and 2019 respectively.

The platform is organized in two main options for access (Municipal and State), both with the same structure: three tabs (trajectory, context and learning), each bringing information pertaining to the municipal and state school boards present in the selected location. Thus, the Municipal Panel brings information about the municipal school board (RM) next to that pertaining to the state school board present in the municipality in question (REM), while the State Panel brings the information of each state board (RE) together with the information of all the municipal boards present in that state (RME).

The first tab, “Trajectory”, brings in table format the following data collected by the School Census: enrolment, average number of students per class, students enrolled in especial education, full time enrolment, rates for satisfactory and unsatisfactory performance, as well as drop-out rates and age/grade distortion. The information presented in this first tab are extremely important for understanding the quantitative aspect and the trajectory of students of a given board. As argued by Soares (2016, p. 143):

A regular school trajectory is the first evidence of a full offer of the right to education. For school trajectory we understand access, permanence, promotion and conclusion of the different stages in which schooling is organized. The quality of this dimension is measured by its regularity, that is, these different stages must be concluded in the expected age. Therefore, a trajectory, although complete, but irregular or longer than necessary, shows inadequate offer of the right to education.

Each of the tables presented in this section brings data corresponding to three years: the year in which occurred the latest edition of Saeb, and the two years prior to that. Therefore, this tab provides to the board’s administrative staff and the community at large an overview of the evolution of these data from one edition of Saeb to the next, not only in the grades that participated in the cognitive tests, but in all the grades of each stage (initial cycle, middle cycle and high school).

The second tab, “Context”, displays six indicators created by Inep, presented below, as well as their respective technical specifications:

(1) Indicator of Socioeconomic Level (INSE), which measures the socioeconomic level of the group of students within schools. This indicator uses as its foundation the level of schooling of the students´ parents and the amount of goods and services to which the students´ families have access; (Brazil, 2015b);

(2) Indicator of School Management Complexity (ICGE), which categorizes the complexity of school management according to the magnitude of the school, including the number of shifts in which it is available, the quantity and complexity of modalities and stages that it offers (Brazil, 2014a);

(3) Indicator of Teaching Effort (IED), which considers the information regarding teaching shifts, number of schools and stages in which the teachers teach and number of students taught to identify aspects of the teaching workload which contribute to overload and burnout in the exercise of the teaching profession (Brazil, 2014b);

(4) Indicator of Adequacy of Academic Training (IAFD), which considers if the teachers´ initial training corresponds to the subjects and the stages taught, according to current legal guidelines (Brazil, 2014c);

(5) Indicator of Teacher´s Regularity (IRD), which measures that permanence of teachers in schools within an interval of five years. (Brazil, 2015c)

(6) Indicator of Development of Basic Education (IDEB), which uses as its foundation the results of the cognitive tests applied by Inep and the average success rates of students in each cycle.

Each of these indicators is displayed in the Panel in table and graph formats, comparing the municipal and state boards in a given location. These indicators aim at presenting aspects of the educational scenario of each school and each school board beyond the results in the cognitive tests. We understand that this broader integrated view of each context may provide a fairer and more complete view of the quality of the educational offer, in which student performance in cognitive tests is only one of many dimensions (Willms et al., 2012; Bauer & Sousa, 2015; Araújo, 2016; Bonamino, 2016; Castro, 2016; Freitas, 2016; Pestana, 2016; Soares, 2016; Leão & Souza, 2020; Lustosa, 2020).

The third and last tab, ”Learning”, brings information about the participation and the performance of the boards´ student population in cognitive tests, in terms of average results as well as percentual distribution of student results in the different levels of the proficiency scale and their respective pedagogical interpretation. In addition, from 2016 onwards, this tab also includes menus for comparing averages and historical series: while the first menu brings the averages of that board in that specific edition next to the other boards in that location (municipal, state, federal and the national average), the “historical series” menu brings information of the board in every edition of Saeb since 2013, in terms of averages as well as its distribution on the proficiency scale.

The aim of this section was to present the Educational Panel and the data which it contains. In the next sections we consider how these elements converse with the goals of Saeb and list a few points of attention in the horizon of its imminent reform in light of the new National Common Core and the National Policy of Assessment.

The goals of Saeb and some conceptual aspects of the Panel

Article 2 of Inep/Mec Document nº 366, published on 29 April 2019 (Brazil, 2019), establishes the guidelines of Saeb for that year:

Article 2: SAEB is a system of external large-scale assessment, consisting in a group of instruments, conducted periodically by INEP since the 1990s, and that has as its objectives, in the scope of Basic Education:

I. To produce educational indicators for Brazil, its Regions and Units of the Federation, and, when possible, for Municipalities and Schools, aiming at maintaining the comparability of the data, thus allowing the development of historical series;

II. To assess quality, equity and efficiency of the education offered in the country by its various governmental spheres;

III. To subsidize the formulation, monitoring and improvement of public policies in education based on evidence, aiming at the social and economic development of Brazil;

IV. To develop technical and scientific competence in the area of educational assessment, through exchange among institutions envolved in teaching and research.

Having these four objectives of Saeb as points of reference, it is possible to identify the central role that the Educational Panel plays in achieving them, considering the multiplicity of instruments, actors and spheres encompassed by Saeb.

Thus, as regards the first objective, in the previous section we presented the indicators that are currently available in the Panel for municipalities and states, in particular with respect to the historical series. In a way, the Panel is inspired by the School Bulletin, another feedback instrument produced by Inep since 2009, which has as its focus each school participating in Saeb. Our goal in conceiving the Panel was to provide the municipal and the state educational administrative staff the type of information presented to the school principal in the Bulletin, and more. Therefore, while the School Bulletin presents the number of students who participated in the tests at the end of each cycle, the Panel offers a wider panorama of the enrolment situation and trajectory in three consecutive years for all the grades in a given cycle. In this way, the Panel aims to contribute specifically in giving visibility to educational data pertaining to the sphere of influence of the municipal and state educational authorities, going beyond cognitive test results (for example, by brining data about teaching effort, school complexity, average number of students per class and number of students in especial education).

Likewise, the context tab replicates and expands what is presented in the School Bulletin. While the latter presents, since 2015, the Indicators of Socioeconomic Level (INSE) and the Indicator of Adequacy of Academic Training (IAFD), the Panel brings, beyond these two, other four indicators produced by Inep, in an effort to make available to the state and municipal educational staff a wider variety of fundamental data to support their decisions in ascertaining and improving school quality, which, as vastly indicated in the literature, and adhering to Saeb´s second objective, goes beyond student performance in cognitive tests (Willms et al., 2012; Bauer & Sousa, 2015; Araújo, 2016; Freitas, 2016; Pestana, 2016; Soares, 2016; Santos & Pinto, 2016; Santos et al., 2017). The concepts adopted in this second objective (quality, equity and efficiency) are polissemic and subject to different interpretations, appropriations and interests (Cury, 2014; Araújo, 2016; Santos et al., 2017; Santos & Ferreira, 2020; Silva, 2020; Pertile & Mori; 2020; Silva et al., 2020). There is an extensive list of criticisms in the literature concerning a view of assessment restricted to test results and fundamentally associated to curricular narrowing (Freitas, 2016; Fernandes, 2016; Silva, 2020). To assess educational quality goes beyond the limitations of large-scale assessment; hence our worry, as we conceived the Panel, about how to make available first and foremost the information regarding trajectory and context, and not simply nor mainly the information relative to performance and learning. It is still imperative to analyze more deeply the social inequities envolved in the process of learning, beyond what cognitive tests are capable of showing (Willms et al., 2012; Bonamino, 2016).

With respect to Saeb´s fourth objective, having in mind this broader perception which considers the context of schools and school boards, the theoretical framework that guided the development of the Educational Panel also had as its premise the importance that data and evidence bring to public policy in its different levels and spheres, from a multidimensional view of education in which equity occupies a central role and encompasses a complex web of subjects and agents: state, municipal and school educational staff and authorities, teachers, students and the whole school community (Ball, 1998; Mainardes, 2006; Silva, 2020). Here, the Panel plays an important role (though far from exhaustive), in the direction of contributing as subsidy to public educational policies “based on evidence” and “aiming at the social and economic development of Brazil”.

Data collection from independent external assessment with scientific rigor is a type of action that aims at bringing transparency, effectiveness and allows for planning of actions on the part of the State and its agents in different levels, besides guiding investments and identifying educational demands (Pestana, 2016). To explain how such evidence may be used in the formulation of policies, the specialized literature offers various conceptions. This process of public policy formulation can be understood as the moment in which means are defined in order to meet perceived needs (Howlett et al., 2013). It can also be understood as a process of generating a set of plausible options for solving problems. Wiseman (2010) presents three theoretical approaches: the technical-functional perspective, the sociopolitical perspective, and the institutional or organizational perspective. The technical-functional perspective is characterized by being a direct approach to technical and functionally efficient decision-making. In this approach, evidence is used “to find the most effective and successful ways to address important educational issues and problems, the goal most often being increased student learning and effective classroom teaching at the least possible expense” (Wiseman, 2010, p. 4). The sociopolitical perspective is more complex. In this approach, social and political agendas affect the decisions and manners in which educational problems will be addressed, and evidence are ways to promote these agendas. Finally, the institutional or organizational perspective “suggests that rationally legitimized models for policymaking exist and become slowly institutionalized as part of many organizational systems” (Wiseman, 2010, p. 4).

In this sense, the data collected and presented by means of platforms such as the Educational Panel may have different usages and approaches. Weiss (1998) presents some of these usages, specifically those related to assessment data, which in her view can be divided into four categories: the instrumental use; the conceptual use; the use as means of persuasion; and for clarification. The instrumentaluse is understood as a means to decision-making. In this case, the process of assessment is able to produce findings that may influence the scope of actions under examination, and lead to decisions based on these findings. For example, a municipal or state authority can consider the data the Panel presents about Teaching Effort, and based on that information she may stipulate a maximum number of shifts or of students per teacher in her school board. The second use is conceptual, when the results of assessment may change the understanding of the nature and the function of the program. Faria and Filgueiras (2007) categorize this usage as the educational function of the assessment. The third use is as instrument of persuasion. It can be seen when assessment is used to “legitimate positions” or “gain adherents” (Weiss, 1998, p. 24). In this scenario, the program manager does not become aware of deficiencies, she uses assessment to validate her opinions and gather support (Faria & Filgueiras, 2007). The fourth use is for enlightenment, which functions as influence with institutions and agents not directly related to the program or the policy. The results generated through assessment impact school boards, alter policy paradigms as well as the governmental agenda and influence beliefs and the organization of institutions (Weiss, 1998, p. 24; Faria, 2005).

The Educational Panel has as its mission to present the information collected by Saeb in a more contextualized and useful manner to municipal and state educational administration and to the educational community at large. Consonant to Saeb´s third objective, therefore, the platform aims at fostering an exchange with the community so as to allow for the platform to be improved continuously based on the needs of the users and on best practices found in academia, in the school community and in public management. The Panel also aligns itself with one of the institutional aims of Inep, which has at its premise to develop a system of assessment and statistics and to provide subsidies for the public administration in the planning of policies aiming at providing high quality education (Brazil, 2017b).

It is always important, however, to highlight that “high quality education” is a complex concept, subject to different interpretations and which, like Saeb, consists in many dimensions, agents, and spheres (Cury, 2014; Bauer & Sousa, 2015; Araújo, 2016; Santos & Ferreira, 2020; Pertile & Mori; 2020; Silva, 2020; Silva et al., 2020). Therefore, to encompass the greatest number of dimensions possible in order to offer evidence about basic education in a more plural, significant and fairer manner, much is still to be done as regards Saeb´s fourth objective of developing “technical and scientific competence” and “exchange between research institutions” in all spheres: university and school communities as well as federal, state, municipal and school administrative bodies (Castro, 2016; Soares, 2016; Waiselfisz & Horta Neto, 2016; Fini & Santos, 2020). As stated in Inep Document nº 366/2019, article 3 (Brazil, 2019), and reaffirmed in article 7 of Mec Document nº 458/2020 (Brazil, 2020) and in article 4 or Inep Document nº 10/2021 (Brazil, 2021):

Considering the quality of Basic Education as a multidimensional attribute, Saeb adopts as its reference seven dimensions of quality of Basic Education which are interrelated so as to promote regular learning trajectories aiming at a holistic education of Brazilian Students:

I – School Availability;

II – Teaching and Learning;

III - Funding;

IV – Educational professionals;

V - Management;

VI - Equity; e

VII - Citizenship, Human Rights and Values.

In order for Saeb to encompass all these dimensions, it needs to develop strong bonds with the federative units and the academic and school communities, as noted in the aforementioned documents. In our view, these bonds are made stronger inasmuch as they recover some important aspects of Saeb´s original design, while also analyzing what is must current in the educational field. In the next section, we explore some of the limitations and possibilities that the new definitions for Saeb present to the Panel.

Current challenges and possibilities

The very conception of Saeb (...) leads to exhaustion, because of the use of incomprehensible scales. I believe that 90% of Saeb´s potential is not employed. I think it should go back to its origins, because there were very promising proposals that were abandoned along the way. Today we have resources that we did not have back then. The [state and municipal] departments have gone digital. (Waiselfisz & Horta Neto, 2016, p. 192)

Historically, the Panel has expanded its scope, although not in the same pace as the transformations which Saeb underwent in the last few years. Initially, the Panel presented only the Reading, Writing and Math results for the 3rd grade, both in the Municipal version and in the State version. From 2016 onwards, with the inclusion of the other grades assessed by Saeb´s cognitive tests, the Panel started to offer information about trajectory and context for the three cycles (initial, middle and high school). From this moment onwards, both panels also start to present the Portuguese and Math results for the 5th and the 9th grades. The results for the 3rd and 4th years of high school, in turn, were presented only in the State Panel, since this stage does not belong to the municipal jurisdiction, and up until that edition, in 2015, only a sample of students at this stage participated in Saeb. In 2017, Saeb becomes mandatory for all students in the last year of high school enrolled in public schools, but the results were included in the Municipal Panel only in 2020, when the results of Saeb 2019 were made public. This edition of Saeb in turn encompassed the following public: daycare and preschool (sample); 2nd grade (sample); 5th and 9th grades (census); 3rd and 4th year of high school (census).

Nowadays, Saeb is part of the National Policy for Basic Education Assessment, under the Ministry of Education Document nº 458, published on 5th may 2020 (Brazil, 2020), which states in article 8:

Saeb will take place annually, on a census basis, having as its goal to verify the competences and abilities expected from basic education, in accordance with the National Common Core – BNCC and the corresponding national curriculum guidelines.

The current reformulation of Saeb, as established in this document, involves not only the broadening of the target-population and the reformulation of the cognitive tests´ matrices, scales and items in order to conform to the new curricular guidelines, but also the long-expected reformulation of the contextual questionnaires ministered to students, teachers, school principals and, for the first time, municipal managers. In a motion to revert the tendency indicated by Bonamino (2016), the contextual surveys underwent considerable changes in the 2019 edition, both in content and in format, being ministered electronically for the first time, to school principals and municipal managers. [5] Up to the moment of writing this article, the results of the survey are available only in microdata format [6] which, although open to the public, requires specific statistical software and knowledge thereof in order to be read and interpreted.

A fundamental question to be considered, therefore, is how to make all this information available in a manner that is accessible and significant to the public, in particular to municipal and state educational staff, given that the structure in which the Panel was designed does not accommodate the recent changes in Saeb. Some of these changes, such as the definition of the seven dimensions of Saeb, the inclusion of early childhood education in the scope of the assessment, the cognitive tests for 2nd graders and the inclusion of Natural and Social Sciences in the tests for 9th graders, and the reformulation of contextual surveys, were established in Saeb 2019 (Brazil, 2019), however, are still not reflected in Saeb´s feedback instruments. Other changes are announced for the future and are still uncertain, such as inclusion of every grade in the scope of the assessment, as well as the use of Saeb as an alternative means to university entrance, currently known as “Enem seriado” (Brazil, 2021, art. 5º). Besides the four objectives of Saeb mentioned in the previous section, the Ministry of Education Document nº 458/2020 (Brazil, 2020) also includes the following, restated in article 3 of Inep Document nº 10/2021 (Brazil, 2021):

Article 6. I – to build a culture of assessment, offering to society, in a transparent manner, information about the teaching-learning process in each school, comparable at the national level, annually, and giving results in sufficient time to allow for pedagogical intervention on the part of teachers and the other members of the school community.

A permanent challenge is how to offer feedback that is significant, timely and coherent with Saeb´s objectives (Castro, 2016; Gomes, 2016; Soares, 2016; Pestana, 2016; Lustosa, 2020). To address these challenges, it is important to define who the target-population is, both for the platform as well as for the assessment itself. The lack of clarity in the objectives of Saeb leads to divergence concerning its usages and available tools, sometimes favoring the managerial use over the pedagogical use, and vice-versa. In this sense, it is useful to retrieve aspects of Saeb´s original design, which carried the idea of “communicating vessels” (Waiselfisz & Horta Neto, 2016), with greater cooperation among the federative units and greater attention to what quality consists in, which goes beyond student performance in cognitive tests. [7] As Freitas points out:

We need a broader view of the national system of assessment. The agents of educational assessment are divided between those who act in the federal level, conducting external large-scale assessment; those who are inside the school, that is, the school community itself; and those who are inside the classrooms, conducting the assessment of learning directly with the student. (Freitas et al., 2009, as quoted by Freitas, 2016, p. 134)

The current scenario of the National Policy for Basic Education Assessment includes not only Saeb, but also the National Exam for Certification of Skills for Adults and Young Adults (Encceja) and the National High School Exam (Enem). While exams focus on individual cognitive test results in order to provide the test-taker with a certificate for this or that end, the goal of Saeb as a systemic assessment implies a greater concern with measuring and improving what is offered by the educational system as a whole, not being reducible, therefore, to the average result or the sum of results obtained by individuals in cognitive tests. Although, in theory, there may be correlated aspects between these two aims, this is one of many possible communicating vessels, and may be one in which the cost-benefit ratio (whether in pedagogical terms or in budgetary terms) is the least beneficial to all people envolved in education: students, teachers, school principals, municipal, state or federal administrative staff, researchers and the public at large.

Aligned to Saeb´s original design, and to extensive literature in the field, we understand that for the last few years, but especially in this moment of social, political and sanitary crisis, Saeb needs to evolve in the direction of offering feedback that is more focused on the resources and conditions offered by the educational system, and not so much on specific cognitive results obtained by individual schools or students. Beyond the appeal made by the National Council of Education and by civil society for Saeb 2021 to be sample-based in virtue of the pandemic, the thesis that a well-design sample is capable of delivering the main benefits of a large-scale assessment of this magnitude, while avoiding many of the main negative points, is recurrent in the literature (Waiselfisz & Horta Neto, 2016; Freitas, 2016; Lustosa, 2020).

With the development of digital platforms, propelled even further with the pandemic and the forced acceleration of remote teaching and learning in all educational levels, the possibilities for significant feedback are even greater, many times delivered by the private sector, which also conditions its uses to the aims and goals of the private sector (Soares, 2016; Pertile & Mori, 2020; Rodrigues, 2020). It is on Inep, as part of its institutional mission, to accompany these advancements so as to democratize access to high quality educational information, in order to provide subsidy to decision-making, taking into consideration a wider range of aspects related to the right to education, where cognitive test results are only one among many aspects of something so complex and multifaceted as basic education.

Final Considerations

With a national system of assessment that is becoming bigger and more complex, and in which test results generate immediate consequences for individuals, schools and school boards, it is important that every assessment initiative be coherent, articulated and technically grounded, and that it presents in a clearer and more transparent manner why, what for, for whom, what and how to assess. (Pestana, 2016, p. 81)

The Educational Panel was created in 2015 as an initiative of Inep´s technical team, of which we were part, to offer a wider range of elements of analysis to state and municipal educational administrative staff. Even with its limitations, the contextualized feedback with aggregated information by area was an effort in order that the educational community could have a clearer and fairer view of the organization and scope of the teaching and learning conditions of each school board.

The platform´s aim to the present is to make aggregated information obtained though Saeb available and easily accessible to educational administrative staff, the academic community and the public at large. If well-presented and well-articulated with the various school boards, these data can be more than simple information about the educational scenario; they can ultimately serve as a lever for change in the school boards (Kellaghan et al., 2011), especially in such a difficult and transformative moment such as this one brought about by the covid-19 pandemic.

However, with the changes that Saeb has been undergoing in the last few years, it is necessary to redesign the Educational Panel and the feedback tools of Saeb as a whole, in order to accompany its various recent changes, and those brought about the National Policy of Basic Education Assessment. Beyond the initiated or announced expansion on the target-population, it is necessary to consider how to offer the public more information about the other dimensions of education listed in the Ministry of Education Document nº 458/2020 (Brazil, 2020).

This future agenda is a challenge for Inep and for the educational community in general. Only with more encompassing data, easy to access and to use, which contextualizes the school boards in ways that are fair, clear and timely, including information about learning, inequity and operations, will it be possible to formulate public policies for education that promote continuous improvement in basic education across the country.

REFERENCES

Aguiar, M. S., & Tuttman, M. T. (2020). Políticas educacionais no Brasil e a Base Nacional Comum Curricular: disputas de projetos. Em Aberto, 33(107), 69-94. https://doi.org/10.24109/2176-6673.emaberto.33i107.4533Links ]

Araújo, I. A. (2016). Avaliação em larga escala e qualidade: dos enquadres regulatórios aos caminhos alternativos. Linhas Críticas, 22(48), 462-479. https://doi.org/10.26512/lc.v22i48.4920Links ]

Ball, S. J. (1998) Big policies/small world: an introduction to international perspectives in education policy. Comparative Education, 34(2),119-130. https://doi.org/10.1080/03050069828225Links ]

Basso, F. V. (2017). Uso dos resultados do Saeb/Prova Brasil na formulação de políticas educacionais estaduais. [Dissertação de mestrado, Universidade de Brasília]. Repositório institucional da UnB. http://repositorio.unb.br/handle/10482/31697 Links ]

Bauer, A., & Sousa, S. Z. (2015). Indicadores para avaliação de programas educacionais: desafios metodológicos. Ensaio: avaliação de políticas públicas em educação, 23(86), 259-284. https://doi.org/10.1590/S0104-40362015000100010Links ]

Bonamino, A. (2016). A evolução do Saeb: desafios para o futuro. Em Aberto, 29(96), 113-126. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Brasil. (2014a). Nota Técnica 40, de 17 de dezembro de 2014. (Indicador de Complexidade de Gestão). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/informacoes_estatisticas/indicadores_educacionais/2014/escola_complexidade_gestao/nota_tecnica_indicador_escola_complexidade_gestao.pdfLinks ]

Brasil. (2014b). Nota Técnica 39, de 17 de dezembro de 2014. (Indicador de Esforço Docente). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/informacoes_estatisticas/indicadores_educacionais/2014/docente_esforco/nota_tecnica_indicador_docente_esforco.pdfLinks ]

Brasil. (2014c). Nota Técnica 20, de 21 de novembro de 2014. (Indicador de Formação Docente). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/informacoes_estatisticas/indicadores_educacionais/2014/docente_formacao_legal/nota_tecnica_indicador_docente_formacao_legal.pdfLinks ]

Brasil. (2015a). Avaliação Nacional da Alfabetização: Relatório 2013-2014: volume 1: da concepção à realização. Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. http://inep.gov.br/documents/186968/484421/Relat%C3%B3rio+ANA+2013-2014+-+Da+concep%C3%A7%C3%A3o+%C3%A0+realiza%C3%A7%C3%A3o/8570af6a-c76e-432a-846f-e69bbb79e4b2?version=1.2Links ]

Brasil. (2015b). Nota Técnica: Indicador de Nível Socioeconômico das Escolas de Educação Básica. Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/informacoes_estatisticas/indicadores_educacionais/2015/nota_tecnica/nota_tecnica_inep_inse_2015.pdfLinks ]

Brasil. (2015c). Nota Técnica 11, de 25 de junho de 2015. (Indicador de Regularidade Docente). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/informacoes_estatisticas/indicadores_educacionais/2014/docente_regularidade_vinculo/nota_tecnica_indicador_regularidade_2015.pdf [ Links ]

Brasil. (2016). Portaria nº 410, de 22 de julho de 2016 (Estabelece a estratégia para a realização da Avaliação Nacional da Alfabetização - ANA, no ano de 2016). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/educacao_basica/prova_brasil_saeb/legislacao/2016/portaria_n410_22072016_ana.pdfLinks ]

Brasil. (2017a). Portaria nº 447, de 24 de maio de 2017 (Estabelece diretrizes para o planejamento e a operacionalização do Sistema de Avaliação da Educação Básica (SAEB) no ano de 2017). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/educacao_basica/saeb/2017/legislacao/portaria_n447_24052017.pdfLinks ]

Brasil. (2017b). Portaria nº 986, de 21 de dezembro de 2017 (Aprova o Regimento Interno do Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira - Inep). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. http://inep.gov.br/web/guest/sobre-o-inep/institucionalLinks ]

Brasil. (2018). Relatório SAEB (ANEB e ANRESC) 2005-2015: panorama da década. Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/educacao_basica/saeb/2018/documentos/livro_saeb_2005_2015_completo.pdfLinks ]

Brasil. (2019). Portaria nº 366, de 29 de abril de2019 (Estabelece as diretrizes de realização do Sistema de Avaliação da Educação Básica (SAEB) no ano de 2019). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://download.inep.gov.br/educacao_basica/saeb/2019/legislacao/portaria_n366_29042019.pdfLinks ]

Brasil. (2020). Portaria nº 458, de 5 de maio de 2020 (Institui normas complementares necessárias ao cumprimento da Política Nacional de Avaliação da Educação Básica). Ministério da Educação. https://download.inep.gov.br/educacao_basica/saeb/2020/legislacao/portaria_n458_05052020.pdfLinks ]

Brasil. (2021). Portaria nº 10, de 8 de janeiro de 2021 (Estabelece parâmetros e fixa diretrizes gerais para implementação do Sistema de Avaliação da Educação Básica - Saeb, no âmbito da Política Nacional de Avaliação da Educação Básica). Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira. https://www.in.gov.br/en/web/dou/-/portaria-n-10-de-8-de-janeiro-de-2021-298322305Links ]

Campbell, C., & Levin, B. (2009). Using data to support educational improvement. Educational Assessment, Evaluation and Accountability, 21(1), 47-65. https://link.springer.com/article/10.1007/s11092-008-9063-xLinks ]

Castro, M. H. G. (2016). O Saeb e a agenda de reformas educacionais: 1995 a 2002. Em Aberto, 29(96), 85-98. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Cury, C. R. J. (2014). A qualidade da educação brasileira como direito. Educação & Sociedade, 35(129), 1053-1066. https://doi.org/10.1590/ES0101-73302014143981Links ]

Faria, C. A. P. (2005). A política da avaliação de políticas públicas. Revista Brasileira de Ciências Sociais, 20(59), 97-109. https://doi.org/10.1590/S0102-69092005000300007Links ]

Faria, C. A. P., & Filgueiras, C. A. C. (2007). As políticas dos sistemas de avaliação da educação básica do Chile e do Brasil. Em Hochman, G., Arretche, M. & Marques., E. (Orgs.) Políticas públicas no Brasil (pp. 327-368). Fiocruz. [ Links ]

Fernandes, R. (2016). A universalização da avaliação e a criação do Ideb: pressupostos e perspectivas. Em Aberto, 29(96), 99-112. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Fini, M. I., & Santos, A. V. F. (2020). Currículo comum, avaliações externas e qualidade da educação. Em Aberto, v. 33, n. 107, p. 191-202. https://doi.org/10.24109/2176-6673.emaberto.33i107.4535Links ]

Freitas, L. C. (2016). A importância da avaliação: em defesa de uma responsabilização participativa. Em Aberto, 29(96), 127-140. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Gomes, C. A. (2016). O tema da avaliação educacional na Constituição de 1988 e na Lei de Diretrizes e Bases da Educação de 1996. Em Aberto, 29(96), 53-70. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Horta Neto, J. L., Junqueira, R. D., & Oliveira, A. S. (2016). Do Saeb ao Sinaeb: prolongamentos críticos da avaliação da educação básica. Em Aberto, 29(96), 21-40. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Howlett, M., Ramesh, M., & Perl, A. (2013). Política Pública: seus ciclos e subsistemas: uma abordagem integral. Campus. [ Links ]

Jannuzzi, P. M. (2005). Indicadores para diagnóstico, monitoramento e avaliação de programas sociais no Brasil. Revista do Serviço Público, 56(2), 137-160. https://doi.org/10.21874/rsp.v56i2.222Links ]

Kellaghan, T., Greaney, V., & Murray, T. S. (2011). Pesquisas do Banco Mundial sobre avaliações de desempenho educacional, v. 5: O uso dos resultados da avaliação do desempenho educacional. World Bank. https://openknowledge.worldbank.org/bitstream/handle/10986/2667/501710PUB00POR00Box0361492B0PUBLIC0.pdf?sequence=5&isAllowed=yLinks ]

Leão, B. L. F., & Souza, A. S. (2020). Sistemas municipais de avaliação da educação (2014-2019): o que as pesquisas revelam? Linhas Críticas, 26, 1-19. https://periodicos.unb.br/index.php/linhascriticas/article/view/33369Links ]

Lustosa, L. A. (2020). Um estudo sobre o plano amostral do Saeb. [Dissertação de mestrado, Universidade Federal de Juiz de Fora]. Repositório Institucional da UFJF - Caed. http://mestrado.caedufjf.net/um-estudo-sobre-o-plano-amostral-do-saeb/ [ Links ]

Macedo, E. P. N. (2011). Philosophy of the many: high school philosophy and a politics of difference. [PhD Thesis, University of Toronto]. TSpace Repository - School of Graduate Studies. https://tspace.library.utoronto.ca/handle/1807/31847 Links ]

Mainardes, J. (2006). Abordagem do ciclo de políticas: uma contribuição para a análise de políticas educacionais. Educação & Sociedade, 27(94), 47-69. https://doi.org/10.1590/S0101-73302006000100003Links ]

Pertile, E. B., & Mori, N. N. R. (2020). Avaliação: a relação entre significado, concepção e procedimentos. Linhas Críticas, 26, 1-15. https://periodicos.unb.br/index.php/linhascriticas/article/view/34246/28161Links ]

Pestana, M. I. (2016). Trajetória do SAEB: criação, amadurecimento e desafios. Em Aberto, 29(96), 71-84. http://rbep.inep.gov.br/ojs3/index.php/emaberto/article/view/3152/2887Links ]

Rodrigues, E. S. J. (2020). Estudos de plataforma: dimensões e problemas do fenômeno no campo da educação. Linhas Críticas, 26, 1-12. https://periodicos.unb.br/index.php/linhascriticas/article/view/28150Links ]

Santos, A. A., Horta Neto, J. L., Junqueira, R. D. (2017). Sistema Nacional de Avaliação da Educação Básica (Sinaeb): proposta para atender ao disposto no Plano Nacional de Educação. Instituto Nacional de Estudos e Pesquisas Anísio Teixeira. https://anped.org.br/sites/default/files/images/sistema_nacional_de_avaliacao_da_educacao_basica_sinaeb_-_proposta_para_atender_ao_disposto_no_plano_nacional_de_educacao_1.pdfLinks ]

Santos, A. V. F., & Ferreira, M. S. (2020). Currículo nacional comum: uma questão de qualidade? Em Aberto, 33(107), 27-42. https://doi.org/10.24109/2176-6673.emaberto.33i107.4528Links ]

Santos, J. R. S., & Pinto, V. F. F. (2016). Por um alargamento da qualidade educacional: um olhar retrospectivo para as avaliações em larga escala no Brasil. Em Aberto, 29(96), 209-216. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Silva, F. T. (2020). O nacional e o comum no ensino médio: autonomia docente na organização do trabalho pedagógico. Em Aberto, 33(107), 155-172. https://doi.org/10.24109/2176-6673.emaberto.33i107.4489Links ]

Silva, S. S., Pires, E. D. P. B., & Ferraz, M. O. M. (2020). Reflexos da política de gestão gerencial sobre o trabalho do coordenador pedagógico. Linhas Críticas, 26, 1-14. https://periodicos.unb.br/index.php/linhascriticas/article/view/3176Links ]

Soares, J. F. (2016). O direito à educação no contexto da avaliação educacional. Em Aberto, 29(96), 141-152. https://doi.org/10.24109/2176-6673.emaberto.29i96.%25pLinks ]

Waiselfisz, J. J., & Horta Neto, J. L. (2016). As origens do Saeb. Em Aberto, 29(96), 177-193. https://doi.org/10.24109/2176-6673.emaberto.29i96.2705Links ]

Weiss, C. H. (1998). Have we learned anything new about the use of evaluation? The American Journal of Evaluation, 19(1), 21-33. https://doi.org/10.1177/109821409801900103Links ]

Willms, J. D., Tramonte, L., Duarte, J., & Bos, S. (2012). Assessing Educational Equality and Equity with Large-Scale Assessment Data: Brazil as a Case Study. Inter-American Development Bank. https://publications.iadb.org/publications/english/document/Assesing-Educational-Equality-and-Equity-with-Large-Scale-Assessment-Data-Brazil-as-a-Case-Study.pdfLinks ]

Wiseman, A. W. (2010). The uses of evidence for educational policymaking: global contexts and international trends. Review of Research in Education, 34(1), 1-24. https://doi.org/10.3102/0091732X09350472Links ]

Wu, X., Ramess, W., Howllet, M., & Fritzen, S. (2014). Guia de políticas públicas: gerenciando processos. Tradução de Ricardo Avelar de Souza. Enap. http://repositorio.enap.gov.br/handle/1/2555Links ]

[1]The translation from Portuguese to English of the original quotations used in this text was made by the authors.

[7]Proposals that include the other dimensions of Saeb, in partnership with states and municipalities, are offered, by instance, by Santos et al. (2017) and Lustosa (2020).

Received: March 08, 2021; Accepted: May 24, 2021

Creative Commons License Esta obra está bajo una Licencia Creative Commons Atribución 4.0 Internacional.