Resumen:
IntroductionConcerns about the risk of bias (RoB) of Meta-analysis (MAs) have grown in parallel with the exponential increase in the number of publications in science. However, this has not been properly assessed in Education. The aims were to evaluate the RoB of MAs in Education and to identify potential predictors of a lower RoB.MethodsSystematic review. Selection criteria were all MAs of experimental design evaluating the effectiveness of educational interventions on any academic outcome published from 1 January 2009 (year of publication of the first PRISMA guideline) published in English or Spanish, with the exclusion of those with other designs, evaluating other outcomes or not accessible to full text. A systematic search was performed in four databases (ERIC, Web of Science, Scopus, and PubMed) until March 2022. A preregistered protocol was used to extract data on study characteristics, PRISMA compliance, and RoB, based on the AMSTAR 2 instrument, and dichotomized as low vs. high RoB. The study selection and data extraction process were independently conducted by two researchers and disagreements were solved by consensus or by a third researcher. Statistical analysis: A flow-diagram and descriptive tables were tabulated. As a measure of association, odds ratios (OR) and its 95% confidence intervals were calculated by logistic regression analysis with dichotomized RoB as the dependent variable.ResultsA total of 69 meta-analyses of studies were identified. Almost 90% (n = 62) of them were rated with a critically low overall confidence level, and almost 70% (n = 49) had a high RoB. Factors related to a low RoB were adherence to PRISMA guideline (OR = 5.5; 95%CI: 1.8-16.6), the most recent studies (OR = 7.4; 95%CI: 1.5-35.3), a higher number of authors (OR = 1.4; 95%CI: 1.1-1.9), a corresponding author from a European country (OR = 3.7; 95%CI: 1.1-12.8), and publishing in the health educational area (OR = 13.4; 95%CI: 3.6-49.6).ConclusionsOur study raises concerns regarding the methodological quality of published MA in Education. The use of instruments, such as AMSTAR 2 and PRISMA 2020, may improve the quality of future MA in Education.Meta-analyses and systematic reviews are research designs developed to identify, evaluate, and summarize the findings of all relevant individual studies over a given research question to make the best available evidence more accessible to readers and decision-makers. It is crucial to systematically assess their methodological weaknesses that could diminish the confidence in their results (risk of bias) and to identify related factors. We assessed the risk of biases in meta-analyses of experimental interventions developed in Education to improve academic performance. Our results suggested that the overall confidence in the results of most of the included studies was low or critically low and several predictors of lower risk of bias were identified (e.g. most recent publications, higher number of authors, European corresponding author and published in education in health sciences). Authors, editors, and users should be aware of the importance of improving the quality of the methodology of these designs.