Název: Assessment of German and Austrian students' Educational Research Literacy: validation of a competency test based on cross-national comparisons
Zdrojový dokument: Studia paedagogica. 2021, roč. 26, č. 4, s. [27]-45
Rozsah
[27]-45
-
ISSN1803-7437 (print)2336-4521 (online)
Trvalý odkaz (DOI): https://doi.org/10.5817/SP2021-4-2
Trvalý odkaz (handle): https://hdl.handle.net/11222.digilib/144767
Type: Článek
Jazyk
Licence: Neurčená licence
Upozornění: Tyto citace jsou generovány automaticky. Nemusí být zcela správně podle citačních pravidel.
Abstrakt(y)
Educational Research Literacy (ERL) is the ability to access, comprehend, and consider scientific information and to apply the resulting conclusions to problems connected with educational decisions. It is crucial for the process of data-based decision making and–corresponding to the consecutive phases–defined as the conglomeration of different facets of competence, including information literacy, statistical literacy, and evidencebased reasoning. However, the engagement with research in educational contexts appears to have some difficulties. This is even more remarkable as the state of knowledge about actual teacher competency levels remains unsatisfactory, even though test instruments for assessing research literacy have been developed in recent years. This paper addresses the question of whether such a test developed in the specific context of German study programs in (teacher) education can be applied to other national contexts, in this case to Austrian teacher education. An investigation of the construct validity under consideration of the psychometric structure and group differences on item level is necessary for ensuring the fairness of cross-national comparisons. Based on multidimensional item response theory models, samples from Germany (n = 1360 students, 6 universities) and Austria (n = 295 students, 2 universities) are investigated in terms of measurement invariance between the two countries. A comparable psychometric structure and at least partial measurement invariance with no particular advantage for either sample could be demonstrated. This is an indication that the presented test instrument can be validly applied to assess the research literacy of teacher training students in both countries.
Reference
[1] Adams, R. J., & Wu, M. (2002). PISA 2000 Technical Report. Organisation for Economic Cooperation and Development. https://www.oecd.org/pisa/data/33688233.pdf
[2] Altrichter, H., Brüsemeister, T., & Heinrich, M. (2005). Merkmale und Fragen einer GovernanceReform am Beispiel des österreichischen Schulwesens. Österreichische Zeitschrift für Soziologie, 30(4), 6–28. https://doi.org/10.1007/s11614-006-0063-0 | DOI 10.1007/s11614-006-0063-0
[3] Bach, A., Wurster, S., Thillmann, K., Pant, H. A., & Thiel, F. (2014). Vergleichsarbeiten und schulische Personalentwicklung—Ausmaß und Voraussetzungen der Datennutzung. Zeitschrift für Erziehungswissenschaft, 17(1), 61–84. https://doi.org/10.1007/s11618-014-0486-5 | DOI 10.1007/s11618-014-0486-5
[4] Ben-Zvi, D., & Garfield, J. (2004). The challenge of developing statistical literacy, reasoning and thinking (1st ed.). Springer. https://doi.org/10.1007/1-4020-2278-6 | DOI 10.1007/1-4020-2278-6
[5] Blixrud, J. C. (2003). Project SAILS: Standardized assessment of information literacy skills. In G. J. Barret (Ed.), ARL: A bimonthly report on research library issues and actions from ARL, CNI, and SPARC. Special double issue on new measures. Number 230-231, October-December 2003 (pp. 18–19). Association of Research Libraries. https://files.eric.ed.gov/fulltext/ED498293.pdf
[6] Blömeke, S., & Zlatkin-Troitschanskaia, O. (2013). Kompetenzmodellierung und Kompetenzerfassung im Hochschulsektor: Ziele, theoretischer Rahmen, Design und Herausforderungen des BMBF-Forschungsprogramms KoKoHs (KoKoHs Working Papers No. 1). Kompetenzmodellierung und Kompetenzerfassung im Hochschulsektor. https://www.kompetenzen-im-hochschulsektor.de/files/2018/05/KoKoHs_WP1_Bloemeke_Zlatkin-Troitschanskaia_2013_.pdf
[7] Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Erlbaum.
[8] Bos, W., Postlethwaite, T. N., & Gebauer, M. M. (2010). Potenziale, Grenzen und Perspektiven internationaler Schulleistungsforschung. In R. Tippelt & B. Schmidt (Eds.), Handbuch Bildungsforschung (3rd ed., pp. 275–295). Springer.
[9] Böttcher-Oschmann, F., Groß Ophoff, J., & Thiel, F. (2019). Validierung eines Fragenbogens zur Erfassung studentischer Forschungskompetenzen über Selbsteinschätzungen – Ein Instrument zur Evaluation forschungsorientierter Lehr-Lernarrangements. Unterrichtswissenschaft, 47(4), 495–521. https://doi.org/10.1007/s42010-019-00053-8 | DOI 10.1007/s42010-019-00053-8
[10] Braun, E., Gusy, B., Leidner, B., & Hannover, B. (2008). Das Berliner Evaluationsinstrument für selbsteingeschätzte, studentische Kompetenzen (BEvaKomp). Diagnostica, 54(1), 30–42. https://doi.org/10.1026/0012-1924.54.1.30 | DOI 10.1026/0012-1924.54.1.30
[11] Brown, C., Poortman, C., Gray, H., Groß Ophoff, J., & Wharf, M. (2021). Facilitating collaborative reflective inquiry amongst teachers: What do we currently know? International Journal of Educational Research, 105, 101695. https://doi.org/10.1016/j.ijer.2020.101695 | DOI 10.1016/j.ijer.2020.101695
[12] Brown, C., Schildkamp, K., & Hubers, M. D. (2017). Combining the best of two worlds: A conceptual proposal for evidence-informed school improvement. Educational Research, 59(2), 154–172. https://doi.org/10.1080/00131881.2017.1304327 | DOI 10.1080/00131881.2017.1304327
[13] Bundesministerium für Unterricht, Kunst und Kultur, & Bundesministerium für Wissenschaft und Forschung [Federal Ministry for Education, the Arts and Culture & Federal Ministry of Science and Research]. (2010). LehrerInnenbildung NEU – Die Zukunft der pädagogischen Berufe: Die Empfehlungen der ExpertInnengruppe [Teacher Education NEW – The future of pedagogical professions]. https://www.qsr.or.at/dokumente/1870-20140529-092820-Empfehlungen_der_ExpertInnengruppe_Endbericht_092010_2_Auflage.pdf
[14] Cain, T. (2015). Teachers' engagement with research texts: Beyond instrumental, conceptual or strategic use. Journal of Education for Teaching, 41(5), 478–492. https://doi.org/10.1080/02607476.2015.1105536 | DOI 10.1080/02607476.2015.1105536
[15] Cramer, C., Harant, M., Merk, S., Drahmann, M., & Emmerich, M. (2019). Meta-Reflexivität und Professionalität im Lehrerinnen- und Lehrerberuf. Zeitschrift für Pädagogik, 65(3), 401–423.
[16] Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52(4), 281–302. https://doi.org/10.1037/h0040957 | DOI 10.1037/h0040957
[17] Davidov, E., Meuleman, B., Cieciuch, J., Schmidt, P., & Billiet, J. (2014). Measurement equivalence in cross-national research. Annual Review of Sociology, 40, 55–75. https://doi.org/10.1146/annurev-soc-071913-043137 | DOI 10.1146/annurev-soc-071913-043137
[18] Davies, P. (1999). What is evidence-based education? British Journal of Educational Studies, 47(2), 108–121. https://doi.org/10.1111/1467-8527.00106 | DOI 10.1111/1467-8527.00106
[19] DeLuca, C., & Johnson, S. (2017). Developing assessment capable teachers in this age of accountability. Assessment in Education: Principles, Policy & Practice, 24(2), 121–126. https://doi.org/10.1080/0969594X.2017.1297010 | DOI 10.1080/0969594X.2017.1297010
[20] DeLuca, C., LaPointe-McEwan, D., & Luhanga, U. (2016). Teacher assessment literacy: A review of international standards and measures. Educational Assessment, Evaluation and Accountability, 28(3), 251–272. https://doi.org/10.1007/s11092-015-9233-6 | DOI 10.1007/s11092-015-9233-6
[21] de Olano, D. (2010). Gewinner, Verlieren und Exoten - PISA in sieben weiteren Staaten. In P. Knodel, K. Martens, D. de Olano, & M. Popp (Eds.), Das PISA-Echo. Internationale Reaktionen auf die Bildungsstudie (pp. 251–300). Campus.
[22] Dietrich, H., Zhang, Y., Klopp, E., Brünken, R., Krause, U.-M., Spinath, F. M., Stark, R., & Spinath, B. (2015). Scientific competencies in the social sciences. Psychology Learning & Teaching, 14(2), 115–130. https://doi.org/10.1177/1475725715592287 | DOI 10.1177/1475725715592287
[23] Dunn, K. E., Skutnik, A., Patti, C., & Sohn, B. (2019). Disdain to acceptance: Future teachers' conceptual change related to data-driven decision making. Action in Teacher Education, 41(3), 193–211. https://doi.org/10.1080/01626620.2019.1582116 | DOI 10.1080/01626620.2019.1582116
[24] Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2017). The effects of a data use intervention on educators' satisfaction and data literacy. Educational Assessment, Evaluation and Accountability, 29(1), 83–105. https://doi.org/10.1007/s11092-016-9251-z | DOI 10.1007/s11092-016-9251-z
[25] Federal Ministry of Education and Research Germany. (2015). Bericht der Bundesregierung über die Umsetzung des Bologna-Prozesses 2012–2015 in Deutschland. https://www.kmk.org/fileadmin/veroeffentlichungen_beschluesse/2015/2015_02_12-NationalerBericht_Umsetzung_BolognaProzess.pdf
[26] Förster, M., Zlatkin-Troitschanskaia, O., Brückner, S., Happ, R., Hambleton, R. K., Walstad, W. B., Asano, T., & Yamaoka, M. (2015). Validating test score interpretations by cross-national comparison: Comparing the results of students from Japan and Germany on an American test of economic knowledge in higher education. Zeitschrift Für Psychologie, 223(1), 14–23. https://doi.org/10.1027/2151-2604/a000195 | DOI 10.1027/2151-2604/a000195
[27] Fullan, M. (2005). The meaning of educational change: A quarter of a century of learning. In A. Lieberman (Ed.), The roots of educational change: International handbook of educational change (1st ed., pp. 202–216). Springer. https://doi.org/10.1007/1-4020-4451-8_12 | DOI 10.1007/1-4020-4451-8_12
[28] Gess, C., Wessels, I., & Blömeke, S. (2017). Domain-specificity of research competencies in the social sciences: Evidence from differential item functioning. Journal for Educational Research Online, 9(2), 11–36. https://doi.org/10.25656/01:14895 | DOI 10.25656/01:14895
[29] Gierl, M. J. (2005). Using dimensionality-based DIF analyses to identify and interpret constructs that elicit group differences. Educational Measurement: Issues and Practice, 24(1), 3–14. https://doi.org/10.1111/j.1745-3992.2005.00002.x | DOI 10.1111/j.1745-3992.2005.00002.x
[30] Gonon, P. (2011). Die Bedeutung des internationalen Arguments in der Lehrerbildung. Beiträge zur Lehrerbildung, 29(1), 20–26. https://doi.org/10.25656/01:13763 | DOI 10.25656/01:13763
[31] Green, S. B., & Yang, Y. (2009). Reliability of summed item scores using structural equation modeling: An alternative to coefficient alpha. Psychometrika, 74(1), 155–167. https://doi.org/10.1007/s11336-008-9099-3 | DOI 10.1007/s11336-008-9099-3
[32] Gregoire, M. (2003). Is it a challenge or a threat? A dual-process model of teachers' cognition and appraisal processes during conceptual change. Educational Psychology Review, 15(2), 147–179. https://doi.org/10.1023/A:1023477131081 | DOI 10.1023/A:1023477131081
[33] Grisay, A., de Jong, J. H. A. L., Gebhardt, E., Berezner, A., & Halleux-Monseur, B. (2007). Translation equivalence across PISA countries. Journal of Applied Measurement, 8(3), 249–266.
[34] Groß Ophoff, J., & Cramer, C. (in press). The engagement of teachers and school leaders with data, evidence and research in Germany. In C. Brown & J. R. Malin (Eds.), The Emerald international handbook of evidence-informed practice in education: Learning from international contexts. Emerald.
[35] Groß Ophoff, J., Schladitz, S., & Wirtz, M. (2017). Differences in research literacy in educational science depending on study program and university. In J. Domenech i de Soria, M. C. Vincent-Vela, E. de la Poza, & D. Blazquez (Eds.), Proceedings of the HEAd'17. 3rd international conference on higher education advances (pp. 1193–1202). Congress UPV. http://dx.doi.org/10.4995/HEAD17.2017.6713 | DOI 10.4995/HEAD17.2017.6713
[36] Groß Ophoff, J., Wolf, R., Schladitz, S., & Wirtz, M. (2017). Assessment of educational research literacy in higher education: Construct validation of the factorial structure of an assessment instrument comparing different treatments of omitted responses. Journal for Educational Research Online, 9(2), 37–68. https://doi.org/10.25656/01:14896 | DOI 10.25656/01:14896
[37] Gustafson, J.-E. (2001). Measurement from a hierarchical point of view. In H. I. Braun, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement (1st ed., pp. 87–111). Routledge.
[38] Haberfellner, C. (2016). Der Nutzen von Forschungskompetenz im Lehramt: Eine Einschätzung aus der Sicht von Studierenden der Pädagogischen Hochschulen in Österreich. Klinkhardt.
[39] Halonen, J. S. (2008). Measure for measure: The challenge of assessing critical thinking. In D. S. Dunn, J. S. Halonen, & R. A. Smith (Eds.), Teaching critical thinking in psychology: A handbook of best practices (1st ed., pp. 61–76). Wiley-Blackwell.
[40] Hamilton, V. M., & Reeves, T. D. (2021). Relationships between course taking and teacher self-efficacy and anxiety for data-driven decision making. The Teacher Educator, 1–19. https://doi.org/10.1080/08878730.2021.1965682 | DOI 10.1080/08878730.2021.1965682
[41] Hartig, J., & Höhler, J. (2009). Multidimensional IRT models for the assessment of competencies. Studies in Educational Evaluation, 35(2-3), 57–63. https://doi.org/10.1016/j.stueduc.2009.10.002 | DOI 10.1016/j.stueduc.2009.10.002
[42] Healey, M. (2005). Linking research and teaching: Exploring disciplinary spaces and the role of inquiry-based learning. In R. Barnett (Ed.), Reshaping the University: New Relationships between Research, Scholarship and Teaching (1st ed., pp. 67–78). Open University Press.
[43] Helsper, W. (2016). Lehrerprofessionalität – der strukturtheoretische Ansatz. In M. Rothland (Ed.), Beruf Lehrer/Lehrerin: Ein Studienbuch (pp. 103–125). Waxmann.
[44] Hofmann, F., Hagenauer, G., & Martinek, D. (2020). Entwicklung und Struktur der Lehrerinnenund Lehrerbildung in Österreich. In C. Cramer, J. König, M. Rothland, & S. Blömeke (Eds.), Handbuch Lehrerinnen- und Lehrerbildung (pp. 227–236). Klinkhardt.
[45] Holland, P. W., & Wainer, H. (1993). Differential item functioning (1st ed.). Erlbaum.
[46] Jesacher-Roessler, L., & Kemethofer, D. (in press). Evidence-informed practice in Austrian education. In C. Brown & J. R. Malin (Eds.), The Emerald international handbook of evidence-informed practice in education. Emerald.
[47] Katenbrink, N., & Goldmann, D. (2020). Varianten Forschenden Lernens—Ein konzeptbasierter Typisierungsvorschlag. In M. Basten, C. Mertens, A. Schöning, & E. Wolf (Eds.), Forschendes Lernen in der Lehrer/innenbildung: Implikationen für Wissenschaft und Praxis (pp. 195–202). Waxmann.
[48] Kiefer, T., Robitzsch, A., & Wu, M. (2013). TAM (Test analysis modules). http://www.edmeasurementsurveys.com/TAM/Tutorials/
[49] Kittel, D., Rollett, W., & Groß Ophoff, J. (2017). Profitieren berufstätige Lehrkräfte durch ein berufsbegleitendes weiterbildendes Studium in ihren Forschungskompetenzen? Bildung & Erziehung, 70(4), 437–452.
[50] Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23(4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 | DOI 10.1016/j.cogdev.2008.09.006
[51] Larcher, S., & Oelkers, J. (2004). Deutsche Lehrerbildung im internationalen Vergleich. In S. Blömke, P. Reinhold, G. Tulodziecki, & J. Wildt (Eds.), Handbuch Lehrerbildung (pp. 128–150). Klinkhardt.
[52] Lowman, R. L., & Williams, R. E. (1987). Validity of self-ratings of abilities and competencies. Journal of Vocational Behavior, 31(1), 1–13. https://doi.org/10.1016/0001-8791(87)90030-3 | DOI 10.1016/0001-8791(87)90030-3
[53] Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47(2), 71–85. https://doi.org/10.1080/00461520.2012.667064 | DOI 10.1080/00461520.2012.667064
[54] Mandinach, E. B., Honey, M., Light, D., & Brunner, C. (2008). A conceptual framework for data-driven decision making. In M. Honey (Ed.), Data-Driven School Improvement: Linking Data and Learning, (pp. 13–31). Teachers College Press.
[55] Marsh, J. A. (2012). Interventions promoting educators' use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48. | DOI 10.1177/016146811211401106
[56] Meissner, D., Vogel, E., & Horn, K.-P. (2012). Lehrerausbildung in Baden-Württemberg seit 1945. Angleichungs-und Abgrenzungsprozesse. In C. Cramer (Ed.), Lehrerausbildung in BadenWürttemberg. Historische Entwicklungslinien und aktuelle Herausforderungen, (pp. 33–62). IKS Garamond.
[57] Mintrop, R., & Zumpe, E. (2019). Solving real-life problems of practice and education leaders' school improvement mind-set. American Journal of Education, 125(3), 295–344. https://doi.org/10.1086/702733 | DOI 10.1086/702733
[58] Münchow, H., Richter, T., von der Mühlen, S., & Schmid, S. (2019). The ability to evaluate arguments in scientific texts: Measurement, cognitive processes, nomological network, and relevance for academic success at the university. The British Journal of Educational Psychology, 89(3), 501–523. https://doi.org/10.1111/bjep.12298 | DOI 10.1111/bjep.12298
[59] Ntuli, E., & Kyei-Blankson, L. (2016). Improving K-12 online learning: Information literacy skills for teacher candidates. International Journal of Information and Communication Technology Education, 12(3), 38–50. | DOI 10.4018/IJICTE.2016070104
[60] Prenger, R., & Schildkamp, K. (2018). Data-based decision making for teacher and student learning: A psychological perspective on the role of the teacher. Educational Psychology, 38(6), 734–752. https://doi.org/10.1080/01443410.2018.1426834 | DOI 10.1080/01443410.2018.1426834
[61] Prenzel, M., Walter, O., & Frey, A. (2007). PISA misst Kompetenzen. Psychologische Rundschau, 58(2), 128–136. https://doi.org/10.1026/0033-3042.58.2.128 | DOI 10.1026/0033-3042.58.2.128
[62] Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service teachers. Teaching and Teacher Education, 50, 90–101. https://doi.org/10.1016/j.tate.2015.05.007 | DOI 10.1016/j.tate.2015.05.007
[63] Reise, S. P., Moore, T. M., & Haviland, M. G. (2010). Bifactor models and rotations: Exploring the extent to which multidimensional data yield univocal scale scores. Journal of Personality Assessment, 92(6), 544–559. https://doi.org/10.1080/00223891.2010.496477 | DOI 10.1080/00223891.2010.496477
[64] Richter, D., Böhme, K., Becker, M., Pant, H. A., & Stanat, P. (2014). Überzeugungen von Lehrkräften zu den Funktionen von Vergleichsarbeiten. Zusammenhänge zu Veränderungen im Unterricht und den Kompetenzen von Schülerinnen und Schülern. Zeitschrift für Pädagogik, 60(2), 225–244. https://doi.org/10.25656/01:12846 | DOI 10.25656/01:12846
[65] Rost, D. H. (2013). Interpretation und Bewertung pädagogisch-psychologischer Studien (3. completely revised ed.) [Interpretation and evaluation of educational-psychological studies]. Klinkhardt.
[66] Rueß, J., Gess, C., & Deicke, W. (2016). Forschendes Lernen und forschungsbezogene Lehre – empirisch gestützte Systematisierung des Forschungsbezugs hochschulischer Lehre. Zeitschrift für Hochschulentwicklung, 11(2), 23–44. | DOI 10.3217/zfhe-11-02/02
[67] Schermelleh-Engel, K., Moosbrugger, H., & Müller, H. (2003). Evaluating the fit of structural equation models: Test of significance and descriptive goodness-of-fit measures. Methods of Psychological Research – Online, 8(2), 23–74.
[68] Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. https://doi.org/10.1016/j.tate.2009.06.007 | DOI 10.1016/j.tate.2009.06.007
[69] Schildkamp, K., & Lai, M. K. (2013). Conclusions and a data use framework. In K. Schildkamp, M. Kuin & L. Earl (Eds.), Data-based decision making in education (pp. 177–191). Springer.
[70] Schildkamp, K., & Poortman, C. (2015). Factors influencing the functioning of data teams. Teachers College Record, 117(4), 1–42. | DOI 10.1177/016146811511700403
[71] Schladitz, S., Groß Ophoff, J., & Wirtz, M. (2015). Konstruktvalidierung eines Tests zur Messung bildungswissenschaftlicher Forschungskompetenz. Zeitschrift für Pädagogik, 61, 167–184. https://doi.org/10.25656/01:15509 | DOI 10.25656/01:15509
[72] Schratz, M., Wiesner, C., Rößler, L., Schildkamp, K., George, A. C., Hofbauer, C., & Pant, H. A. (2018). Möglichkeiten und Grenzen evidenzorientierter Schulentwicklung. Nationaler Bildungsbericht Österreich, 2, 403–454. http://doi.org/10.17888/nbb2018-1.4 | DOI 10.17888/nbb2018-1.4
[73] Shank, G., & Brown, L. (2007). Exploring educational research literacy. Routledge.
[74] Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany. (2019). The education system in the Federal Republic of Germany 2016/2017. A description of the responsibilities, structures and developments in education policy for the exchange of information in Europe. https://www.kmk.org/fileadmin/Dateien/pdf/Eurydice/Bildungswesen-engl-pdfs/dossier_en_ebook.pdf.
[75] Standing Conference. (2020). Ländervereinbarung über die gemeinsame Grundstruktur des Schulwesens und die gesamtstaatliche Verantwortung der Länder in zentralen bildungspolitischen Fragen (Beschluss der Kultusministerkonferenz vom 15.10.2020). https://www.kmk.org/fileadmin/Dateien/veroeffentlichungen_beschluesse/2020/2020_10_15-Laendervereinbarung.pdf.
[76] Standing Conference. (2004). Standards für die Lehrerbildung: Bildungswissenschaften. Beschluss der Kultusministerkonferenz vom 16.12.2004. http://www.kmk.org/fileadmin/veroeffentlichungen_beschluesse/2004/ 2004_12_16-Standards-Lehrerbildung.pdf
[77] Stelter, A., & Miethe, I. (2019). Forschungsmethoden im Lehramtsstudium – aktueller Stand und Konsequenzen. Erziehungswissenschaft, 30(58), 25–33. https://doi.org/10.3224/ezw.v30i1.03 | DOI 10.3224/ezw.v30i1.03
[78] Stout, W. (1987). A nonparametric approach for assessing latent trait unidimensionality. Psychometrika, 52(4), 589–617. https://doi.org/10.1007/BF02294821 | DOI 10.1007/BF02294821
[79] Tay, L., Meade, A. W., & Cao, M. (2015). An overview and practical guide to IRT measurement equivalence analysis. Organizational Research Methods, 18(1), 3–46. https://doi.org/10.1177/1094428114553062 | DOI 10.1177/1094428114553062
[80] Ulrich, I., & Gröschner, A. (2020). Praxissemester im Lehramtsstudium in Deutschland: Wirkungen auf Studierende. Springer VS.
[81] van Geel, M., Keuning, T., Visscher, A. J., & Fox, J.-P. (2016). Assessing the effects of a schoolwide data-based decision-making intervention on student achievement growth in primary schools. American Educational Research Journal, 53(2), 360–394. https://doi.org/10.3102/0002831216637346 | DOI 10.3102/0002831216637346
[82] van Geel, M., Keuning, T., Visscher, A., & Fox, J.-P. (2017). Changes in educators' data literacy during a data-based decision making intervention. Teaching and Teacher Education, 64, 187–198. https://doi.org/10.1016/j.tate.2017.02.015 | DOI 10.1016/j.tate.2017.02.015
[83] van Ophuysen, S., Behrmann, L., Bloh, B., Homt, M., & Schmidt, J. (2017). Die universitäre Vorbereitung angehender Lehrkräfte auf Forschendes Lernen im schulischen Berufsalltag. Journal for Educational Research Online, 9(2), 276–305. https://doi.org/10.25656/01:14952 | DOI 10.25656/01:14952
[84] Watson, J. M., & Callingham, R. A. (2003). Statistical literacy: A complex hierarchical construct. Statistics Education Research Journal, 2(2), 3–46.
[85] Wessels, I., Rueß, J., Jenßen, L., Gess, C., & Deicke, W. (2018). Beyond cognition: Experts' views on affective-motivational research dispositions in the social sciences. Frontiers in Psychology, 9, 1300. https://doi.org/10.3389/fpsyg.2018.01300 | DOI 10.3389/fpsyg.2018.01300
[86] Wiesner, C., & Schreiner, C. (2019). Implementation, Transfer, Progression und Transformation: Vom Wandel von Routinen zur Entwicklung von Identität. Von Interventionen zu Innovationen, die bewegen. Bausteine für ein Modell zur Schulentwicklung durch Evidenz (en). Praxistransfer Schul- und Unterrichtsentwicklung, 79–140.
[87] Wilson, M., & Scalise, K. (2006). Assessment to improve learning in higher education: The BEAR Assessment System. Higher Education, 52(4), 635–663. https://doi.org/10.1007/s10734-004-7263-y | DOI 10.1007/s10734-004-7263-y
[88] Wirtz, M. A., & Böcker, M. (2017). Differential item functioning (DIF). In M. A. Wirtz (Ed.), Dorsch – Lexikon der Psychologie (18. Aufl.). Hogrefe.
[89] Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACERConQuest Version 2: Generalised item response modelling software. Australian Council for Educational Research.
[90] Zeuch, N., Förster, N., & Souvignier, E. (2017). Assessing teachers' competencies to read and interpret graphs from learning progress assessment: Results from tests and interviews. Learning Disabilities Research & Practice, 32(1), 61–70. https://doi.org/10.1111/ldrp.12126 | DOI 10.1111/ldrp.12126
[91] Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Toepper, M., & Lautenbach, C. (2016). Messung akademisch vermittelter Kompetenzen von Studierenden und Hochschulabsolventen. Ein Überblick zum nationalen und internationalen Forschungsstand (Vol. 1). Springer.
[2] Altrichter, H., Brüsemeister, T., & Heinrich, M. (2005). Merkmale und Fragen einer GovernanceReform am Beispiel des österreichischen Schulwesens. Österreichische Zeitschrift für Soziologie, 30(4), 6–28. https://doi.org/10.1007/s11614-006-0063-0 | DOI 10.1007/s11614-006-0063-0
[3] Bach, A., Wurster, S., Thillmann, K., Pant, H. A., & Thiel, F. (2014). Vergleichsarbeiten und schulische Personalentwicklung—Ausmaß und Voraussetzungen der Datennutzung. Zeitschrift für Erziehungswissenschaft, 17(1), 61–84. https://doi.org/10.1007/s11618-014-0486-5 | DOI 10.1007/s11618-014-0486-5
[4] Ben-Zvi, D., & Garfield, J. (2004). The challenge of developing statistical literacy, reasoning and thinking (1st ed.). Springer. https://doi.org/10.1007/1-4020-2278-6 | DOI 10.1007/1-4020-2278-6
[5] Blixrud, J. C. (2003). Project SAILS: Standardized assessment of information literacy skills. In G. J. Barret (Ed.), ARL: A bimonthly report on research library issues and actions from ARL, CNI, and SPARC. Special double issue on new measures. Number 230-231, October-December 2003 (pp. 18–19). Association of Research Libraries. https://files.eric.ed.gov/fulltext/ED498293.pdf
[6] Blömeke, S., & Zlatkin-Troitschanskaia, O. (2013). Kompetenzmodellierung und Kompetenzerfassung im Hochschulsektor: Ziele, theoretischer Rahmen, Design und Herausforderungen des BMBF-Forschungsprogramms KoKoHs (KoKoHs Working Papers No. 1). Kompetenzmodellierung und Kompetenzerfassung im Hochschulsektor. https://www.kompetenzen-im-hochschulsektor.de/files/2018/05/KoKoHs_WP1_Bloemeke_Zlatkin-Troitschanskaia_2013_.pdf
[7] Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Erlbaum.
[8] Bos, W., Postlethwaite, T. N., & Gebauer, M. M. (2010). Potenziale, Grenzen und Perspektiven internationaler Schulleistungsforschung. In R. Tippelt & B. Schmidt (Eds.), Handbuch Bildungsforschung (3rd ed., pp. 275–295). Springer.
[9] Böttcher-Oschmann, F., Groß Ophoff, J., & Thiel, F. (2019). Validierung eines Fragenbogens zur Erfassung studentischer Forschungskompetenzen über Selbsteinschätzungen – Ein Instrument zur Evaluation forschungsorientierter Lehr-Lernarrangements. Unterrichtswissenschaft, 47(4), 495–521. https://doi.org/10.1007/s42010-019-00053-8 | DOI 10.1007/s42010-019-00053-8
[10] Braun, E., Gusy, B., Leidner, B., & Hannover, B. (2008). Das Berliner Evaluationsinstrument für selbsteingeschätzte, studentische Kompetenzen (BEvaKomp). Diagnostica, 54(1), 30–42. https://doi.org/10.1026/0012-1924.54.1.30 | DOI 10.1026/0012-1924.54.1.30
[11] Brown, C., Poortman, C., Gray, H., Groß Ophoff, J., & Wharf, M. (2021). Facilitating collaborative reflective inquiry amongst teachers: What do we currently know? International Journal of Educational Research, 105, 101695. https://doi.org/10.1016/j.ijer.2020.101695 | DOI 10.1016/j.ijer.2020.101695
[12] Brown, C., Schildkamp, K., & Hubers, M. D. (2017). Combining the best of two worlds: A conceptual proposal for evidence-informed school improvement. Educational Research, 59(2), 154–172. https://doi.org/10.1080/00131881.2017.1304327 | DOI 10.1080/00131881.2017.1304327
[13] Bundesministerium für Unterricht, Kunst und Kultur, & Bundesministerium für Wissenschaft und Forschung [Federal Ministry for Education, the Arts and Culture & Federal Ministry of Science and Research]. (2010). LehrerInnenbildung NEU – Die Zukunft der pädagogischen Berufe: Die Empfehlungen der ExpertInnengruppe [Teacher Education NEW – The future of pedagogical professions]. https://www.qsr.or.at/dokumente/1870-20140529-092820-Empfehlungen_der_ExpertInnengruppe_Endbericht_092010_2_Auflage.pdf
[14] Cain, T. (2015). Teachers' engagement with research texts: Beyond instrumental, conceptual or strategic use. Journal of Education for Teaching, 41(5), 478–492. https://doi.org/10.1080/02607476.2015.1105536 | DOI 10.1080/02607476.2015.1105536
[15] Cramer, C., Harant, M., Merk, S., Drahmann, M., & Emmerich, M. (2019). Meta-Reflexivität und Professionalität im Lehrerinnen- und Lehrerberuf. Zeitschrift für Pädagogik, 65(3), 401–423.
[16] Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52(4), 281–302. https://doi.org/10.1037/h0040957 | DOI 10.1037/h0040957
[17] Davidov, E., Meuleman, B., Cieciuch, J., Schmidt, P., & Billiet, J. (2014). Measurement equivalence in cross-national research. Annual Review of Sociology, 40, 55–75. https://doi.org/10.1146/annurev-soc-071913-043137 | DOI 10.1146/annurev-soc-071913-043137
[18] Davies, P. (1999). What is evidence-based education? British Journal of Educational Studies, 47(2), 108–121. https://doi.org/10.1111/1467-8527.00106 | DOI 10.1111/1467-8527.00106
[19] DeLuca, C., & Johnson, S. (2017). Developing assessment capable teachers in this age of accountability. Assessment in Education: Principles, Policy & Practice, 24(2), 121–126. https://doi.org/10.1080/0969594X.2017.1297010 | DOI 10.1080/0969594X.2017.1297010
[20] DeLuca, C., LaPointe-McEwan, D., & Luhanga, U. (2016). Teacher assessment literacy: A review of international standards and measures. Educational Assessment, Evaluation and Accountability, 28(3), 251–272. https://doi.org/10.1007/s11092-015-9233-6 | DOI 10.1007/s11092-015-9233-6
[21] de Olano, D. (2010). Gewinner, Verlieren und Exoten - PISA in sieben weiteren Staaten. In P. Knodel, K. Martens, D. de Olano, & M. Popp (Eds.), Das PISA-Echo. Internationale Reaktionen auf die Bildungsstudie (pp. 251–300). Campus.
[22] Dietrich, H., Zhang, Y., Klopp, E., Brünken, R., Krause, U.-M., Spinath, F. M., Stark, R., & Spinath, B. (2015). Scientific competencies in the social sciences. Psychology Learning & Teaching, 14(2), 115–130. https://doi.org/10.1177/1475725715592287 | DOI 10.1177/1475725715592287
[23] Dunn, K. E., Skutnik, A., Patti, C., & Sohn, B. (2019). Disdain to acceptance: Future teachers' conceptual change related to data-driven decision making. Action in Teacher Education, 41(3), 193–211. https://doi.org/10.1080/01626620.2019.1582116 | DOI 10.1080/01626620.2019.1582116
[24] Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2017). The effects of a data use intervention on educators' satisfaction and data literacy. Educational Assessment, Evaluation and Accountability, 29(1), 83–105. https://doi.org/10.1007/s11092-016-9251-z | DOI 10.1007/s11092-016-9251-z
[25] Federal Ministry of Education and Research Germany. (2015). Bericht der Bundesregierung über die Umsetzung des Bologna-Prozesses 2012–2015 in Deutschland. https://www.kmk.org/fileadmin/veroeffentlichungen_beschluesse/2015/2015_02_12-NationalerBericht_Umsetzung_BolognaProzess.pdf
[26] Förster, M., Zlatkin-Troitschanskaia, O., Brückner, S., Happ, R., Hambleton, R. K., Walstad, W. B., Asano, T., & Yamaoka, M. (2015). Validating test score interpretations by cross-national comparison: Comparing the results of students from Japan and Germany on an American test of economic knowledge in higher education. Zeitschrift Für Psychologie, 223(1), 14–23. https://doi.org/10.1027/2151-2604/a000195 | DOI 10.1027/2151-2604/a000195
[27] Fullan, M. (2005). The meaning of educational change: A quarter of a century of learning. In A. Lieberman (Ed.), The roots of educational change: International handbook of educational change (1st ed., pp. 202–216). Springer. https://doi.org/10.1007/1-4020-4451-8_12 | DOI 10.1007/1-4020-4451-8_12
[28] Gess, C., Wessels, I., & Blömeke, S. (2017). Domain-specificity of research competencies in the social sciences: Evidence from differential item functioning. Journal for Educational Research Online, 9(2), 11–36. https://doi.org/10.25656/01:14895 | DOI 10.25656/01:14895
[29] Gierl, M. J. (2005). Using dimensionality-based DIF analyses to identify and interpret constructs that elicit group differences. Educational Measurement: Issues and Practice, 24(1), 3–14. https://doi.org/10.1111/j.1745-3992.2005.00002.x | DOI 10.1111/j.1745-3992.2005.00002.x
[30] Gonon, P. (2011). Die Bedeutung des internationalen Arguments in der Lehrerbildung. Beiträge zur Lehrerbildung, 29(1), 20–26. https://doi.org/10.25656/01:13763 | DOI 10.25656/01:13763
[31] Green, S. B., & Yang, Y. (2009). Reliability of summed item scores using structural equation modeling: An alternative to coefficient alpha. Psychometrika, 74(1), 155–167. https://doi.org/10.1007/s11336-008-9099-3 | DOI 10.1007/s11336-008-9099-3
[32] Gregoire, M. (2003). Is it a challenge or a threat? A dual-process model of teachers' cognition and appraisal processes during conceptual change. Educational Psychology Review, 15(2), 147–179. https://doi.org/10.1023/A:1023477131081 | DOI 10.1023/A:1023477131081
[33] Grisay, A., de Jong, J. H. A. L., Gebhardt, E., Berezner, A., & Halleux-Monseur, B. (2007). Translation equivalence across PISA countries. Journal of Applied Measurement, 8(3), 249–266.
[34] Groß Ophoff, J., & Cramer, C. (in press). The engagement of teachers and school leaders with data, evidence and research in Germany. In C. Brown & J. R. Malin (Eds.), The Emerald international handbook of evidence-informed practice in education: Learning from international contexts. Emerald.
[35] Groß Ophoff, J., Schladitz, S., & Wirtz, M. (2017). Differences in research literacy in educational science depending on study program and university. In J. Domenech i de Soria, M. C. Vincent-Vela, E. de la Poza, & D. Blazquez (Eds.), Proceedings of the HEAd'17. 3rd international conference on higher education advances (pp. 1193–1202). Congress UPV. http://dx.doi.org/10.4995/HEAD17.2017.6713 | DOI 10.4995/HEAD17.2017.6713
[36] Groß Ophoff, J., Wolf, R., Schladitz, S., & Wirtz, M. (2017). Assessment of educational research literacy in higher education: Construct validation of the factorial structure of an assessment instrument comparing different treatments of omitted responses. Journal for Educational Research Online, 9(2), 37–68. https://doi.org/10.25656/01:14896 | DOI 10.25656/01:14896
[37] Gustafson, J.-E. (2001). Measurement from a hierarchical point of view. In H. I. Braun, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement (1st ed., pp. 87–111). Routledge.
[38] Haberfellner, C. (2016). Der Nutzen von Forschungskompetenz im Lehramt: Eine Einschätzung aus der Sicht von Studierenden der Pädagogischen Hochschulen in Österreich. Klinkhardt.
[39] Halonen, J. S. (2008). Measure for measure: The challenge of assessing critical thinking. In D. S. Dunn, J. S. Halonen, & R. A. Smith (Eds.), Teaching critical thinking in psychology: A handbook of best practices (1st ed., pp. 61–76). Wiley-Blackwell.
[40] Hamilton, V. M., & Reeves, T. D. (2021). Relationships between course taking and teacher self-efficacy and anxiety for data-driven decision making. The Teacher Educator, 1–19. https://doi.org/10.1080/08878730.2021.1965682 | DOI 10.1080/08878730.2021.1965682
[41] Hartig, J., & Höhler, J. (2009). Multidimensional IRT models for the assessment of competencies. Studies in Educational Evaluation, 35(2-3), 57–63. https://doi.org/10.1016/j.stueduc.2009.10.002 | DOI 10.1016/j.stueduc.2009.10.002
[42] Healey, M. (2005). Linking research and teaching: Exploring disciplinary spaces and the role of inquiry-based learning. In R. Barnett (Ed.), Reshaping the University: New Relationships between Research, Scholarship and Teaching (1st ed., pp. 67–78). Open University Press.
[43] Helsper, W. (2016). Lehrerprofessionalität – der strukturtheoretische Ansatz. In M. Rothland (Ed.), Beruf Lehrer/Lehrerin: Ein Studienbuch (pp. 103–125). Waxmann.
[44] Hofmann, F., Hagenauer, G., & Martinek, D. (2020). Entwicklung und Struktur der Lehrerinnenund Lehrerbildung in Österreich. In C. Cramer, J. König, M. Rothland, & S. Blömeke (Eds.), Handbuch Lehrerinnen- und Lehrerbildung (pp. 227–236). Klinkhardt.
[45] Holland, P. W., & Wainer, H. (1993). Differential item functioning (1st ed.). Erlbaum.
[46] Jesacher-Roessler, L., & Kemethofer, D. (in press). Evidence-informed practice in Austrian education. In C. Brown & J. R. Malin (Eds.), The Emerald international handbook of evidence-informed practice in education. Emerald.
[47] Katenbrink, N., & Goldmann, D. (2020). Varianten Forschenden Lernens—Ein konzeptbasierter Typisierungsvorschlag. In M. Basten, C. Mertens, A. Schöning, & E. Wolf (Eds.), Forschendes Lernen in der Lehrer/innenbildung: Implikationen für Wissenschaft und Praxis (pp. 195–202). Waxmann.
[48] Kiefer, T., Robitzsch, A., & Wu, M. (2013). TAM (Test analysis modules). http://www.edmeasurementsurveys.com/TAM/Tutorials/
[49] Kittel, D., Rollett, W., & Groß Ophoff, J. (2017). Profitieren berufstätige Lehrkräfte durch ein berufsbegleitendes weiterbildendes Studium in ihren Forschungskompetenzen? Bildung & Erziehung, 70(4), 437–452.
[50] Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23(4), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 | DOI 10.1016/j.cogdev.2008.09.006
[51] Larcher, S., & Oelkers, J. (2004). Deutsche Lehrerbildung im internationalen Vergleich. In S. Blömke, P. Reinhold, G. Tulodziecki, & J. Wildt (Eds.), Handbuch Lehrerbildung (pp. 128–150). Klinkhardt.
[52] Lowman, R. L., & Williams, R. E. (1987). Validity of self-ratings of abilities and competencies. Journal of Vocational Behavior, 31(1), 1–13. https://doi.org/10.1016/0001-8791(87)90030-3 | DOI 10.1016/0001-8791(87)90030-3
[53] Mandinach, E. B. (2012). A perfect time for data use: Using data-driven decision making to inform practice. Educational Psychologist, 47(2), 71–85. https://doi.org/10.1080/00461520.2012.667064 | DOI 10.1080/00461520.2012.667064
[54] Mandinach, E. B., Honey, M., Light, D., & Brunner, C. (2008). A conceptual framework for data-driven decision making. In M. Honey (Ed.), Data-Driven School Improvement: Linking Data and Learning, (pp. 13–31). Teachers College Press.
[55] Marsh, J. A. (2012). Interventions promoting educators' use of data: Research insights and gaps. Teachers College Record, 114(11), 1–48. | DOI 10.1177/016146811211401106
[56] Meissner, D., Vogel, E., & Horn, K.-P. (2012). Lehrerausbildung in Baden-Württemberg seit 1945. Angleichungs-und Abgrenzungsprozesse. In C. Cramer (Ed.), Lehrerausbildung in BadenWürttemberg. Historische Entwicklungslinien und aktuelle Herausforderungen, (pp. 33–62). IKS Garamond.
[57] Mintrop, R., & Zumpe, E. (2019). Solving real-life problems of practice and education leaders' school improvement mind-set. American Journal of Education, 125(3), 295–344. https://doi.org/10.1086/702733 | DOI 10.1086/702733
[58] Münchow, H., Richter, T., von der Mühlen, S., & Schmid, S. (2019). The ability to evaluate arguments in scientific texts: Measurement, cognitive processes, nomological network, and relevance for academic success at the university. The British Journal of Educational Psychology, 89(3), 501–523. https://doi.org/10.1111/bjep.12298 | DOI 10.1111/bjep.12298
[59] Ntuli, E., & Kyei-Blankson, L. (2016). Improving K-12 online learning: Information literacy skills for teacher candidates. International Journal of Information and Communication Technology Education, 12(3), 38–50. | DOI 10.4018/IJICTE.2016070104
[60] Prenger, R., & Schildkamp, K. (2018). Data-based decision making for teacher and student learning: A psychological perspective on the role of the teacher. Educational Psychology, 38(6), 734–752. https://doi.org/10.1080/01443410.2018.1426834 | DOI 10.1080/01443410.2018.1426834
[61] Prenzel, M., Walter, O., & Frey, A. (2007). PISA misst Kompetenzen. Psychologische Rundschau, 58(2), 128–136. https://doi.org/10.1026/0033-3042.58.2.128 | DOI 10.1026/0033-3042.58.2.128
[62] Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service teachers. Teaching and Teacher Education, 50, 90–101. https://doi.org/10.1016/j.tate.2015.05.007 | DOI 10.1016/j.tate.2015.05.007
[63] Reise, S. P., Moore, T. M., & Haviland, M. G. (2010). Bifactor models and rotations: Exploring the extent to which multidimensional data yield univocal scale scores. Journal of Personality Assessment, 92(6), 544–559. https://doi.org/10.1080/00223891.2010.496477 | DOI 10.1080/00223891.2010.496477
[64] Richter, D., Böhme, K., Becker, M., Pant, H. A., & Stanat, P. (2014). Überzeugungen von Lehrkräften zu den Funktionen von Vergleichsarbeiten. Zusammenhänge zu Veränderungen im Unterricht und den Kompetenzen von Schülerinnen und Schülern. Zeitschrift für Pädagogik, 60(2), 225–244. https://doi.org/10.25656/01:12846 | DOI 10.25656/01:12846
[65] Rost, D. H. (2013). Interpretation und Bewertung pädagogisch-psychologischer Studien (3. completely revised ed.) [Interpretation and evaluation of educational-psychological studies]. Klinkhardt.
[66] Rueß, J., Gess, C., & Deicke, W. (2016). Forschendes Lernen und forschungsbezogene Lehre – empirisch gestützte Systematisierung des Forschungsbezugs hochschulischer Lehre. Zeitschrift für Hochschulentwicklung, 11(2), 23–44. | DOI 10.3217/zfhe-11-02/02
[67] Schermelleh-Engel, K., Moosbrugger, H., & Müller, H. (2003). Evaluating the fit of structural equation models: Test of significance and descriptive goodness-of-fit measures. Methods of Psychological Research – Online, 8(2), 23–74.
[68] Schildkamp, K., & Kuiper, W. (2010). Data-informed curriculum reform: Which data, what purposes, and promoting and hindering factors. Teaching and Teacher Education, 26(3), 482–496. https://doi.org/10.1016/j.tate.2009.06.007 | DOI 10.1016/j.tate.2009.06.007
[69] Schildkamp, K., & Lai, M. K. (2013). Conclusions and a data use framework. In K. Schildkamp, M. Kuin & L. Earl (Eds.), Data-based decision making in education (pp. 177–191). Springer.
[70] Schildkamp, K., & Poortman, C. (2015). Factors influencing the functioning of data teams. Teachers College Record, 117(4), 1–42. | DOI 10.1177/016146811511700403
[71] Schladitz, S., Groß Ophoff, J., & Wirtz, M. (2015). Konstruktvalidierung eines Tests zur Messung bildungswissenschaftlicher Forschungskompetenz. Zeitschrift für Pädagogik, 61, 167–184. https://doi.org/10.25656/01:15509 | DOI 10.25656/01:15509
[72] Schratz, M., Wiesner, C., Rößler, L., Schildkamp, K., George, A. C., Hofbauer, C., & Pant, H. A. (2018). Möglichkeiten und Grenzen evidenzorientierter Schulentwicklung. Nationaler Bildungsbericht Österreich, 2, 403–454. http://doi.org/10.17888/nbb2018-1.4 | DOI 10.17888/nbb2018-1.4
[73] Shank, G., & Brown, L. (2007). Exploring educational research literacy. Routledge.
[74] Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany. (2019). The education system in the Federal Republic of Germany 2016/2017. A description of the responsibilities, structures and developments in education policy for the exchange of information in Europe. https://www.kmk.org/fileadmin/Dateien/pdf/Eurydice/Bildungswesen-engl-pdfs/dossier_en_ebook.pdf.
[75] Standing Conference. (2020). Ländervereinbarung über die gemeinsame Grundstruktur des Schulwesens und die gesamtstaatliche Verantwortung der Länder in zentralen bildungspolitischen Fragen (Beschluss der Kultusministerkonferenz vom 15.10.2020). https://www.kmk.org/fileadmin/Dateien/veroeffentlichungen_beschluesse/2020/2020_10_15-Laendervereinbarung.pdf.
[76] Standing Conference. (2004). Standards für die Lehrerbildung: Bildungswissenschaften. Beschluss der Kultusministerkonferenz vom 16.12.2004. http://www.kmk.org/fileadmin/veroeffentlichungen_beschluesse/2004/ 2004_12_16-Standards-Lehrerbildung.pdf
[77] Stelter, A., & Miethe, I. (2019). Forschungsmethoden im Lehramtsstudium – aktueller Stand und Konsequenzen. Erziehungswissenschaft, 30(58), 25–33. https://doi.org/10.3224/ezw.v30i1.03 | DOI 10.3224/ezw.v30i1.03
[78] Stout, W. (1987). A nonparametric approach for assessing latent trait unidimensionality. Psychometrika, 52(4), 589–617. https://doi.org/10.1007/BF02294821 | DOI 10.1007/BF02294821
[79] Tay, L., Meade, A. W., & Cao, M. (2015). An overview and practical guide to IRT measurement equivalence analysis. Organizational Research Methods, 18(1), 3–46. https://doi.org/10.1177/1094428114553062 | DOI 10.1177/1094428114553062
[80] Ulrich, I., & Gröschner, A. (2020). Praxissemester im Lehramtsstudium in Deutschland: Wirkungen auf Studierende. Springer VS.
[81] van Geel, M., Keuning, T., Visscher, A. J., & Fox, J.-P. (2016). Assessing the effects of a schoolwide data-based decision-making intervention on student achievement growth in primary schools. American Educational Research Journal, 53(2), 360–394. https://doi.org/10.3102/0002831216637346 | DOI 10.3102/0002831216637346
[82] van Geel, M., Keuning, T., Visscher, A., & Fox, J.-P. (2017). Changes in educators' data literacy during a data-based decision making intervention. Teaching and Teacher Education, 64, 187–198. https://doi.org/10.1016/j.tate.2017.02.015 | DOI 10.1016/j.tate.2017.02.015
[83] van Ophuysen, S., Behrmann, L., Bloh, B., Homt, M., & Schmidt, J. (2017). Die universitäre Vorbereitung angehender Lehrkräfte auf Forschendes Lernen im schulischen Berufsalltag. Journal for Educational Research Online, 9(2), 276–305. https://doi.org/10.25656/01:14952 | DOI 10.25656/01:14952
[84] Watson, J. M., & Callingham, R. A. (2003). Statistical literacy: A complex hierarchical construct. Statistics Education Research Journal, 2(2), 3–46.
[85] Wessels, I., Rueß, J., Jenßen, L., Gess, C., & Deicke, W. (2018). Beyond cognition: Experts' views on affective-motivational research dispositions in the social sciences. Frontiers in Psychology, 9, 1300. https://doi.org/10.3389/fpsyg.2018.01300 | DOI 10.3389/fpsyg.2018.01300
[86] Wiesner, C., & Schreiner, C. (2019). Implementation, Transfer, Progression und Transformation: Vom Wandel von Routinen zur Entwicklung von Identität. Von Interventionen zu Innovationen, die bewegen. Bausteine für ein Modell zur Schulentwicklung durch Evidenz (en). Praxistransfer Schul- und Unterrichtsentwicklung, 79–140.
[87] Wilson, M., & Scalise, K. (2006). Assessment to improve learning in higher education: The BEAR Assessment System. Higher Education, 52(4), 635–663. https://doi.org/10.1007/s10734-004-7263-y | DOI 10.1007/s10734-004-7263-y
[88] Wirtz, M. A., & Böcker, M. (2017). Differential item functioning (DIF). In M. A. Wirtz (Ed.), Dorsch – Lexikon der Psychologie (18. Aufl.). Hogrefe.
[89] Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACERConQuest Version 2: Generalised item response modelling software. Australian Council for Educational Research.
[90] Zeuch, N., Förster, N., & Souvignier, E. (2017). Assessing teachers' competencies to read and interpret graphs from learning progress assessment: Results from tests and interviews. Learning Disabilities Research & Practice, 32(1), 61–70. https://doi.org/10.1111/ldrp.12126 | DOI 10.1111/ldrp.12126
[91] Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Toepper, M., & Lautenbach, C. (2016). Messung akademisch vermittelter Kompetenzen von Studierenden und Hochschulabsolventen. Ein Überblick zum nationalen und internationalen Forschungsstand (Vol. 1). Springer.