2004
Volume 1, Issue 1
  • E-ISSN: 3051-1208

Samenvatting

Abstract

This article critically examines how two originally distinct concepts – openness, rooted in transparency and democratic access, and innovation, tied to economic valorisation and competitiveness –have been aligned through policy reforms, evaluative regimes, and digital infrastructures. Drawing on neo-institutionalist theory and comparative empirical cases from Europe, China, Latin America, and Africa, it interrogates the institutional and epistemic consequences of this convergence. The analysis demonstrates how open science and innovation have been reframed as instruments of performative governance, narrowing epistemic diversity and reshaping legitimacy in the name of accountability and impact. While alternative practices and platforms offer civic and pluralist counter-models, the managerial paradigm remains structurally dominant. The article explores the entanglement of openness and innovation with broader transformations in knowledge capitalism and calls for reflexivity in knowledge governance.

Loading

Article metrics loading...

/content/journals/10.5117/EJEP2025.1.004.RUAN
2025-12-01
2026-04-04
Loading full text...

Full text loading...

/deliver/fulltext/30511208/1/1/EJEP2025.1.004.RUAN.html?itemId=/content/journals/10.5117/EJEP2025.1.004.RUAN&mimeType=html&fmt=ahah

References

  1. Acemoglu, D., & Johnson, S. (2023). Power and progress: Our thousand-year struggle over technology and prosperity. Basic Books.
    [Google Scholar]
  2. African Open Science Platform. (2017). The African Open Science Platform: Strategy and roadmap. Academy of Science of South Africa (ASSAf). URL: https://www.openscience.africa/open-science-platform/
    [Google Scholar]
  3. Arenas-Castro, H., O’Bryan, R., Oh, R. R. Y., Ovsyanikova, E., Pérez-Hämmerle, K.-V., Pottier, P., Powers, J. S., Rodríguez-Acevedo, A., Rozak, A. H., Sena, P. H. A., Sockhill, N. J., Tedesco, A. M., Tiapa-Blanco, F., Tsai, J.-S., Villarreal-Rosas, J., Wadgymar, S. M., Yamamichi, M., … Amano, T. (2024). Academic publishing requires linguistically inclusive policies. Proceedings of the Royal Society B: Biological Sciences, 291, 20232840. https://doi.org/10.1098/rspb.2023.2840
    [Google Scholar]
  4. Becerril-García, A., Aguado-López, E., & Macedo-García, A. (2023). Perspectives of the Latin American non-commercial journal publishing and South-South collaboration before commercial business models for open access. Access: An International Journal of Nepal Library Association, 2(1), 191–199. https://doi.org/10.3126/access.v2i01.58993
    [Google Scholar]
  5. Benavente, J. M., Crespi, G., Figal Garone, L., & Maffioli, A. (2012). The impact of national research funds: A regression discontinuity approach to the Chilean FONDECYT. Research Policy, 41(8), 1461–1475. https://doi.org/10.1016/j.respol.2012.04.007
    [Google Scholar]
  6. Bernasconi, A. (2011). A legal perspective on “privateness” and “publicness” in Latin American higher education. Journal of Comparative Policy Analysis: Research and Practice, 13(4), 351–365. https://doi.org/10.1080/13876988.2011.583105
    [Google Scholar]
  7. Bezuidenhout, L., Leonelli, S., Kelly, A. H., & Rappert, B. (2017). Beyond the digital divide: Towards a situated approach to open data. Science and Public Policy, 44(4), 464–475. https://doi.org/10.1093/scipol/scw036
    [Google Scholar]
  8. Byun, K., Jon, J.-E., & Kim, D. (2013). Quest for building world-class universities in South Korea: Outcomes and consequences. Higher Education, 65, 645–659. https://doi.org/10.1007/s10734-012-9568-6
    [Google Scholar]
  9. Callaghan, C. (2018). A review of South Africa’s National Research Foundation’s ratings methodology from a social science perspective. South African Journal of Science, 114(3–4), Article 20170344. https://doi.org/10.17159/sajs.2018/20170344
    [Google Scholar]
  10. Campbell, D. F. J., & Carayannis, E. G. (2016). Epistemic governance and epistemic innovation policy in higher education. Technology, Innovation and Education, 2(1), Article 2. https://doi.org/10.1186/s40660-016-0008-2
    [Google Scholar]
  11. Coalition for Advancing Research Assessment (CoARA). (2022). Agreement on Reforming Research Assessment. https://www.coara.org/agreement/the-agreement-full-text/
    [Google Scholar]
  12. Council of the European Union. (2006). Decision 2006/971/EC of 19 December 2006 concerning the Specific Programme “Ideas” implementing the Seventh Framework Programme (2007–2013). Official Journal of the European Union, L 400, 243–256.
    [Google Scholar]
  13. DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited. American Sociological Review, 48 (2), 147–160. https://doi.org/10.2307/2095101
    [Google Scholar]
  14. Dobusch, L., & Heimstädt, M. (2024). The structural transformation of the scientific public sphere: Constitution and consequences of the path towards open access. Philosophy & Social Criticism, 50(1), 216–238. https://doi.org/10.1177/01914537231203558
    [Google Scholar]
  15. DORA (2012). San Francisco Declaration on Research Assessment. American Society for Cell Biology. https://sfdora.org/read/
    [Google Scholar]
  16. Espeland, W. N., & Sauder, M. (2007). Rankings and reactivity: How Public Measures Recreate Social Worlds. American Journal of Sociology, 113(1), 1–40. https://doi.org/10.1086/517897
    [Google Scholar]
  17. European Council. (2000). Presidency conclusions of the Lisbon European Council (23–24 March 2000). Brussels: Council of the European Union.
    [Google Scholar]
  18. European Commission. (2000). Towards a European Research Area (COM(2000) 6 final). Brussels: European Commission.
    [Google Scholar]
  19. European Commission: Directorate-General for Research and Innovation, Open innovation, open science, open to the world – A vision for Europe, Publications Office, 2015, https://data.europa.eu/doi/10.2777/061652
    [Google Scholar]
  20. European Commission. (2021). Horizon Europe Programme Guide. Publications Office of the EU. https://research-and-innovation.ec.europa.eu/funding/funding-opportunities/funding-programmes-and-open-calls/horizon-europe/horizon-europe-work-programmes_en
    [Google Scholar]
  21. European Commission Open Science Policy Platform. (2020). Final Report of the OSPP. Publications Office of the EU. https://data.europa.eu/doi/10.2777/00139
    [Google Scholar]
  22. Fyfe, A., et al. (2022). Untangling academic publishing: A history of the relationship between commercial interests, academic prestige and the circulation of research. Zenodo / cOAlition S. https://www.researchgate.net/publication/320194592_Untangling_academic_publishing_A_history_of_the_relationship_between_commercial_interests_academic_prestige_and_the_circulation_of_research
    [Google Scholar]
  23. Fecher, B., & Friesike, S. (2014). Open science: One term, five schools. In S.Bartling & S.Friesike (Eds.), Opening Science (pp. 17–47). Springer Open. https://library.oapen.org/bitstream/handle/20.500.12657/28008/1/2014_Book_OpeningScience.pdf#page=24
    [Google Scholar]
  24. Fecher, B., Hebing, M., Laufer, M., Pohle, J., & Sofsky, F. (2023). Friend or foe? Exploring the implications of large language models on the science system. arXiv. https://doi.org/10.48550/arXiv.2306.09928
    [Google Scholar]
  25. Gingras, Y. (2021). Bibliometrics and research evaluation: Uses and abuses. MIT Press.
    [Google Scholar]
  26. Gläser, J., & Serrano Velarde, K. (2018). Changing funding arrangements and the production of scientific knowledge: Introduction to the special issue. Minerva, 56(1), 1–10. https://doi.org/10.1007/s11024-018-9334-6
    [Google Scholar]
  27. Godin, B. (2015). Innovation contested: The idea of innovation over the centuries. Routledge. https://doi.org/10.4324/9781315855608
    [Google Scholar]
  28. Guo, F., & Liu, N. C. (2024). Excellence initiatives and the transformation of university governance in China: Visibility, competition, and accountability. Higher Education Policy, 37(2), 345–364. https://doi.org/10.1057/s41307-023-00305-9
    [Google Scholar]
  29. Hazelkorn, E. (2015). Rankings and the reshaping of higher education. Palgrave Macmillan.
    [Google Scholar]
  30. Hazelkorn, E., & Mihut, G. (Eds.). (2021). Research handbook on university rankings: Theory, methodology, influence and impact. Edward Elgar Publishing.
    [Google Scholar]
  31. Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261. https://doi.org/10.1016/j.respol.2011.09.007
    [Google Scholar]
  32. Kelty, C. (2019). The participant. University of Chicago Press.
    [Google Scholar]
  33. Kehm, B. M. (2020). Higher education systems and institutions, Germany. In P. N.Teixeira & J. C.Shin (Eds.), The international encyclopedia of higher education systems and institutions (pp. 1–10). Springer.
    [Google Scholar]
  34. Kivistö, J., Pekkola, E., & Lyytinen, A. (2017). The influence of performance-based management on teaching and research performance of Finnish senior academics. Tertiary Education and Management, 23(3), 260–275. https://doi.org/10.1080/13583883.2017.1328529
    [Google Scholar]
  35. Kornberger, M., Pflueger, D., & Mouritsen, J. (2017). Evaluative infrastructures: Accounting for platform organization. Accounting, Organizations and Society, 60, 79–95. https://doi.org/10.1016/j.aos.2017.05.002
    [Google Scholar]
  36. Larivière, V., Haustein, S., & Mongeon, P. (2015). The oligopoly of academic publishers. PLOS ONE, 10(6), e0127502.
    [Google Scholar]
  37. Leonelli, S. (2019). Data governance is key to interpretation: Reconceptualizing data in data science. Harvard Data Science Review, 1(1). https://doi.org/10.1162/99608f92.17405bb6
    [Google Scholar]
  38. Leonelli, S. (2023). Philosophy of open science. Cambridge University Press.
    [Google Scholar]
  39. Lundvall, B.-Å. (Ed.). (1992). National systems of innovation: Towards a theory of innovation and interactive learning. Pinter Publishers.
    [Google Scholar]
  40. Marques, F. P. J., Kniess, A., Goyanes, M., & Oliveira, T. (2025). De-Westernizing communication studies through domesticating the Global South: A critical examination of the mechanisms shaping scholarly participation in the field. International Journal of Communication, 19, 3761–3787. https://ijoc.org/index.php/ijoc
    [Google Scholar]
  41. Marginson, S. (2022). Global science and national comparisons: Beyond bibliometrics and scientometrics. Comparative Education, 58(2), 125–146. https://doi.org/10.1080/03050068.2021.1981725
    [Google Scholar]
  42. Merton, R. K. (1968). The Matthew effect in science. Science, 159(3810), 56–63.
    [Google Scholar]
  43. Mirowski, P. (2018). The future(s) of open science. Social Studies of Science, 48(2), 171–203. https://doi.org/10.1177/0306312718772086
    [Google Scholar]
  44. National Institutes of Health, National Library of Medicine. (2020). PubMed Central: An archive of biomedical and life sciences journal literature. Bethesda, MD: NIH. https://www.ncbi.nlm.nih.gov/pmc/about/intro/
    [Google Scholar]
  45. Nelson, R. R. (Ed.). (1993). National innovation systems. Oxford University Press.
    [Google Scholar]
  46. Nowotny, H., Scott, P., & Gibbons, M. (2001). Re-thinking science: Knowledge and the public in an age of uncertainty. Polity Press
    [Google Scholar]
  47. Obadia, L. (2025). The Ambivalent Impact of Project Research on Academic Freedom in Social Sciences and Humanities. A Case Study: From France to Europe Political Anthropological Research on International Social Sciences, 1–22. https://brill.com/view/journals/pari/aop/article-10.1163-25903276-bja10083/article-10.1163-25903276-bja10083.xml
    [Google Scholar]
  48. OECD. (2022). Main Science and Technology Indicators 2022/1. Paris: OECD Publishing.
    [Google Scholar]
  49. OECD. (2023). OECD Science, Technology and Innovation Outlook 2023: Times of Transition. Paris: OECD Publishing.
    [Google Scholar]
  50. Pontika, N., Klebel, T., Correia, A., Metzler, H., Knoth, P., & Ross-Hellauer, T. (2022). Indicators of research quality, quantity, openness, and responsibility in institutional review, promotion, and tenure policies across seven countries. Quantitative Science Studies, 3(4), 888–911. https://doi.org/10.1162/qss_a_00224
    [Google Scholar]
  51. Popp Berman, E. (2022). Thinking like an economist: How efficiency replaced equality in U.S. public policy. Princeton University Press.
    [Google Scholar]
  52. Power, M. (1997). The audit society: Rituals of verification. Oxford University Press.
    [Google Scholar]
  53. Purnell, P. J. (2020). The effect of the Indonesian higher education evaluation system on conference proceedings publications. Scientometrics, 125(3), 2681–2703. https://doi.org/10.1007/s11192-020-03773-2
    [Google Scholar]
  54. de Rijcke, S., Wouters, P. F., Rushforth, A., Franssen, T., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use: A literature review. Research Evaluation, 25(2), 161–169. https://doi.org/10.1093/reseval/rvv038
    [Google Scholar]
  55. Ruano-Borbalan, J.-C. (2022). Doctoral education from its medieval foundations to today’s globalisation and standardisation. European Journal of Education, 57(3), 367–380. https://doi.org/10.1111/ejed.12522
    [Google Scholar]
  56. Ruano-Borbalan, J.-C. (2025). Towards a more inclusive science: Open knowledge, public participation, and institutional change. European Commission. https://thebridgeproject.vdu.lt/post/bridge-report-towards-a-more-inclusive-science---open-science-and-the-transformation-of-knowledge-governance
    [Google Scholar]
  57. Rushforth, A., & Hammarfelt, B. (2023). The rise of responsible metrics as a professional reform movement: A collective action frames account. Quantitative Science Studies, 4(4), 879–897. https://doi.org/10.1162/qss_a_00280
    [Google Scholar]
  58. Rushforth, A. (2025). Research assessment reform as collective action problem: Contested framings of research system transformation. Minerva. https://doi.org/10.1007/s11024-025-09573-3
    [Google Scholar]
  59. Seeber, M., Cattaneo, M., Meoli, M., & Malighetti, P. (2019). Self-citations as strategic response to the use of metrics for career decisions. Research Policy, 48(2), 478–491. https://doi.org/10.1016/j.respol.2017.12.004
    [Google Scholar]
  60. Seo, T.-S. (2018). Open access full-text databases in Asian countries. Science Editing, 5(2), 124–132. https://doi.org/10.6087/kcse.124
    [Google Scholar]
  61. von Seggern, J., Holst, J., & Singer-Brodowski, M. (2023). The self in the mirror: Fostering researchers’ reflexivity in transdisciplinary and transformative studies at the science–policy interface. Ecology and Society, 28(2), Article 17. https://doi.org/10.5751/ES-14057-280217
    [Google Scholar]
  62. Salmi, J. (2016). Excellence strategies and the creation of world-class universities. In J. C.Shin & B. M.Kehm (Eds.), Matching visibility and performance: World-class universities in global competition (pp. 13–48). Springer.
    [Google Scholar]
  63. Shin, J. C., & Lee, S. J. (2022). Different measures of international faculty and their impacts on global rankings. Scientometrics, 127, 6125–6145. https://doi.org/10.1007/s11192-022-04511-6
    [Google Scholar]
  64. Smolentseva, A. (2018). Russia: The institutional landscape of Russian higher education. In J.Huisman, A.Smolentseva, & I.Froumin (Eds.), 25 years of transformations of higher education systems in post-Soviet countries: Reform and continuity. Palgrave Macmillan. https://doi.org/10.1007/978-3-319-52980-6
    [Google Scholar]
  65. State Council Regulation on Scientific Data Management. (2018, April2). The State Council of the People’s Republic of China. Retrieved from https://english.www.gov.cn/policies/latest_releases/2018/04/02/content_281476099479814.htm
  66. Stephan, P. (2012). How economics shapes science. Harvard University Press.
    [Google Scholar]
  67. Sugimoto, C. R., Larivière, V., Ni, C., & Cronin, B. (2023). Scholarly metrics under the microscope: Usage, visibility, and the reproduction of inequalities. MIT Press.
    [Google Scholar]
  68. UNESCO. (2021). UNESCO recommendation on open science. Paris: United Nations Educational, Scientific and Cultural Organization. https://unesdoc.unesco.org/ark:/48223/pf0000379949
    [Google Scholar]
  69. Wouters, P., Sugimoto, C. R., Larivière, V., McVeigh, M. E., Pulverer, B., de Rijcke, S., & Waltman, L. (2019). Rethinking impact factors: Better ways to judge a journal. Nature, 569(7758), 621–623. https://doi.org/10.1038/d41586-019-01643-3
    [Google Scholar]
  70. Xu, Y., Zha, Q., Welch, A., Ruano-Borbalan, J.-C., Hoelscher, M., & Liu, J. (2025). Reinventing the teaching–research nexus to foster university education in the twenty-first century. Compare: A Journal of Comparative and International Education, 1–22. https://doi.org/10.1080/03057925.2025.2544514
    [Google Scholar]
  71. Ziman, J. (2000). Real science: What it is, and what it means. Cambridge University Press.
    [Google Scholar]
/content/journals/10.5117/EJEP2025.1.004.RUAN
Loading
/content/journals/10.5117/EJEP2025.1.004.RUAN
Loading

Data & Media loading...

Dit is een verplicht veld
Graag een geldig e-mailadres invoeren
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error