SN 3 EP 1: The Future Of Research Assessment And Incentives In Africa With Dr. Rhoda Wanyenze (MakSPH)

18 October 2024 Categories: latest news, News

SUMMARY KEYWORDS

African scholarly conversations, research assessment, societal impact, public health research, research translation, policy alignment, research metrics, open science, knowledge transfer, research incentives, research quality, research impact, research ecosystem, research collaboration, research policy

 

EPISODE SUMMARY

In this episode, Dr. Rhoda Wanyenze, Professor and Dean of the School of Public Health at Makerere University, discusses the evolving landscape of research assessment in Africa. The conversation highlights the need to rethink how research outputs are evaluated, focusing on societal impact rather than traditional academic metrics like publications and citations. Dr. Wanyenze emphasizes the importance of aligning research with real-world needs, fostering collaborations, and incentivizing meaningful knowledge translation to drive policy change and improve service delivery.

 

HERE ARE THE KEY THINGS TO LOOK OUT FOR:

Shift from Traditional Research Metrics:

The episode emphasizes the need to move beyond traditional metrics like publication counts and impact factors. Dr. Wanyenze advocates for research to be evaluated based on its societal impact, such as policy changes and service improvements, rather than purely academic outputs.

Incentivizing Meaningful Impact: 

Dr. Wanyenze and Joy discuss how academic institutions can better incentivize researchers by rewarding real-world contributions instead of focusing solely on publishing. They explore creating systems that value practical outcomes, such as influencing policies or improving public services.

Role of Open Science: 

The conversation highlights the importance of Open Science in making research more accessible and collaborative. Dr. Wanyenze explains how adopting open practices and frameworks like DORA can support research that has a broader societal impact and foster a more inclusive research ecosystem.

 

EPISODE TRANSCRIPT

Welcome to Mazungumzo – African Scholarly Conversations, where we are joined by an expansive list of African policymakers, science communication specialists, innovators, and tertiary institution leaders who contribute to this realm of science communication.

I’m your host Joy Owango, the Executive Director of Training Centre in Communication (TCC Africa), a capacity-building trust based at the University of Nairobi, Chiromo Campus in Nairobi, Kenya.

Our guest is a Professor and Dean of the School of Public Health at Makerere University, a position she has held for the past six and a half years. She has extensive experience in research, capacity building, program management, and policy development in various public health areas. She also has vast governance experience and has served on several boards for organizations in Uganda and globally. In addition to this Prof Wanyenze is an Executive Board Member of the  Declaration on  Research Assessment (DORA. ) In this episode, Prof. Rhoda Wanyenze shares her perspective on an important issue – refocusing how we assess and incentivize research to prioritize societal impact.

A warm welcome to the programme Daktari,

Dr. Rhoda Wanyenze:

Thank you, Joy. I really appreciate the opportunity to engage with you and greetings to our listeners.

Joy Owango:

Before we dive into the main topic, please tell our listeners about your background and what has inspired your career path in public health research.

Dr. Rhoda Wanyenze:

I have been in the health field for the last 30 years, since I completed my training as a medical doctor and then moved on to public health work, especially with programs and policy, working with programs supporting & strengthening of systems for the Ministry of Health alongside several other stakeholders. It was a career that started with clinical work, then moving into programs and policy, and then finally I ended up in research and academia, where I have been for the last more than 20 years, doing research, initially with the research institutions, and then eventually moving into Makerere University as a professor. So, it’s been quite some time, and my inspiration in this journey, I would say, I love research. I really love research. I have this curious mind where I’m always looking out for something that needs to be done differently. So, since I interfaced with research in research teams about 24 years ago, I have not looked back. I have continued to do research even before I joined the academic space. I have been doing this and then my shift from policies and research institutions to academia was largely driven by my desire to contribute to training of the next generation health professionals, public health leaders, as well as researchers. When engaging with a number of students coming out of universities, I felt perhaps there are certain things we need to strengthen in terms of how they are prepared to align with the field out there. And I just felt, rather than stay out there and complain about some of the issues, like, for example, the leadership skills, being able to engage with multiple stakeholders and communities to be able to deliver services, I thought, why not step in there and be part of preparing them. So, that was really my biggest inspiration, and then I have since worked with a lot of research groups, partnerships with universities in Uganda and other institutions, and partnerships with universities in the north, especially in the USA and Europe, and then I’ve also had quite a number of partnerships with African institutions, where I have been a lead for a number of networks. Some of the most recent networks I’ve worked in, in Africa include what we call the Partnership to Enhance Analytical Capacity and Data Use in Africa. We closed off the work that we were doing in the network, in December of last year, although we are hoping to continue this work, hopefully sometime this year, and we were a partnership of 12 countries in Eastern and Southern Africa working with Ministries of Health to enhance use of data in HIV, TB and Malaria. Then I’ve also had a network that I led to assess the response to covid19 across five countries in Eastern and Western Africa, among others, across infectious diseases, sexual reproductive health, as well as health systems.  It’s been a rich and rewarding career of working with so many people in Uganda, and so many people across the African continent.

 

Joy Owango:

Fantastic. From your experience across Africa in academia and research implementation and now policy, what are the major limitations in how research outputs and real-world impacts are typically measured and incentivized within current academic assessment models?

 

Dr. Rhoda Wanyenze:

So, before I get to the measures and how this is done, I would like to speak briefly around the issues surrounding alignment of research with the real-world needs, and making sure that we are able to translate these findings and create impact within the communities. It may seem obvious that translation is something that everybody will do and will focus on, because every researcher that you engage with is talking about translation. It’s a buzzword. We also engage at institutional level, and everybody is talking about translation. And so, you assume it’s something that we will do. It’s easy to do, and you’ll just do it. And then when you ask, what is translation, everybody’s just talking about, I disseminated my work, and it was attended. You know, the dissemination was attended by so many people, and then when you ask, what happened after that? There’s no follow up, and the engagement is often happening at the tail end, when the evidence is available already, and its alignment is sometimes questionable. So, this is an area that perhaps needs to be given a lot more thought in terms of how it’s structured with and integrated within the research ecosystem in academic institutions, let alone the issue around how it’s measured. There are quite a number of issues around here as well because there’s so much evidence. So, in Africa, we have so little evidence, especially in some spaces, but there’s also so much evidence in some areas, and sometimes the evidence conflicts, and sometimes there are tons of things to read. So, it needs a lot of time. It needs a lot of space to analyze before a policy maker, for example, someone in the program, can decide that this is worth applying, and this is how it’s going to be applied. So, there is quite a lot that we need to think through in this space and then sometimes we do research that’s not even aligned to what the policymakers need. It’s not as responsive, or maybe in terms of the timing. It’s not coming out quickly enough. Sometimes you’re doing a study that’s going to rightfully take a very long time before the evidence comes out, but the policymaker wants the answer today. So, there are quite a number of issues there that we need to think through, but then there is also an emerging area of people trusting the evidence. I’m sure you’ve seen quite a number of especially on social media, people countering even established evidence. People are talking about vaccinations and how this is intended to hurt Africans and a number of things. So, there’s a growing distrust in evidence and because of this, we don’t want to take it for granted that as long as we generate it, people will use it. We have to do it in a manner that engages right from the start. People want to hear what they are anxious about, and understand the questions they have. Sometimes you’re going to test an intervention, but they’ll tell you, this intervention doesn’t work for us. I’ve had an experience where we were going to test an intervention, it was so intensive, and when we engaged with the users. They said, even if you show that it works, we will never implement it. It will be too expensive.  So, they gave us what they thought might be manageable. Can you adjust some of these pieces? And we actually were able to do that. And then they said, could you also track the cost so that when you’re done and if it works, we know how much it has to do with. Those are the kinds of engagement you need to do before you even crystallize the questions you’re going to answer, the methodologies you’re going to use. But you also get people to have confidence in you and when they know that you value them, you have integrated their thoughts. They identify with what you’re doing. They own it, and it makes the process much easier, but not just coming tail end to do the translation. But at the same time, we have to recognize that this is an expensive process in terms of time, and sometimes it costs money. These are some of the areas where you feel scientists needs to think intentionally and budget for them and even engage with the funders so that they know that certain things might not move too fast, because we have to engage the users before we move another step. But that said, there are quite a number of issues around measurements. What do we measure? What are the incentives anyway?  How do I benefit? We’ve all heard you publish or you perish because everybody values publications, and so everybody is really driven to publish and publish and as many articles as possible, and sometimes even fragmenting what could have been one paper into two, because you need papers to be able to survive. So, this measurement that focuses largely on the number of publications, and then sometimes, for some institutions, it’s not even just the numbers, its which journal did you publish in? then it’s the journal impact factor that is drummed up. And sometimes someone gets their paper rejected, so many times, simply because they want to start highest impact factor, and then they come down. Sometimes I’ve engaged with colleagues and said, your work is so important and people need to hear, get it out to the journal that will review it the fastest so that people can engage with it, then you hear it’s about general Impact Factor. And then you’re saying, no, a paper, a journal that targets the audience that needs to hear about your paper, is the journal you want to go for. Is it a journal that will be accessible to clinicians if you’ve done clinical work? Is it a journal that’s going to go into issues around policy and analysis of this issue? but often people don’t think about that. They’re just thinking, what’s the impact factor? And it doesn’t matter how long it takes for that paper to come out. Moving away from that we now have people who begin to say citations, because this is more objective because it shows people are engaging with your work and so people want to see citations. This is a big one. So, you have institutions that have now moved ahead to integrate various citation metrics, like the H index, and so in your CV, they will not just look at the journals, but they’re also going to look at the citation indices that they’ve now integrated. This can be a challenge, and often it’s done without necessarily looking at discipline specific differences.  In certain disciplines, people have a lot of journals that churn out a lot of papers in certain spaces. So, there are many more people citing your work. In some other disciplines you have less of that happening. So you’re going to get somebody, maybe in the clinical sciences, where we often tend to have a lot more papers and citations, and you’re comparing them with someone in philosophy, or even engineering. And so sometimes we don’t even look at those discipline-specific differences, and neither do we sometimes look beyond a journal Impact Factor and say, so fine this is a high impact journal, but you could write their paper that is cited just five times, when somebody could have a lower impact factor journal, but their work is, perhaps, you know, very well received and you have so many people engaging with it and citing it more. But at the same time, we know that people don’t always cite your paper because it’s good, sometimes It’s because they are criticizing it, because maybe it’s something they don’t agree with. So, what do these numbers mean?

Joy Owango:

They are not entirely objective at the end of the day.

Dr. Wanyeze:

Quite often, many people gravitate to these numbers, because they say they are objective, but actually they might not be as objective, depending on how you use right? That can be, quite challenging, and I think we need to rethink some of these things.

Joy Owango:

So basically, what you’re saying is that there’s a need for cultural evolution in academia. We need a paradigm shift, yeah, and if that is the case, how can the research ecosystem better incentivize impactful knowledge translation alongside publishing requirements?

Dr. Rhoda Wanyenze:

I think we actually need that shift, and we needed it like yesterday, and we need to start thinking about the things that are really most valuable and meaningful about research. Thinking about the impact, and about what actions are taken out there is so important, but we are not tracking this in the first place. Most academic institutions and research institutions don’t intentionally think through this. They don’t have the tools and have not yet moved towards tracking some of these processes. Engaging stakeholders, for example, in the example I gave you around just getting to hear what the user would like to know and what will work for them before you implement it. They are not tracking those kinds of processes. They don’t really matter at the moment and so because we don’t track, we also don’t consider that if we are promoting you. I did my work, and it has led to a shift in policy, and that policy might have really helped to improve service delivery or the lives of people, but that is not tracked, that is not considered if I’m going to be promoted. So, we need to shift and begin to track those kinds of things. We also need to be thinking about the quality of the work that we do, because it’s about the quality of the work and how well conceptualized it is, how responsive it is, and then engaging the stakeholders and being able to ensure that the findings can shift the space in thinking and action. So, this needs to be done, but at the same time, we need to take it beyond just tracking translation, to also think about the processes, the partnerships, the collaborative work that we need to do across disciplines, across institutions that enhance the quality and the robustness of the work that we do, and the level of innovation. So, all these things are very critical, but they are not tracked, and they are quite often in the pathway towards translation. We need to be thinking through tracking the processes, but also tracking the outcomes, and then moving into the actions that have been taken in terms of the eventually, how the tools that we develop, or the evidence that we generate has been used to impact people out there. We need to be thinking through this a bit more critically, and new mechanisms of tracking, and hopefully going beyond just numbers, because even as we think about new measures of tracking translation, you see examples out there where people are gravitating, again, to a lot of numbers that that might not tell us the full story in terms of the contribution of the work that we do. So, I think we need to be thinking through these processes and we need thinking beyond institutions, because there’s peer pressure. Institutions would wish to shift, but others are not shifting because, for example, if they don’t shift, they are ranked favorably by some of the university ranking institutions out there. So you will have your institution that is reforming, perhaps not seen as favorably as one that’s not reforming. So, I think a broader engagement beyond individual institutions, bringing this conversation to the national and regional levels, and considering a coalition of reforms was to the external as well as the internal environment within these institutions.

 

Joy Owango:

I agree with you. There’s a lot of discourse that needs to be done. But then it comes down to the very uncomfortable topic, and that is how can universities and research institutions better incentivize and support researchers in translating their findings into products, services or policies that drive positive impact. So, if you look at a university, we know they are relying on the traditional ways of incentivization, which means you publish in particular journals you need to have a certain high impact factor and which is a bit restrictive. So now, when we are looking at the Open Science landscape, what, would you tell a university to look at in order to better incentivize and support their researchers? especially if they are already adopting this open science practices, of which they are really with the open access publishing. You know, they already using one of the pillars through publishing. So how, do we talk to a vice chancellor and tell them this is a better way to incentivize your researchers look at these other options beyond just the traditional ones. So, what would you recommend, or what would you propose that would work for them?

Dr. Rhoda Wanyenze:

I would shift this conversation beyond just measurements and incentives to thinking through a broader supportive ecosystem for open science, because it goes beyond measurements and rewards to thinking infrastructure, thinking systems, culture, because some of these things have been normalized. They have become the right thing to do. They have become the measure of quality. They have become the culture. And if you want shift from this you’re actually viewed as watering down the quality of the work that the institution is doing. You are actually seen like somebody who is spoiling the science.

So, I think this conversation needs to be thought through in a broad, multi-pronged approach, a systems approach, and thinking in terms of engagement. Before I even go to what they should do, specific to incentives, I would talk to them about engaging the key actors within the institution, so that we begin to reconceptualize the value of research. What is it, and are we actually achieving it? If not, what is it that needs to shift. Whether it’s open science or moving to a different form of measurement, should not necessarily water down the quality. In fact, it should increase the quality, because the science is more open and we all know what you’re doing, you can share the tools, you can share your analytic codes if somebody else wants to relook at what you’ve done, then they are able to even see whether they can replicate what you did. But at the same time, I think that it’s a process that would make data more available to people, even within the same institution, so that we can maximize the benefits from this data. We go investing a lot in data when we have a lot of data on the same issues, even within the same institution, but nobody knows who has what data, and then they go and collect the same. In an institution like mine, Makerere where we have thousands of staff, it’s very difficult to know who has done what and which data exists. So, there’s lots of benefits I would also wish to bring to them, but at the same time, they will say, so what system or infrastructure do I need to actually be able to open up some of these spaces? How do I make the shifts? What are options available to me if I want to change measurements, if I want to incentivize what exists out there? So, I think they would also then need to look at some of the tools available there. Like, for example, DORA has quite a number of tools that institutions can adapt and utilize to have their own context specific interventions, by using some of the available guidance to be able to do this. So, this is something that I would probably think through. How do I go about it? It’s a complex shift. Which tools are available to use? At the same time I would engage to see that the universities need to speak to the external environment and how institutions are perceived. They can also advocate for change, because in some countries, you find that universities are rewarded based on scientific outputs, which is publications and things so if my budget is going to go down because I am shifting emphasis from certain things, that’s a challenge. So, you might want to engage with the education and science sectors, so that you have that shift in the external environment. The more complex challenge for universities would be the university rankings and what you are using now. They are also making reforms so they are moving away from just only publications to also begin to look at citations and other things. But of course, with the limitations that come with some of those. But I know that some of these, like higher education, has now integrated knowledge transfer and some processes of knowledge transfer, so you can see that they also beginning to make some reforms that are moving towards impact for research, quality and impact for research, but certainly they drive the behavior of the universities, and that’s something that you would want to also engage with that Vice Chancellor to know, what am I up against? What should I expect, and how will I respond to it if it happens? So, it has to be a fairly comprehensive discussion to thinking through, but ultimately, they would need to have policies, strategies and tools to support the implementation. They would need to have their own institutional measurements derived based on some of the guidance that might exist out there, but ultimately, they have to come up with their own that is customized to their context, depending on what the key stakeholders within that university might want.

 

Joy Owango:

You’ve been talking about different types of metrics and indicators that we need to look at, you even highlighted the issue on citations, which also has its limitations. So, what metrics or indicators would be most effective in measuring the success of these knowledge translation efforts? How can they be standardized across different fields of research? So, what do you think is missing out in terms of the metrics that should be potentially added as a form of measuring assessment.

 

Dr. Rhoda Wanyenze:

I think what is missing out is the process. Some of these things take a lot of investment in time and money, as I noted earlier, before you realize them. But if you just want to measure my output on only a paper, or the grant that I got right without looking at what it did and what difference it made. It’s a challenge, so we need to be looking at this process. If I’m going to have a knowledge transfer, I’ll need partnerships. So, if somebody, for example, was able to look at the process, they look at the process of engagement, they look at the partnerships, whether partnered with a sector out there, like health or education, whether I’ve partnered with a private sector and recognized and acknowledged as an important part of the process, and then there are quality issues along the way. I think that’s really important. And I think that is missing, there is a growing example of lists of suggested indicators for measurements of knowledge transfer, right from the resources. So, I think it’s important that we look at the qualitative as well. And I think the qualitative is what is missing, that you can be able to describe exactly the contribution you have made, if I have done analysis of data. How has that analysis led to change? It’s very difficult to do that. If I could use an example of work that we have been doing here in my school most recently, for example, so we analyzed data, and we did this as Makerere University in partnership with FHI 360 working on equity, looking at equity in family planning, who is being left behind that? What is the problem? We picked up quite a number of groups that are being left behind, the younger people, presumably not sexually active, but they are especially much younger ones. And then we are looking at some populations. Typically, we used to think rural is disadvantaged, but we are thinking, okay, the peri urban poor are becoming a very disadvantaged population so long story short, we came up with quite a number of groups that we thought needed more attention. And we engaged a lot. We worked directly with the Ministry of Health in this, including the technical working group. So, for the first time, they were able to say, we are going to include equity as a goal in our implementation plan for family planning. And out of that, we’ve been able to then move forward and say, you’ve included equity, but how exactly is it going to be done? And so, we’ve moved that to having a Uganda specific definition of what equity means. How will Uganda measure equity? We’ve moved forward to developing a roadmap towards how Uganda is going to do this and eventually evaluate the impact of that shift in policy. it’s taken a lot of work, a lot of meetings, a lot of staff, but when I look back, it won’t count in my promotion.

Joy Owango:

That is true, but then there’s quite a bit that you are involved in.

Dr. Rhoda Wanyenze:

For me, even before the papers are published, and by the way, the papers came much later because we were focused and the papers delayed. It might appear like my productivity is low, but my impact is huge. How do you track that? So those are the processes, Joy, that I think are very important. That’s where the qualitative would indicate this national plan that has been developed or adjusted because of what I did. There is a whole new structure that has been set up to address the issues that we found, and maybe a few years down the road, we could say actually it has then led to certain groups that were being left behind now being addressed better, in terms of service delivery. That is huge impact, but nobody tracks it. I might forget it after a few years, and I will not be rewarded for it. So those are the bits that are missing that I think we need to work on in a very intentional way. Actually Joy because of that, earlier this year, I was engaging with my colleagues in my school and saying, we are going to develop a tool to track the value of our research. So, we’ve now presented it, we are discussing it, and we are continuing to engage with it. And we thought, let’s start writing this down, because we are cheating ourselves. We don’t get enough credit for what we do because we are not even tracking it. So, we are now beginning to document and we are going to see if we can step a few years back and ask people for each of their research projects they’ve worked on. What difference did it make? Did it lead to a new curriculum, improved teaching for students? Did lead to a change in how programs are delivered? Did it lead to a change in policy? or a component of a policy and if so which one, what’s changed so that we can begin to document those things. I believe those are the things that are missing quite often, because that is again different from when I manufacture a vaccine and then I know have a patent, it has done this commercially and I’m even getting money. When it comes to the innovation value chain up to commercialization, that becomes so much easier to track, but when it comes to shifts in policy and programs and impacts on the community, sometimes it’s very difficult, and I think we need new tools to be able to look at that.

Joy Owango:

In essence, what you’re saying is that in the process of assessment, institutions need to look beyond the quantitative aspect. They need to look at the qualitative aspect. Because there’s so much that is done within the qualitative aspect, and the qualitative aspect tends to come in much later in terms of publications and citations and that’s not enough to measure the impact and the involvement of the researcher within the institute? That leads me to the next question, what can Africa learn from the Agreement on Reforming Research Assessment set by the European Union and the Coalition of Advanced Research Assessment -CoARA?

 

Dr. Rhoda Wanyenze:

These aspects are actually really emphasized in that coalition, in that agreement.  I think that is important for us, because you see that they are reorienting research and taking it back to quality and impact of the research, and these are the very things we’ve been talking that are missing. But also because they are doing this as a coalition, they are actually addressing the complexities that I raised earlier. They are advocating for change. This change is not just internal for one or two institutions. It’s a multitude of institutions that will sign off to be part of this. So, it gives them the power to shift the ecosystem both externally and to advocate for change for other stakeholders and actors in this broader ecosystem, because it goes much bigger than the institution-level environment when we are dealing with science and innovation. I think there’s a lot we can learn, and perhaps we can also begin to say how strong are our own coalitions and networks and how can we come up with an Africa specific and responsive mechanism to capture to even also capture our indigenous knowledge, because we’ve also talked a lot about indigenous knowledge and decolonization of knowledge, some of which have not gone as far as they ought to. This would also be the opportunity for us to look at the types of knowledge that exist within the African context, and how this can be harnessed and also tracked and recognized. There are a lot of lessons we can pick from this agreement CoARA but also a lot of lessons that we can pick from DORA, and then at the same time, we can also come up with our own context-specific thinking about how things work for Africa.

Joy Owango:

You just mentioned DORA, and it is actually the next question. Though you’ve highlighted the various ways in which we need to improve our assessments and provide better incentives within the institutions, I’d like to know the initiatives like what the Declaration of on Research Assessment (DORA) aim to reform the flood assessment systems. So how specifically do you see DORA help shift the incentives towards translating research into meaningful policies, programs and societal benefits, especially within the African context and its complexities.

 

Dr. Rhoda Wanyenze:

I think there is a lot of opportunities for us to learn from the work that DORA has done over more than a decade. DORA has really been committed in terms of supporting development of new policies and practice for responsible research assessment. DORA has developed several criteria and standards that institutions can use as alternatives when they are doing assessments for hiring of new staff, reviewing staff for promotions, and the tenure appointments within academic institutions, and these are actually available on the DORA website. So, these would actually be very useful. The Reformscape is actually a collection of some of these criteria and standards and one can access and be able to make adaptations, because DORA recognizes that institutions have their uniqueness’s and doesn’t mean  they have to be used as they are but can also be adapted and in fact, because of that, they also propose mechanisms in terms of how institutions can go about evaluating their ecosystem, engaging multiple stakeholders and making sure that they can advance this.  There are several of these, like the space rubric that is available on the DORA website that gives institutions some guidance on how they can go around ensuring that these reforms are done. Most recently, DORA has actually sent out a document on responsible use of quantitative indicators. It came out about maybe about three or four weeks ago, it’s an analysis of the pros and cons of the commonly used metrics in measurement. You will find a lot of these that we’ve been talking about discussed in this; the journal impact factors, when and how can they be used? How do they differ across disciplines? the citations, the H index, and also looking at other metrics, like the field normalization citation index, almetrics. This document is very easy to read, and it gives the reader, individuals and institutions an opportunity to objectively decide which of these indicators to use or which combination of them. It also emphasize the value of qualitative narratives CVs as an emerging area that could help supplement this. So, it’s an interesting document, and this can be accessed, and DORA has a number of institutions, in thousands that have subscribed, that are signatories. Unfortunately, Africa is among the least represented, because when, when I look at the recent data from the analysis of the signatories, we’ll see more than more than 1100 signatories from South America, at 36% of all the signatories, Europe has more than thousand as well, at 33% and North America are closing in on 20% but Africa has only 55 signatories representing just 1.7% of the signatories across and these are coming from just a handful of countries, only four countries.

Joy Owango:

Why is that?

 

Dr. Rhoda Wanyenze

I think it’s a combination of issues. It might be an awareness issue, and maybe we need to do more outreach so that they know that these declarations exist and that they don’t just exist for you to sign off, but they also have tools and documents that they have developed that are freely available and can be used, because when some of the universities are grappling with how to do the reforms, they might not even know where to look for some of these tools. So perhaps we need to do a bit more outreach and get these universities appreciate that these tools exist but also partly because the conversations around this need to shift have not been as widespread, and perhaps we need to have these conversations more about reorienting research to quality and impact and that will go along with improving the measurements for use. So, I think we need more conversations around this.

 

Joy Owango:

I feel that most African universities are still reliant on the traditional methods of assessment and competitiveness, especially when it comes to looking at rankings as well. So it’s a discussion I see with a lot of Vice Chancellors, you see them holding on to that era where things worked for them without considering the fact there’s a cultural shift in academia, and there’s a need to have better assessment, adopt different practices, like open science. So, there’s also that I think this is going to be a bit of a generational change when you see a new cadre or generation of Vice Chancellors coming into play. But most importantly, as you said, yes, there’s a need for increased awareness on this. I’ve really enjoyed this conversation. Thank you so much. Professor Wanyenze, this has been very enlightening. Is there anything else you’d like to add before we wind up?

Dr. Rhoda Wanyenze:

I’ve also enjoyed discussing this with you. Joy. I think it’s such an important subject, because we invest a lot of money in research. Part of the reason why some people don’t like to fund academic institutions is because they keep asking, what’s the value of funding you but it’s because they don’t see the value of the research and the work that we do. I think this would be a great opportunity for us to begin to show off what we can do so that we can attract more support. And I think we need to be looking at it from that angle as well as we continue to engage. So, I’m looking forward to continuing to speak about this and hopefully leading to that small but sure cultural shift.

Joy Owango:

Thank you so much for for joining us

Outro:

Thanks for joining us on today’s episode of Mazungumzo podcast. Be sure to subscribe and follow us on all our channels for more updates and candid stories by researchers, policymakers, higher education leaders, and innovators on your journeys. See you in our next episode.

Listen to the full episode and explore more episodes from the #Mazungumzo- African Scholarly Conversations podcast on the following platforms:

Buzzsprout: https://www.buzzsprout.com/2140692/episodes/15941652

Spotify: https://open.spotify.com/episode/1mFaKTymqRH7XBeN4sEaOv?si=fb40028bb4404fe1

Apple Podcasts: https://podcasts.apple.com/us/podcast/sn-3-ep-1-the-future-of-research-assessment/id1652483621?i=1000673415972

Afripods: https://afripods.africa/podcast/426e65f3-2c86-4c95-99af-a7ac9de09584

TuneIn:  http://tun.in/probT

Share:

Sign Up for the Latest Updates

ABOUT TCC AFRICA

The Training Centre in Communication (TCC Africa) is the first award-winning African-based training centre to teach effective communication skills to scientists.

GET IN TOUCH

University of Nairobi, School of Biological Sciences, Chiromo Campus, Gecaga Institute Building.

+254 020 808 6820
+254 020 2697401
+254 733 792316

info@tcc-africa.org

Skip to content