When the Financial Times quietly posted a survey seeking input into the composition of its coveted list of scholarly journals in business, it set off a frenzy. Journal editors and other interested parties launched campaigns seeking to stuff the virtual ballot box, and thousands of “voters” obliged.
This response was not surprising. Inclusion in the FT50 list has material consequences for journals (which benefit from prestige, visibility, and subscriptions), schools (whose FT ranking depends in part on how productive their faculty is in FT50 journals), and scholars (some of whom receive direct pecuniary rewards for publishing in sanctioned journals). A lot is at stake in decisions about what journals “count.”
But what most respondents to the survey seemed to overlook was the underlying intention of the FT’s assessment, stated plainly in the first sentence of the survey: “The Financial Times is planning to update its analysis of business schools’ research, with a particular focus on assessing the broader impact on society.” That is, the FT was joining RRBM, AACSB and EFMD in aiming to better measure the societal impacts of academic research on the world beyond business schools. In some sense, this is the scholarly analogue of the Environmental, Social and Governance (ESG) movement in the corporate world, recently endorsed by the Business Roundtable and the World Economic Forum. ESG metrics are designed to incentivize corporations to make the world a better place. The FT, b-school funders, students, and many of us, believe that business scholars currently focused on accumulating a roster of A publications or hits on Google Scholar, should be redirected by better measures of the societal impact of research with an aim to make the world a better place.
This is a bold ambition. Assessing “productivity” based on counts of articles published in an approved roster of journals, and “impact” based on citations, have been bad for science. As Aguinis and friends (2014) point out, something has gone wrong when the answer to ‘‘What is good for the advancement of our knowledge?’’ is different from ‘‘What is good for the advancement of a scholar’s career?’’ If the FT’s efforts were ultimately successful, we would see an alignment between what is best for scholarly careers AND what is best for advancing knowledge AND what is best for serving the needs of society.
This moment is a great opportunity to re-think impact and, ideally, re-shape the ecosystem of business research to better serve society. And we hope to host the dialogue right here.
Scholars have long lamented the pathologies of our simplistic and inward-focused metrics of impact. When resources are not constrained, we have many credible methods for measuring the societal impact of research. The Responsible Research Awards in different disciplines are based on nominations, careful reading of submissions, and judgement by expert panels. The new AACSB standards related to societal impact are assessed by self-reporting by schools relative to their missions and evaluated by a panel of deans from peer business schools. The societal impact of research in U.K. universities is assessed (and financially rewarded) through a very rigorous multi-year process of each university creating a set of case studies grounded in research with purported societal impact that is evaluated by a large panel of outside experts. The UK Research Excellence Framework (REF) is often viewed as the gold standard for assessing impact. It is also very resource intensive. Like the other approaches, the REF’s relies heavily on expert judgements and qualitative approaches that are difficult to replicate at scale
But today we have far better tools that should enable us to assess societal impact on a global scale. We believe that creative, credible archival measures are possible. Natural language processing, academic citations, social media references, trade publications, policy papers, and general access to big data are opening new possibilities for developing useful proxies.
The movement for Responsible Research in Business and Management (RRBM) has been at the forefront of changing the ecosystem of business research. AACSB’s revised accreditation standards include a new focus on societal impact. Three academic disciplines have established annual Responsible Research Awards for published articles and books. Several journals across different fields have published special issues on responsible research. The annual RRBM Summit continued despite the pandemic, and we launched a webinar series to engage our community more broadly. In the past five years, we have grown our community of supporters, showcased many exemplars and fostered interest in generating credible and useful research.
Our biggest challenge is to develop alternative approaches to assessing societal impact of research that can be used in by scholars, by schools, by journals, by accreditation processes, by other business school stakeholders, and/or by the Financial Times. We do not expect to find a solution that is both scalable and as credible as professional judgement by an expert panel. But we are convinced that we need something better than current practice.
As active members of the RRBM community, we want to engage you in crowd-sourcing ideas and research methods to address this challenge. Please comment on this blog and help us develop ideas for new, credible measures of societal impact of research. If you had almost unlimited access to big data, how would you uncover the threads from business school research to societal impact? Recognizing that corporations are unlikely to use full academic referencing in their 10-K reports and press releases, how can we tap the diffusion of credible research? If we can create a better archival measure of the societal impact of research, we will be able to create a better world.
Bill Glick and Jerry Davis are two founding members of RRBM.
I encourage everyone to sign up for the upcoming panel discussion “IFSAM executive committee response to FT survey on management journals”. Please see description and registration information at: https://www.ifsam.org/blog/2020/12/22/webinar-on-jan-29-2021/
Thank you Xavier Castaner and IFSAM for organising such an interesting panel discussion. We touched briefly on altmetrics and I would recommend the work by my colleague Mike Thelwall on the use of alternative indicators and webometrics, see for example https://www.scholarlyassessmentreports.org/articles/10.29024/sar.10/. Similar to citation data, there are subject and field effects which need to be carefully considered.