The NIRF (National Institutional Ranking Framework) Report of 2019 was released with much fanfare, recently.
The report can be downloaded here : NIRF REPORT 2019.
At the outset, the report is well designed and has been done with a lot of good intention and purpose. The biggest benefit for the general public would be that instead of depending on hearsay and unreliable private ranking systems, parents and students would have a ranking system which is authentic and accountable.
When I talked to Engineering and Medical College Professors whom I personally know, the unanimous opinion was it is a reliable report. Coming from Professors themselves, who are discerning in nature, it is quite encouraging and heartening. In a way, this report is a grand finale of AICTE norms which are laid out by AICTE to ensure the essential requirements of quality engineering institutions.
Engineering colleges eagerly awaited this report and this was expected as a decent rank would be a good marketing point.
The impressive thing about this report is that the ranking parameters have been clearly mentioned right on the cover page of the report.
Let us now delve right into the report.
The report itself consists of 66 pages.
Out of these 66 pages, 1-17 pages talks about the background and the process followed. 18 – 66 pages list all the engineering, management, pharma, medical, law and architecture.
What we have finally is pages 1-18, which talk about the important aspects of ranking of institutions.
What is to be lauded is the compactness of the entire exercise. The report is fully focused on rankings and nothing else.
Let us look at the important aspects of the ranking parameters, process, calendar and few more aspects.
Ranking parameters are grouped under five headings:
Teaching, Learning and Resources
It has be acknowledged that all parameters have been drilled down to numbers. This shows that the grey areas have been avoided to ensure possibility of corruption and bias is eliminated.
Under this category, number of students, student faculty ratio, faculty with PhD and experience are the scoring points. All are measured in numbers and is pretty straightforward.
Research and Professional Practice
Number of publications, Quality of publications, Patents published and granted, footprint of projects and professional practice.
This basically addresses enhancing quality of faculty contribution. The key point to be noted here is that most faculty hesitate to even attempt writing papers, which is the foundation for getting patents. Institutions should strive to increase the number of papers published by a significant number and create a strong foundation for publishing and getting patents granted. Essentially, what we are talking about is nudging faculty themselves to be creators rather than being teachers and consumers. The report refers to the list compiled by financial times. This list is well chosen and perusal of the list shows that each establishment is strong its respective core subject. One note of advise, faculty should possess command over English Language to impress the editorial board of the 50 publications.
Selection of topics for papers is also very important. No purpose is served by choosing topics for papers which involve extensive research, is of no public service or industry, and is expensive to conduct. Begin with selecting topics which industry is grappling to solve and publish the papers in relevant journals.
Involve the students and award credits for the work they do in publishing papers. This will create a win win solution and aid in getting better ranks.
Placement and higher studies, university examinations, median salary, number of Ph. D. students graduated. These factors determine the graduation out rank.
Outreach and Inclusivity
Region and country diversity, women, economically and socially challenged students, and physically challenged students, are included in this category.
Getting students from other countries is a challenge for any university and marketing to the Indian community in selected countries is a good strategy, to begin with.
In total, 3000 institutions submitted their request for a ranking.
This is a heartening sign and it clearly shows, unambiguously that managements of universities are willing to compete on an even pitch and face the challenges.
Institutions such as National Board of Accreditation and INFLIBNET have put in thought and wisdom in making the entire exercise simple, straightforward and easy to adopt.
Normalisation of data has been adopted from this year on wards, to bring in parity in rankings. The database used is the common database – AISHE (All India Survey of Higher Education).
I have a few suggestions to the NIRF board to consider. <If any of these are adopted kindly credit me for the same>. Faculty, kindly note. Giving credit to the idea creator is the first step towards creating an army of student creators.
Integrity of Data
It is common knowledge that fudging of data is a common phenomenon all over the world. There is no guarantee that our Universities will not do it. How does one discourage fudging in this scenario? The implementation committee seems to have recognised this and has taken some steps. In my tenure as a STEP in charge, i was surprised to see that AICTE officials took videos of the incubates as an evidence. Video recording wherever and whatever possible, offers a foolproof way to create evidence which no one can deny.
Next, cases where data is not reliable can be penalised in the subsequent year to ensure that colleges think twice before fudging data.
Gradual climb should be given additional points in ranking.
The other aspect is the number of startups. The startup certificate is a clear proof that the unit is indeed a startup. So, startup certificate verification should be imposed, if the MHRD is serious about energising engineering and medical institutions.
Kindly, also consider, lack of thrust areas can attract a penalty. The logic here is without thrust areas no company or an educational institution go far. This will be the foundation on which the ranking system has to be built.
The NIRF exercise starts during the period of September and December is the month all data has to be submitted. All data is submitted online and this reduces errors to a great extent.
A clear listing citation database has been given and it a helpful guideline for getting peer recognition and to get started.
Perception input is voluntary and is an open and transparent system. It is online and a prescribed format is provided.
One of the missing category is the polytechnic colleges. It would have been apt to include the top 15 colleges in India, just as with Law and Architecture.
One of the major takeaways from the report is the clear indicator that higher the number of papers published higher is the rank of the department / branch. The report also notes that this indicator does not hold good at a college level. This obviously means that some departments are laggards, and are pulling the ranking down.
Laggard departments have to be investigated by colleges themselves, identify the reasons for some departments doing well, and take adequate measures to do better.
The number of startups is a big challenge colleges will face in the coming years. The Govt. of India is providing support but colleges have to look at innovative ways to speed up the process. The best thing is to start a TBI and scout for a IoT based startups in their respective cities. And, expose students to the workings of IoT projects in the practical world. The higher the number of startups based on cloud computing, IoT, AI and ML greater and speedier are the chances of success. ATL (Atal Tinkering Lab) is another option to consider.
This is a brief overview of the NIRF report. One exercise I will be doing in the near future is analysing digital strength of colleges. This is the foundation for the future.
Happy ranking for 2020.