India Rankings 2016

[Pages:41]National Institutional Ranking Framework

India Rankings 2016

Engineering | Management | Pharmacy | Universities

Outreach & Inclusivity

Research, Professional Practice & Collaborative Performance

Graduation Outcomes

Teaching, Learning & Resources

Perception

RANKING PARAMETERS

Content

Preface

i

1. Background

1

2. NIRF Parameters for Ranking of Institutions

2

3. Metric to Compute Ranking Scores

2

4. Participation: A Matter of Choice

3

5. Methodology

4

5.1. Source of Data: Individual Institutions

4

5.2. Data Collection and Data Capturing

4

5.2.1 Online Data Capturing Platform

5.2.2 Data Submission Utility

5.2.3 Publications, Citations and Collaboration: Web of Science, Scopus and Indian Citation Index

5.2.4 Perception

5.3. Data Mining and Data Verification

6

5.4. Interpretation and Minor Deviations

6

5.5. Ranking Threshold

7

6. Analysis of Ranking Results

7

6.1. Engineering (Category A-Research & Teaching)

7

6.2. Management (Category A-Research & Teaching)

8

6.3. Pharmacy (Category A-Research & Teaching)

9

6.4. Universities

9

7. Computation of Scores and Rankings: Top 25 in Each Category

10

7.1. India Rankings 2016: Engineering Category "A"

10

7.2. India Rankings 2016: Management Category "A"

11

7.3. India Rankings 2016: Pharmacy Category "A"

12

7.4. India Rankings 2016: Universities Category "A"

13

8. Top Ten Institutions in Each Category

14

8.1. Engineering (Category A-Research & Teaching)

14

8.2. Management (Category A-Research & Teaching)

20

8.3. Pharmacy (Category A-Research & Teaching)

25

8.4. Universities

30

9. Collaborators

35

Preface

Lessons from the First National Rankings Effort

This has been a huge learning opportunity for those of us who have been involved in this effort.

Receiving such a large response from educational institutions was the first surprise. That the institutions were so keen to position themselves amongst their peers was a very pleasant thing to learn. This speaks volumes for the institutional aspirations to move towards quality, and augurs well for the future evolution of higher educational institutions.

In this respect, it is relevant to mention that response from universities, engineering, management and pharmacy institutions was very enthusiastic, even overwhelming and therefore, particularly gratifying. The response was uniformly spread across Government-funded and privately-funded or self-financing institutions. This makes the ranking effort truly representative and worthwhile.

However, despite wide publicity, the response from the Architecture and general degree colleges fell short of being truly representative. Hence, for this year, it was decided not to rank institutions belonging to these two categories. We offer apologies to institutions in these two categories that took the trouble to register and populate the data; but it became clear to us that there were significant pockets of relative vacuum. In the absence of representative participation, the ranking exercise here would have become meaningless.

On the down side, the associated learning has been that educational institutions have to be more careful about the data. Many institutions failed to put their best foot forward, when it came to supplying quality data. The point being emphasized here is "quality". Some of the institutions were definitely casual in supplying the data sought. Reliability of an exercise like this depends entirely on the reliability of the data. This is clearly an area in which further work is needed. Institutions need to appreciate that supplying honest and reliable data is important for their own image of how they want to be seen by their pupils and the World at large.

In fact almost a major fraction of the ranking effort has gone into improving data quality. A large number of our volunteers were deeply involved here. We used an outliers based approach for the detection of bad data points, contacted the concerned institutions repeatedly to examine the suspected figures and to supply the corrected values instead. While some amount of automation was used for the task, significant human interventions were needed for which our officials, volunteers and staff (both at NBA and INFLIBNET) spent many sleepless nights. We are grateful to many professional volunteers involved in the exercise, who gave their time for this task selflessly.

In fact, the major learning from this exercise for the future is to give deeper attention to this problem. Going forward, we need to find ways for many of the relevant data to be populated directly from a few independent

India Rankings 2016

I

sources. There is need for organizing a major effort to create such sources for independent data, as well as for doing away with parameters for which such sources cannot be reliably identified. To put this in a better perspective, the data on research publications and citations was taken directly from resources published by Elsevier (Scopus), Thomson Reuters (Web of Science) and the Indian Citation Index. We had little problems with data taken from these sources. We have taken every care to be objectively neutral in our work here. However, it became necessary at times to effect data corrections, where common reasoning dictated that there was an obvious error due to carelessness. However, the number of such interventions is very small and in all cases had little possibility to impact the over-all ranking process. In spite of taking all care, we are aware of the many pitfalls and deficiencies of a first attempt, and the possible imperfections in the results. We seek indulgence of all stakeholders and hope to work with them in developing the future versions of this national exercise.

Surendra Prasad Chairman, NBA

India Rankings 2016

ii

"The Ranking framework will empower a larger number of Indian Institutions to participate in the global rankings, and create a significant impact internationally too. I see this as a sensitization process and an empowering tool, and not a tool for protection."

Smriti Zubin Irani, HRD Minister

1. Background

The process of framing National Institutional Ranking Framework (NIRF) began on October 9, 2014 with constitution of a 16-member core committee under the chairmanship of Secretary (HE), Ministry of Human Resource Development. The terms of reference of the Committee were to suggest a reliable, transparent and authentic National Framework for performance measurement and ranking of institutions for higher education and to recommend institutional mechanisms, processes and timelines for implementation of the National Institutional Ranking Framework. The framework was given a final shape subsequent to intense discussions and deliberations during a series of meetings of the Committee and exchanges with peers and stakeholders through many online discussions.

The National Institutional Ranking Framework (NIRF) for engineering and management categories was unveiled by Smt. Smriti Irani, Honourable Minister for Human Resource Development on 29th September, 2015. Soon after, the NIRF for ranking of the pharmacy and architecture institutions as well as for colleges & universities were released. The NIRF envisaged separate rankings for different categories of institutions in their own respective peer groups. Further, within each discipline, there was provision for separate ranking in two categories ? institutions that are engaged in research and teaching (Category A), and those engaged mainly in teaching (Category B).

The final framework identified nearly 22 parameters in five major heads, several of them are similar to those employed globally such as excellence in teaching, learning and research. However, there are a few which are India-centric, reflecting aspirations of the rising numbers of young people of a vast nation. Country-specific parameters relevant to the Indian situation include regional and international diversity, outreach, gender equity and inclusion of disadvantaged sections of society.

"I don't see this becoming encompassing and exhaustive in next two years, but a beginning to that end. Our framework is a moderated version of QS and is self-reporting. Though all central universities will be part of it, I hope good institutions from private as well as other sectors will aspire to be part of rankings,"

Vinay Sheel Oberoi, Secretary (HE), MHRD

India Rankings 2016

1

The NIRF has been drafted to provide an Indian context to educational aspirations and needs".

Smriti Zubin Irani, HRD Minister

2. NIRF Parameters for Ranking of Institutions

The NIRF provides for ranking of institutions under five broad generic parameters, namely: i) Teaching, Learning and Resources; ii) Research, Consulting and Collaborative Performance; iii) Graduation Outcome; iv) Outreach and Inclusivity; and v) Perception.

Teaching, Learning and Resources (TLR)

Research Productivity, Impact and IPR (RPII)

Faculty Student Ratio - Permanent Faculty (FSR)

Combined Metric for Publications (PU)

Faculty Student Ratio - Visiting Faculty (FSR)

Combined Metric for Citations (CI)

Metric for Faculty with Ph.D. and Experience (FQE)

Intellectual Property Right and Patents (IPR)

Metric for Library, Studio & Laboratory Facilities (LL)

% of Collaborative Publications and Patents (CP)

Metric for Sports and Extra Curricular Facilities (SEC)

Footprint of Projects and Professional Practice (FPPP)

Metric for Teaching and Innovation (TI)

Graduation Outcome (GO)

Outreach and Inclusivity (OI)

Performance in University Examinations (PUE)

Outreach Footprint (Continuing Education, Services)

Performance in Public Examinations (PPE)

(CES)

Performance in Placement, Higher Studies and

Percentage of Students from Other States / Countries (RD) Entrepreneurship (PHE)

Percentage of Women Students and Faculty (WS)

Mean Salary for Employment (MS)

Percentage of Economically and Socially Disadvantaged Students (ESDS)

Facilities for Physically Challenged / Differently Abled Persons

Perception (PR)

Process for Peer Rating in Category (PR) and Applications to Seat Ratio (SR)

3. Metrics to Compute Ranking Scores

The discipline-specific ranking frameworks for all the six categories of institutions mentioned above are available on the NIRF Web site (). These documents identify the relevant data required to suitably

India Rankings 2016

2

measure the performance score under-each sub-head mentioned above and propose a suitable metric to compute a score for the sub-head. The sub-head scores are then added to obtain scores for each individual parameter. The overall score is computed based on the weights allotted to different parameters.

4. Participation: A Matter of Choice

Institutions desirous of participating in the ranking exercise using framework defined by the NIRF were invited to register themselves on the NIRF Web portal and submit their online applications for ranking in one or more disciplines along with relevant data in the given format by 31st December, 2015. The last date for submission of data was later extended up to 15th January, 2016.

Discipline

Total

Engineering 1438

Numbers Cat. A Cat. B

314 1116

Cat. A & B

8

CFTIs & CFU

53

Govt. Inst.

168

Semi-Govt. Inst. (Deemed, State, 12 B Univ.)

13

Others (Pvt. & Deemed Pvt.)

1204

Management 609

142 459

8

14

27

11

557

Pharmacy

454

173 275

6

--

34

1

419

Architecture 28

10

17

1

1

4

4

19

College

803

-

-

-

--

--

744

59

University

233

-

-

-

46

--

141

46

Region-wise Distribution of Institutions Registered for Ranking under NIRF

Discipline / Region North South East West North-east Total

Engineering Management

252

150

737

257

126

35

301

162

22

5

1438

609

Pharmacy University

84

86

194

58

28

23

145

50

3

16

454

233

"India needs national rankings and national data collection effort. We want more universities from India to provide data, and we really encourage data sharing."

Phil Baty, THE Rankings

India Rankings 2016

3

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download