There is a lot of data available in the form of the brochures of business schools, Web sites, articles from various publications as well as books and you need to start from somewhere in order to make sense of the information overload. This is where you need to look at the suggested rankings of business schools by various renowned publications mentioned in Chapters 2 and 3. These rankings serve as a guide to understand how the various programmes stack up against each other on various parameters. You can make your own rankings by combining the results of the various parameters that are important to you from the figures quoted by some of the surveys conducted by BusinessWeek, U.S. News & World Report, and others. You will also be able to see the rankings of programmes all over the world such as those in the US, Europe, Canada and Asia. However, sometimes it might be inappropriate to directly use the rankings and arrive at a conclusion, let’s say, between the US and the European programmes which are of different durations and involve different costs. Another factor, which is usually considered for the unofficial ranking of programmes across the world, is the prestige or the brand value of the programme as understood by people. We advise you to talk to a lot of people (friends, colleagues, current students, even bosses) to get their perceptions about the top programmes. It is all the more important for international students planning to return to their country in future, to understand that some programmes have a higher brand equity than others and would stand them in a good stead in their own country due to better recognition of the programme.
A number of rankings for business schools are published every year. While these rankings influence the overall opinion about the programmes, one must recognize that they are not the final word. It is wise to take into account what these rankings have to say, and also use your own research and judgment to plan your strategy for applying.
Rankings from the following publications are the most widely used ones:
Any ranking is determined by the criteria that are ultimately driven by human judgment—including who to survey, which criteria to include and how to accord weightage to each of these. Rankings change from time to time and the ranking lists that are mostly referred to are general in nature, that is, they do not separately rank schools for different criteria. While this presents an overall picture of the school in question, it may not help answer specific questions that you may have. It is better, therefore, to supplement overall rankings with functional rankings (such as those in BusinessWeek and U.S. News & World Report) as well as conduct in-depth research of the programme to see how well it suits your needs.
Popular rankings from the sources mentioned above have demonstrated inconsistency overtime and more so, across different sources. Consider the top five schools in these rankings:
U.S. News & World Report Ranking (2008)
BusinessWeek Ranking (2006)
Financial Times Ranking (2007)
Wall Street Journal Ranking (2006)
Forbes Ranking (2005)
A total of 13 schools have been ranked in the top 5 by various business school rankings! This is only one pointer to the degree of variation that exists in the process of rankings.
U.S. News & World Report and BusinessWeek—the most well-known rankings—have been published for more than ten years. Rankings from the Financial Times, Forbes and The Wall Street Journal have come out recently. U.S. News & World Report’s ranking is generally considered the best ranking because their system is the most transparent and their rankings always come closest to peoples’ perception of relative prestige. There are three schools which have been ranked at Number 1 by U.S. News & World Report in the last ten years—Stanford Graduate School of Business, Harvard Business School and the Sloan School of Management at MIT.
Yet, none of these three schools have ever been ranked Number 1 by Business Week. Instead, the only schools to achieve the top position in its ranking are the Wharton School of the University of Pennsylvania and the Kellogg School of Management at Northwestern University. While the Financial Times has only been publishing rankings for the past three years, it also uses relatively transparent criteria, and is respected as a ranking (with some caveats). The only schools to hold the Number 1 position in its ranking have been Harvard and Wharton.
It would not be a wise idea to rely too much on the numbers that are published by one particular source. The ultimate purpose of ranking is to help you target the right B-schools and the right kind of MBA programme for you and not to decide for you which B-school is the best and which not.
The best strategy is to supplement rankings with your own research about the schools. Remember, a better-ranked school need not necessarily get you a better job or a higher salary.
The key reason why prospective applicants need to be a little wary of rankings is the sheer range of criteria that go towards determining why a school should be at the top and why another should be at the 27th.
However, there are some common determinants of rankings, such as GMAT scores, earning potential, recruiters’ views, and yield rates, which act as great aids in directing us to our dream schools.
For instance, Financial Times combines around 20 factors in ranking the world’s top 100 programmes, notably, students’ career paths, diversity of experience and research quality of the school. While these serve as good indicators, especially when comparing schools across continents, there are certain inherent risks as well.
For example, salary comparisons across regions are standardized by considering the relative purchasing power of the various currencies. However, the sizeable differences in tax structures in these locations are not considered. Similarly, other subjective factors like alumni satisfaction levels are considered. Conversely, some other factors like diversity indicators (women and international students, percentage of alumni working abroad, etc.), qualifications of faculty (doctorate/research publications) appear to be too objective and may not contribute significantly to the quality of education, image or recruitment profile of the school. The weightage accorded to the factors is also unscientific and open to debate.
BusinessWeek’s bi-annual ranking of top B-schools is an eagerly anticipated event because their methodology is simpler, with a clear focus on the two critical target groups—the students and the recruiters. A third measure used to round off the surveys is an index of the intellectual capital of the schools, evaluated by the number of publications by faculty in leading journals and magazines. However, low participation by a school in the survey would significantly impact its rankings; conversely, using current students to evaluate their own school intrinsically means that the responses are subjective.
U.S. News & World Report’s annual rankings offer a more scientific method of ranking—a fact borne out by the degree of consistency in the rankings themselves over the past few years. The publication essentially looks at three main factors—reputation, placements and key admissions criteria for the schools. These further consist of sub-factors like average salaries of the graduates, percentage placed within three months of graduation, student GPAs, GMAT scores, and yield. Perhaps the major scope for potential bias lies in the reputation factor.
The Wall Street Journal is a relative newcomer in the rankings business. April 2001 saw the first WSJ rankings published. However, this has yet to gain real credibility among prospective applicants, primarily due to the large variances seen between schools’ rankings year on year. More importantly, rankings like No. 10 for Stanford and No. 13 for Harvard while Carnegie Mellon’s Tepper School of Business swept in at No. 2, do much to further the sceptics’ point of view.
Forbes uses a focussed mechanism for ranking schools—return on investment. They compile data on the graduates’ salary over a five-year period since they left B-School. This is then compared against their investment-cost of tuition and the opportunity cost of the pre-MBA salary foregone with assumed increments. The discounted return is then used to rank the schools. This is a good thumb guide, but then by this logic, it would almost always make sense to go to a European school with a one-year programme rather than a Harvard or Stanford.
This section enables objective evaluation of certain aspects to create a sort of ranking from the available data across various sources to arrive at a list which speaks for itself. In Chapter 3, something similar has been discussed but it helps you research the business schools based on your priorities from a subjective point of view, according to your specific needs.
There are a few ways to temper the wide variations that exist in the popular rankings. We shall describe two of them here.
The most commonly used is averaging of ranks of business schools, as brought out by various publications. Table 4.1(a), (b) and (c) explains this method.
One of the most apparent advantages of this method is that it evens-out advantages and actually produces a consolidated ranking. Another is that if you include newer rankings such as that of Forbes or The Wall Street Journal and construct a more exhaustive table, the results will not be too different from what they are right now for the top 10 schools.
The other method is to identify how many of the major rankings have included a particular programme in the top 5 or top 10 or top 15. For example, if we take BusinessWeek, U.S. News & World Report, Financial Times and The Wall Street Journal, the results are as follows:
The top 5 schools in three or more of the five rankings:
The following schools feature in the top 10 of three or more rankings:
This method gives a fairly good clustering of B-schools into different tiers. This can then be used as a yardstick to also measure the quality of a programme.
To summarize, you not only have to depend on the rankings generated by well-known publications but also have to do your own research and determine the best school for yourself by following either the above suggested method or some variation of it.
Table 4.1(a) ‘Selectivity’ and ‘Yield’ Figures for Some US B-Schools
School | Yield | Selectivity |
---|---|---|
Carnegie Mellon | 45% |
28% |
Chicago | 67% |
23% |
Columbia | 71% |
15% |
Cornell (Johnson) | 47% |
36% |
Dartmouth (Tuck) | 59% |
19% |
Duke (Fuqua) | 45% |
37% |
Emory (Goizueta) | 39% |
37% |
Georgetown (McDonough) | 40% |
41% |
Harvard | 87% |
13% |
Indiana (Kelley) | 48% |
33% |
Michigan | 61% |
35% |
MIT (Sloan) | 66% |
20% |
Northwestern (Kellogg) | 57% |
23% |
NYU (Stern) | 48% |
22% |
Ohio State (Fisher) | 44% |
53% |
Pennsylvania (Wharton) | 68% |
16% |
Purdue (Krannert) | 34% |
44% |
Stanford | 78% |
10% |
Texas-Austin (McCombs) | 45% |
43% |
UC Berkeley (Haas) | 47% |
17% |
UCLA (Anderson) | 45% |
25% |
UNC (Kenan-Flagler) | 40% |
47% |
USC (Marshall) | 44% |
36% |
Virginia (Darden) | 39% |
38% |
Washington U. -St. Louis (Olin) | 38% |
54% |
Yale | 42% |
25% |
Table 4.1(b) ‘Selectivity’ and ‘Yield’ Figures for Some European B-Schools
School | Yield | Selectivity |
---|---|---|
INSEAD | 72% |
n/a |
IMD | 84% |
19% |
Oxford (Said) | 57% |
42% |
London | 53% |
26% |
SDA Bocconi | 87% |
27% |
IESE | 67% |
40% |
HEC-Paris | 65% |
22% |
University of Cambridge (Judge) | 55% |
40% |
Cranfield School of Management | 60% |
62% |
Table 4.1(c) ‘Selectivity’ and ‘Yield’ Figures for Some Canadian B-Schools
School | Yield | Selectivity |
---|---|---|
Queens | 35% |
47% |
Rotman | 48% |
50% |
York (Schulich) | 50% |
54% |
Western Ontario (Ivey) | 54% |
60% |
Source: From various B-school Web sites
18.118.10.32