2012 MFA Rankings: The Methodology

Seth Abramson
From the September/October 2011 issue of
Poets & Writers Magazine

Note: The following article refers to the rankings that appeared in the September/October 2011 issue of Poets & Writers Magazine. Our most recent coverage of MFA programs is available in the September/October 2012 issue on newsstands now.


The 2012 Poets & Writers Magazine rankings comprise individual rankings for full-residency MFA, low-residency MFA, and doctoral creative writing programs. Full-residency MFA programs are assessed on the basis of eighteen measures, half of which are ordered rankings and half of which are unranked categories of important program features. While all nine of the full-residency MFA rankings–based measures cannot be classified as scientific, all nine are predicated upon sufficient hard data to be substantially probative. A scientific ranking of MFA or doctoral creative writing programs is not presently possible, as more than half of the nation's full- and low-residency graduate creative writing programs have thus far declined to make public the necessary data (see “Full-Residency Rankings: Ranking Questionnaires and Program Response Rates”).

Four of the nine full-residency MFA rankings are based a survey of a large sample of current MFA applicants. The most important of the four rankings is the ranking upon which the ordering of the programs in the table is based, a ranking that reflects individual fiction and poetry applicant’s varying esteem for the nation's 151 full-residency MFA programs, two creative writing MPW programs, and four MFA-comparable creative writing MA programs (for a discussion of the inclusion of these latter six programs, see “Non-MFA Master’s Degree Programs”). The remaining three poll-based genre rankings are essentially subsets of this first ranking to the extent they offer data relating to various elements of the overall cohort polled: fiction applicants, poetry applicants, and nonfiction applicants (the last of which having been polled for the nonfiction-genre rankings only). For the genre rankings programs are ordered, as with the overall rankings, on the basis of the number of votes received by each full-residency program in that category. Polled respondents cast a vote by stating a present or future intent to apply to the program in question. The top fifty overall vote-getters (with two programs tied for fiftieth), along with the next twenty-five vote-getters (denominated “Honorable Mention” programs), are listed in the rankings table, also published in the September/October 2011 print edition of Poets & Writers Magazine. The remaining eight-two full-residency programs are listed in "The Additional Rankings of Full-Residency MFA Programs."

For the genre rankings, programs ranking in the top fifty in poetry and fiction are noted in both the print and online rankings tables, as are programs ranking in the top twenty in nonfiction.

The five hard data–based rankings are as follows: funding, selectivity, fellowship placement, job placement, and student-faculty ratio. These rankings are scientific to the extent that they rank programs on the basis of quantitative data publicly released by the programs themselves, and unscientific to the extent that not every program has released data for every category of assessment. The rankings therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for well over half of the nation's full-residency MFA programs; the remaining programs are primarily smaller, newer, lightly advertised, or nondomestic programs, or else programs with a primarily regional applicant base. As all of these programs have websites, however, and as all of these programs exert exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken, by the rankings, as an indication that these programs fully fund less than 33 percent of their students and do not have an acceptance rate low enough for inclusion in the top fifty in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 9.4 percent for it to be included in the top fifty for selectivity). The rankings are based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program websites are regularly reviewed to determine whether a program has added information to its online profile; program administrators can also e-mail Poets & Writers Magazine directly to draw attention to any substantive website changes.

Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have ranked in the top fifty in the funding category. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity rankings; a program must receive at least eighty applications annually to be considered eligible for this ranking. As to the two placement rankings, these do not rely on programs' promotional materials or their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, were equally eligible for a top fifty ranking in fellowship placement and job placement. Data sufficient to calculate student- faculty ratios for virtually every MFA program in the United States were also readily available.

The overlap between those programs ranked in the top fifty overall and those programs ranked in the top fifty in the other seven categories subject to ranking is significant. One hundred percent of the overall top fifty programs ranked in the top fifty in one or both of the fiction and poetry genres; ninety percent ranked in the top fifty in both poetry and fiction. In nonfiction, fifteen of the top twenty nonfiction programs (75 percent) also ranked in the overall top fifty.

Thirty-eight (75 percent) of the overall top fifty ranked in the top fifty in funding, with another six (12 percent) receiving an Honorable Mention. In all, 87percent of the top fifty full-residency programs ranked in the top fifty for funding or received an Honorable Mention in this measure of program quality. Forty-six (90 percent) of the top fifty programs ranked in the top fifty in selectivity, with thirty-seven (73 percent ) ranking in the top fifty in fellowship placement (and another 4 percent earning Honorable Mention) and thirty-three (65 percent) ranking in the top fifty in job placement. Thirty-one (61 percent) of the top fifty programs ranked in the top fifty in student-faculty ratio, with another six (12 percent) earning an Honorable Mention in the category.

In view of the above, ordering programs on the basis of their overall vote totals also had the effect of placing a special emphasis, in the rankings, on those programs that placed highest in the five hard data rankings.

By definition, statistics-based quality assessment schemes of any kind—whether in the form of rankings or otherwise, whether in the field of education or any other field—perform poorly when it comes to assessing unquantifiable program features and/or features that can only be assessed subjectively by an individual experiencing them firsthand. That such features are most assuredly a vital element of every graduate creative writing program does not and should not preclude the possibility of statistics-based quality assessment schemes operating alongside the primary mechanism programs have to introduce applicants to unique curricular features: their own websites and promotional material. By definition, programs bear primary responsibility for educating prospective applicants regarding program offerings, and no ranking system of any kind could or should usurp this responsibility—especially as no applicant applies to a program without first researching it. Indeed, programs at all times have a captive audience of hundreds if not thousands for their online promotional materials. Some programs may worry that neither these rankings nor any ranking system devisable could adequately capture each graduate creative writing program’s most unique elements; these fears are understandable, but misplaced. Those applicants polled for this ranking had access to individual program websites at all times before, during, and after their formation of a list of where they intended to apply.

If an unquantifiable program feature of which a program is singularly proud does not lead to that program excelling in these rankings, it is neither because these rankings did not consider that feature nor because the applicants polled for these rankings did not. Instead, it is because that feature is less popular among applicants than it is among those program faculty and administrators who originally brainstormed and designed it. By polling current applicants rather than individual program faculties and administrators, the rankings ensure that the class of persons with the most say in these rankings is one that is likely to have recently accessed the very program websites that (presumably) prominently advertise those features of which programs are most proud. Faculty and administrators at one program are highly unlikely to be trawling the program websites of other programs on a regular basis; consequently, they are far less likely to be aware of competitors’ idiosyncratic, online-advertised features.

The methodology for these rankings was devised with the following goals in mind: (1) To better inform applicants of their program options by offering the nation’s first complete listing of graduate creative writing programs; (2) to more widely disseminate hard data regarding objective measures of program quality, which often play a role in applicants’ application and matriculation decisions; (3) to be responsive to, and reflective of, the opinions, interests, concerns, and values of current applicants to graduate creative writing programs; (4) to emphasize the importance of inputs as well as outputs in assessing the overall quality of individual graduate creative writing programs; and (5) to enhance the transparency of the admissions process for present and future applicants to graduate creative writing programs.

Polling Locus
In the twelve months between April 16, 2010, and April 15, 2011, 640 full-residency MFA applicants were surveyed on the highest-trafficked MFA-related website on the Internet, The Creative Writing MFA Blog. Founded on August 21, 2005, this website received approximately 395,000 unique visitors during the polling period, including approximately 635,000 page-loads, 252,000 first-time visitors, and 142,000 returning visitors.

The Creative Writing MFA Blog is a free, public, lightly-moderated discussion blog whose only requirement for viewing is access to a computer; active participation on the board requires a Google account. The site is run by American novelist Tom Kealey and a team of more than seventeen designated moderators, approximately five of whom are active at any one time. The author of this article was a moderator at The Creative Writing MFA Blog for a portion of the above-referenced polling period. Kealey himself was not an active moderator during this period. The Creative Writing MFA Blog has no stated agenda other than to provide accurate and timely information about MFA programs to current and prospective applicants. A sister website, The Creative Writing PhD Blog, caters specifically to doctoral creative writing applicants; as this website is only prominently linked to from The Creative Writing MFA Blog itself, it is here considered an adjunct of the latter website (and indeed, the web address for the Creative Writing PhD Blog is merely the web address of the Creative Writing MFA Blog with a “2” appended at the end).

Tom Kealey, the proprietor of The Creative Writing MFA Blog, is also the author of the top-selling MFA-related book in the United States, per Amazon sales statistics recorded during the polling period. This book, The Creative Writing MFA Handbook (Continuum, 2008) prominently features the web addresses for The Creative Writing MFA Blog and The Suburban Ecstasies. Consequently, even those who conducted their MFA research via print publications were arguably likely to come across the web address for The Creative Writing MFA Blog during the course of their reading. Indeed, as Kealey's book is the only print publication on the American or international market that profiles individual full-residency MFA programs in detail, it has become nearly ubiquitous in the MFA applicant community. As nearly all applicants to creative writing doctoral programs were once MFA applicants, a similarly high degree of familiarity with The Creative Writing MFA Blog may be presumed among this sub-community as well.

Individual users on The Creative Writing MFA Blog were distinguished by their user accounts, and substantial additional measures were taken to prevent duplicate submissions. During the polling period the number of individual accounts active on The Creative Writing MFA Blog was between 1,000 and 1,250, which suggests that the present polling's 640-person cohort represents between one-half and two-thirds of all active patrons on the site during the twelve-month polling period in question. The presence of an unknown number of nonposting members on the site helps explain the high unique visitor count cited above, as well as the fact that even the most respected stat-counter services will sometimes read returning users as first-time users, depending upon an individual user's privacy settings with respect to IP-recognition “cookies.”

Polled applicants were asked to list the programs to which they had applied or intended to apply, and were permitted to adjust these lists during the polling period. Fewer than 10 percent of poll respondents elected to do so.

Period of Data Collection
Only recently-collected data is of use to applicants. These rankings use no polling data more than twelve months old for the full-residency rankings, and a four-year data-set for the newer, smaller-applicant-pool low-residency and doctoral rankings (with breakouts provided, as to the low-residency rankings, for each year of data collection).

By way of comparison, the 2012 U.S. News & World Report higher education rankings use four-year-old (2007) polling data for rankings in the following ten fields of study: Audiology; Clinical Psychology; Fine Arts; Library and Information Studies; Occupational Therapy; Pharmacological Sciences; Physical Therapy; Public Affairs; Social Work; and Speech-Language Pathology. The magazine uses three-year-old (2008) polling data for rankings in these seven fields of study: Criminology; Economics; English; History; Political Science; Psychology; and Sociology. And the magazine uses two-year-old (2009) polling data for rankings in these seven fields of study: Biological Sciences; Chemistry; Computer Science; Earth Science; Mathematics; Physics; and Statistics.

Of the thirty-five fields of study ranked by U.S. News & World Report, only 31 percent were ranked using polling data as fresh as that used by Poets & Writers Magazine for the full-residency creative writing MFA rankings. Specifically, the former magazine’s rankings of degrees in Business, Education, Engineering, Healthcare Management, Law, Medicine, Nursing, Physician Assistance, Public Health, Rehabilitation Counseling, and Veterinary Medicine used polling data only a year old.

Ranking Questionnaires and Program Response Rates
Graduate degree programs in creative writing respond to rankings-related inquiries at a lower rate than do programs in almost any other field of study in the United States. This is one of several reasons these rankings do not seek to poll the opinions of program faculty and administrators as to the quality of peer programs.

The following response rates were reported for questionnaires sent pursuant to the 2012 U.S. News & World Report rankings (in each instance, the field of study is followed by the response rate from all programs polled in the field): Engineering (98 percent); Business (91 percent); Criminology (90 percent); Education (90 percent); Medicine (84 percent); Healthcare Management (76 percent); Statistics (67 percent); Law (66 percent); Public Health (61 percent); Audiology (57 percent); Library and Information Studies (56 percent); Pharmacological Sciences (56 percent); Social Work (56 percent); Occupational Therapy (53 percent); Veterinary Medicine (48 percent); Nursing (47 percent); Computer Science (46 percent); Physician Assistance (45 percent); Sociology (43 percent); Speech-Language Pathology (42 percent); Public Affairs (40 percent); Rehabilitation Counseling (40 percent); Fine Arts (39 percent); Political Science (37 percent); Economics (34 percent); Mathematics (34 percent); Physical Therapy (33 percent); English (31 percent); Physics (31 percent); Earth Sciences (29 percent); Clinical Psychology (28 percent); Chemistry (25 percent); Psychology (25 percent); History (23 percent); and Biological Sciences (15 percent). Respondent institutions in each of these academic fields were aware that their questionnaire responses would not be kept confidential, and that their participation in surveys sponsored by U.S. News & World Report would result in publication of a substantial stock of program-specific data regarding each university polled.

Every two years, the Association of Writers and Writing programs (AWP) sends a questionnaire to programs in the field of creative writing—a field whose administrators and faculty are no longer polled by U.S. News & World Report—in the much the same way U.S. News & World Report does for the thirty-five fields of study listed above. A crucial difference between the two questionnaires, however, is that the AWP questionnaire guarantees anonymity to it respondents; AWP releases no program-specific data or polling results pursuant to its biennial questionnaire. It is worth noting, too, that AWP estimates (as of its 2009 Annual Report) that 34 percent of programs in the field of creative writing have declined to become members of AWP. These programs are not subject to AWP questionnaires.

According to AWP’s publicly-released summary of the program questionnaire it distributed in 2007, between 40 and 60 percent of AWP’s member programs declined to answer AWP’s queries regarding internal admissions and funding data. Specifically, 47 percent of programs declined to reveal how many assistantships they offered annually to incoming students, 61 percent declined to reveal the stipend offered to teaching assistants, 56 percent declined to reveal whether they offered a full-tuition waiver to teaching assistants, 49 percent declined to reveal how many scholarships were offered to incoming students, 55 percent declined to reveal their annual number of applicants, and 52 percent declined to reveal the size of their annual matriculating class. Compounding the incompleteness of the AWP survey was the fact that the Association did not distinguish between low-residency and full-residency programs. Given that low-residency programs do not offer teaching assistantships (as low-residency students are only on campus during brief residencies), this omission was a critical one.

As the polling cohort reached by AWP represents only 66 percent of programs in the field of creative writing, the response rates above must be modified to calculate the percent-of-total-field response rate for AWP’s questionnaire. To the six questions cited above, an average of 46.66 percent of AWP members agreed to respond; on average, then, the response rate for all creative writing programs nationally, at least as to critical funding- and admissions-data queries necessary for the tabulation of national rankings in these categories of assessment, is approximately 30 percent. This would rank the field of creative writing thirtieth out of thirty-six recognized fields of academic study in institutional transparence.

It is reasonable to assume that the AWP questionnaire would have received fewer responses than it did had it required of respondents, as did the U.S. News & World Report questionnaire, public disclosure of all individual and aggregate responses. Consequently, it is fair to conclude that, were creative writing programs queried as other programs of study are annually queried, their combined response rate would rank last of any field of study available to graduate students in the United States. That the AWP questionnaire is proffered biennially by a nonprofit trade organization to its own constituents, whereas the U.S. News & World Report questionnaire is an annual for-profit enterprise sent to unaffiliated institutions by an independent media outlet, suggests that even the conservative calculations above underestimate the opacity of the field of creative writing to inquiries from independent researchers.

Avoidance of Respondent Bias
The most extensive ranking system in the United States, the higher education rankings published annually by U.S. News & World Report, publishes assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school rankings, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program quality rankings published by U.S. News & World Report use the same basic methodology, as stated in the 2012 edition of the magazine’s graduate school rankings: “Rankings…are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” Note that this last provision merely ensures that poll respondents have some basic familiarity with the programs they are rating; it does not ask or encourage respondents to submit an institutional self-assessment.

As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The present rankings improve on this model by polling individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors, but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to evaluate.

Polling Cohort Demographics
Online polling conducted in 2010 using a Google-sponsored polling application suggests that the online MFA applicant community, including the community at The Creative Writing MFA Blog, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Asked, "If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program? ", and given the alphabetically-listed options "Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 poll respondents provided the following responses:

1. Funding (68 percent)                
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)

There is substantial similarity between these poll results and the results of a 2009 poll that asked applicants, "Which of these is most important to your decision about where to apply?", with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of poll responses, were identical to the top four responses in 2010:

1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)

When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?”, 1,929 poll respondents provided the following responses:

1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)

These results are consistent with earlier online polling results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.

Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?”, 686 poll respondents provided the following responses:

1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)

Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?”, 860 poll respondents provided the following responses:

1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)                 

The application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):

1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)

Asked, in 2010, "Why do you want to get a graduate creative writing degree?", and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:

1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)