Poets & Writers
Published on Poets & Writers (https://www.pw.org)

Home > 2012 MFA Rankings: The Methodology

2012 MFA Rankings: The Methodology [1]

by
Seth Abramson
September/October 2011 [2]
9.1.11

Note: The following article refers to the rankings that appeared in the September/October 2011 issue of Poets & Writers Magazine. Our most recent coverage of MFA programs is available in the September/October 2012 issue on newsstands now.


FULL-RESIDENCY RANKINGS
Introduction

The 2012 Poets & Writers Magazine rankings comprise individual rankings for full-residency MFA [3], low-residency MFA [4], and doctoral creative writing programs [5]. Full-residency MFA programs are assessed on the basis of eighteen measures, half of which are ordered rankings and half of which are unranked categories of important program features. While all nine of the full-residency MFA rankings–based measures cannot be classified as scientific, all nine are predicated upon sufficient hard data to be substantially probative. A scientific ranking of MFA or doctoral creative writing programs is not presently possible, as more than half of the nation's full- and low-residency graduate creative writing programs have thus far declined to make public the necessary data (see “Full-Residency Rankings: Ranking Questionnaires and Program Response Rates”).

Four of the nine full-residency MFA rankings are based a survey of a large sample of current MFA applicants. The most important of the four rankings is the ranking upon which the ordering of the programs in the table is based, a ranking that reflects individual fiction and poetry applicant’s varying esteem for the nation's 151 full-residency MFA programs, two creative writing MPW programs, and four MFA-comparable creative writing MA programs (for a discussion of the inclusion of these latter six programs, see “Non-MFA Master’s Degree Programs [6]”). The remaining three poll-based genre rankings are essentially subsets of this first ranking to the extent they offer data relating to various elements of the overall cohort polled: fiction applicants, poetry applicants, and nonfiction applicants (the last of which having been polled for the nonfiction-genre rankings only). For the genre rankings programs are ordered, as with the overall rankings, on the basis of the number of votes received by each full-residency program in that category. Polled respondents cast a vote by stating a present or future intent to apply to the program in question. The top fifty overall vote-getters (with two programs tied for fiftieth), along with the next twenty-five vote-getters (denominated “Honorable Mention” programs), are listed in the rankings table [3], also published in the September/October 2011 [7] print edition of Poets & Writers Magazine. The remaining eight-two full-residency programs are listed in "The Additional Rankings of Full-Residency MFA Programs [8]."

For the genre rankings, programs ranking in the top fifty in poetry and fiction are noted in both the print and online rankings tables, as are programs ranking in the top twenty in nonfiction.

The five hard data–based rankings are as follows: funding, selectivity, fellowship placement, job placement, and student-faculty ratio. These rankings are scientific to the extent that they rank programs on the basis of quantitative data publicly released by the programs themselves, and unscientific to the extent that not every program has released data for every category of assessment. The rankings therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for well over half of the nation's full-residency MFA programs; the remaining programs are primarily smaller, newer, lightly advertised, or nondomestic programs, or else programs with a primarily regional applicant base. As all of these programs have websites, however, and as all of these programs exert exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken, by the rankings, as an indication that these programs fully fund less than 33 percent of their students and do not have an acceptance rate low enough for inclusion in the top fifty in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 9.4 percent for it to be included in the top fifty for selectivity). The rankings are based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program websites are regularly reviewed to determine whether a program has added information to its online profile; program administrators can also e-mail Poets & Writers Magazine directly to draw attention to any substantive website changes.

Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have ranked in the top fifty in the funding category. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity rankings; a program must receive at least eighty applications annually to be considered eligible for this ranking. As to the two placement rankings, these do not rely on programs' promotional materials or their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, were equally eligible for a top fifty ranking in fellowship placement and job placement. Data sufficient to calculate student- faculty ratios for virtually every MFA program in the United States were also readily available.

The overlap between those programs ranked in the top fifty overall and those programs ranked in the top fifty in the other seven categories subject to ranking is significant. One hundred percent of the overall top fifty programs ranked in the top fifty in one or both of the fiction and poetry genres; ninety percent ranked in the top fifty in both poetry and fiction. In nonfiction, fifteen of the top twenty nonfiction programs (75 percent) also ranked in the overall top fifty.

Thirty-eight (75 percent) of the overall top fifty ranked in the top fifty in funding, with another six (12 percent) receiving an Honorable Mention. In all, 87percent of the top fifty full-residency programs ranked in the top fifty for funding or received an Honorable Mention in this measure of program quality. Forty-six (90 percent) of the top fifty programs ranked in the top fifty in selectivity, with thirty-seven (73 percent ) ranking in the top fifty in fellowship placement (and another 4 percent earning Honorable Mention) and thirty-three (65 percent) ranking in the top fifty in job placement. Thirty-one (61 percent) of the top fifty programs ranked in the top fifty in student-faculty ratio, with another six (12 percent) earning an Honorable Mention in the category.

In view of the above, ordering programs on the basis of their overall vote totals also had the effect of placing a special emphasis, in the rankings, on those programs that placed highest in the five hard data rankings.

By definition, statistics-based quality assessment schemes of any kind—whether in the form of rankings or otherwise, whether in the field of education or any other field—perform poorly when it comes to assessing unquantifiable program features and/or features that can only be assessed subjectively by an individual experiencing them firsthand. That such features are most assuredly a vital element of every graduate creative writing program does not and should not preclude the possibility of statistics-based quality assessment schemes operating alongside the primary mechanism programs have to introduce applicants to unique curricular features: their own websites and promotional material. By definition, programs bear primary responsibility for educating prospective applicants regarding program offerings, and no ranking system of any kind could or should usurp this responsibility—especially as no applicant applies to a program without first researching it. Indeed, programs at all times have a captive audience of hundreds if not thousands for their online promotional materials. Some programs may worry that neither these rankings nor any ranking system devisable could adequately capture each graduate creative writing program’s most unique elements; these fears are understandable, but misplaced. Those applicants polled for this ranking had access to individual program websites at all times before, during, and after their formation of a list of where they intended to apply.

If an unquantifiable program feature of which a program is singularly proud does not lead to that program excelling in these rankings, it is neither because these rankings did not consider that feature nor because the applicants polled for these rankings did not. Instead, it is because that feature is less popular among applicants than it is among those program faculty and administrators who originally brainstormed and designed it. By polling current applicants rather than individual program faculties and administrators, the rankings ensure that the class of persons with the most say in these rankings is one that is likely to have recently accessed the very program websites that (presumably) prominently advertise those features of which programs are most proud. Faculty and administrators at one program are highly unlikely to be trawling the program websites of other programs on a regular basis; consequently, they are far less likely to be aware of competitors’ idiosyncratic, online-advertised features.

The methodology for these rankings was devised with the following goals in mind: (1) To better inform applicants of their program options by offering the nation’s first complete listing of graduate creative writing programs; (2) to more widely disseminate hard data regarding objective measures of program quality, which often play a role in applicants’ application and matriculation decisions; (3) to be responsive to, and reflective of, the opinions, interests, concerns, and values of current applicants to graduate creative writing programs; (4) to emphasize the importance of inputs as well as outputs in assessing the overall quality of individual graduate creative writing programs; and (5) to enhance the transparency of the admissions process for present and future applicants to graduate creative writing programs.

Polling Locus
In the twelve months between April 16, 2010, and April 15, 2011, 640 full-residency MFA applicants were surveyed on the highest-trafficked MFA-related website on the Internet, The Creative Writing MFA Blog [9]. Founded on August 21, 2005, this website received approximately 395,000 unique visitors during the polling period, including approximately 635,000 page-loads, 252,000 first-time visitors, and 142,000 returning visitors.

The Creative Writing MFA Blog is a free, public, lightly-moderated discussion blog whose only requirement for viewing is access to a computer; active participation on the board requires a Google account. The site is run by American novelist Tom Kealey and a team of more than seventeen designated moderators, approximately five of whom are active at any one time. The author of this article was a moderator at The Creative Writing MFA Blog for a portion of the above-referenced polling period. Kealey himself was not an active moderator during this period. The Creative Writing MFA Blog has no stated agenda other than to provide accurate and timely information about MFA programs to current and prospective applicants. A sister website, The Creative Writing PhD Blog, caters specifically to doctoral creative writing applicants; as this website is only prominently linked to from The Creative Writing MFA Blog itself, it is here considered an adjunct of the latter website (and indeed, the web address for the Creative Writing PhD Blog is merely the web address of the Creative Writing MFA Blog with a “2” appended at the end).

Tom Kealey, the proprietor of The Creative Writing MFA Blog, is also the author of the top-selling MFA-related book in the United States, per Amazon sales statistics recorded during the polling period. This book, The Creative Writing MFA Handbook (Continuum, 2008) prominently features the web addresses for The Creative Writing MFA Blog and The Suburban Ecstasies [10]. Consequently, even those who conducted their MFA research via print publications were arguably likely to come across the web address for The Creative Writing MFA Blog during the course of their reading. Indeed, as Kealey's book is the only print publication on the American or international market that profiles individual full-residency MFA programs in detail, it has become nearly ubiquitous in the MFA applicant community. As nearly all applicants to creative writing doctoral programs were once MFA applicants, a similarly high degree of familiarity with The Creative Writing MFA Blog may be presumed among this sub-community as well.

Individual users on The Creative Writing MFA Blog were distinguished by their user accounts, and substantial additional measures were taken to prevent duplicate submissions. During the polling period the number of individual accounts active on The Creative Writing MFA Blog was between 1,000 and 1,250, which suggests that the present polling's 640-person cohort represents between one-half and two-thirds of all active patrons on the site during the twelve-month polling period in question. The presence of an unknown number of nonposting members on the site helps explain the high unique visitor count cited above, as well as the fact that even the most respected stat-counter services will sometimes read returning users as first-time users, depending upon an individual user's privacy settings with respect to IP-recognition “cookies.”

Polled applicants were asked to list the programs to which they had applied or intended to apply, and were permitted to adjust these lists during the polling period. Fewer than 10 percent of poll respondents elected to do so.

Period of Data Collection
Only recently-collected data is of use to applicants. These rankings use no polling data more than twelve months old for the full-residency rankings, and a four-year data-set for the newer, smaller-applicant-pool low-residency and doctoral rankings (with breakouts provided, as to the low-residency rankings, for each year of data collection).

By way of comparison, the 2012 U.S. News & World Report higher education rankings use four-year-old (2007) polling data for rankings in the following ten fields of study: Audiology; Clinical Psychology; Fine Arts; Library and Information Studies; Occupational Therapy; Pharmacological Sciences; Physical Therapy; Public Affairs; Social Work; and Speech-Language Pathology. The magazine uses three-year-old (2008) polling data for rankings in these seven fields of study: Criminology; Economics; English; History; Political Science; Psychology; and Sociology. And the magazine uses two-year-old (2009) polling data for rankings in these seven fields of study: Biological Sciences; Chemistry; Computer Science; Earth Science; Mathematics; Physics; and Statistics.

Of the thirty-five fields of study ranked by U.S. News & World Report, only 31 percent were ranked using polling data as fresh as that used by Poets & Writers Magazine for the full-residency creative writing MFA rankings. Specifically, the former magazine’s rankings of degrees in Business, Education, Engineering, Healthcare Management, Law, Medicine, Nursing, Physician Assistance, Public Health, Rehabilitation Counseling, and Veterinary Medicine used polling data only a year old.

Ranking Questionnaires and Program Response Rates
Graduate degree programs in creative writing respond to rankings-related inquiries at a lower rate than do programs in almost any other field of study in the United States. This is one of several reasons these rankings do not seek to poll the opinions of program faculty and administrators as to the quality of peer programs.

The following response rates were reported for questionnaires sent pursuant to the 2012 U.S. News & World Report rankings (in each instance, the field of study is followed by the response rate from all programs polled in the field): Engineering (98 percent); Business (91 percent); Criminology (90 percent); Education (90 percent); Medicine (84 percent); Healthcare Management (76 percent); Statistics (67 percent); Law (66 percent); Public Health (61 percent); Audiology (57 percent); Library and Information Studies (56 percent); Pharmacological Sciences (56 percent); Social Work (56 percent); Occupational Therapy (53 percent); Veterinary Medicine (48 percent); Nursing (47 percent); Computer Science (46 percent); Physician Assistance (45 percent); Sociology (43 percent); Speech-Language Pathology (42 percent); Public Affairs (40 percent); Rehabilitation Counseling (40 percent); Fine Arts (39 percent); Political Science (37 percent); Economics (34 percent); Mathematics (34 percent); Physical Therapy (33 percent); English (31 percent); Physics (31 percent); Earth Sciences (29 percent); Clinical Psychology (28 percent); Chemistry (25 percent); Psychology (25 percent); History (23 percent); and Biological Sciences (15 percent). Respondent institutions in each of these academic fields were aware that their questionnaire responses would not be kept confidential, and that their participation in surveys sponsored by U.S. News & World Report would result in publication of a substantial stock of program-specific data regarding each university polled.

Every two years, the Association of Writers and Writing programs (AWP) sends a questionnaire to programs in the field of creative writing—a field whose administrators and faculty are no longer polled by U.S. News & World Report—in the much the same way U.S. News & World Report does for the thirty-five fields of study listed above. A crucial difference between the two questionnaires, however, is that the AWP questionnaire guarantees anonymity to it respondents; AWP releases no program-specific data or polling results pursuant to its biennial questionnaire. It is worth noting, too, that AWP estimates (as of its 2009 Annual Report) that 34 percent of programs in the field of creative writing have declined to become members of AWP. These programs are not subject to AWP questionnaires.

According to AWP’s publicly-released summary of the program questionnaire it distributed in 2007, between 40 and 60 percent of AWP’s member programs declined to answer AWP’s queries regarding internal admissions and funding data. Specifically, 47 percent of programs declined to reveal how many assistantships they offered annually to incoming students, 61 percent declined to reveal the stipend offered to teaching assistants, 56 percent declined to reveal whether they offered a full-tuition waiver to teaching assistants, 49 percent declined to reveal how many scholarships were offered to incoming students, 55 percent declined to reveal their annual number of applicants, and 52 percent declined to reveal the size of their annual matriculating class. Compounding the incompleteness of the AWP survey was the fact that the Association did not distinguish between low-residency and full-residency programs. Given that low-residency programs do not offer teaching assistantships (as low-residency students are only on campus during brief residencies), this omission was a critical one.

As the polling cohort reached by AWP represents only 66 percent of programs in the field of creative writing, the response rates above must be modified to calculate the percent-of-total-field response rate for AWP’s questionnaire. To the six questions cited above, an average of 46.66 percent of AWP members agreed to respond; on average, then, the response rate for all creative writing programs nationally, at least as to critical funding- and admissions-data queries necessary for the tabulation of national rankings in these categories of assessment, is approximately 30 percent. This would rank the field of creative writing thirtieth out of thirty-six recognized fields of academic study in institutional transparence.

It is reasonable to assume that the AWP questionnaire would have received fewer responses than it did had it required of respondents, as did the U.S. News & World Report questionnaire, public disclosure of all individual and aggregate responses. Consequently, it is fair to conclude that, were creative writing programs queried as other programs of study are annually queried, their combined response rate would rank last of any field of study available to graduate students in the United States. That the AWP questionnaire is proffered biennially by a nonprofit trade organization to its own constituents, whereas the U.S. News & World Report questionnaire is an annual for-profit enterprise sent to unaffiliated institutions by an independent media outlet, suggests that even the conservative calculations above underestimate the opacity of the field of creative writing to inquiries from independent researchers.

Avoidance of Respondent Bias
The most extensive ranking system in the United States, the higher education rankings published annually by U.S. News & World Report, publishes assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school rankings, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program quality rankings published by U.S. News & World Report use the same basic methodology, as stated in the 2012 edition of the magazine’s graduate school rankings: “Rankings…are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” Note that this last provision merely ensures that poll respondents have some basic familiarity with the programs they are rating; it does not ask or encourage respondents to submit an institutional self-assessment.

As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The present rankings improve on this model by polling individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors, but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to evaluate.

Polling Cohort Demographics
Online polling conducted in 2010 using a Google-sponsored polling application suggests that the online MFA applicant community, including the community at The Creative Writing MFA Blog, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Asked, "If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program? ", and given the alphabetically-listed options "Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 poll respondents provided the following responses:

1. Funding (68 percent)                
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)

There is substantial similarity between these poll results and the results of a 2009 poll that asked applicants, "Which of these is most important to your decision about where to apply?", with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of poll responses, were identical to the top four responses in 2010:

1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)

When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?”, 1,929 poll respondents provided the following responses:

1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)

These results are consistent with earlier online polling results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.

Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?”, 686 poll respondents provided the following responses:

1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)

Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?”, 860 poll respondents provided the following responses:

1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)                 

The application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):

1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)

Asked, in 2010, "Why do you want to get a graduate creative writing degree?", and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:

1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)

The Poets & Writers Magazine rankings do not use the above polling data to create a weighting system for the overall rankings. There is a presumption that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.

Were the above data used to create a weighting system for the rankings, many of the nation's most prominent and popular programs would drop from the top fifty rankings altogether. The result would be a series of rankings that poorly reflected the present national consensus on program quality. For instance, under the rankings' current methodology a popular but largely-unfunded MFA program in a major urban center might still appear in the top fifty rankings because even a low standing in the funding, selectivity, student-faculty, fellowship placement, and job placement categories can be counterbalanced by a program's popularity due to location and/or faculty. The popularity of a program's location and faculty is best reflected by privileging applicants' application lists rather than a confluence of these lists and publicly-accessible hard data. To redesign the overall rankings as something other than a direct reflection of current applicant mores would be to ensure that virtually no nonfully funded and/or big-city programs (with only a handful of exceptions) would appear in the overall top fifty rankings, nor many (if any) nonfully funded programs whose appeal lies in large part in the composition of their faculty rosters.

While it’s fair to assume that program popularity going forward may be directly affected by a high or low standing in the funding, selectivity, fellowship-placement, and job-placement rankings, the pace of this trend is arrested, rather than hastened, by the current ranking methodology. If the top fifty programs were ordered according to a weighted ranking system that calculated hard data regarding program funding, selectivity, student-faculty ratio, fellowship placement,and job placement, most large urban programs would drop from the top fifty rankings immediately. The present methodology both registers the relative decline or stagnation in the popularity of certain such programs while allowing for these programs to improve their funding, selectivity, student-faculty ratio, and placement statistics before losing their positions, by applicant consensus, in the top fifty altogether.

Genre of Poll Respondents
Asked in 2010, using a Google-sponsored polling application, “What is your primary genre?”, 701 poll respondents from The Creative Writing MFA Blog provided the following responses:

1. Fiction (53 percent)
2. Poetry (28 percent)
3. Nonfiction (15 percent)
4. Other (2 percent)

Isolating only the 578 poetry and fiction respondents to the above poll question, the results are as follows:

1. Fiction (65 percent)
2. Poetry (35 percent)

This suggests that the potential polling cohort at The Creative Writing MFA Blog is similar in its constitution, in terms of genre affiliation, to the national MFA-applicant cohort. Hard data from twenty MFA programs with available admissions data for both genres (constituting a total of twenty-four data-sets ranging in age from the 2008–2009 admissions cycle to the 2010–2011 admissions cycle) generates a total data-set of 12,368 applicants, 8,730 of these being fiction applicants (70 percent) and 3,638 poetry applicants (30 percent).

This hard data also bears a close statistical resemblance to the actual polling cohort for the 2012 full-residency MFA rankings: during the time period discussed above, 635 MFA applicants with a declared genre were polled for these rankings, including 398 in fiction (63 percent) and 237 in poetry (37 percent). This said, the disparity between the genre demographics of the actual polling cohort and applicants’ genre demographics as recorded by MFA programs suggests that poets were marginally more likely to participate in the present rankings’ program-popularity polling than were fiction writers. Specifically, the actual polling cohort contains 10 percent fewer fiction writers than a fully representative sample would have included. While some substantial portion of this disparity is explained by the margin of error presumptively generated by Google-sponsored polling apparatuses and the relative paucity of two-genre statistical breakouts (only 10 percent of all full-residency MFA programs have such data available from any of the past three admissions cycles), it is nevertheless clear that the present polling under-samples fiction writers by approximately 5 percent. Analysis of existing polling data reveals, however, that such under-sampling has a negligible effect on the final overall rankings. Even those few programs that are substantially more popular among fiction writers than poets could expect to see at most only a one- to three-spot increase in their overall ranking were the actual polling cohort for the rankings fully representative of the national applicant pool in terms of genre affiliation.

Of course, no full-residency graduate creative writing program advertises its fiction, poetry, or nonfiction tracks as being of any more or less importance to its reputation than any other genre track, so an equally valid methodology for the fiction- and poetry-focused overall rankings would be to use a polling demographic equally divided between fiction writers and poets. The present methodology essentially charts a middle course between this option and a fully national-pool-representative polling cohort.

Poll respondents for the 2012 Poets & Writers Magazine rankings were self-selected, and it is the particular and express design of the rankings’ methodology that the polling cohort for these rankings be self-selected. Just as a poll aimed at determining popular car manufacturers might use a self-selecting polling cohort to only compile the responses of the best-researched car buyers—for instance, those who had spent time on websites that allow consumers to compare various available car brands and styles—these rankings do not intend to sample a generic cohort of MFA applicants. Instead, it is the aim of the rankings to primarily if not exclusively catalogue application decisions made by the best-researched MFA applicants (see “Full-Residency Rankings: Polling Locus [11]”).

National Full-Residency Applicant Pool Size
The frequency with which each full-residency MFA program appeared on polled fiction and poetry applicants' application lists was determined by dividing the number of votes for a particular program in both fiction and poetry by the total number of applicants in these two genres polled during the 2010–2011 admissions cycle (640). Because recent applicant-pool hard data is available for thirty-nine full-residency MFA programs, a function of these two data-points was used to estimate the total national full-residency applicant pool in fiction and poetry for the 2010–2011 admission cycle. While such an extrapolation presumes that the users of The Creative Writing MFA Blog were and are demographically similar to those individuals who did not use The Creative Writing MFA Blog to research programs during the polling period (and that those who cast votes on The Creative Writing MFA Blog were demographically similar to those who were patrons but did not), this sampling was used because (1) demographic data for all full- and low-residency applicants is not known or knowable, and (2) there is no particular reason to suspect dramatic demographic differences between the various subgroups cited above, as The Creative Writing MFA Blog is a public website easily accessible by networked computer. Likewise, because user accounts allow website patrons to manage the amount of personal information they release to the public, there is no particular reason for any subset of applicants to feel chilled from casting a vote for whichever programs they favor. While the general tenor of discourse on The Creative Writing MFA Blog is consistent with the applicant-demographic polling described above—for instance, the blog constitutes a community that generally favors more selective over less selective programs, higher-ranked programs over lower-ranked ones, programs with better student-faculty ratios over those with worse, programs in cities and towns popular among younger Americans versus those in less talked-about locales, funded over unfunded programs, and programs with a longer duration over those with a shorter one—these attitudes are consistent with that present conventional wisdom expounded upon at length in most media accounts of the creative writing MFA, as well as the sort of advice about important program features that college creative writing professors give to their MFA-aspirant students every day. There appears to be nothing remarkable about the demographics of those who patronize a free, public, lightly-moderated website like The Creative Writing MFA Blog.

In an open letter published online in 2009, the Association of Writers and Writing Programs (AWP) estimated that there are "more than 13,000 applicants to [full- and low-residency] MFA programs each year." Data collected for the 2012 Poets & Writers Magazine rankings indicate that this estimate is almost certainly incorrect. While no one knows for certain the total number of applicants annually to full-residency programs in the United States, based on the available data the present median estimate for this figure is 2,797. The mean estimate is 3,253; subtracting the two lowest and two highest outliers from the thirty-nine program-based data-points available results in an “adjusted mean” of 3,042. Similar calculations conducted during the 2009–2010 admissions cycle produced similar results: a median of 3,116; a mean of 3,478; and an adjusted mean (dropping the one lowest and one highest outlier from a data-set including sixty-three program-based data-points) of 3,276. These numbers suggest that the 2011 Poets & Writers Magazine rankings polled more than 15 percent of the annual applicant pool to full-residency MFA programs, and that the 2012 Poets & Writers Magazine rankings have now polled more than 20 percent of this population.

As noted in the introductory article for the 2011 Poets & Writers Magazine rankings ("The MFA Revolution: Ranking the Nation's Best Programs," Poets & Writers Magazine, September/October 2010), while the Poets & Writers Magazine rankings are not scientific, they are probative. Whereas scientific rankings (which require demographic data that is, in this case, unavailable both to independent researchers and national trade organizations) traditionally poll, at the state level, well less than a hundredth of one percent of their target population, and national polls typically sample well less than a thousandth of one percent, the sample size for the 2012 Poets & Writers Magazine rankings, in a nod to the necessarily unscientific nature of the polling, is between 2,000 and 20,000 times larger as a percentage of population.

To arrive at this year’s national applicant-pool estimates, the following equation was used:

(640 divided by the number of fiction and poetry votes received by a program in 2010–2011 applicant polling) multiplied by (the number of fiction and poetry applicants reported by that program during the 2010–2011 admissions cycle)

Using the equation above, it was determined that twenty-six of the thirty-nine program-data-based estimates for the annual full-residency applicant pool (67 percent) fell within approximately 1,000 applicants of the above-cited mean of 2,797, and 32 estimates (82 percent) fell within 1,500.

The popularity of programs whose extrapolated national-applicant-pool estimates significantly exceed the adjusted mean may well be undertabulated (ranked lower than what they would have been had the entire national applicant pool been polled) by the polling done for the 2012 Poets & Writers Magazine rankings; conversely, programs whose extrapolated applicant-pool estimates fall significantly below the adjusted mean may well be overtabulated (ranked higher than what they would have been had the entire national applicant pool been polled). These undertabulations and overtabulations are not random; they reflect the fact that those applicants less likely to have been exposed to the present conventional wisdom regarding MFA applications on sites like The Creative Writing MFA Blog are consequently more likely to apply to short-duration, poorly-funded programs in high cost-of-living urban areas. The current conventional wisdom among the online-researching MFA applicant community is that it is advisable to apply to longer-duration, well-funded programs in lower cost-of-living areas. To the extent the polling conducted at The Creative Writing MFA Blog favors better-funded programs or programs that boast some of the qualities referenced earlier in this section, this statistical phenomenon is a conscious mirror-imaging of the general preferences of the best-researched MFA applicants, not an inadvertent byproduct of the rankings' methodology.

Of the ten programs listed below with the highest upward deviation from the adjusted mean—that is, the programs most likely to have been undertabulated by this ranking—not one is fully-funded. Only one is even half fully-funded. Only two of ten are three years in duration; one, in fact, is only a single year in duration. More than half are located in one of four high-cost-of-living locales: Boston, New York City, San Francisco, and Washington, D.C.

Of the twenty programs with the largest downward deviation from the adjusted mean—the programs most likely to have been overtabulated by this ranking—eighteen (90 percent) are fully funded. Of the remaining two programs, one is 70 percent fully funded, and one is an Honorable Mention selection in the funding category and has of late widely advertised its ongoing push to better fund matriculants. Of these twenty programs, nearly two-thirds (65 percent) are three years in duration, and two others (10 percent) offer a substantial percentage of each graduating class third-year funding opportunities. Fifteen of the programs (75 percent) are at public universities.

Below are national applicant-pool estimates, derived from the polling data, for the thirty-nine programs with available annual admissions statistics for 2011, ranked from the lowest estimates for the national full-residency applicant pool in fiction and poetry (programs more likely to be overtabulated) to the highest (programs more likely to be undertabulated). To reiterate, the number in parentheses represents an estimate of the total full-residency MFA applicant pool for 2010–2011 based on the percentage of the 640 MFA applicants polled for the Poets & Writers Magazine rankings who voted for that school and the actual number of applications the school reportedly received.

McNeese State University in Lake Charles, Louisiana (1,176)
Virginia Polytechnic Institute in Blacksburg, Virginia (1,580)
Purdue University in West Lafayette, Indiana (1,697)
University of Minnesota in Minneapolis (1,758) *
University of South Carolina in Columbia (1,768)
University of California in San Diego (1,824)
Louisiana State University in Baton Rouge (1,874)
University of Miami in Florida (1,920)
University of Colorado in Boulder (1,996) *
Indiana University in Bloomington (2,116)
Cornell University in Ithaca, New York (2,201)
Arizona State University in Tempe (2,352)
Johns Hopkins University in Baltimore, Maryland (2,369)
University of Iowa in Iowa City (2,424)
University of Florida in Gainesville (2,600)
Syracuse University in New York (2,601)
University of Nevada in Las Vegas (2,618)
University of Wisconsin in Madison (2,640)
Hollins University in Roanoke, Virginia (2,685) *
University of Mississippi in Oxford (2,797)
Brown University in Providence, Rhode Island (2,800)
University of Virginia in Charlottesville (2,869)                                                                      
Pennsylvania State University in University Park (2,891) *
University of Michigan in Ann Arbor (3,104)
University of Texas in Austin (3,220)
University of Wyoming in Laramie (3,303) *
University of Oregon in Eugene (3,469)
University of California in Irvine (3,491)
Georgia College & State University in Milledgeville (3,576)
New York University in New York City (4,106)
University of Maryland in College Park (4,185)
Colorado State University in Fort Collins (4,221)
University of Montana in Missoula (4,292) *
Hunter College in New York City (4,364) *
George Mason University in Fairfax, Virginia (5,007) *
Brooklyn College in New York City (5,493)
Boston University in Massachusetts (5,847)
North Carolina State University in Raleigh (7,200)                                  
San Francisco State University in California (10,435)

* = Publicly-released applicant-pool data included three genres. A two-genre estimate—using the national-average 6:3:2 distribution of fiction, poetry, and nonfiction applications—has been used to generate this extrapolated figure.

The variation in the figures above reflects the differing practices of applicants who conduct substantial research into programs via online MFA-applicant communities and those who do not. The list reflects that, for example, Boston University in Massachusetts is probably more popular among the total national applicant pool than it is among the 640 users polled on The Creative Writing MFA Blog. That the Iowa Writers' Workshop, whose reputation and name-recognition in the field of graduate creative writing is the most likely of any program to be equivalent across all applicant groups, is only 350 or so applicants off the median national applicant-pool estimate of 2,797 suggests that the Writers' Workshop was one of the most "neutrally-tabulated" programs in these rankings—as no obvious reason exists for individual groups of applicants to be more or less familiar with the much-lauded seventy-five-year-old program. Other programs within one standard deviation of the median include the well-known, longstanding programs at Brown University (0 percent off the median), University of Virginia (3 percent), Hollins University (4 percent), Syracuse University (7 percent), and University of Michigan (11 percent).

The mean for the above estimates is 3,253, and the “adjusted mean” (calculated by dropping the two lowest and two highest outliers) is 3,042.

As the annual applicant-pool estimates provided above relate only to fiction and poetry applications, the traditional 6:3:2 genre ratio (see “Genre Rankings: Cohort”; approximately 18.1 percent of all MFA applicants nationally are nonfiction applicants, though among the applicant community at The Creative Writing MFA Blog this figure is 15.3 percent) can be used to estimate the median and mean number of nonfiction applicants per annum: 622 (the median; in 2010 this was 567), 723 (the mean; in 2010 this was 632), and 676 (the adjusted mean; in 2010 this was 596). These figures are derived directly from the median, mean, and adjusted mean calculations for full-residency fiction and poetry programs recited above.

The above nonfiction applicant-pool estimates cross-check, broadly speaking, with estimates extrapolated from programs with known recent admissions data for nonfiction: University of Iowa (whose 2011 admissions figures produce an estimate of 278 annual applications in nonfiction nationwide); University of Wyoming in Laramie (2010 figures: 253); Sarah Lawrence College in Bronxville, New York (2006: 561); and Rutgers University in Camden, New Jersey (2010: 842). Further confirmation is provided by programs whose three-genre applicant-pool sizes are known but for which a breakdown by genre is unavailable. The nonfiction applicant pool for these programs can be estimated using the national-average 6:3:2 ratio, and this estimate can then be used to estimate the national nonfiction applicant pool in the manner described above for poetry and fiction programs. The result is a series of national nonfiction applicant-pool estimates from the following programs’ 2011 admissions data: Hollins University in Roanoke, Virginia (127); University of Arizona in Tucson (130); University of North Carolina in Wilmington (208); University of Wyoming in Laramie (217); George Mason University in Fairfax, Virginia (260); Pennsylvania State University in University Park (270); University of Notre Dame in South Bend, Indiana (283); University of Alabama in Tuscaloosa (291); University of Minnesota in Minneapolis (296); American University in Washington, D.C. (314); Hunter College in New York City (441); University of Colorado in Boulder (477); Ohio State University in Columbus (527); Columbia University in New York City (652); and University of Montana in Missoula (676). The median for these estimates is 291, while the mean is 345.

One issue with this second method of extrapolation is that highly-ranked nonfiction programs are both more likely to have nonfiction-genre statistical “breakouts” available, and more likely to receive a larger percentage of nonfiction applicants than the usual 6:3:2 ratio would suggest. Some of the national nonfiction applicant-pool estimates above are therefore slightly lower than they should be, and this may explain the difference in the two sets of medians and means derived from the two methods of extrapolation described above.

While these estimates cannot fix with certainty the annual nonfiction applicant pool, that every estimate above is between 125 and 725, with a clear majority falling between 250 and 550, suggests that the correct figure is well under a thousand.

Added to the adjusted median and mean data for fiction and poetry, these nonfiction figures suggest an annual three-genre applicant pool, across all full-residency programs in the United States, of 3,088 (median) or 3,597 (mean). A more cautious approach would be to propose a range: The annual three-genre full-residency applicant pool is likely between 3,000 and 3,750, a drop of several hundred from last year’s estimate of 3,500 to 4,000.

Data Sources
For those program measures not subject to applicant polling, such as rankings and recitations of admissions, curricular, placement, student-faculty ratio, and funding data, only data publicly released by the programs—either to individual applicants, to groups of applicants, in a program's promotional literature, or via a program website—have been included in the rankings table. All data were updated regularly to reflect programs' most recent public disclosures. As previously stated, programs can alert Poets & Writers Magazine to important website updates by e-mailing the magazine directly.

Many of the nation's full- and low-residency MFA programs decline to publicly release internal data (see “Ranking Questionnaires and Program Response Rates [12]”). Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by a ranking system that relies on transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures. As research for these rankings is based entirely on publicly-available, publicly-verifiable data, the accuracy of the data upon which the rankings are based can be readily confirmed by any party.

LOW-RESIDENCY RANKINGS
Structure

Low-residency programs were measured in nine categories, seven of which are rankings—five employing unscientific but probative polling of the sort described above, and two based upon publicly-available hard data. Low-residency programs have not been assessed with respect to their funding packages because these programs generally offer no or very little financial aid to incoming students. The reason for this is that low-residency programs presume their applicants will continue in their present employment during the course of their studies.

Cohort
Over the course of four successive application cycles, a total of 230 low-residency applicants were polled as to their program preferences, with these preferences exhibited in the form of application lists. The locus for this polling was the Poets & Writers online discussion board, the Speakeasy Message Forum [13], widely considered the highest-trafficked low-residency community on the Internet. The relatively small cohort used for this polling accounts for the following: (1) The annual applicant pool for low-residency programs is approximately one-eighth the size of the full-residency applicant pool (see “Full-Residency Rankings: National Full-Residency Applicant Pool Size” and “National Low-Residency Applicant Pool Size [14]”); (2) low-residency applicants do not congregate online in the same way or in the same numbers that full-residency applicants do; and (3) low-residency programs are subject to a "bunching" phenomenon not evident with full-residency programs, with only ten programs nationally appearing on even 10 percent of poll respondents' application lists, and only four appearing on 20 percent or more. For this reason only the top ten low-residency programs [4] have been included in the rankings [4] (also available in the September/October 2011 print edition of Poets & Writers Magazine); below this level it is difficult to draw distinctions between programs, as none received a significant number of votes over the four years polling was conducted.

One explanation for the bunching phenomenon described above may be that low-residency programs are less susceptible to comparison than full-residency programs, as many of the major considerations for full-residency applicants, including location, funding, cohort quality, class size, program duration, student-faculty ratio, job placement, and cost of living, are not major considerations for low-residency applicants due to the structure and mission of low-residency programs. Generally speaking, low-residency programs are assessed on the basis of their faculty and pedagogy, neither of which are conducive to quantification and ranking. That three programs have such a clear advantage in the rankings on the other forty-six operating in the United States, Canada, the United Kingdom, and China is a function of both the relatively recent development of the low-residency model (with older programs tending to be more highly regarded, though none dates to before 1976) and the consensus that appears to have existed for years that three programs in particular are strongest in terms of faculty, selectivity, and placement. It is worth noting, too, that a significant number of the world's forty-nine low-residency MFA programs were founded within the last eight to ten years (see “The Full-Residency Rankings Chart: Additional Program Measures: Dates of Establishment,” below); applicant familiarity with these programs may still be relatively low.

The four-year low-residency polling described above has been further broken down into year-by-year poll results. The cohort for the 2010–2011 annual ranking was thirty-five, for the 2009–2010 annual ranking eighty-eight, for the 2008–2009 ranking fifty-five, and for the 2007–2008 ranking fifty-two. If and when individual Speakeasy account-holders applied to programs in more than one admissions cycle, their application lists from each cycle were treated as separate slates of votes; repeat applicants accounted for less than 10 percent of the polling cohort, however. Full-residency applicants on The Creative Writing MFA Blog who applied to one or more low-residency programs as part of their overall slate of target programs (see "Structure" and "Cohort") were also included in the low-residency voting; due to the exceedingly small number of such votes, these entries were manually compared both to one another and to existing low-residency application lists to ensure duplicate lists were avoided.

While polls with larger cohorts are, all other things being equal, more reliable than those with smaller ones, the fact that the annual applicant pool for low-residency programs is likely between 350 and 400 (see “National Low-Residency Applicant Pool Size [14]”) suggests that the total polling cohort for the 2012 low-residency MFA rankings likely represents approximately 50 percent of a single-year national applicant pool for this sort of degree program. Moreover, as is the case with the full-residency rankings, cross-checking applicant vote totals across a period of four years reveals substantial consistency in the results and quickly unearths any significant anomalies or outliers. Of the ten low-residency programs listed in this year's print rankings, eight (80 percent) ranked in the top ten in all four years of polling, while another was in the top ten for three of the four application cycles studied. All of the programs in the top ten achieved at least an Honorable Mention (a ranking between eleven and fifteen) for all four of the years in which low-residency applicants were polled. In fact, only one current top ten low-residency program has ever ranked lower than twelfth in any individual admissions cycle since 2006 (and even this occurred in only one admissions cycle).

An n.d. notation signifies that a program has not released the requisite data. An em-dash (—) indicates that the program is unranked in that category. Only nine of the nation’s forty-nine low-residency MFA programs achieved a positive score in the national placement rankings, which considered placement data for full- and low-residency programs in a single assessment (and which then, as to the low-residency programs, combined the data for fellowship and job placement into a single table): University of Alaska in Anchorage; Bennington College in Vermont; Goddard College in Plainfield, Vermont; Murray State University in Kentucky; Queens University in Charlotte, North Carolina; University of Southern Maine in Portland; Spalding University in Louisville, Kentucky; Vermont College of Fine Arts in Montpelier; and Warren Wilson College in Swannanoa, North Carolina.

In order to better acknowledge the achievement, in the placement category, of these nine low-residency programs relative to their low-residency peers, and in recognition of the fact that low-residency graduates are substantially less likely to seek postgraduate fellowships or even postgraduate university teaching positions (largely because they do not give up their present employment when they matriculate), the national placement data collected for these rankings have been reconstituted as a low-residency-only ranking.

Due to the still relatively small number of low-residency programs in the United States and abroad, only programs receiving top ten placement in any category of assessment have received a special notation in either the print or online editions of the rankings.

National Low-Residency Applicant Pool Size
A realistic estimate of the annual number of low-residency MFA applicants is four hundred. This estimate is based in part on the fact that the five most-applied-to low-residency programs receive an average of 144 total applications during the 2009–2010 admissions cycle; in contrast, the five most-applied-to full-residency programs received an average of 1,137 fiction and poetry only applications during the same admissions cycle. If this comparison is any guide, approximately eight times as many individuals apply to full-residency programs as low-residency programs each year, suggesting a mean low-residency applicant pool, per year, of between 350 and 400. This figure can then be cross-checked using the number of votes for Warren Wilson College in the present low-residency rankings (95), the total number of low-residency votes cast for the rankings (230), and Warren Wilson's publicly-released annual applicant pool size (200). Using these figures one would expect an annual national low-residency applicant pool of 484. Of the three low-residency programs most highly visible nationally, and therefore the most likely to be neutrally tabulated by these rankings, data is also available, in addition to Warren Wilson College, for Bennington College (whose data suggest an estimated 487 annual low-residency applicants).

In view of the above, the four-year, 230-person sample used for this year's low-residency rankings likely represents approximately one-half of an annual applicant cohort for this type of residency program.

Added to the adjusted mean for annual full-residency poetry, fiction, and nonfiction applicants, the estimate for the annual number of low-residency applicants suggests a total annual applicant pool to creative writing MFA programs—across all genres and types of residency, and gauging discrete applicants only—of somewhere between 3,500 and 4,250. As noted above, this estimate is significantly different from the estimate provided to the public by AWP in 2009 (“[there are] more than 13,000 applicants to MFA programs each year”).

GENRE RANKINGS
Cohort

Between April 16, 2010, and April 15, 2011, 398 fiction applicants were polled for the fiction-genre rankings, 237 poetry applicants were polled for the poetry-genre rankings, and 62 nonfiction applicants were polled for the nonfiction-genre rankings (the total polling cohort for the nonfiction rankings is 163, however, representing a three-year, three-admissions-cycle cohort). The reason for the disparity between the total number of fiction and poetry applicants in the genre-specific polls (635) and the total number of votes in the overall fiction and poetry poll (640) is that five applicants, or less than 1 percent of the cohort polled in fiction and poetry, did not specify their genre—though it was clear from their application lists that the genre in which they applied could not have been nonfiction (due to the fact that the majority of MFA programs do not offer nonfiction tracks, an applicant specifying that he or she has applied in only genre, but who lists certain programs on his or her application list, can be precluded from consideration as a nonfiction applicant). One consequence of this less-than-1 percent nongenre-reporting population is that it is statistically possible for programs to be tied in the overall rankings even when, by virtue of their rankings in the two major genres, it would seem to be a statistical impossibility.

The cohort sizes used in this polling are roughly consistent with the national distribution of MFA applicants by genre, as revealed by those few programs that both (1) accept applicants in all three genres, and (2) release their internal admissions data for all three genres. The national distribution of fiction, poetry, and nonfiction applicants is approximately six to three to two, respectively.

Due to the still relatively small number of nonfiction programs in the United States and abroad, only programs receiving top twenty placement in the genre have received a special notation in either the print or online editions of the rankings. No Honorable Mentions have been awarded, for the following reasons: (1) the relatively small number of votes for programs ranked beyond twentieth in the genre, all of which appeared on fewer than 10 percent of nonfiction applicants' application lists; (2) a bunching phenomenon in the nonfiction rankings, such that any presumptive Honorable Mention section of the nonfiction rankings (programs ranked between twenty-one and twenty-five) would run the risk of being almost half the size of the rankings proper; and (3) there would be little statistical distinction (only one vote) between the presumptive Honorable Mention programs and the six programs ranked behind them—a much smaller disparity, out of a total polling cohort of 163, than the three-vote difference between the top fifty and Honorable Mention sections in the 640-cohort full-residency rankings.

Programs without a nonfiction program are designated, in the top fifty rankings, with n/a (not applicable)

INTERNATIONAL PROGRAMS
Special Note on International Programs

The Poets & Writers Magazine MFA rankings have always considered, and will continue to consider, international MFA programs. However, international programs are unlikely to fare well in the overall rankings for several reasons: (1) nearly all non-U.S./non-Canadian graduate creative writing programs are (by U.S. accreditation standards) nonterminal (that is, they are M.Phil, M.St., or MA degrees, as opposed to the terminal MFA degrees considered by the Poets & Writers rankings); (2) non-U.S./non-Canadian applicants are less likely to frequent a U.S./Canadian-focused MFA website like The Creative Writing MFA Blog, and therefore non-U.S./non-Canadian programs are less likely to appear on the application lists of those polled for these rankings (and Canadian applicants applying to Canadian programs may be less likely to patronize The Creative Writing MFA Blog than American applicants applying to American programs); (3) unlike U.S. and Canadian MFA programs, overseas programs are rarely fully funded for nondomestic students (U.S./Canadian MFA programs less frequently distinguish between domestic and international applicants with respect to funding eligibility), and therefore are less likely to be popular amongst the U.S. and Canadian applicants that frequent The Creative Writing MFA Blog; and (4) due to the exceedingly small number of non-U.S. terminal-degree MFA programs now in operation (94 percent of all creative writing MFA programs now extant are located in the United States, and more than half of those in operation outside the United States were founded within the last forty-eight months), programs in Canada and elsewhere simply have fewer entrants into the international MFA system with which to achieve a top fifty standing in a polling-based overall ranking (as compared to in any of the hard-data categories in this assessment: for instance, funding, selectivity, and postgraduate fellowship or job placement).

NON-MFA MASTER'S DEGREE PROGRAMS
Special Note on MA, MPW, M.Phil, and M.St. Programs

Over the past five years, the present MFA rankings project has catalogued several thousand MFA applicants' application lists. One abiding trend is that only a small number of nonterminal master's degree programs in creative writing can frequently be found on application lists otherwise comprised entirely of terminal-degree MFA programs. As only four MA programs presently answer to this description, these four MA programs have been included in the full-residency rankings for terminal-degree creative writing programs. Applicant mores seem to indicate that these programs are now considered on par with MFA programs in overall quality—if not in the critical feature of terminality. Many of those who attend MA programs in creative writing subsequently apply to terminal-degree MFA programs upon graduation.

Two additional programs are included in these rankings despite being MPW (Master's of Professional Writing) programs. The two reasons for these inclusions are (1) the MPW is arguably a terminal degree, though there is little evidence yet on the question of whether or not it enjoys the same regard in the field of creative writing (as to terminality) as the MFA, and (2) as with the four MA programs referenced above, MPW programs on occasion appear on the application lists of applicants who are otherwise applying only to MFA programs. Two other types of nonterminal creative writing master’s degrees, the M.Phil and the Master of Studies (M.St.), are unknown in the United States, are but offered at several universities in the United Kingdom.

Since polling of MFA applicants began on December 15, 2006, the present MFA rankings project (which, as noted throughout this article, focuses its attention almost exclusively on applicants whose application lists are dominated by terminal MFA programs in creative writing) has registered the following polling results for nonterminal master’s degree programs (results through July 25, 2011):

Top Twenty Non-MFA Creative Writing Master’s (MA, MPW, M.Phil, or M.St.) Degrees, 2006–2011
1. University of California in Davis (63)
2. University of Southern California in Los Angeles (10)
3. Western Washington University in Bellingham (9)
4t. Eastern Michigan University in Ypsilanti (4)
4t. Miami University in Oxford, Ohio (4)                
4t. University of Tennessee in Knoxville (4)
7t. Kansas State University in Manhattan, Kansas (2)
7t. Ohio University in Athens (2)
7t. University of East Anglia in Norwich, England (2)
7t. University of Louisville in Kentucky (2)
11t. California State University in Sonoma (1)
11t. College of Charleston in South Carolina (1)
11t. Concordia University in Montreal, Canada (1)
11t. Goldsmiths, University of London in England (1)
11t. Oxford University in England (1)
11t. Royal Holloway, University of London in England (1)
11t. University of Exeter in England (1)
11t. University of Kent in Canterbury, England (1)
11t. University of Southampton in England (1)
11t. University of Sussex in Brighton and Hove, England (1)

[N = 85].

The low “N,” which corresponds to the size of the polling cohort, of this ranking, coupled with the extremely long duration of the polling period and the general unlikelihood of finding committed MA applicants in an online community designed for MFA applicants, means that these results should be read and disseminated with extreme caution; the statistical confidence for this listing is still relatively low, though a renewed effort to capture and catalog the application preferences of nonterminal creative writing master’s candidates is presently under way. Still, no substantive conclusions ought be drawn about trends in nonterminal creative writing master’s degree applications on the basis of this polling.

THE FULL-RESIDENCY RANKINGS CHART: ADDITIONAL PROGRAM MEASURES
Dates of Establishment

Reciting the dates of establishment for the nation’s top full-residency MFA programs offers a critical historical context for the rankings themselves, the institutions assessed by the rankings, and the very degree that is the focus of both the rankings and the institutions whose attributes the rankings catalogue. This column of data does not apply to nonterminal, academic master’s programs in creative writing (with or without the option of a creative thesis), which are different in form and function from their terminal-degree, art-school MFA peers.

Previous ranking methodologies have leaned heavily on the somewhat tendentious factor of program visibility. When programs are assessed by individuals already within the system, the natural result is that older programs—whatever their selectivity, financial resources, faculty resources, curriculum, pedagogy, or student outcomes—move to the top of the pack due to their profile advantage. Yet applicants report only limited interest in programs’ historical pedigrees, as pedigree itself is often considered a suspect quantity in the national literary arts community. By publishing, for the first time, the dates of establishment of the nation’s top seventy-five full-residency MFA programs, these rankings permit applicants and other consumers of these data to both disassociate historical pedigree from the distinct question of program quality, while also better understanding the historical context in which the creative writing MFA has achieved such cultural prominence.

Creative writing as an academic discipline originated in the late nineteenth century, yet by January of 1964 there was still only one MFA-conferring graduate creative writing program in the world. In fact, though the first MFAs in any field were granted in the 1920s, and the MFA-conferring Iowa Writers’ Workshop was founded in 1936, the MFA as a degree would have no abiding place in the national literary arts community until the 1980s. The 1940s, 1950s, and much of the 1960s were marked by attempts to find alternative models to the one provided by the Iowa Writers’ Workshop: first, in the degree-granting, relatively nonselective, grade-free creative writing program at Black Mountain College, which was founded in the 1930s but had its heyday in the late 1940s and early to mid-1950s; second, in the undergraduate-only creative writing program at Stanford University (founded in 1947 by Wallace Stegner) and other undergraduate programs modeled closely upon this one; and third, in nonterminal MA programs in creative writing founded at a number of institutions, including Johns Hopkins University in Baltimore (1946), University of Denver in Colorado (1947), Cornell University in Ithaca, New York (1948), Indiana University in Bloomington (1948), University of Florida in Gainesville (1948), and Hollins University in Roanoke, Virginia (1960). Some of these latter programs required academic theses of their students rather than creative ones.

Ultimately, the Iowa Writers’ Workshop MFA model became ascendant—after certain of its elements had been experimented with by the types of degree programs listed above—because of a grass-roots campaign by working creative writers (among both faculties and student bodies at various institutions) to gain greater administrative, pedagogical, and creative autonomy from the Academy to which they had previously been attached. Most of the early MFA programs appear to have been founded only after years—in some cases several decades—of struggle between creative writers and university bureaucrats, with the two primary bases for the latter’s objection to the MFA being that it cost much more than the MA to administer (due to the need for greater faculty resources, and the necessity of awarding tuition remission-eligible assistantships to many terminal-degree candidates) and permitted universities less immediate oversight over their resident literary artists. Far from a “cash cow” warmly embraced by U.S. universities, the creative writing MFA was for decades rejected by America’s universities as too exotic, too expensive, and too distant from the traditional academic functions of an American English department.

At the beginning of the 1980s there were still fewer than two dozen creative writing MFA programs in the world. It was not until the turn of the century that the rate of MFA-program creation significantly increased, as indicated by the table below, which catalogues MFA programs’ dates of establishment by decade (as of the writing of this article, well over 80 percent of all MFA programs worldwide had had their dates of establishment tabulated):

MFA Programs Founded, by Decade
1920s: 0
1930s: 1
1940s: 0
1950s: 0
1960s: 11
1970s: 9
1980s: 23
1990s: 29
2000s: 78
2010s: 15 *

* = In the first eighteen months of the decade. At this pace, the 2010s could see the creation of approximately a hundred new MFA programs.

Of the thirty-six programs whose dates of establishment are not considered by this table, preliminary research suggests that approximately twenty-five were founded in the 2000s, six in the 1990s, and five in the 1980s. The conclusion to be drawn from the above data is that the “MFA boom” occurred sometime in the mid-1990s and has not yet abated. Of all the full- and low-residency MFA programs now in existence worldwide, over 75 percent were founded between the early 1990s and the present day.

Funding
Nothing in these rankings' funding assessments is intended to impugn the motives or character of professors, administrators, or staff at any of the nation's graduate creative writing programs. The presumption of these rankings is that all of these groups have and do militate, with varying degrees of success, for more funding for their students—and that, given the choice, every program would choose to be fully funded. Still, there is no question that some programs require virtually no financial outlay by admitted students, and others are institutionally structured to induce students to take out substantial student loans. The rankings take this into account, as funding is an important factor among the current MFA applicant pool when deciding where to apply.

Program funding packages were calculated on the basis of annual cost-of-living-adjusted stipend values for programs with full tuition waivers, and on the basis of annual cost-of-living-adjusted stipend values less annual tuition for programs offering only partial tuition waivers. Programs were further divided into categories on the basis of the percentage of each incoming class offered full funding. "Full funding" is defined as the equivalent of a full tuition waiver and an annual stipend of at least $8,000/academic year. No program offering full funding to less than 100 percent of its incoming class is ranked ahead of any program fully funded for all students. Likewise, no nonfully funded program is ranked ahead of any program in a higher "coverage" bracket. The six coverage brackets acknowledged by the rankings are as follows: "All” (100 percent fully funded); “Nearly All” (90 to 99 percent fully funded); “Most” (60 to 89 percent); “Some” (30 to 59 percent); “Few” (16 to 29 percent); and “Very Few” (0 to 15 percent). All of these percentages refer to the percentage of each annual incoming class that receives a full funding package.

It’s interesting to note that no program fully funding less than 40 percent of its admitted students received a top fifty ranking in the funding category in the 2012 Poets & Writers Magazine MFA rankings. Programs whose coverage bracket is sufficiently high to receive a national ranking, and whose stipend is sufficiently high to meet the definition of full funding, but whose specific annual stipends were unknown at the time the rankings were compiled, were ranked last within their respective coverage brackets. Top fifty programs awarded an Honorable Mention in funding are indicated with a star in the print edition of the rankings. In the online-only third and fourth tiers of the overall rankings, the designation "HM" is used instead.

Programs that fully fund 33 percent or more of their admitted students were considered eligible for package averaging. If and when programs meeting this criterion were revealed to offer funding packages of differing value to different students, the total stipend value of all full-funding packages was divided by the number of such packages to determine average annual stipend value. Because some programs do not advertise special funding offerings available only to select students, not every program benefited from this feature of the rankings. Consistent with the structure and conceit of these rankings, programs exhibiting maximum transparency with respect to their promotional materials were most likely to receive a comprehensive assessment of their total funding package.

The funding rankings take into account duration of funding, as programs were ranked for this measure by multiplying average annual package value by the duration of each program in years. The varying amount of tuition charged at individual programs was disregarded, as students receiving full funding do not, by definition, pay tuition.

Applicants should be aware that many programs deduct administrative fees—almost always less than $1,000, and usually less than $500—from their annual stipends. These fees were not considered by the funding rankings. Moreover, some programs offer health insurance to all admitted students and some do not. Programs that offer health insurance to all admitted students include, but are not limited to, the following (programs are listed in order of funding rank): University of Texas in Austin [Michener Center]; Cornell University in Ithaca, New York; University of Michigan in Ann Arbor; Louisiana State University in Baton Rouge; Ohio State University in Columbus; University of Alabama in Tuscaloosa; Virginia Polytechnic Institute (Virginia Tech) in Blacksburg; Washington University in Saint Louis, Missouri; Arizona State University in Tempe; Iowa State University in Ames; Purdue University in West Lafayette, Indiana; University of Minnesota in Minneapolis; McNeese State University in Lakes Charles, Louisiana; Pennsylvania State University in University Park; University of Iowa in Iowa City; University of Wyoming in Laramie; Vanderbilt University in Nashville; University of Wisconsin in Madison; University of Texas in Austin [English Department]; University of Virginia in Charlottesville; University of California in Irvine; University of Oregon in Eugene; University of Central Florida in Orlando; University of New Mexico in Albuquerque; Rutgers University in Camden, New Jersey; and Oklahoma State University in Stillwater.

Selectivity
As fewer than five full- or low-residency programs nationally publicly release yield data—the percentage of those offered admission to a program who accept their offers and matriculate—the acceptance rate figures used for the national selectivity ranking are necessarily yield-exclusive. Most have been calculated using the simplest and most straightforward method: Taking the size of a program's annual matriculating cohort in all genres and dividing it by the program's total number of annual applications across all genres. Forty of the top fifty programs in selectivity (74 percent, as there are fifty-four programs in the top fifty listing for selectivity) had available admissions data from the 2010–2011 admissions cycle, ten of the top fifty programs in this category (19 percent) most recently released admissions data during the 2009–2010 admissions cycle, and four programs (7 percent) most recently released admissions data during the 2008–2009 admissions cycle or earlier.

The relative paucity of data available for the selectivity rankings—acceptance rates are available for only 112 of the nation’s 200 MFA programs (56 percent)—is attributable to programs' continued reticence in releasing the sort of internal admissions and funding data regularly released by colleges, universities, and most professional degree programs. Hundreds of interviews with MFA applicants between 2006 and 2011 suggest that a program's acceptance rate is one of the top five pieces of information applicants request when researching a graduate creative writing program. Fortunately, all of the top fifty MFA programs have made their annual acceptance rates public either directly or indirectly, along with nineteen of the twenty-five Honorable Mention programs (76 percent).

In order to avoid artificially privileging small programs with an unknown but likely modest annual yield—programs with small applicant pools but also small incoming cohorts, and consequently, in some instances, extremely low yield-exclusive acceptance rates—only programs receiving more than eighty applications annually were eligible for the top fifty in selectivity. Of the fifty-seven full-residency programs with unknown admissions data, the overall polling done for these rankings suggests that no more than ten would even be eligible for inclusion in the top fifty for selectivity on the basis of their applicant-pool size. Whether these programs' annual incoming cohorts are also sufficiently small—and thus the programs, statistically, sufficiently selective—to make any of these programs entrants into the top fifty for selectivity is unknown. The likelihood is that three or fewer programs that would otherwise appear in the top fifty for selectivity are ineligible for that ranking solely because they have thus far declined to publicly release their admissions data.

Of programs with fewer than eighty applications whose admissions data are known, the ten most selective programs are as follows: Florida International University in Miami, #1; University of Kansas in Lawrence, #2; Northern Michigan University in Marquette, #3; Old Dominion University in Norfolk, Virginia, #4; West Virginia University in Morgantown, #5; University of Missouri in Saint Louis, #6; Temple University in Philadelphia, #7; Savannah College of Art & Design in Georgia, #8; Otis College of Art & Design in Los Angeles, #9; and University of Central Florida in Orlando, #10. These program rankings are provisional; admissions data for low-volume MFA programs cannot be considered as probative as data for programs with larger applicant pools.

The small number of low-residency programs with publicly-accessible acceptance rates makes crafting a selectivity ranking for such programs difficult. Of the eighteen programs (37 percent of all low-residency programs) with available data, two-thirds have available admissions data only from the 2007–2008 admissions cycle or earlier. Fortunately, the programs ranked first, second, and third in this measure (and in the other hard-data and polling measures) have all released data from one of their past three admissions cycles. The applicant-pool-size cutoff for inclusion in the low-residency selectivity rankings is set at forty annual applicants.

Fellowship Placement
Programs' postgraduate fellowship placement records were assessed by determining how many individual "placement events" a given program's current students or (much more commonly) graduates achieved during the decade from 2001 to 2011. Only a limited number of fellowships and residencies are available to MFA graduates while in-program or immediately postgraduation, and fewer still are specifically targeted at current MFA students and/or recent MFA graduates. Most of these make publicly available the names and biographical data of their fellows and residents. The focus for this year's fellowship placement rankings was on sixteen of the fellowships and residencies in this group—generally speaking, the nation's sixteen most prestigious mid-MFA/post-MFA fellowships and residencies.

The fellowships and residencies surveyed for this measure were the following: The Stegner Fellowship at Stanford University in Palo Alto, California; the Wisconsin Creative Writing Institute Fellowship at the University of Wisconsin in Madison; the Hodder Fellowship at Princeton University in New Jersey; the Fine Arts Work Center Fellowship at the Fine Arts Work Center in Provincetown, Massachusetts; the Emory University Creative Writing Fellowship in Atlanta; the Stadler Fellowship at Bucknell University in Lewisburg, Pennsylvania; the Axton Fellowship at University of Louisville in Kentucky; the Olive B. O'Connor Fellowship at Colgate University in Hamilton, New York; the Bennett Fellowship/Writer-in-Residence at Phillips Exeter Academy in Exeter, New Hampshire; the James Merrill Writer-in-Residence at the James Merrill House in Stonington, Connecticut; the Amy Clampitt Residency Award at the Amy Clampitt House in Lenox, Massachusetts; the Bread Loaf Writers’ Conference Fellowships (“Waiterships”) at Middlebury College in Vermont; the Gettysburg Emerging Writer Lectureship at Gettysburg College in Pennsylvania; the Ruth Lilly Poetry Fellowships granted by the Poetry Foundation in Chicago; the Sewanee Writers’ Conference Fellowships at the University of the South in Sewanee, Tennessee (including Borchardt, Dakin, Elkin, Justice, McCorkle, Nemerov, Ralston, Sewanee, Taylor, Van Duyn, Wall, and Williams Scholars/Fellows); and the Steinbeck Fellowship for Fiction-Writers at the Center for Steinbeck Studies at San Jose State University in California.

These sixteen fellowships and residencies played host to 625 placement events between 2001 and 2011. As the placement rankings acknowledge placement events rather than placed fellows or residents, it is possible for a single fellow or resident to be the subject of more than one placement event.

As simply ranking programs by the number of their students or graduates subject to placement events between 2001 and 2011 would unfairly favor larger programs (which naturally have more graduates on the fellowship market annually), programs have instead been ranked on the basis of a placement score, calculated as follows: A program's total number of placement events between 2001 and 2011 was divided by the size of the program's annual incoming cohort. The resulting top fifty size-adjusted scores ranged from 2.80 to 0.13. In several instances, programs identical both in size and in their number of placement events received scores resulting in rankings "ties"; where possible, ties were broken by privileging the program with the higher number of total placement events. Programs founded during the assessment period had their scores pro-rated on the basis of how many years (out of the last ten) they had had graduated students on the postgraduate fellowship market.

Because fellowships and residencies draw no distinction between full- and low-residency programs, this is the only measure in which full- and low-residency programs were ranked in a single measure. This said, the low-residency programs were subsequently granted their own numerical ranking, in recognition of the fact that these programs are hampered by the decreased likelihood that their graduates will seek fellowships or residencies in the first instance (as by definition low-residency students already have full- or part-time employment).

These fellowship placement rankings should be used with caution. As selection for a fellowship or residency is often the result of one or more individuals being the "consensus pick(s)" of a panel of judges—and as consensus in art has not always, historically, favored innovation—it is possible for fellows and residents to in some instances be amongst the most talented, but not necessarily the most innovative, of their graduating year's national cohort. This is by no means to impugn, or remark upon, the writing of any particular fellow or resident, or on the selections of any particular fellowship or residency. Instead, the emphasis in this caveat is on causation: Applicants should not presume either that a program with a high placement ranking can ensure them a fellowship or residency, nor that a program with a high placement ranking necessarily hosts the strongest student cohort if innovation, rather than technical mastery, is the particular interest of the applicant. On these points the rankings make no specific claim other than to note these important distinctions.

Job Placement
Between 2008 and 2011, the most popular online discussion board for creative writing job-seekers pursuing full-time employment at the university level, The Academic Jobs Wiki, listed 207 full-time positions available for poets and fiction writers. Data on the individuals ultimately hired for these openings was available for 145 of these 207 listings (70 percent). Searches conducted over consecutive years due to budget freezes were only counted as a single listing. Based on research into the educational credentials of the individuals ultimately hired for these positions, a ranking was created to indicate which graduate creative writing programs’ alumni enjoyed the most success on the academic job market over these three hiring cycles. The following figures may be of academic interest to those tracking employment opportunities for creative writers in higher education:

2008–2009 Hiring Season: Sixty-eight positions available (hire information available for 62 percent); male/female split for those positions with available data for final fires was 57 percent female, 43 percent male.

2009–2010 Hiring Season: Eighty-four positions available, ten of which were holdovers from the previous year (i.e., cancelled or frozen searches from the previous year); hire information available for 52 percent; male/female split for those positions with available data for final hires was 57 percent female, 43 percent male.

2010–2011 Hiring Season: Seventy-five positions available, eight of which were holdovers from the previous year (i.e., cancelled or frozen searches from the previous year); hire information available for 79 percent; male/female split for those positions with available data for final hires was 51 percent male, 49 percent female.

Given that the nation’s two hundred full- and low-residency MFA programs, and thirty-two doctoral programs in creative writing, graduate more than two thousand poets and two thousand fiction writers every year, along with between five hundred and a thousand nonfiction writers (some of whom have qualifications and prior publications in fiction and/or poetry), the data above suggests that each year full-time teaching positions at the university level are available for, on average, well less than 1 percent of graduate creative writing program alumni. Even if graduates were only required to compete for employment against those in their own annual cohort, and even assuming only between 10 and 20 percent of nonfiction program graduates can or do compete for positions advertised for poetry and/or fiction, this figure would be less than 2 percent. Realistically, however, each year’s graduate creative writing program alumni are competing against an ever-increasing stock of unemployed, underemployed, and employed-but-still-job-hunting alumni from previous years.

While polling suggests that only about half of the nation’s creative writing program graduates wish to teach, even this statistic cannot bring an individual degree-holding poet or writer’s employment chances (all things being equal) higher than, at best, 4 percent. Consequently, those graduate creative writing programs with the best track records in terms of job placement—the top ten programs in this measure achieved full-time job-placement rates, during the period assessed, of between 8 and 20 percent—are offering to students significant value-added as they pursue postgraduate employment. Whether higher job placement rates at certain schools are due to stronger alumni networks, better career placement services, better teaching, or simply better students is unclear, though there appears to be a high correlation between a program’s standing in this measure and its standing in other indicia of average cohort quality.

Student-Faculty Ratio
Using data on individual programs’ total student-body sizes, along with recitations of full-time core faculty in fiction, nonfiction, and poetry from the Poets & Writers online MFA database, student-faculty ratios were calculated for the 122 full-residency MFA programs (81 percent of all such programs) with both sets of data available. Tiebreakers in student-faculty ratio were awarded (where necessary and where possible) to the program with the higher number of total core faculty members. Note that this ranking, unlike others in the 2012 Poets & Writers Magazine MFA rankings, takes into account any and all fiction, poetry, and nonfiction faculty and students at individual programs, not merely faculty and students in the former two genres. Along with the top fifty programs in this measure, ten Honorable Mention distinctions were awarded, owing to the large number of programs eligible for (and assessed for) this measure.

Student-faculty ratio is treated, here, as a presumptively rankable program quality. Just as a large percentage of applicants report that they prefer, all things being equal, a more selective program, or a better-funded program, or a program that performs better at placing its graduates in fellowships and full-time jobs post-graduation, generally speaking creative writing graduate students prefer a better student-faculty ratio to a worse one—the better to have immediate and meaningful access to those charged with instructing, mentoring, and advising them. The fifty-nine non-top fifty, non-Honorable Mention full-residency MFA programs (as to this measure) are ranked as follows (with ratio in parentheses after each school):

Non-Top 50/Non-Honorable Mention Programs in Student-Faculty Ratio, with Rank and Ratio

64. The New School in New York City (5.14)
65t. Adelphi University in New York City (5.14)
65t. California Institute of the Arts in Valencia (5.14)
67. Florida State University in Tallahassee (5.46)
68. Texas State University in San Marcos (5.50)
69. University of California in Irvine (5.50)
70t. Saint Mary’s College of California in Moraga (5.63)
70t. University of Colorado in Boulder (5.63)
70t. University of New Hampshire in Durham (5.63)
73t. University of Arkansas in Fayetteville (5.71)
73t. University of Maryland in College Park (5.71)
75. Georgia State University in Atlanta (5.83)
76. University of Houston in Texas (6.00)
77. University of Massachusetts in Boston (6.00)
78t. Southern Connecticut State University in New Haven (6.00)
78t. University of Guelph-Humber in Toronto, Canada (6.00)
80. University of Texas-Pan American in Edinburg (6.00)
81. Northwestern University in Chicago (6.36)
82. Ohio State University in Columbus (6.43)
83. Portland State University in Oregon (6.50)
84. Sarah Lawrence College in Bronxville, New York (6.52)
85. California College of the Arts in Oakland (6.67)
86. University of Alabama in Tuscaloosa (6.88)
87t. Queens College, CUNY (7.00)
87t. University of South Florida in Tampa (7.00)
89. Florida International University in Miami (7.14)
90t. University of Baltimore in Maryland (7.20)
90t. University of Nevada at Las Vegas (7.20)
92. University of British Columbia in Vancouver, Canada (7.50)
93. Florida Atlantic University in Boca Raton (7.50)
94. Chatham University in Pittsburgh (8.00)
95t. Roosevelt University in Chicago (8.00)
95t. University of Memphis in Tennessee (8.00)
97. Western Michigan University in Kalamazoo (8.33)
98. Oklahoma State University in Stillwater (8.33)
99. Columbia College in Chicago (8.38)
100t. American University in Washington, D.C. (8.57)
100t. Eastern Washington University in Cheney (8.57)
100t. San Diego State University in California (8.57)
100t. University of Massachusetts in Amherst (8.57)
100t. University of Utah in Salt Lake City (8.57)
105. George Mason University in Fairfax, Virginia (9.09)
106. Chapman University in Orange, California (9.17)
107. University of Missouri at Saint Louis (9.33)
108. Northeast Ohio MFA (Consortium) (10.00)
109. California State University in Fresno (10.00)
110. Minnesota State University in Mankato (10.00)
111. Lindenwood University in Saint Charles, Missouri (10.00)
112. McNeese State University in Lake Charles, Louisiana (10.00)
113. Emerson College in Boston, Massachusetts (10.42)
114. University of Iowa in Iowa City (10.78)
115. New York University in New York City (12.50)
116. San Jose State University in California (12.50)
117. City College of New York, CUNY (13.33)
118. University of San Francisco in California (14.00)
119. Hamline University in Saint Paul, Minnesota (17.14)
120. San Francisco State University in California (18.90)
121. Mills College in Oakland, California (23.75)
122. Columbia University in New York City (28.33)                 

Program Duration
A program's duration is measured by the average length of its curriculum in years. Some programs allow students to petition to extend their stay; because such petitions are granted on a case-by-case basis, and because the granting of such petitions often results in only an additional unfunded, non-tuition-remitted year of study, individual programs' program-extension policies have not been considered in calculating program duration. Nationally, only one full-residency MFA program is known to be one year in duration, and only two programs are four years in duration. The remaining 148 full-residency programs are either two or three years in duration, with programs ranked in the top fifty significantly more likely to be three years than are programs not ranked in the top fifty. Few programs that fund less than 33 percent of their incoming students are more than two years in duration. In fact, only six non-fully-funded top fifty programs are three years in duration, and all of these are either ranked in the top fifty for funding or else were awarded an Honorable Mention in the category. Fully funded programs are slightly more likely to be three years in duration as opposed to two; 59 percent of the thirty-four fully-funded programs in the top fifty are three years in duration or longer, and an additional three (9 percent) are known to offer substantial numbers of graduates third-year funding in some form or another. Low-residency programs are almost exclusively two years in duration, and yet these programs cannot be measured by their duration because, unlike full-residency programs, they are generally highly flexible with respect to the duration of their students' courses of study.

Assessments of program duration do not consider the availability of postgraduate fellowships, or automatic postgraduate placements, unless these opportunities are guaranteed to all rising third-years in good standing in the program. As applicable, nonguaranteed postgraduate funding opportunities are formally acknowledged in program funding calculations (see “The Full-Residency Rankings Chart: Additional Program Measures: Funding [15]") provided the average number of opportunities available each year is known. All this said, at least one program, Cornell University in Ithaca, New York, so consistently offers a third-year lectureship to all or nearly all of its graduating students that it may nominally be considered a three-year program (some students even receive a fourth-year lectureship as well).

Program Size
In the rankings, the size of a program's annual incoming cohort is expressed using the usual acronyms for magnitude: XS (Extra-Small, an average total of two to nine students, per matriculating class, in the two major genres, fiction and poetry, combined); S (Small, ten to nineteen students); M (Medium, twenty to thirty-one students); L (Large, thirty-two to forty-nine students); and XL (Extra-Large, fifty or more students per year). Because many programs do not include their matriculating class size on their websites, in some instances this data has been extrapolated from other available information. One program, the University of Wisconsin in Madison, was by necessity granted a special dispensation in several categories, as it is the only MFA program in the United States or abroad to admit fiction and poetry students in alternating years. This required two methodological accommodations: (1) using statistical extrapolation for the overall and genre rankings (the program’s previous-year percentage of all votes cast in the “off-year” genre—that is, the percentage of all fiction votes cast for the 2009–2010 application cycle that the University of Wisconsin’s fiction program received—is multiplied by the number of voters in that genre in the current year; this is then added to the actual number of votes the program received in the “on-year” genre); and (2) averaging the class-size figures for the program. Because the program accepts six poets and six fiction writers every two years, the program is treated as having an average annual matriculating class size of six.

Full Funding
Full funding is defined as the equivalent of a full tuition waiver and a minimum $8,000/academic year stipend. Where the tuition waiver offered is less than 100 percent, the program's stipend value is reduced by the amount an admitted student is asked to pay in tuition annually. All stipend values are adjusted for cost of living. Cost of living assessments were made using the website Sperling's Best Places (www.bestplaces.net/COL/default.aspx [16]). Healthcare costs, administrative fees, and student relocation costs were not estimated or considered, nor was the cost of tuition—as students receiving full funding, by the definition of the term used in the rankings, do not pay tuition.

In setting the stipend value for full funding at a cost-of-living-adjusted $8,000/academic year, the rankings make no assertion as to whether this should be considered a living wage. A single individual earning this amount per annum is eligible for the Earned Income Tax Credit when filing federal taxes; however, because the "$8,000/academic year" standard translates to $8,000/nine months, this rate of income accrual in fact extrapolates to an annual income of $10,667. This is still below $13,440—the amount, in 2009 dollars, at which a single individual is phased out completely from the Earned Income Tax Credit.

The rankings acknowledge that MFA students receiving the minimum full-funding stipend may still find themselves borrowing a de minimis amount (defined as less than $3,000/academic year) to help defray the costs of program attendance. For the purposes of this article, the de minimis borrowing level has been set at that rate of borrowing that both puts an applicant out of range of pro-rated EITC coverage and yet results in less than $10,000 in total federal debt during a three-year MFA program. Of the nation's thrity-nine fully-funded full-residency programs, only two are known to offer cost-of-living-adjusted stipends of less than $10,000/academic year.

Cost of Living
The cost of living in the various programs' listed host locations was determined using Sperling's Best Places (www.bestplaces.net/COL/default.aspx [16]). All cost-of-living data were then compared to a randomly-selected national-average-range constant, in this case Ann Arbor, Michigan.

Notations used for cost of living are as follows: Very Low (16 percent or more below Ann Arbor, Michigan, the national-average-range constant for the 2012 rankings); Low (between 6 and 15 percent below); Average (between 5 percent below and 5 percent above); High (between 6 and 15 percent above); and Very High (16 percent or more above). While some students may choose to live outside the boundaries of their program's host location, commuting to an MFA program rather than living near campus includes hidden costs of its own, indeed costs of both a pecuniary and nonpecuniary nature. For this reason, only a program's host location was assessed for this measure. Cost-of-living adjustments were also used to determine the package value at individual programs for the Funding and Full Funding categories (see “Funding [15]” and “Full Funding”).

Teaching Load
While individual applicants' interest in teaching composition, rhetoric, literature, or creative writing to undergraduates will vary, generally speaking the most popular teaching load is a 1/1 (one course to be taught in the fall semester, one in the spring semester). The teaching loads of individual programs have not been ranked per se, yet this 1/1 standard has been used to determine whether a given program's teaching load is "light," "average," or "heavy." That is, because the 1/1 load is the most popular amongst applicants—though it is not the most common teaching load at MFA programs—average annual teaching loads of 0/0, 0/1, 1/0, and 1/1 have been denominated "light" by the rankings. An average annual teaching load of 2/1 or 1/2 (the most common teaching load) is termed "average," while an average annual teaching load of 2/2 is considered "heavy." Note that the term “load” is not used here pejoratively; some applicants will wish to teach more rather than less, even as other applicants prefer to do no teaching whatsoever. At present the rankings take no position whatsoever on the academic or professional value of teaching a large or small number of undergraduate sections per academic year.

The term "average" is used here in two different senses: First, to denote a category of teaching load; second, to indicate that all programs are assessed by their "average" teaching load. Because many programs offer different teaching-load packages to different students, and/or increase or decrease teaching load over the duration of the program, the average (mean) number of courses taught per year per student in each program is used. In some instances, students may request and/or get assigned—once admitted to a program—a larger and therefore better-compensated teaching load. Such additional teaching sections are by no means guaranteed, however, and therefore are not noted in or considered by the rankings chart.

Some programs fund a small enough percentage of admittees through teaching assistantships that to assign such programs an "average teaching load" would be to wrongly imply that admitted students are likely to receive an assistantship. For this reason, programs that offer assistantships to less than one-third of their incoming cohort received an asterisk in the "teaching load" column. Programs eligible for a "teaching load" designation, but which do not publicly disclose the teaching load they assign their teaching assistants, are indicated with a "Not Available" ("N/A") notation.

CGSR Compliance
The Council of Graduate Schools Resolution, also known as the “April 15th Resolution,” states that graduate programs that are signatories to the Resolution shall keep funded offers of admission open through April 15 of each application cycle. Colleges and universities that adhere to the Resolution represent that all of their constituent programs and departments adhere to the terms of the Resolution, which include mailing a copy of the Resolution with all acceptances. Under the terms of the Resolution programs may neither rescind nor threaten to rescind offers of admission to which any funding whatsoever is attached prior to April 15, nor may they explicitly or implicitly indicate to such accepted candidates, in writing or in person or via telephone, that there is any deadline for their matriculation decision other than April 15. Historically, MFA applicants have reported widespread noncompliance with the Resolution, which is problematic for applicants because CGSR-violative programs often require final matriculation decisions from applicants well before they have heard admissions responses from the other programs to which they applied. Applicants increasingly see such acceptances as excessively restrictive of their options and opportunities.

At present, only three CGSR signatories are believed to be noncompliant with the contract they and more than a hundred other universities signed and published for prospective applicants. This said, the CGSR Compliance category does not distinguish between programs known to have already violated the Resolution and those nonsignatories that simply could do so without running afoul of their host universities’ administrative policies. Therefore, while applicants should exercise due diligence and caution in applying to programs that are not CGSR compliant, they should also not presume violations will occur. The best policy is to contact nonsignatory programs directly and inquire regarding their CGSR-related policies; needless to say, some programs will welcome such queries more than others, as of late the question of the CGSR’s viability for creative writing MFA programs has been hotly contested by certain nonsignatory programs.

Any signatory to the CGSR found to be in violation of that contract will be listed as noncompliant, whether or not the program’s host college or university continues to be a CGSR signatory. Compliance inquiries are initiated on the basis of applicant self-reporting; since 2006, fully 100 percent of applicant complaints regarding programs’ CGSR-related policies have been found, following an investigation, to be meritorious. Indeed, in all but one instance the offending program ultimately confessed to the violation.

page_5: 

GRE Required
This category indicates whether or not a program requires applicants to submit Graduate Record Examination (GRE) General Test scores as part of their applications. Generally, programs that offer a substantial portion of incoming students some form of financial aid require these scores, and so applicants are advised to take this test prior to applying in order to avoid artificially limiting their application options. In most instances, student scores are only lightly scrutinized (or simply ignored altogether) by the programs themselves, and instead reviewed—where they are reviewed—by individual universities’ Graduate Colleges, which often have minimum GRE-score requirements (typically very generous ones). Creative writing MFA applicants should not avoid the GRE General Test for fear of the Mathematics portion of the exam; even those programs that do give minor weight to standardized test scores in their admissions processes generally look only at applicants’ Verbal and Analytical Writing scores. At present no programs require the GRE Subject Test in English Literature, though two programs (Johns Hopkins University in Baltimore and Boston University in Massachusetts) strongly suggest that applicants sit for and submit their scores from this exam. Applicants should also be aware that certain university-wide fellowships and grants require the submission of GRE scores. Applicants who do not submit such scores with their applications cannot be considered for these forms of financial aid.

Language Required
This category indicates whether or not a program requires applicants to exhibit proficiency in a foreign language prior to graduation. Some programs with a foreign-language requirement allow applicants to place out of this requirement through the submission and application of prior foreign-language course credits at the college level; other programs require that applicants take an exam (often a reading-knowledge-only translation exam) to show proficiency, regardless of their prior foreign-language experience. At present only a small minority of programs—six of the seventy-five listed in the print edition of the rankings, or 8 percent—have a foreign-language requirement as part of their curriculum. However, the category is presented here due to applicants’ great interest in, and sometimes anxiety about, such prerequisites for graduation.

Cross-genre Study
Certain MFA programs require that individuals who apply and are admitted in a particular genre take only workshops in this “declared” genre while in-program. Other programs permit, or even require, matriculated students to take out-of-genre workshops—and among this group are two further subcategories of programs, those that permit students to take as many out-of-genre workshops as they wish, and those that permit or require only a limited number of out-of-genre workshops.

The past five years of online, public discussions between and amongst MFA applicants suggest that the availability of cross-genre study has become one of the top three concerns for applicants seeking additional curricular information about the programs to which they wish to apply. Many applicants already write in more than one genre, and wish to have their multifaceted talents as literary artists shepherded, rather than impeded, by the programs on their chosen application list; other students are merely curious about genres other than their own, and view their in-program time as a rare opportunity to experiment with modes of literary art other than those with which they are already conversant. A smaller—but growing—subset of the applicant pool is comprised of self-styled “literary artists” rather than simply “poets” or “writers,” and these individuals already incorporate so many different aesthetic traditions into their work that to be limited to either “poetry workshops” or “prose workshops” would (in their view) be a betrayal of their artistic vision. Because the availability of cross-genre study is such a prominent concern amongst the applicant class, it is listed as a separate category here. All data for this category were taken directly from program websites; any program that permits or requires applicants to take out-of-genre workshops, in whatever number, has been listed in this column as a YES. Programs that explicitly prohibit such study are indicated with a NO. Because the tradition, among MFA programs, has been to disallow cross-genre study, programs whose Web sites were silent on the question of such study were also treated as, and are listed in the rankings as, a NO for this measure.

Application Fee
The application fee column lists each program’s application fee for the most recent application cycle. These data are taken from program websites.

The relevance of these data has increased in recent years, as three distinct but related phenomena have been observed in the MFA admissions system over the past five admissions cycles: acceptance rates at the nation’s top programs are steadily declining; applicants are responding to this trend by applying to a larger and larger number of programs each year (the conventional wisdom in 2005 was that the average applicant should apply to eight to ten programs; now, applicants are regularly advised to apply to between twelve and fifteen programs, and more if financially feasible; see “Full-Residency Rankings: Polling Cohort Demographics [17]” section for more information on contemporary applicants’ application-list mores); and the amount of money the average applicant has available to pay application fees has either remained steady or declined, with the vast majority of applicants reporting that they have less than $1,000 available for all MFA-application-related costs.

Given the cost of the Graduate Record Examination (GRE) General Test ($160), the cost per GRE “score report” to be sent to individual programs ($23), and the cost per transcript ordered from an alma mater (costs as high as $50 per transcript at some institutions, and rarely if ever less than $5 per transcript; some applicants, particularly nontraditional and international applicants, must order transcripts from multiple alma maters), applicants are increasingly unable to afford to apply to programs with high application fees. And because of the importance of applicant polling to the current national assessment scheme, programs with higher application fees are likely to receive fewer applications per annum and thus rank lower in the overall rankings relative to their peers. The rankings’ recitation of application fees for the top 50 and Honorable Mention programs in the full-residency rankings is consequently intended to benefit programs as much as applicants; most programs constantly revisit their administrative fee schedules, and in doing so the following data may be of assistance: 

Application Fees by Total Cost, with Breakdown of Number of Top Fifty and Honorable Mention Programs

$0: 2
$5: 0
$10: 0
$15: 0
$20: 0
$25: 0
$30: 3
$35: 3
$40: 6
$45: 1
$50: 16
$55: 8
$60: 8
$65: 9
$70: 5
$75: 7
$80: 2
$85: 1
$90: 1
$95: 1
$100: 0
$105: 0
$110: 0
$115: 0
$120: 0
$125: 2
$130: 0
$135: 0
$140: 0
$145: 0
$150: 1

De minimis fee distinctions—$1 or less—are ignored for the purposes of the above listing. As noted in the print edition of the rankings, certain programs (nine of the seventy-five that appear in the print listings, or 12 percent) offer reduced rates for early, online-applying, or online-paying applicants. Two programs assess no application fee unless and until an applicant is admitted.

THE LOW-RESIDENCY RANKING CHART: ADDITIONAL PROGRAM MEASURES
Genre Availability

Genre Availability measures the number of genres in which a given low-residency program offers either a formal track or an informal concentration or focus. As many programs occasionally offer courses in genres for which they do not offer a track or concentration, it can be difficult to assess which genres are sufficiently supported at a given program that the program is likely to receive creative portfolios in that genre. The aim of the Genre Availability category is to include only those genres in which a program is likely to attract an appreciable number of applicants—as opposed to a genre offered only as an occasional brief-residency course lacking substantial nonresidency, academic-year faculty support.

Residency
Residency measures the number of days per year a low-residency student in the program is required to be on campus. While at present there is no conventional wisdom as to the value of a long residency versus a shorter one, low-residency programs' residencies are typically the only opportunity for students to interact in person with their professors and to meet their classmates; consequently, many applicants to low-residency programs may prefer longer residencies.

DOCTORAL RANKINGS
Introduction

The 2012 Poets & Writers Magazine creative writing doctoral rankings rank, in six measures, all thirty-two creative writing doctoral programs in the United States. Two additional columns are included as well, indicating each program’s CGSR compliance status (see “The Full-Residency Rankings Chart: Additional Program Measures: CGSR Compliance [18]”) and whether or not the program in question also offers a terminal degree in creative writing at the Master’s level (an MFA or an MPW). This latter category is discussed in more detail below.

All of the six ordered rankings in the creative writing doctoral rankings are, as is the case with the ordered rankings in the full-residency MFA chart, unscientific, though all are predicated upon sufficient hard data to be substantially probative. Three of the six unscientific rankings are poll-based: Votes, Funding, and Selectivity. The methodologies for these rankings are consistent in all respects with their MFA-ranking counterparts (see “Full-Residency Rankings: Introduction [19],” “The Full-Residency Rankings Chart: Additional Program Measures: Funding [15],” and “The Full-Residency Rankings Chart: Additional Program Measures: Selectivity [20],” respectively), with the following exceptions: (1) Votes for the doctoral rankings were tabulated in a manner and at a polling locus consistent with the low-residency MFA rankings (see “Low-Residency Rankings: Cohort [21]”) between April 16, 2007, and April 15, 2010; and (2) Polling for the doctoral rankings used a Google-sponsored polling application, on a sub-site of The Creative Writing MFA Blog (a site linked to only from that blog) called The Creative Writing PhD Blog, for the 2010–2011 admissions cycle. The 2010–2011 results were then cross-checked against informal polling occurring concurrently on the discussion board of The Creative Writing MFA Blog and at the Poets & Writers Speakeasy.

A different polling methodology was used for the doctoral rankings than for the full- and low-residency MFA rankings in 2010–2011 because the annual national applicant pool for creative writing doctoral programs is both miniscule and decentralized. Aggregating sufficient polling responses for a reliable tabulation has been the work of several years, and the “N” for these rankings remains slightly lower, relative to a the annual national cohort of applicants for this type of degree program, than for either of the other two rankings discussed in this methodology article (see “National Doctoral Applicant Pool Size”).

As indicated in the Introduction to this article, an across-the-board scientific ranking of doctoral creative writing programs is not presently possible, as more than half of the nation's eligible graduate creative writing programs have thus far declined to make public the necessary data (see “Full-Residency Rankings: Ranking Questionnaires and Program Response Rates [12]”).

Over the last two years, the domestic and international listing of creative writing doctoral programs located in the right-hand sidebar of The Suburban Ecstasies—the largest such listing available—has been, according to a Google-sponsored internal stat-counting application, the third-most-visited page on the website. This emphasizes both the growing popularity of doctoral creative writing programs (prompted, most likely, by the growing number of full-residency MFA graduates seeking further creative writing study) and the fact that the vast majority of such programs are not in the United States, and thus are more difficult for American applicants to discover or investigate. According to The Suburban Ecstasies, there are currently ninety-six doctoral creative writing programs in the world, only a third of which (32) are in the United States. Nevertheless, four years of applicant polling on U.S.-based websites suggests that American applicants are highly unlikely to apply to overseas creative writing doctoral programs, with the most frequently cited reason being the cost of these programs. Substantial financial aid to overseas doctoral applicants is rarely available at creative writing doctoral programs in the United Kingdom, the Republic of Ireland, Australia, or the Philippines. Because no non-U.S. creative writing doctoral program received more than one vote from applicants in the polling conducted for the Poets & Writers Magazine doctoral rankings between 2007 and 2010, poll respondents for the 2010–2011 admissions cycle were not explicitly provided a listing of overseas creative writing doctoral programs with which to indicate they had submitted applications. Instead, an Other” category was provided, along with a full listing of U.S. creative writing doctoral programs. According to the polling, only 6 of 467 doctoral applications (1.3 percent) were submitted to an unlisted program, not all of which were necessarily overseas creative writing doctoral programs. (Worth noting from this data, too, is that the average creative writing doctoral applicant applied to just under five programs, a substantially smaller figure than for full-residency MFA applicants; see “Full-Residency Rankings: Polling Cohort Demographics [17].") Nevertheless, for the 2011–2012 doctoral-applicant polling, an Other (UK) option will be provided, along with an Other (US) option, as more than 70 percent of non-U.S. creative writing doctoral programs are in the United Kingdom.

Throughout the creative writing doctoral program rankings chart, all special notations used—for instance, to denote Honorable Mention and unranked programs—are identical in appearance and purpose as for the full- and low-residency MFA rankings (see, generally, “Full-Residency Rankings [22]” and “Low-Residency Rankings [23]").

National Doctoral Applicant Pool Size
The frequency with which each doctoral program appeared on polled fiction and poetry doctoral applicants' application lists may be determined by dividing the number of votes for a particular program in both fiction and poetry by the total number of applicants in these two genres polled during the past four admissions cycles (145). Because recent applicant-pool hard data is available for five creative writing doctoral programs (five other data-sets are available, but these combine master’s and doctoral applications and so cannot readily be used for this purpose), it is possible to use a function of these two data-points to estimate the size of the national creative writing doctoral pool in fiction and poetry for the 2010-2011 admission cycle. While such an extrapolation presumes that the users of The Creative Writing MFA Blog and The Creative Writing PhD Blog were and are demographically similar to those individuals who did not use these websites to research programs during the polling period (and that those who cast votes on these websites were demographically similar to those who were patrons but did not), such unscientific sampling is necessary because (1) demographic data for all creative writing doctoral applicants is not known or knowable, and (2) there is no particular reason to suspect dramatic demographic differences between the various sub-groups cited above, as The Creative Writing MFA Blog and The Creative Writing PhD Blog are public websites easily accessible by networked computer. Likewise, because user accounts allow website patrons to manage the amount of personal information they release to the public, there is no particular reason for any subset of applicants to feel chilled from casting a vote for whichever programs they favored.

While the general tenor of discourse on The Creative Writing MFA Blog and The Creative Writing PhD Blog is consistent with the polling described above (see “Full-Residency Rankings: Polling Cohort Demographics [17]”)—for instance, these are communities that generally favor more selective over less selective programs, higher-ranked programs over lower-ranked ones, programs with better student-to-faculty ratios over those with worse, programs in cities and towns popular among younger Americans versus those in less talked-about locales, funded over unfunded programs, programs with a longer duration over those with a shorter one—these attitudes are consistent with that present conventional wisdom expounded upon at length in most recent media accounts of graduate creative writing programs, as well as the sort of advice about important program features that college and graduate creative writing professors give to their students every day. There appears to be nothing remarkable about the demographics of those who patronize free, public, lightly-moderated websites like The Creative Writing MFA Blog and The Creative Writing PhD Blog.

To arrive at national applicant-pool estimates the following equation was used (for more detail on this equation, see “Full-Residency Rankings: National Full-Residency Applicant Pool Size [24]”):

(145 divided by the number of fiction and poetry votes received by a program in 2006–2011 applicant polling) multiplied by (the number of fiction and poetry applicants reported by that program during the 2010–2011 admissions cycle)

Using the equation above, the following national doctoral-applicant pool size extrapolations were made:

University of Cincinnati in Ohio (149)
Texas Tech University in Lubbock (250)
University of Nebraska in Lincoln (307)
University of Denver in Colorado (453)
University of Southern California in Los Angeles (659)

The most substantial outlier here, that extrapolation attributed to the University of Southern California in Los Angeles, may be partly explained by the presence of MPW applicants as well as doctoral applicants in the admissions hard-data used for the extrapolation. None of the other four programs offer a terminal master’s degree in creative writing.

While the data above is too sparse to permit speculation, with any degree of precision, regarding the annual national applicant pool for creative writing doctoral programs, it does suggest that the total figure is likely well under one thousand, meaning that the 2012 Poets & Writers Magazine creative writing doctoral program rankings have surveyed, for their overall popularity measure, more than 15 percent of an annual cohort of such applicants. This compares favorably to similar percentages for full- and low-residency programs (see “Full-Residency Rankings: Introduction” and “Low-Residency Rankings,” above).

THE DOCTORAL RANKINGS CHART: ADDITIONAL PROGRAM MEASURES
Creative Writing Job Placement

The methodology for this measure was identical to that employed for full- and low-residency MFA programs (see “The Full-Residency Rankings Chart: Additional Program Measures: Job Placement [25]”) with the exception that job placement “events” involving graduates of creative writing doctoral programs were used to create a separate and distinct job placement ranking for such programs. Necessarily, the “N” for this measure was much lower than for full-residency programs, though also much higher than for low-residency programs, whose graduates usually do not enter the academic job market upon graduation due to their in-program employment status. In all, 33 of the 145 hiring events analyzed between 2008 and 2011 (23 percent) involved individuals with creative writing doctoral degrees.

Departmental Job Placement and Departmental Reputation
In 2010, the National Research Council (NRC), a federally-funded entity, released its first comprehensive analysis of the nation’s doctoral programs in the humanities, social sciences, and sciences in fifteen years. NRC analyses are generally considered the “gold standard” for doctoral rankings in the United States, though the NRC has never provided an assessment of creative writing doctoral programs in its more than a quarter-century of providing roughly-once-per-decade assessments of the nation’s doctoral degrees. Data collection for the 2010 report began in June 2006, with the final release of the report delayed several times between 2008 and 2010.

All of the 2010 NRC data are available on www.phds.org [26]. On this site, prospective doctoral-program applicants can not only view NRC data but also use an online interface to rank colleges and universities’ constituent graduate departments by various measures. Three measures available to prospective doctoral applicants in all fields are “Academic Quality: NRC Survey-Based Quality Score” (a statistical rating of departments’ academic quality based on a questionnaire sent to individuals working in the same field as the department being assessed), “Academic Quality: NRC Regression-Based Quality Score” (a statistical rating of departments’ academic quality based on a multi-variable regression analysis conducted by professional statisticians), and “Placement Rate” (described by the NRC as a measure calculating the percentage of “doctoral recipients [who] have a job or a postdoctoral position…at graduation”). Note that, as to this last measure, no distinction was made by the NRC between full-time academic employment for doctoral-program graduates and employment of any other kind. In contrast, the “Creative Writing Job Placement” measure in the 2012 Poets & Writers Magazine doctoral-program rankings considers only graduates’ success in securing full-time employment in academic positions in higher education.

For the Departmental Job–Placement category, the NRC “placement rate” rankings were reconfigured to only include programs that offer a creative writing doctoral degree in fiction and poetry. Nevertheless, prospective applicants must remember that this is departmental, not program-specific, data, and so these rankings correspond to the performance of individual colleges and universities’ entire English departments, not merely those students within those departments who are studying creative writing. This data is provided for prospective creative writing doctoral-program applicants because job placement data specifically relating to creative writing doctorate-holders is unavailable for nearly all creative writing doctoral programs in the United States and abroad. Consequently, NRC placement-rate data specific to individual English departments is the best available alternative for applicants to creative writing doctoral programs interested in prospective programs’ postgraduate job-placement records.

For the Departmental–Reputation category, the two NRC “Quality Measures” (one survey-based, one regression-based) were averaged—with each score given equal weight—to produce a single score for each of the thirty-two creative writing doctoral programs in the United States. These scores were then ordered automatically by the www.phds.org [26] user interface. The purpose of this category is to give prospective doctoral-program applicants some insight into the academic pedigree attached to the universities to which they plan to apply. While the value of a creative writing doctoral degree on the academic job market is as yet unproven—due to the relatively low numbers of such graduates on the job market (see “The Doctoral Rankings Chart: Additional Program Measures: Creative Writing Job Placement”)—because university hiring committees are often organized by department, not by subject area, many graduates of creative writing doctoral programs will ultimately have their job applications considered by English department faculty familiar with the reputations of various English departments but not those departments’ graduate creative writing tracks. Consequently, measuring the respective pedigrees of the thirty-two English departments with such tracks is a useful exercise for creative writing doctoral-program applicants.

More information about the NRC rankings (formally known as the United States National Research Council Assessment of Research Doctorate Programs) can be found at the Council’s website, sites.nationalacademies.org/PGA/Resdoc [27]. Available for free on this website is the Council’s July 2009-published methodology document, A Guide to the Methodology of the National Research Council Assessment of Doctorate Programs.

Offers MFA/MPW
Creative writing doctoral program applicants are divided on whether it is preferable for a creative writing doctoral program to also offer a terminal master’s degree in creative writing, and for this reason data on such “bi-terminality” at certain doctoral programs is provided in the 2012 Poets & Writers Magazine doctoral rankings. Some applicants report preferring English departments that offer two different terminal degrees in creative writing, because this suggests both a high level of support for creative writing on the part of the host institution, and because it necessarily means a larger and more vibrant in-program creative writing community than would otherwise be the case. Other applicants believe that asking creative writing doctoral students to workshop alongside individuals who have not yet earned a terminal degree in creative writing dilutes cohort quality and unsuitably alters a program’s workshop dynamic. Conversely, some MPW and MFA applicants may be interested in this data because programs that offer doctoral degrees in creative writing alongside master’s degrees in creative writing permit generally younger, less-experienced students to workshop with individuals who are already performing at a high level within their genre. On the value of “bi-terminal” English departments the rankings take no position; these data are provided solely to offer prospective master’s and doctoral applicants in creative writing more information about prospective programs than they would otherwise have.

Seth Abramson is the author of Northerners (Western Michigan University Press, 2011), winner of the 2010 Green Rose Prize from New Issues Poetry & Prose, and The Suburban Ecstasies (Ghost Road Press, 2009). A contributing author to The Creative Writing MFA Handbook (Continuum, 2008) and a 2008 recipient of the J. Howard and Barbara M. J. Wood Prize from Poetry, his work has recently appeared in Best New Poets 2008, American Poetry Review, Boston Review, New York Quarterly, and elsewhere. Currently a doctoral candidate in English at the University of Wisconsin in Madison, he is a graduate of Harvard Law School and the Iowa Writers' Workshop.


Source URL:https://www.pw.org/content/2012_mfa_rankings_the_methodology

Links
[1] https://www.pw.org/content/2012_mfa_rankings_the_methodology [2] https://www.pw.org/content/septemberoctober_2011 [3] http://www.pw.org/content/2012_mfa_rankings_the_top_fifty [4] http://www.pw.org/content/2012_mfa_rankings_the_lowresidency_top_ten [5] http://www.pw.org/content/2012_creative_writing_doctoral_program_rankings_the_top_fifteen [6] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=3#Nonmfa [7] http://www.pw.org/content/septemberoctober_2011 [8] http://www.pw.org/content/2012_mfa_rankings_full_residency_programs [9] http://creative-writing-mfa-handbook.blogspot.com/ [10] http://sethabramson.blogspot.com/ [11] http://www.pw.org/content/2012_mfa_rankings_the_methodology#PollingLocus [12] http://www.pw.org/content/2012_mfa_rankings_the_methodology#RankingQuestionnaires [13] http://www.pw.org/speakeasy/ [14] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=3#LowResPoolSize [15] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=3#Funding [16] http://www.bestplaces.net/COL/default.aspx [17] http://www.pw.org/content/2012_mfa_rankings_the_methodology#pollingcohort [18] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=4#CGSR [19] http://www.pw.org/content/2012_mfa_rankings_the_methodology [20] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=3#selectivity [21] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=2#Cohort [22] http://www.pw.org/content/2012_mfa_rankings_the_methodology#Full-ResidencyRankings [23] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=2#Low-ResidencyRankings [24] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=2#FullPoolSize [25] http://www.pw.org/content/2012_mfa_rankings_the_methodology?article_page=4#job [26] http://www.phds.org/ [27] http://sites.nationalacademies.org/PGA/Resdoc/