Poets & Writers
Published on Poets & Writers (https://www.pw.org)

Home > 2011 Poets & Writers Magazine Ranking of MFA Programs: A Guide to the Methodology

2011 Poets & Writers Magazine Ranking of MFA Programs: A Guide to the Methodology [1]

by
Seth Abramson
September/October 2010 [2]
9.1.10

 

FULL-RESIDENCY RANKINGS

Introduction
The 2011 Poets & Writers Magazine MFA rankings are comprised of individual rankings for both full-residency and low-residency programs. The full-residency programs are assessed on the basis of sixteen measures, half of which are in the nature of ordered rankings and half of which are unranked recitations of important program features. All eight of the full-residency rankings-based measures are unscientific, though all eight are predicated upon sufficient hard data to be substantially probative. A scientific ranking of MFA programs is not presently possible, as more than half of the nation's full- and low-residency programs have thus far declined to make public the necessary data (see below).

Four of the eight full-residency rankings are based upon unscientific polling of a large sample of current MFA applicants. These rankings are discussed in significant detail throughout this article. The most important of the four rankings is the ranking upon which the ordering of the programs in the chart is based, a ranking predicated upon individual fiction and poetry applicants' varying esteem for the nation's 148 full-residency MFA programs. The remaining three poll-based "genre" rankings are essentially subsets of this first ranking to the extent they offer data relating to various elements of the overall cohort polled: fiction applicants, poetry applicants, and nonfiction applicants. Programs are ordered, as with the "overall" rankings, on the basis of the number of votes received by each MFA program in that category. Polled respondents cast a "vote" by stating a present or future intent to apply to the program in question. The top fifty "overall" vote-getters are listed in the rankings chart [3], also published in the September/October 2010 [2] print edition of Poets & Writers Magazine (with two programs tied for fiftieth), with the remaining 97 MFA programs listed in "The Additional Rankings of Full-Residency MFA Programs [4]."

As to the genre rankings, programs ranking in the top fifty in poetry and fiction are noted in both the print and online rankings charts, as are programs ranking in the top twenty in nonfiction.

The four hard data-based rankings are as follows: total funding, annual funding, selectivity, and placement. These rankings are scientific to the extent that they rank programs on the basis of quantitative data publicly released by the programs themselves, though they are unscientific to the extent that not every program has released data for every category of assessment. The rankings therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for approximately half of the nation's full-residency MFA programs; the remaining programs are primarily smaller, newer, lightly advertised, or nondomestic programs, or else programs with a primarily regional applicant base. As all of these programs have Web sites, however, and as all of these programs exert exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken, by the rankings, as an indication that these programs fully fund less than 33% of their students and do not have an acceptance rate low enough for inclusion in the top 50 in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 11.7% for it to be included in the selectivity ranking). The rankings are based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program Web sites are regularly reviewed to determine whether a program has added information to its online profile; program administrators can also e-mail the author of this methodology article to draw attention to any substantive Web site changes.

Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have ranked in the top 50 in either of the two funding categories. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity rankings; a program must receive at least 100 applications annually to be considered eligible for the ranking in this category. As to the placement rankings, these do not rely on programs' promotional materials or their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, were equally eligible for a top 50 ranking.

The overlap between those programs ranked in the top 50 overall and those programs ranked in the top 50 in the other seven categories subject to ranking is significant. Ninety-eight percent of the overall top 50 programs ranked in the top 50 in one or both of the fiction and poetry genres—and the one top 50 program that failed to achieve this status missed the cut by one vote. Forty-four of the overall top 50 (86%) ranked in the top 50 in both poetry and fiction. In nonfiction, 20 of the top 30 nonfiction programs (67%) also ranked in the overall top 50.

Thirty-two (63%) of the overall top 50 ranked in the top 50 in funding, with another seven (14%) receiving an Honorable Mention (see below for definitions). In all, 77% of the top 50 full-residency programs ranked in the top 50 for funding or received an Honorable Mention in this measure of program quality. Forty-six (90%) of the top 50 programs ranked in the top 50 in selectivity, with 36 (71%) ranking in the top 50 in placement. Of the 29% of the top 50 MFA programs that did not rank in the top 50 for placement, nearly two-thirds were hampered by the fact that they were founded in the midst of the twelve-year assessment period for this measure. Programs disadvantaged in this way include the programs at University of Wyoming in Laramie, University of Mississippi in Oxford, University of Illinois in Urbana-Campaign, University of Nevada in Las Vegas, Vanderbilt University in Nashville, Tennessee, Louisiana State University in Baton Rouge, The New School in New York City, Virginia Polytechnic Institute [Virginia Tech] in Blacksburg, and Purdue University in West Lafayette, Indiana.

In view of the above, ordering programs on the basis of their overall vote totals also had the effect of placing a special emphasis, in the rankings, on those programs that placed highest in the four hard data rankings.

In reading the rankings and this methodology article, several principles should be kept in mind: (1) MFA programs are not for everyone, and many poets and writers will find their energies better spent elsewhere as they attempt to explore and augment their existing talents; (2) no poet or writer should feel that they must attend an MFA program, whether such a concern is related to employment, networking, or personal artistic improvement and achievement; (3) MFA students must remain on guard against sacrificing their unique aesthetic, political, and cultural perspectives on the altar of consensus, as MFA programs are ideally for an exchange of diverse opinions, not hothouses for groupthink or aesthetic dogmatism; (4) an MFA in no way guarantees one postgraduate employment, as the MFA is a nonprofessional, largely unmarketable degree whose value lies in the time it gives one to write, not any perceived (and illusory) advantage it may offer in the networking, publishing, or employment arenas; (5) in view of the preceding, it is unwise to go into any debt for an MFA degree; (6) holding an MFA degree does not, in itself, make one more or less likely to be a successful poet or writer, nor should those with MFA degrees consider themselves in any respect better equipped, purely on the basis of their degree, for the myriad challenges of a writing life; (7) the MFA, as an art-school degree, is not time-sensitive, and many poets and writers will find the experience of an MFA more rewarding if they have first pursued, for several years, other avenues of self-discovery and civic engagement; (8) the MFA rankings are not intended to increase applicant anxiety, reduce applicants' application and matriculation decisions to a numbers game, or define prestige as a function of pedigree rather than program factors that genuinely enrich the lives of real poets and writers (e.g., funding, a strong cohort, strong teaching, a vibrant and welcoming location and community)—instead, their aim is to maximize the information at applicants' fingertips.

The hope is that these rankings will better position applicants to make an important life choice, one which (necessarily) will finally be made, and must be made, using the rankings as only a secondary resource. Specifically, the goals of these rankings and their methodology are the following: Less overall student debt among MFA graduates, more transparency in the promotional materials and public disclosures of existing MFA programs, and greater access, for applicants, to the wealth of conventional wisdom in the MFA applicant community about which programs and which program features are most conducive to a memorable and valuable MFA experience. Ideally, the MFA offers aspiring poets and writers several years of funded time to write in a mutually-inspiring community; to the extent some may see in the MFA unresolved dangers for the future of American poetry and fiction, these rankings are as committed—in their own way—to the avoidance of these dangers as are those who have argued passionately for the abolition of the MFA degree altogether. A better-funded and more transparent national MFA system will be of greater benefit to artists in the long run than the wholesale termination and dismantling of the system.

Cohort
In the nine months between July 15, 2009, and April 15, 2010, 527 full-residency MFA applicants were polled on the highest-trafficked MFA-related Web site on the Internet, The MFA Blog. Founded on August 21, 2005, this Web site received 410,000 unique visitors during the polling period, including 706,000 page-loads, 276,000 first-time visitors, and 134,000 returning visitors. (The site's StatCounter.com stat-counter was operational as of August 17, 2009; consequently, the actual Web-traffic during the polling period was higher than is listed here.)

The MFA Blog is a free, public, moderated discussion blog whose only requirement for viewing is access to a computer; active participation on the board requires a Google account. The site is run by American novelist Tom Kealey and a team of more than twenty designated moderators, approximately five of whom are active at any one time. The author of this article was a moderator at The MFA Blog for a portion of the polling period. Kealey himself was not an active moderator during this period. The Web site has no stated agenda other than to provide accurate and timely information about MFA programs to current and prospective applicants.

Online polling conducted in 2009 using a Google-sponsored polling application suggests that the online MFA applicant community, including the community at The MFA Blog, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Specifically, polling of more than 250 current applicants to MFA programs revealed the following:

· Asked, "Which of these is most important to your decision about where to apply?", and given the options "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above," with the option to select more than one answer, the top four answers were as follows: Funding, 56%; Reputation, 45%; Location, 32%; Faculty, 18%; and

· Asked, "Why do you want to get a graduate creative writing degree?", and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers were as follows: Time to Write, 55%; Employability, 43%; and Mentoring, 36%.

The Poets & Writers Magazine rankings have not, to date, used the above polling data to create a weighting system for the overall rankings. There is a presumption that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.

Were the above polling used to create a weighting system for the rankings, many of the nation's most prominent and popular programs would drop from the top 50 rankings altogether. The result would be a series of rankings that poorly reflected the present national consensus on program quality. For instance, under the rankings' current methodology a popular but largely-unfunded MFA program in a major urban center might yet appear in the top 50 rankings because even a low standing in the funding, selectivity, and placement categories can be counterbalanced by a program's popularity due to location. The popularity of a program's location is best reflected by privileging applicants' application lists rather than a confluence of these lists and scientifically-gathered, publicly-accessible hard data. To redesign the overall rankings as something other than a direct reflection of current applicant mores would be to ensure that no nonfully funded and/or big-city program (with only one or two exceptions) would appear in the overall top 50 rankings.

While current trends suggest that program popularity going forward will be directly affected by a high or low standing in the funding, placement, and selectivity categories, the pace of this trend is arrested, rather than hastened, by the current ranking methodology. Whereas a weighted ranking system focusing on hard funding, selectivity, and placement data would remove most large-cohort urban programs from the national rankings immediately, the present methodology both registers the relative decline or stagnation in the popularity of such programs while ensuring that these programs have sufficient time to improve their funding, selectivity, and placement statistics before they are removed, by applicant consensus, from the top 50 altogether.

Polling Locus
Spring 2010 Google Web searches for the individual terms "creative + writing + MFA," "CW + MFA," "poetry + MFA," "fiction + MFA," "MFA + questions," "creative + writing + MFA + blog," and "MFA + blog" returned The MFA Blog as the top worldwide hit in each instance. Several other contemporaneous searches resulted in "top five" worldwide hits: "MFA + program"; "MFA + applicant"; "MFA + application"; "nonfiction + MFA"; "MFA + resource"; and "MFA + response + times." Given the visibility of the site for online-researching MFA applicants, the extended duration of the polling period, and the regularity with which the polling question regarding applicants' application lists was posed, a correlation is presumed between that group of MFA applicants who used online research tools during the 2009–10 application cycle, and that group of applicants at least casually conversant with The MFA Blog. Tom Kealey, the proprietor of The MFA Blog, is also the author of the top-selling MFA-related book in the United States, per Amazon sales statistics recorded during the polling period. This book, The Creative Writing MFA Handbook, prominently features the Web address for The MFA Blog. Consequently even those who conducted their MFA research via print publications were arguably likely to come across the Web address for The MFA Blog during the course of their reading. Indeed, as Kealey's book is the only print publication on the American or international market that profiles individual full-residency MFA programs in detail, it has become nearly ubiquitous in the MFA applicant community.

Individual users on The MFA Blog were distinguished by their user accounts, and substantial additional measures were taken to prevent duplicate submissions. During the polling period the number of individual accounts active on The MFA Blog was between 1,000 and 1,500, which suggests that the present polling's 527-person cohort represents between one-half and one-third of all active patrons on the site during the nine-month period in question. The presence of an unknown number of nonposting members on the site helps explain the high unique visitor count cited above, as well as the fact that even the most respected stat-counter services will sometimes read returning users as first-time users, depending upon an individual user's privacy settings with respect to IP-recognition cookies.

Polled applicants were asked to list the programs to which they had applied or intended to apply, and were permitted to adjust these lists during the polling period. Fewer than 10% of poll respondents elected to do so.

National Full-Residency Applicant Pool Size
The frequency with which each full-residency MFA program appeared on polled fiction and poetry applicants' application lists may be determined by dividing the number of votes for a particular program in both fiction and poetry by the total number of applicants in these two genres polled during the 2009–10 admissions cycle. Because recent applicant-pool hard data is available for 63 full-residency MFA programs, it is possible to use a function of these two data-points to estimate the total number of applications received in fiction and poetry by full-residency MFA programs. While such an extrapolation presumes that the users of The MFA Blog were and are demographically similar to those individuals who did not use The MFA Blog to research programs during the polling period (and that those who cast votes on The MFA Blog were demographically similar to those who were patrons but did not), such unscientific sampling is necessary because (1) demographic data for all full- and low-residency applicants is not known or knowable, and (2) there is no particular reason to suspect dramatic demographic differences between the various sub-groups cited above, as The MFA Blog is a public Web site easily accessible by networked computer. Likewise, because user accounts allow Web site patrons to manage the amount of personal information they release to the public, there is no particular reason for any subset of applicants to feel chilled from casting a vote for whichever programs they favored. While the general tenor of discourse on The MFA Blog is consistent with the polling described above—for instance, it is a community that generally favors funded over unfunded programs—these attitudes are consistent with that present conventional wisdom expounded upon at length in most recent media accounts of the creative writing MFA. There appears to be nothing remarkable about the demographics of those who patronize a free, public, lightly-moderated Web site like The MFA Blog.

In a document released in 2009, the Association of Writers and Writing Programs (AWP) estimated that there are "more than 13,000 applicants to [full- and low-residency] MFA programs each year." Data collected for the 2011 Poets & Writers Magazine rankings indicate that this estimate is likely incorrect. While no one knows for certain the total number of applicants annually to full-residency programs in the United States, based on the available data the present median estimate for the annual applicant pool for full-residency programs is 3,116 applicants. The mean estimate is 3,478; subtracting two substantial outliers from the 63 program-based data-points available results in an adjusted mean of 3,276. Similar calculations, using data collected in 2008 and 2009, produced similar results, with program-data-based estimates ranging from 2,500 to 4,000 annual applicants to full-residency MFA programs. These numbers suggest that the 2011 Poets & Writers Magazine rankings polled more than 15% of the annual applicant pool to full-residency MFA programs.

As noted in the introductory article for the print edition of the 2011 rankings, while the rankings are not scientific, they are probative. Whereas scientific rankings (which require demographic data that is, in this case, unavailable both to independent researchers and national trade organizations) traditionally poll, at the state level, well less than a hundredth of one percent of their target population, and national polls typically sample well less than a thousandth of one percent, the sample-size here, in a nod to the necessarily unscientific nature of the polling, is between 1,500 and 15,000 times larger as a percentage of population.

To arrive at national applicant-pool estimates the following equation was used:

(527 / number of fiction and poetry votes received by a program in 2009–10 applicant polling) x (number of fiction and poetry applicants reported by that program during the most recent admissions cycle for which data is available)

Using the equation above, it was determined that 36 of the 63 program-data-based estimates for the annual full-residency applicant pool (57.1%) fell within 1,000 applicants of the adjusted mean of 3,276, and 51 estimates (81.0%) fell within 1,500.

The popularity of programs whose extrapolated national-applicant-pool estimates significantly exceed the adjusted mean may well be under-tabulated (ranked lower than what they would have been had the entire national applicant pool been polled) by the polling done for the 2011 Poets & Writers Magazine rankings; conversely, programs whose extrapolated applicant-pool estimates fall significantly below the adjusted mean may well be over-tabulated (ranked higher than what they would have been had the entire national applicant pool been polled). These under-tabulations and over-tabulations are not random; they reflect the fact that those applicants less likely to have been exposed to the present conventional wisdom regarding MFA applications on sites like the MFA Blog are consequently more likely to apply to short-duration, poorly-funded programs in high cost-of-living urban areas. The current conventional wisdom among the online-researching MFA applicant community is that it is advisable to apply to longer-duration, well-funded programs in lower cost-of-living areas. To the extent the polling conducted at The MFA Blog favors better-funded programs, this bias is a conscious mirror-imaging of the bias of the most well-researched MFA applicants, and not an inadvertent byproduct of the rankings' methodology.

Of the 17 programs listed below with the highest upward deviation from the adjusted mean—the programs most likely to have been under-tabulated by this ranking in comparison to known applicant-pool figures—not one is fully-funded. Not one is half fully-funded. Not one is three years in duration; one, in fact, is only a single year in duration. Thirteen of the seventeen (76.5%) are located in one of six high-cost-of-living locales: Boston, Chicago, Los Angeles, New York City, San Francisco, and Washington, D.C. The remaining four have other, distinct reasons for possible under-tabulation, including, variously, noncompliance with the CGSR Resolution, a significant consideration amongst MFA applicants in The MFA Blog community; significantly higher placement in the mid-1990s rankings of creative writing programs no longer used by patrons of The MFA Blog; and an appeal and notoriety based in part on factors other than the quality of the university's MFA program. For instance, a program with an extremely popular creative writing doctorate program might receive disproportionately more word-of-mouth among those who do not research programs via an MFA-focused community online.

Of the 17 programs with the largest downward deviation from the adjusted mean—the programs most likely to have been over-tabulated by this ranking in comparison to known applicant-pool figures—thirteen (76.5%) are fully funded. Of the remaining four programs, one was advertised as fully funded at the time the polling for these rankings was conducted, one fully funds all admittees as to tuition but does not offer assistantships to all accepted students, and one ranks among the better-funded larger programs in the United States—the conventional wisdom among online-researching applicants being that it is advisable to apply to at least one slightly larger, lower-selectivity program. Of these 17 programs, more than half are three years in duration, with one program (the most presumptively over-tabulated program) being four years in length. Other than Tucson, Arizona (pop. 542,000), the largest host locale amongst these 18 programs is Greensboro, North Carolina (pop. 258,000).

Below are national applicant-pool estimates, derived from the polling data, for the 63 programs with available annual admissions statistics, ranked from the lowest estimates for the national full-residency applicant pool in fiction and poetry (programs more likely to be over-tabulated) to the highest (programs more likely to be under-tabulated). To reiterate, the number in parentheses represents an estimate of the total full-residency MFA applicant pool for 2009–10 based on the percentage of the 527 polled who voted for that school and the actual number of applications that school reportedly received.

University of Alabama in Tuscaloosa (1,347)

University of Notre Dame in Indiana (1,561) *

Virginia Polytechnic Institute [Virginia Tech] in Blacksburg (1,735)

University of Illinois in Urbana-Champaign (1,824)

University of Arizona in Tucson (1,829) *

Purdue University in West Lafayette, Indiana (1,957)

University of Maryland in College Park (1,991)

University of North Carolina in Greensboro (2,012)

University of Wyoming in Laramie (2,029) *

University of Florida in Gainesville (2,076)

University of Mississippi in Oxford (2,147)

Syracuse University in New York (2,178)

Bowling Green State University in Ohio (2,245)

Louisiana State University in Baton Rouge (2,272)

Indiana University in Bloomington (2,314)

Ohio State University in Columbus (2,436) *

Southern Illinois University in Carbondale (2,441)

University of California in Irvine (2,489)

Arizona State University in Tempe (2,531)

Johns Hopkins University in Baltimore (2,577)

University of Wisconsin in Madison (2,636)

Hollins University in Roanoke, Virginia (2,729)

University of Arkansas in Fayetteville (2,792)

Portland State University in Oregon (2,812)

University of Idaho in Moscow, Idaho (2,863)

University of Minnesota in Minneapolis (2,907)

George Mason University in Fairfax, Virginia (2,920) *

University of Montana in Missoula (2,929) *

University of Michigan in Ann Arbor (2,976)

University of Houston in Texas (3,024)

North Carolina State University in Raleigh (3,088)

New York University in New York City (3,116) [median]

University of Iowa in Iowa City (3,138)

Rutgers University in Newark, New Jersey (3,281)

University of Texas in Austin (3,294)

University of Virginia in Charlottesville (3,398)

Vanderbilt University in Nashville (3,404)

Texas State University in San Marcos (3,426)

Brown University in Providence, Rhode Island (3,477)

Washington University in Saint Louis, Missouri (3,529)

University of Nevada in Las Vegas (3,556)

California College of the Arts in San Francisco, California (3,570)

University of Massachusetts in Amherst (3,667)

Pennsylvania State University in University Park (3,606)

Cornell University in Ithaca, New York (3,699)

University of Oregon in Eugene (3,800)

University of North Carolina in Wilmington (3,904)

CalArts in Valencia, California (3,919)

American University in Washington, D.C. (4,287)

Hunter College in New York City (4,304) *

Boston University in Massachusetts (4,367)

Columbia University in New York City (4,385) *

Brooklyn College in New York (4,417)

University of Washington in Seattle (4,546)

University of San Francisco in California (5,217) *

School of the Art Institute of Chicago in Illinois (5,950)

Sarah Lawrence College in Bronxville, New York (5,961)

The New School in New York City (5,969) *

Florida State University in Tallahassee (6,337)

Emerson College in Boston, Massachusetts (6,610)

San Francisco State University in California (7,572)

Saint Mary's College of California in Moraga (8,400)

Rutgers University in Camden, New Jersey (10,894)

* = Publicly-released applicant-pool data included three genres. A two-genre estimate—using the national-average 6:3:2 distribution of fiction, poetry, and nonfiction applications—has been used to generate this extrapolated figure.

The variation in the figures above reflects the differing practices of applicants who conduct substantial research into programs via online MFA-applicant communities and those who do not. The list reflects that, for example, Rutgers University in Camden, New Jersey, is probably more popular among the total national applicant pool than it is among the 527 users polled on The MFA Blog. That the Iowa Writers' Workshop, whose reputation and name-recognition in the field of graduate creative writing is the most likely of any program to be equivalent across all applicant groups, is only a mere 22 applicants off the median estimate of 3,116 suggests that the Writers' Workshop was the most "neutrally-tabulated" program in these rankings—as no obvious reason exists for individual groups of applicants to be more or less familiar with the much-lauded 75 year-old program.

Several other credibly-funded programs with long-standing national reputations both in print, online, and through word-of-mouth are likewise exceedingly close to the median estimate of the national applicant pool cited above, including New York University in New York City (0% off the median), University of Houston in Texas (2.95%), University of Michigan in Ann Arbor (4.49%), University of Texas in Austin (5.71%), University of Montana in Missoula (6.00%), and University of Virginia in Charlottesville (9.05%).

As the annual applicant-pool estimates provided above relate only to fiction and poetry applications, the traditional 6:3:2 genre ratio (see "Genre Rankings: Cohort," below) can be used to estimate the median and mean number of nonfiction applicants per annum: 567 (median), 632 (mean), and 596 (adjusted mean). These figures are derived directly from the median, mean, and adjusted mean calculations for full-residency fiction and poetry programs (see above). These estimates cross-check, broadly speaking, with estimates extrapolated from programs with known nonfiction admissions data: University of Iowa (whose admissions figures produce an estimate of 248 annual applications in nonfiction nationwide); University of Wyoming in Laramie (253); Sarah Lawrence College in Bronxville, New York (561); and Rutgers University in Camden, New Jersey (842). Further confirmation is provided by programs whose three-genre applicant pools are known but for which a breakdown by genre is unavailable. The nonfiction applicant pool for these programs can be estimated using the 6:3:2 ratio. The result is a series of estimates from the following programs: University of Pittsburgh in Pennsylvania (whose admissions figures produce an estimate of 101 annual applications in nonfiction nationwide); University of Arizona in Tucson (119); Eastern Washington University in Cheney (154); University of Notre Dame in Indiana (196); George Mason University in Fairfax, Virginia (259); University of Alabama in Tuscaloosa (345); Ohio State University in Columbus (463); University of San Francisco in California (556); Hunter College [CUNY] in New York City (575); and Columbia University in New York City (688).

While these estimates cannot fix with certainty the annual nonfiction applicant pool, that every estimate above is between 100 and 900, with a clear majority falling between 200 and 600, suggests that the correct figure is well under a thousand. Further, the outlying estimates here (101 and 119) are from programs ranked much higher in nonfiction than in any other genre; the above-referenced 6:3:2 genre ratio may therefore underestimate these programs' actual number of nonfiction applicants, thereby artificially decreasing their national applicant pool projections.

Added to the adjusted mean data for fiction and poetry, these nonfiction figures suggest an annual three-genre applicant pool, across all full-residency programs in the United States, of 3,872. A more cautious approach would be to propose a range: The annual three-genre full-residency applicant pool is likely between 3,500 and 4,000.

Data Sources
For those program measures not subject to applicant polling, such as rankings and recitations of admissions, curricular, and funding data, only data publicly released by the programs—either to individual applicants, to groups of applicants, in a program's promotional literature, or via a program Web site—have been included in the rankings chart. All data were updated regularly to reflect programs' most recent public disclosures.

Many of the nation's full- and low-residency MFA programs decline to publicly release internal data. In 2007, between 40% and 60% of the nation's MFA programs declined to answer questions on an AWP questionnaire seeking admissions and funding data from member programs. Specifically, 47% of programs declined to reveal how many assistantships they offered annually to incoming students; 61% declined to reveal the stipend offered to teaching assistants; 56% declined to reveal whether they offered a full tuition waiver to teaching assistants; 49% declined to reveal how many scholarships were offered to incoming students; 55% declined to reveal their annual number of applicants; and 52% declined to reveal the size of their annual matriculating class. Compounding the incompleteness of the AWP survey was the fact that the Association did not distinguish between low-residency and full-residency programs. Given that low-residency programs do not offer teaching assistantships (as low-residency students are only on campus during brief residencies), this omission was a critical one. Likewise, because AWP surveys are only sent to AWP members, and AWP has previously indicated in public disclosures that 33% of U.S. creative writing programs are not AWP members, the 2007 survey's polling cohort (142 MFA programs) was missing as many as 71 potential respondents.

Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by a ranking system that promotes and rewards transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures. As research for these rankings is based entirely on publicly-available, publicly-verifiable data, (1) the accuracy of the data upon which the rankings are based can be readily confirmed by any party, and (2) programs can easily optimize their involvement in the rankings by ensuring their applicants have access to all of the data prospective students generally require in making application and matriculation decisions.

Programs were not contacted directly for these rankings for a variety of reasons: (1) As indicated above, past attempts by AWP, the national trade organization for creative writing programs, to secure even bare-majority participation by its member programs via a nationwide data-disclosure project were unsuccessful (and AWP member programs presumably owe more, not less, institutional fealty to AWP than to any independent nonprofit or freelance journalist); (2) the human resources required to track down internal admissions data for nearly two hundred MFA programs, many of which do not wish to release such data, would likely be prohibitive for any independent nonprofit organization or freelance investigative journalist; (3) to the extent the present rankings seek to actively promote program transparency, it would be counterintuitive for the rankings to reward programs willing to selectively leak data to members of the media through private channels, but not, via publicly-accessible channels, to the public-at-large; (4) unless 100% compliance with a nationwide data-disclosure project could be ensured, any attempt to reach programs individually—rather than place the responsibility for disclosure of admissions, curricular, and funding data on the programs themselves—will necessarily favor those programs researchers are able to successfully contact. This places the onus for proof of "equivalent due diligence" (as to each program) on researchers rather than where it belongs, on the programs themselves. The programs, not their assessors, are the "bearers of least burden" with respect to due diligence in the release of these data, as they only stand to benefit from increased transparency and are entirely in control of their internal data and program Web sites at all times.

LOW-RESIDENCY RANKINGS


Structure

Low-residency programs were measured in eight categories, six of which are rankings—four employing unscientific but probative polling of the sort described above, and two based upon publicly-available hard data. Low-residency programs have not been assessed with respect to their funding packages because these programs generally offer no or very little financial aid to incoming students. The reason for this is that low-residency programs presume their applicants will continue in their present employment during the course of their studies.

Cohort
Over the course of three successive application cycles, a total of 195 low-residency applicants were polled as to their program preferences, with these preferences exhibited in the form of application lists. The locus for this polling was the Poets & Writers online discussion board, The Speakeasy [5], widely considered the highest-trafficked low-residency community on the Internet. The relatively small cohort used for this polling accounts for the following: (1) The annual applicant pool for low-residency programs is approximately one-eighth the size of the full-residency applicant pool (see below); (2) low-residency applicants do not congregate online in the same way or in the same numbers that full-residency applicants do; and (3) low-residency programs are subject to a "bunching" phenomenon not evident with full-residency programs, with only eight programs nationally appearing on even 10% of poll respondents' application lists, and only three appearing on 20% or more. For this reason only the top ten low-residency programs have been included in the rankings [6] (also available in the September/October 2010 print edition of Poets & Writers Magazine); below this level it is difficult to draw distinctions between programs, as none received a significant number of votes over the three years polling was conducted.

One explanation for the bunching phenomenon described above may be that low-residency programs are less susceptible to comparison than full-residency programs, as many of the major considerations for full-residency applicants, including location, funding, cohort quality, class size, duration, and cost of living, are not major considerations for low-residency applicants due to the structure and mission of low-residency programs. Generally speaking, low-residency programs are assessed on the basis of their faculty and pedagogy, neither of which are conducive to quantification and ranking. That three programs have such a clear advantage in the rankings on the other 43 operating in the United States, Canada, the United Kingdom, and China is a function of both the relatively recent development of the low-residency model (with older programs tending to be more highly regarded, though none dates to before 1976) and the consensus that appears to have existed for years that three programs in particular are strongest in terms of faculty, selectivity, and placement. It is worth noting, too, that a significant number of the world's 46 low-residency MFA programs were founded within the last eight to ten years; applicant familiarity with these programs may still be relatively low.

The three-year low-residency polling described above has been further broken down into year-by-year poll results. The cohort for the 2009–10 annual ranking was 88, for the 2008–09 ranking 55, and for the 2007–08 ranking 52. If and when individual account-users applied to programs in more than one admissions cycle, their application lists from each cycle were treated as separate slates of votes; repeat applicants accounted for less than 10% of the polling cohort, however. Full-residency applicants on The MFA Blog who applied to one or more low-residency programs as part of their overall slate of target programs (see "Structure" and "Cohort" under the header "Full-Residency Rankings," above) were also included in the low-residency voting; due to the exceedingly small number of such votes, these entries were manually compared both to one another and to existing low-residency application lists to ensure duplicate lists were avoided.

While polls with larger cohorts are, all other things being equal, more reliable than those with smaller ones, the fact that the annual applicant pool for low-residency programs is likely between 400 and 500 (see below) suggests that even the 2007–08 single-year low-residency rankings polled a substantial percentage of all applicants nationally during that application cycle. Moreover, as is the case with the full-residency rankings, cross-checking applicant vote totals across a period of three years reveals substantial consistency in the results and quickly unearths any significant anomalies or outliers. Of the ten low-residency programs listed in this year's print rankings, eight (80%) ranked in the top 10 in all three years of polling, while another was in the top 10 for two of the three application cycles studied. All of the programs in the top 10 achieved at least an Honorable Mention (a ranking between 11 and 15) for all three of the years in which low-residency applicants were polled.

An "N/A" notation signifies that a program has not released the requisite data. An asterisk indicates that the program is unranked in that category. Only five low-residency programs achieved a positive score in the national placement ranking, which considered placement data for full- and low-residency programs in a single assessment: Vermont College of Fine Arts in Montpelier (#17 nationally); Warren Wilson College in Swannanoa, North Carolina (#38); Bennington College in Vermont (#41); University of Alaska in Anchorage (#46); and Queens University of Charlotte, North Carolina (#53). In order to better acknowledge the achievement, in the placement category, of these five low-residency programs relative to their low-residency peers, and in recognition of the fact that low-residency graduates are substantially less likely to seek postgraduate fellowships (largely because they do not give up their present employment when they matriculate), the rankings above have been re-constituted as low-residency-only: Vermont College of Fine Arts, #1; Warren Wilson College, #2; Bennington College, #3; University of Alaska, Anchorage, #4; and Queens University of Charlotte, #5.

Due to the still relatively small number of low-residency programs in the United States and abroad, only programs receiving top 10 placement in any category of assessment have received a special notation in either the print or online editions of the rankings.

National Low-Residency Applicant Pool
A realistic estimate of the annual number of low-residency MFA applicants is 400. This estimate is based in part on the fact that the five most-applied-to low-residency programs receive an average of 144 total applications per year; in contrast, the five most-applied-to full-residency programs receive an average of 1,137 fiction and poetry only applications per year. If this comparison is any guide, approximately eight times as many individuals apply to full-residency programs as low-residency programs each year, suggesting a mean low-residency applicant pool, per year, of just over 400. This figure can then be cross-checked using the number of votes for Warren Wilson College in the present low-residency rankings (79), the total number of low-residency votes cast for the rankings (195), and Warren Wilson's publicly-released annual applicant pool size (200). Using these figures one would expect an annual national low-residency applicant pool of 494. The only other low-residency programs for which all these data are both available and may be considered reliable are Bennington College (whose data suggest an estimated 488 annual low-residency applicants) and Lesley College (598).

In view of the above, the three-year, 195-person sample used for this year's low-residency rankings likely represents between one-half and one-third of an annual applicant cohort for this type of residency program.

Added to the adjusted mean for annual poetry, fiction, and nonfiction applicants, the estimate for the annual number of low-residency applicants suggests a total annual applicant pool to creative writing MFA programs—across all genres and types of residency, and gauging discrete applicants only—of somewhere between 4,000 and 5,000.

GENRE RANKINGS


Cohort

Between July 15, 2009, and April 15, 2010, 346 fiction applicants were polled for the fiction-genre rankings, 141 poetry applicants were polled for the poetry-genre rankings, and 101 nonfiction applicants were polled for the nonfiction-genre rankings. The reason for the disparity between the total number of fiction and poetry applicants in the genre-specific polls (487) and the total number of votes in the overall fiction and poetry poll (527) is that 40 applicants, or 7.6% of the cohort polled in fiction and poetry, did not specify their genre—though it was clear from their application lists that the genre in which they applied could not have been nonfiction (due to the fact that the majority of MFA programs do not offer nonfiction tracks, an applicant specifying that he or she has applied in only genre, but who lists certain programs on his or her application list, can be precluded from consideration as a nonfiction applicant). One consequence of this 7.6% nongenre-reporting population is that certain programs are tied in the overall rankings even though, by virtue of their rankings in the two major genres, this would seem to be a statistical impossibility.

The cohort sizes used in this polling are roughly consistent with the national distribution of MFA applicants by genre, as revealed by those few programs which both (1) accept applicants in all three genres, and (2) release their internal admissions data for all three genres. The national distribution of fiction, poetry, and nonfiction applicants is approximately 6 to 3 to 2, respectively.

Due to the still relatively small number of nonfiction programs in the United States and abroad, only programs receiving top 20 placement in the genre have received a special notation in either the print or online editions of the rankings. No Honorable Mentions have been awarded, for the following reasons: (1) the relatively small number of votes for programs ranked beyond twentieth in the genre, all of which appeared on fewer than 10% of nonfiction applicants' application lists; (2) a bunching phenomenon in the nonfiction rankings, such that any presumptive Honorable Mention section of the nonfiction rankings (programs ranked between 21 and 25) would include nine programs, making the Honorable Mention section nearly half the size of the rankings proper; and (3) there would be little statistical distinction, that is, two votes or less, between the nine presumptive Honorable Mention programs and the six programs ranked behind them—a smaller disparity, out of a cohort of 101, than the three-vote difference between the top 50 and Honorable Mention sections in the 527-cohort full-residency rankings.

Programs without a nonfiction program are designated, in the top 50 rankings, with an em-dash (—).

INTERNATIONAL PROGRAMS

Special Note on International Programs
The Poets & Writers Magazine MFA rankings have always considered, and will continue to consider, international MFA programs. However, international programs are unlikely to fare well in the overall rankings for several reasons: (1) nearly all non-U.S./non-Canadian graduate creative writing programs are (by U.S. accreditation standards) non-terminal (that is, they are M.Phil, M.St., or MA degrees, as opposed to the terminal MFA degrees considered by the Poets & Writers rankings); (2) non-U.S./non-Canadian applicants are less likely to frequent a U.S./Canadian-focused MFA website like The MFA Blog, and therefore non-U.S./non-Canadian programs are less likely to appear on the application lists of those polled for these rankings (and Canadian applicants applying to Canadian programs may be less likely to patronize The MFA Blog than American applicants applying to American programs); (3) unlike U.S. and Canadian MFA programs, overseas programs are rarely fully funded for non-domestic students (U.S./Canadian MFA programs do not distinguish between domestic and international applicants with respect to funding eligibility), and therefore are less likely to be popular amongst the U.S. and Canadian applicants that frequent The MFA Blog; and (4) due to the exceedingly small number of non-U.S. terminal-degree MFA programs now in operation, programs in Canada and elsewhere simply have fewer entrants into the international MFA system with which to achieve a top 50 ranking in any of the "ranked" categories in this assessment (for instance, in funding, selectivity, and postgraduate placement).

Every publicly-advertised terminal-degree MFA program in the world presently appears somewhere in the Poets & Writers Magazine MFA rankings. Even so, applicants seeking to attend MFA programs outside the United States should use these rankings with caution, as they may only imperfectly capture the level of esteem in which non-U.S. programs are held by non-U.S. MFA applicants. 

NON-TERMINAL MASTER'S DEGREE (MA) PROGRAMS


Special Note on MA Programs

Over the past four years, the present MFA rankings project has catalogued several thousand MFA applicants' application lists. One abiding trend is that only a small number of non-terminal Master's (MA) degree programs in creative writing can frequently be found on application lists otherwise comprised entirely of terminal-degree MFA programs. As only two MA programs presently answer to this description, these two programs have been included in the full-residency rankings for terminal-degree creative writing programs. Applicant mores seem to indicate that these programs are now considered on par with MFA programs in overall quality—if not in the critical feature of "terminality." Many of those who attend MA programs in creative writing subsequently apply to terminal-degree MFA programs upon graduation.

A third program is included in these rankings despite being an MPW (Master's of Professional Writing). The two reasons for this inclusion are (1) the MPW is arguably a terminal degree, though there is little evidence yet on the question of whether or not it enjoys the same regard in the field of creative writing (as to "terminality") as the MFA, and (2) as with the two MA programs referenced above, this MPW program often appears on the application lists of applicants who are otherwise applying only to MFA programs. In contrast, the overwhelming majority of MA programs only appear on the application lists of MA-only applicants.

Apart from these three ranked non-MFA programs, the non-terminal creative writing Master's programs most commonly appearing on MA-only application lists are Eastern Michigan University in Ypsilanti (three appearances), University of Tennessee in Knoxville (three appearances), and University of Louisville in Kentucky (two appearances). Eleven MA programs appeared only once on application lists during the course of this year's polling of MFA applicants. Because MA applicants do not commonly frequent The MFA Blog, the MA cohort involved in this polling is too small for any substantive conclusions to be drawn about trends in MA-only applications.

THE FULL-RESIDENCY RANKINGS CHART: ADDITIONAL PROGRAM MEASURES
Funding

Nothing in these rankings' funding assessments is intended to impugn the motives or character of professors, administrators, or staff at any of the nation's graduate creative writing programs. The presumption of these rankings is that all of these groups have and do militate, with varying degrees of success, for more funding for their students—and that, given the choice, every program would choose to be fully funded. Still, there is no question that some programs require virtually no financial outlay by admitted students, and others are institutionally structured to induce students to take out substantial student loans. The rankings must and do take this into account, just as they take into account that one has more time to focus on one's in-genre writing and reading if one is fully funded—or asked merely to work a low-impact part-time teaching job—than if one is forced to work a full-time non-academic job to make ends meet while in-program. The rankings' strong emphasis on funding is an accurate reflection of what conventional wisdom amongst the MFA applicant class now holds.

Program funding packages were calculated on the basis of annual cost-of-living-adjusted stipend values for programs with full tuition waivers, and on the basis of annual cost-of-living-adjusted stipend values less annual tuition for programs offering only partial tuition waivers. Programs were further divided into categories on the basis of the percentage of each incoming class offered full funding. "Full funding" is defined as the equivalent of a full tuition waiver and an annual stipend of at least $8,000/academic year. No program offering full funding to less than 100% of its incoming class is ranked ahead of any program fully funded for all students. Likewise, no non-fully funded program is ranked ahead of any program in a higher "coverage" bracket. The four coverage brackets acknowledged by the rankings are as follows: "100% fully funded"; "75%+ fully funded"; "40% to 75% fully funded"; and "33% to 39% fully funded." All of these refer to the percentage of each annual incoming class that receives a full funding package.

No program fully funding less than a third of its admitted students received a top 50 ranking in the funding category. Programs whose coverage bracket is sufficiently high to receive a national ranking, and whose stipend is sufficiently high to meet the definition of "full funding," but whose specific annual stipends were unknown at the time the rankings were compiled, were ranked last within their respective coverage brackets. Top 50 programs awarded an Honorable Mention in funding are indicated with a plus-sign (+) in the print edition of the rankings. In the online-only Honorable Mention, second, and third tiers of the overall rankings, the designation "HM" is used instead.

Programs that fully fund 33% or more of their admitted students were also considered eligible for "package averaging." If and when programs meeting this criterion were revealed to offer funding packages of differing value to different students, the total stipend value of all full-funding packages was divided by the number of such packages to determine average annual stipend value. Because some programs do not advertise special funding offerings available only to select students, not every program benefited from this feature of the rankings. Consistent with the structure and conceit of these rankings, programs exhibiting maximum transparency with respect to their promotional materials are most likely to receive a comprehensive assessment of their program's total funding package.

The "total funding" rankings take into account duration of funding, as programs were ranked for this measure by multiplying annual package value by the duration of each program in years. The varying amount of tuition charged at individual programs was disregarded, as students receiving full funding do not, by definition, pay tuition. For the annual funding ranking, only annual package value was considered.

Applicants should be aware that many programs deduct administrative fees—almost always less than $1,000, and usually less than $500—from their annual stipends. Moreover, some programs offer health insurance to all admitted students and some do not. Programs that offer health insurance to all admitted students include but are not limited to the following (listed in order of "total funding" rank): University of Alabama in Tuscaloosa; Cornell University in Ithaca, New York; Louisiana State University in Baton Rouge; Ohio State University in Columbus; Virginia Polytechnic Institute (Virginia Tech) in Blacksburg; Arizona State University in Temple; Purdue University in West Lafayette, Indiana; Washington University in Saint Louis, Missouri; Iowa State University in Ames; University of Michigan in Ann Arbor; University of Minnesota in Minneapolis; Vanderbilt University in Nashville, Tennessee; Pennsylvania State University in University Park; University of Wyoming in Laramie; University of Iowa in Iowa City; University of Virginia in Charlottesville; University of Wisconsin in Madison; University of Oregon in Eugene; University of Nevada in Las Vegas; University of New Mexico in Albuquerque; and Rutgers University in Camden, New Jersey.

Selectivity
As fewer than five full- or low-residency programs nationally publicly release "yield" data—the percentage of those offered admission to a program who accept their offers and matriculate—the acceptance rate figures used for the national selectivity ranking are necessarily yield-exclusive. Most have been calculated using the simplest and most straightforward method: Taking the size of a program's annual matriculating cohort in all genres and dividing it by the program's total number of annual applications across all genres. Thirty-three of the top 50 programs in selectivity (66%) had available admissions data from the 2009–10 admissions cycle, nine of the top 50 programs in this category (18%) most recently released admissions data during the 2008–09 admissions cycle, and eight programs (16%) most recently released admissions data during the 2007–08 admissions cycle.

The relative paucity of data available for the selectivity rankings is attributable to programs' continued reticence in releasing the sort of internal admissions and funding data regularly released by colleges, universities, and most professional degree programs. Hundreds of interviews with MFA applicants between 2006 and 2010 suggest that a program's acceptance rate is one of the top five pieces of information applicants request when researching a graduate creative writing program. Fortunately, only one of the top 50 MFA programs has not yet made its annual acceptance rate public either directly or indirectly. More than 30 programs not ranked in the top 50 nationally have publicly released their annual acceptance rate.

In order to avoid artificially privileging small programs with an unknown but likely modest annual "yield"—programs with small applicant pools but also small incoming cohorts, and consequently, in some instances, extremely low yield-exclusive acceptance rates—only programs receiving more than 100 applications annually are eligible for the top 50 in selectivity. Of the approximately 68 full-residency programs and 31 low-residency programs with unknown admissions data, the "overall" polling done for these rankings suggests that no more than ten to fifteen would be eligible for inclusion in the top 50 for selectivity on the basis of applicant-pool size. Whether these ten to fifteen programs' annual incoming cohorts are sufficiently small—and thus the programs, statistically, sufficiently selective—to make any of these programs entrants into the top 50 for selectivity is unknown. The likelihood is that fewer than five programs that would otherwise appear in the top 50 for selectivity are ineligible for that ranking solely because they have thus far declined to release their admissions data to applicants.

Of programs with fewer than 100 applications whose admissions data are known, the eight most selective programs are as follows: Florida International University in Miami, #1; University of Massachusetts in Boston, #2; University of California in San Diego, #3; University of Kansas in Lawrence, #4; Northern Michigan University in Marquette, #5; Old Dominion University in Norfolk, Virginia, #6; West Virginia University in Morgantown, #7; and Georgia College & State University in Milledgeville, #8. These program rankings are provisional; admissions data for low-volume MFA programs cannot be considered as probative as data for programs with larger applicant pools.

The small number of low-residency programs with publicly-accessible acceptance rates makes crafting a selectivity ranking for such programs difficult. Of the 15 programs (32.6% of all low-residency programs) with available data, all but five have available data only from the 2007-8 admissions cycle or earlier. Fortunately, the programs ranked first, second, and third in this measure have all released data from one of their past two admissions cycles. The applicant-pool-size cutoff for inclusion in the low-residency selectivity rankings is set at 50 annual applicants.

Placement
Programs' placement records were assessed by determining how many individual "placement events" a given program's graduates achieved during the twelve-year period from 1998 to 2010. Only a limited number of fellowships and residencies are available to MFA graduates immediately post-graduation, and fewer still are specifically targeted at recent MFA graduates. Most of these make publicly available the names and biographical data of their fellows and residents. The focus for this year's rankings was on eleven of the fellowships and residencies in this group—generally speaking, the nation's eleven most prestigious post-MFA fellowships and residencies.

The fellowships and residencies surveyed for this measure were the following: The Stegner Fellowship at Stanford University in Palo Alto, California; the Wisconsin Creative Writing Institute Fellowship at the University of Wisconsin in Madison; the Hodder Fellowship at Princeton University in New Jersey; the Fine Arts Work Center Fellowship at the Fine Arts Work Center in Provincetown, Massachusetts; the Emory University Creative Writing Fellowship in Atlanta, Georgia; the Stadler Fellowship at Bucknell University in Lewisburg, Pennsylvania; the Axton Fellowship at University of Louisville in Kentucky; the Olive B. O'Connor Fellowship at Colgate University in Hamilton, New York; the Bennett Fellowship/Writer-in-Residence at Phillips Exeter Academy in Exeter, New Hampshire; the James Merrill Writer-in-Residence at the James Merrill House in Stonington, Connecticut; and the Amy Clampitt Residency Award at the Amy Clampitt House in Lenox, Massachusetts.

These eleven fellowships and residencies played host to more than 300 "placement events" between 1998 and 2010. As the placement rankings acknowledge placement events rather than placed fellows or residents, it is possible for a single fellow or resident to be the subject of more than one placement event.

As simply ranking programs by the number of their graduates subject to placement events between 1998 and 2010 would unfairly favor larger programs, programs have instead been ranked on the basis of a placement "score," calculated as follows: A program's total number of placement events between 1998 and 2010 was divided by the size of the program's annual incoming cohort. The resulting top 50 size-adjusted scores ranged from 1.30 to 0.03. Three programs—University of Utah in Salt Lake City; Florida International University in Miami; and University of Alaska in Fairbanks—experienced a single placement event during the twelve-year period surveyed but could not be ranked because their annual incoming cohort size has not been publicized. None of these programs were eligible for a placement ranking higher than #42, however, based on the number of placement events to which their graduates were subject during the survey period. In several instances, programs identical both in size and in their number of placement events received scores resulting in rankings "ties."

Because fellowships and residencies draw no distinction between full- and low-residency programs, this is the only measure in which full- and low-residency programs were ranked in a single measure. This said, the low-residency programs were subsequently granted their own numerical ranking, in recognition of the fact that these programs are hampered by the decreased likelihood that their graduates will seek fellowships or residencies in the first instance (as by definition low-residency students already have full- or part-time employment). It must also be said that, because a program needed to have begun graduating students in 1998 to be an eligible program for the full survey period, only programs founded in 1996 or after enjoyed the full benefit of the twelve-year assessment window, and many if not most low-residency programs were founded after 1996. Likewise, several top 50 programs that might have ranked (or ranked higher) in the top 50 in placement did not do so for the same reason. Top 50 programs founded in the midst of the assessment period include University of Wisconsin in Madison (first class graduated, 2004); University of New Hampshire in Durham (first class graduated as terminal-degree MFA recipients, 2005); and University of Wyoming in Laramie (2007).

These placement rankings should be used with caution. As selection for a fellowship or residency is often the result of one or more individuals being the "consensus pick(s)" of a panel of judges—and as consensus in art has not always, historically, favored innovation—it is possible for fellows and residents to in some instances be amongst the most talented, but not necessarily the most innovative, of their graduating year's national cohort. This is by no means to impugn, or remark upon, the writing of any particular fellow or resident, or on the selections of any particular fellowship or residency. Instead, the emphasis in this caveat is on causation: Applicants should not presume either that a program with a high placement ranking can ensure them a fellowship or residency, nor that a program with a high placement ranking necessarily hosts the strongest student cohort if innovation, rather than technical mastery, is the particular interest of the applicant. On these points the rankings make no specific claim other than to note these important distinctions.

page_5: 

Program Duration
A program's duration is measured by the average length of its curriculum in years. Some programs allow students to petition to extend their stay; because such petitions are granted on a case-by-case basis, and because the granting of such petitions often results in only an additional unfunded year of study, individual programs' program-extension policies have not been considered in calculating program duration. Internationally, only two full-residency MFA programs are known to be one year in duration, and only two programs are four years in duration. The remaining 144 full-residency programs are either two or three years in duration, with programs ranked in the top 50 significantly more likely to be three years than are programs not ranked in the top 50. Very few programs that fund less than 33% of their incoming students are more than two years in duration. In fact, only five non-fully-funded top 50 programs are three years in duration, and all of these are either ranked in the top 50 for funding or else were awarded an Honorable Mention in the category (two of the five fully fund more than 75% of their admittees). Fully funded programs are slightly more likely to be three years in duration as opposed to two; 52.9% of the 34 fully-funded programs are three years in duration, and an additional three (8.8%) are known to offer substantial numbers of graduates third-year funding in some form or another. Low-residency programs are almost universally two years in duration, and yet these programs cannot be measured by their duration because, unlike full-residency programs, they are generally highly flexible with respect to the duration of their students' courses of study.

Assessments of program duration do not consider the availability of postgraduate fellowships, or automatic postgraduate placements, at individual programs. Because such opportunities are not guaranteed as to any individual student, they cannot be formally acknowledged in the rankings except, as applicable, with respect to a program's funding (see "Funding" section, above). This said, at least one program, Cornell University in Ithaca, New York, so consistently offers a third-year lectureship to all of its admitted students that it may nominally be considered a three-year program.

Program Size
In the rankings, the size of a program's annual incoming cohort is expressed using the usual acronyms for magnitude: XS (Extra-Small, an average of one to four students admitted per year in each of the two major genres, fiction and poetry); S (Small, five to nine students); M (Medium, ten to fifteen students); L (Large, sixteen to twenty-four students); and XL (Extra-Large, twenty-five or more students per major genre per year). Because many programs do not include their matriculating class size on their Web sites, in some instances this data has been extrapolated from other available information. One program, the University of Wisconsin in Madison, was by necessity granted a special dispensation in several categories, as it is the only MFA program in the United States or abroad to admit fiction and poetry students in alternating years. This required two methodological accommodations: (1) using sampling and extrapolation for the overall and genre rankings by determining the average genre bias of the program's applicant pool across a four-year period, and then using this year's single-genre vote totals to create a reasonable extrapolation for the off-year genre; and (2) averaging the class-size figures for the program. Because the program accepts six poets and six fiction writers every two years, the program is treated as having an average annual matriculating class size of six.

Full Funding
Full funding is defined as the equivalent of a full tuition waiver and a minimum $8,000/academic year stipend. Where the tuition waiver offered is less than 100%, the program's stipend value is reduced by the amount an admitted student is asked to pay in tuition annually. All stipend values are adjusted for cost of living. Cost of living assessments were made using the Web site Sperling's Best Places (http://www.bestplaces.net/COL/default.aspx [7]). Healthcare costs, administrative fees, and student relocation costs were not estimated or considered, nor was the cost of tuition—as students receiving "full funding," by the definition of the term used in the rankings, do not pay tuition.

In setting the stipend value for "full funding" at a cost-of-living-adjusted $8,000/academic year, the rankings make no assertion as to whether this should be considered a living wage. A single individual earning this amount per annum is eligible for the Earned Income Tax Credit when filing federal taxes; however, because the "$8,000/academic year" standard translates to $8,000/nine months, this rate of income accrual in fact extrapolates to an annual income of $10,667. This is still below $13,440—the amount, in 2009 dollars, at which a single individual is phased out completely from the Earned Income Tax Credit.

The rankings acknowledge that MFA students receiving the minimum full-funding stipend may still find themselves borrowing a de minimis amount (defined as less than $3,000/academic year) to help defray the costs of program attendance. For the purposes of this article, the de minimis borrowing level has been set at that rate of borrowing that both puts an applicant out of range of pro-rated EITC coverage and yet results in less than $10,000 in total federal debt during a three-year MFA program. Of the nation's 34 fully-funded full-residency programs, only two are known to offer cost-of-living-adjusted stipends of less than $10,000/academic year. Both of these programs offer stipends above $9,000/academic year, however.

Cost of Living
The cost of living in the various programs' listed host locations was determined using Sperling's Best Places (http://www.bestplaces.net/COL/default.aspx [7]). All cost-of-living data were then compared to a randomly-selected national-average-range constant, in this case Providence, Rhode Island.

Notations used for cost of living are as follows: Very Low (16% or more below Providence, Rhode Island, the national-average-range constant); Low (between 6% and 15% below); Average (between 5% below and 5% above); High (between 6% and 15% above); and Very High (16% or more above). While some students may choose to live outside the boundaries of their program's host location, commuting to an MFA program rather than living near campus includes hidden costs of its own, indeed costs of both a pecuniary and non-pecuniary nature. For this reason, only a program's host location was assessed for this measure. Cost-of-living adjustments were also used to determine the package value at individual programs for the total funding, annual funding, and full funding categories (see above).

Teaching Load
While individual applicants' interest in teaching composition, rhetoric, literature, or creative writing to undergraduates will vary, generally speaking the most popular teaching load is a "1/1" (one course to be taught in the fall semester, one in the spring semester). The teaching loads of individual programs have not been ranked per se, yet this 1/1 standard has been used to determine whether a given program's teaching load is "light," "average," or "heavy." That is, because the 1/1 load is the most popular amongst applicants—though it is not the most common teaching load at MFA programs—average annual teaching loads of 0/0, 0/1, 1/0, and 1/1 have been denominated "light" by the rankings. An average annual teaching load of 2/1 or 1/2 (the most common teaching load) is termed "average," while an average annual teaching load of 2/2 is considered "heavy."

The term "average" is used here in two different senses: First, to denote a category of teaching load; second, to indicate that all programs are assessed by their "average" teaching load. Because many programs offer different teaching-load packages to different students, and/or increase or decrease teaching load over the duration of the program, the average (mean) number of courses taught per year per student in each program is used. In some instances, a plus-sign (+) is used to indicate that students may request and/or get assigned—once admitted to a program—a larger and therefore better-compensated teaching load. Such additional teaching sections are not guaranteed, however.

Some programs fund a small enough percentage of admittees through teaching assistantships that to assign such programs an "average teaching load" would be to wrongly imply that admitted students are likely to receive an assistantship. For this reason, programs that offer assistantships to less than one-third of their incoming cohort received an asterisk in the "teaching load" column. Programs eligible for a "teaching load" designation, but which do not publicly disclose the teaching load they assign their teaching assistants, are indicated with a "Not Available" ("N/A") notation.

CGSR Compliance
This category is discussed in detail in the rankings' introductory article ("The MFA Revolution: Ranking the Nation's Best Programs," Poets & Writers Magazine, September/October 2010). At present, only one CGSR signatory—a program not ranked in the top 50—is believed to be non-compliant with the contract it and more than a hundred other universities signed and published for prospective applicants.

Technically, nonsignatories to the CGSR are not obligated to be compliant with it. They are termed here "non-compliant" in order to alert prospective applicants to the fact that these programs are not bound by (and in nearly all instances are known not to follow) CGSR guidelines. Any signatory to the CGSR conclusively found to be in violation of that contract will likewise be listed as "non-compliant," whether or not the program continues to be a CGSR signatory. Compliance inquiries are conducted via e-mail exchanges with program administrators, and are initiated on the basis of applicant self-reporting.

Trend
The trend category assesses a program's movement in the rankings since those rankings published during the 2009–10 admissions cycle ("The Top 50 MFA Programs," Poets & Writers, November/December 2009). Programs that moved five spots or less in the rankings between 2009 and 2010 are denoted with a horizontal arrow. Programs that lost more than five spots are indicated by a downward arrow, and programs that gained more than five spots are indicated by an upward arrow.

Studio/Academic Orientation
A program is considered "studio-oriented" if more than two-thirds of its credit requirements can be met through workshops, thesis hours, internships, or independent study, and "academics-oriented" if not. Academics-oriented programs in which students take only two courses per semester, one of which is a workshop, are indicated with an "A" and an asterisk (A*).

THE LOW-RESIDENCY RANKING CHART: ADDITIONAL PROGRAM MEASURES


Genre Availability

"Genre Availability" measures the number of genres in which a given low-residency program offers either a formal track or an informal "concentration" or "focus." As many programs occasionally offer courses in genres for which they do not offer a track or concentration, it can be difficult to assess which genres are sufficiently supported at a given program that the program is likely to receive creative portfolios in that genre. The aim of the "Genre Availability" category is to include only those genres in which a program is likely to attract an appreciable number of applicants—as opposed to a genre offered only as an occasional brief-residency course lacking substantial non-residency, academic-year faculty support.

Residency
"Residency" measures the number of days per year a low-residency student in the program is required to be on campus. While at present there is no conventional wisdom as to the value of a long residency versus a shorter one, low-residency programs' residencies are typically the only opportunity for students to interact in person with their professors and to meet their classmates; consequently, many applicants to low-residency programs may prefer longer residencies.

Seth Abramson is the author of Northerners (New Issues Press, 2011), winner of the 2010 Green Rose Prize, and The Suburban Ecstasies (Ghost Road Press, 2009). A contributing author to The Creative Writing MFA Handbook (Continuum, 2008) and a 2008 recipient of the J. Howard and Barbara M. J. Wood Prize from Poetry, his work has recently appeared in Best New Poets 2008, Conjunctions, and elsewhere. Currently a doctoral candidate in English at the University of Wisconsin in Madison, he is a graduate of Harvard Law School and the Iowa Writers' Workshop.


Source URL:https://www.pw.org/content/2011_poets_amp_writers_magazine_ranking_of_mfa_programs

Links
[1] https://www.pw.org/content/2011_poets_amp_writers_magazine_ranking_of_mfa_programs [2] https://www.pw.org/content/septemberoctober_2010 [3] https://www.pw.org/content/2011_mfa_rankings_the_top_fifty_0 [4] https://www.pw.org/content/2011_mfa_rankings_the_additional_rankings_of_fullresidency_mfa_programs [5] https://www.pw.org/speakeasy [6] https://www.pw.org/content/2011_mfa_rankings_the_top_ten_lowresidency_programs [7] http://www.bestplaces.net/COL/default.aspx