Poets & Writers
Published on Poets & Writers (https://www.pw.org)

Home > 2013 MFA Index: Further Reading

2013 MFA Index: Further Reading [1]

by
Seth Abramson
September/October 2012 [2]
8.17.12

Note: The following article explains the approach used to compile the 2013 MFA Index, published in the September/October 2012 print edition of Poets & Writers Magazine.

The 2013 MFA Index comprises two tables: one of 85 full-residency MFA programs and one of 29 low-residency MFA programs. Full-residency MFA programs are assessed on the basis of twenty measures: eight listings of survey results, and twelve listings of other important program features. While the eight surveys detailed in the full-residency table cannot be classified as scientific, all are predicated upon sufficient hard data to be substantially probative. A scientific survey of full- or low-residency creative writing MFA programs, or doctoral creative writing programs, is not presently possible, as more than half of the nation's full- and low-residency graduate creative writing programs do not publicly release the data necessary for such an assessment. 

Three of the eight full-residency MFA surveys are based on a survey of a large sample of current MFA applicants. In each survey, programs are ordered on the basis of the number of times they appeared on applicants’ application lists; the resulting numeric ordering therefore assesses only the popularity of individual programs among a large sample of well-researched aspiring poets and writers, and is not an assessment of overall program quality. While prospective MFA students have a wide range of resources at their disposal in selecting where to apply—including not only quantitative data but also programs’ promotional materials; first-hand interviews with program staff, faculty, and current and former students; large online communities of fellow applicants, many of which are devoted to MFA research, discussion, and application advising; their own past experiences in undergraduate and non-institutional workshop settings; the literary oeuvres of current faculty members and recent alumni; previous program assessments in various online and print media, including first-hand accounts from, and interviews with, current and former faculty and graduates; and in some instances program visits or prior first-hand experience workshopping with selected faculty members—there is necessarily some information that is impossible for prospective MFA students to access unless and until they become matriculants. Once a student has matriculated, however, they may develop subjective attachments to their alma mater, which makes objective assessment of their own experiences, as opposed to the provision of definitionally “self-serving” survey responses, difficult or even impossible; for this reason and others discussed in more detail, current MFA students are not asked to self-report on their programs, nor are they considered to have a broader, more accurate, or more timely knowledge of programs they do not attend than those unmatriculated applicants who are currently researching program options.

In the 2013 MFA Index, programs appearing in the top half of each applicant popularity survey are noted by their numeric placement in the survey results. Programs not appearing in the top half of the 2012 popularity survey or the four-year applicant popularity survey (taken between April 16, 2008 and April 15, 2012) may still appear in the full-residency table, provided they appear in the top half of three of the seven survey areas described below.

The five hard data–based survey columns in the full-residency table are as follows: selectivity, funding, student-faculty ratio, fellowship placement, and job placement. These categories appear in the table in the chronological order in which applicants to graduate creative writing programs encounter them: a program’s selectivity determines, all thing being equal, an individual applicant’s likelihood of securing admission; the funding category indicates, all things being equal, what aid package will be attached to that program admission if and when it comes; student-faculty ratio gestures toward, all things being equal, a student’s ability to access individual faculty members while in-program; fellowship placement generally measures the success of individual programs in placing recent graduates in post-MFA fellowships; job placement generally measures the success of individual programs in placing graduates who have already achieved some measure of fellowship/publishing success in full-time creative writing teaching positions in higher education .

These survey results are scientific to the extent that they order programs on the basis of quantitative data publicly released by the programs themselves, and unscientific to the extent that not every program has released data for every category of assessment. These five columns therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for well over half of the nation's full-residency MFA programs; the remaining programs are generally smaller, newer, lightly advertised, nondomestic, or regional in terms of their applicant base. As all of these programs have websites, however, and as all enjoy exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken as an indication that these programs fully fund less than a third of their students and do not have an acceptance rate low enough for inclusion in the top half of programs in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 11.1 percent for it to be included in the top half of the selectivity category). The full-residency table is based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program websites are regularly reviewed to determine whether a program has added information to its online profile.

Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have appeared in the top half of full-residency programs in the funding category. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity listing; a program must receive at least fifty applications annually to be considered eligible for this listing. As to the two placement-related surveys, these do not rely on programs' promotional materials or on their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, are equally eligible for a top-half placement in the fellowship and job placement categories. Data sufficient to calculate student-faculty ratios for virtually every MFA program in the United States were also readily available.

The overlap between those programs appearing in the top half of the two fiction/poetry surveys—the third applicant survey being of nonfiction applicants only—and those programs appearing in the top half of the five hard-data survey results is significant. Of the 71 full-residency programs that appear in the top half of both the one-year and four-year survey results,  60 (85 percent) are also in the top half in selectivity, and two (3 percent) have no selectivity data available; 60 (85 percent) appear in the top half in the funding category; 47 (66 percent) appear in the top half in student-faculty ratio; 56 (79 percent) appear in the top half in fellowship placement; and 42 (59 percent) appear in the top half in fellowship placement. Of the 32 programs listed in the chart as placing in the top half of the nonfiction survey, 28 (88 percent) also appear in the top half of both the one-year and four-year (fiction/poetry) surveys, with another three (9 percent) appearing in the top half of one of the two. In a sign that applicant mores remain fairly consistent over time, of the 60 programs appearing in the top half of programs in the four-year applicant survey, 58 (97 percent) also appear in the top half of programs in the one-year survey.

The approach to compiling the 2012 MFA Index was devised with the following goals in mind: (1) To better inform applicants of their program options by offering the nation’s first complete listing of graduate creative writing programs; (2) to more widely disseminate hard data regarding objective but non-exhaustive/non-conclusory measures of program quality, which often play a role in applicants’ application and matriculation decisions; (3) to be responsive to, and reflective of, the opinions, interests, concerns, and values of current applicants to graduate creative writing programs; (4) to emphasize the importance of both inputs and outputs in assessing individual graduate creative writing programs; and (5) to enhance the transparency of the admissions process for present and future applicants to graduate creative writing programs.

Popularity Surveys

In the twelve months between April 16, 2011, and April 15, 2012, 540 full-residency MFA applicants were surveyed on The MFA Draft 2012 Group, hosted by Facebook.com. Created by a group of then-current MFA applicants in the early Spring of 2011, the group at its peak had nearly 1,500 members during the survey period, and featured dozens of new MFA-related conversation threads—some garnering up to a thousand individual responses—each day. The MFA Draft 2012 Group was created and is moderated by the applicants themselves; admission of new members was and is determined entirely by applicants; and decisions regarding the production of personal or program information for purposes of research were likewise made entirely by applicants. The site was promoted via several methods: Word-of-mouth; links and references in the MFA Draft 2011 Group, the predecessor to the MFA Draft 2012 Group (and there is also, now, a well-attended MFA Draft 2013 Group); and links and references on The Creative Writing MFA Blog founded by novelist Tom Kealey. The four-year survey is a compilation of the data from this one-year survey and the past three years of Poets & Writers Magazine program assessments.

Individual users in the MFA Draft 2012 Group were distinguished by their proper names, as reflected by the full names (first and last) associated with their Facebook accounts. Internet research was conducted to verify applicants’ identities as and when authenticity was in question. The document in which Draft 2012 applicants voluntarily compiled their application decisions was part of an application that tracks all changes to uploaded documents by editors’ full names/Facebook accounts. This application ensured that any and all changes to the document were traceable to specific users. Users were permitted to amend their application lists in real-time; though less than 15 percent of users elected to make amendments to their lists once posted, all changes to applicants’ lists were accounted for by both the one-year and four-year applicant surveys appearing in the program tables. Substantial additional security measures were taken to ensure the authenticity of compiled application lists.

As noted, surveyed applicants voluntarily listed the programs to which they had applied or intended to apply, and were permitted to adjust these lists at any point during the survey period.

Only recently-collected data is of use to applicants. The one-year full-residency survey uses no survey data predating April 16, 2011; the four-year full-residency survey uses no survey data predating April 16, 2008; the low-residency survey uses no data predating April 16, 2007; and the creative writing doctoral survey uses no data predating April 16, 2007. The longer survey periods for low-residency MFA and creative writing doctoral programs were necessitated by the significantly smaller applicant pools for these degrees.

Eligibility for Inclusion in the 2013 MFA Index

Eligibility for the 2013 MFA Index was determined by how programs placed in seven survey areas: popularity among applicants, selectivity, funding, student-faculty ratio, fellowship placement, job placement, and location. Programs are categorized by the number of survey areas in which they place in the top half of all programs a) eligible for the category, and b) with data publicly available. The number of programs considered eligible for each category of assessment varies by the category; in some instances (such as fellowship placement and job placement) newer programs cannot yet be added to the pool of eligible programs because they have yet to graduate any poets or writers, whereas in other instances (such as selectivity, funding, and student-faculty ratio) certain programs may not yet have released the data necessary for them to be assessed in these areas. Popularity and location are the only categories in which all presently extant programs (or in the case of the four-year applicant survey, all programs extant for a majority of the four-year survey period) were instantly members of the pool of eligible programs. Otherwise, the pool of eligible programs was 124 for the selectivity survey, 150 for the funding survey, 130 for the student-faculty ratio survey, 130 for the fellowship placement survey, and 90 for the job placement survey. The reason for the significantly lower number of eligible programs in the job placement category is that MFA graduates are generally not competitive for the sort of hires catalogued by the job placement assessment—full-time, non-adjunct positions teaching creative writing at the college/university level—until they have been away from their alma maters for at least five years, during which time they have continued to publish in their genre (and in many instances gain teaching experience as adjuncts) to make themselves more marketable as full-time hires.

Given that the job placement survey tracks hires made between 2009 and 2012, only programs known to have graduated at least two classes of poets and writers by 2004 are considered to have had a full opportunity to place in the job placement category, with the result being that the number of programs founded prior to fall 2001 was used as the “eligible pool” figure for this measure (but note that, in order not to disadvantage programs founded after this time, such programs were nevertheless included in the survey if they successfully placed a graduate in an eligible position). As 90 full-residency programs were founded prior to fall 2001, the size of the pool for the job placement measure was set at 90. In future years the size of the “eligible pool,” consistent with the above-stated methodology, will increase.

MFA Program Response Rates

Graduate degree programs in creative writing respond to assessment-related inquiries at a lower rate than do programs in almost any other field of study in the United States. The following response rates were reported for questionnaires sent pursuant to the 2012 U.S. News & World Report program assessments (in each instance, the field of study is followed by the response rate from all programs surveyed in the field): Engineering (98 percent); Business (91 percent); Criminology (90 percent); Education (90 percent); Medicine (84 percent); Healthcare Management (76 percent); Statistics (67 percent); Law (66 percent); Public Health (61 percent); Audiology (57 percent); Library and Information Studies (56 percent); Pharmacological Sciences (56 percent); Social Work (56 percent); Occupational Therapy (53 percent); Veterinary Medicine (48 percent); Nursing (47 percent); Computer Science (46 percent); Physician Assistance (45 percent); Sociology (43 percent); Speech-Language Pathology (42 percent); Public Affairs (40 percent); Rehabilitation Counseling (40 percent); Fine Arts (39 percent); Political Science (37 percent); Economics (34 percent); Mathematics (34 percent); Physical Therapy (33 percent); English (31 percent); Physics (31 percent); Earth Sciences (29 percent); Clinical Psychology (28 percent); Chemistry (25 percent); Psychology (25 percent); History (23 percent); and Biological Sciences (15 percent). Respondent institutions in each of these academic fields were aware that their questionnaire responses would not be kept confidential, and that their participation in surveys sponsored by U.S. News & World Report would result in publication of a substantial stock of program-specific data regarding each university queried.

Every two years, the Association of Writers and Writing programs (AWP) sends a questionnaire to programs in the field of creative writing—a field whose administrators and faculty are no longer surveyed (and whose programs are no longer assessed) by U.S. News & World Report—in much the same way U.S. News & World Report does for the thirty-five fields of study listed above. A crucial difference between the two questionnaires, however, is that the AWP questionnaire guarantees anonymity to its respondents; AWP releases no program-specific data or survey results pursuant to its biennial questionnaire. It is worth noting, too, that AWP estimates (as of its 2009 Annual Report) that 34 percent of programs in the field of creative writing have declined to become members of AWP. These programs are not subject to AWP questionnaires.

According to AWP’s publicly released summary of the program questionnaire it distributed in 2007, between 40 and 60 percent of AWP’s member programs declined to answer AWP’s queries regarding internal admissions and funding data. Specifically, 47 percent of programs declined to reveal how many assistantships they offered annually to incoming students, 61 percent declined to reveal the stipend offered to teaching assistants, 56 percent declined to reveal whether they offered a full-tuition waiver to teaching assistants, 49 percent declined to reveal how many scholarships were offered to incoming students, 55 percent declined to reveal their annual number of applicants, and 52 percent declined to reveal the size of their annual matriculating class. 

Avoidance of Respondent Bias

The most extensive program assessment system in the United States, the higher education surveys published annually by U.S. News & World Report, produce assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school assessment, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program assessments published by U.S. News & World Report use the same basic methodology, as stated in the 2011 edition of the magazine containing its graduate school program assessments: “[These assessments] are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” This last provision merely ensures that survey respondents have some basic familiarity with the programs they are assessing; it does not ask or encourage respondents to submit an institutional (or personal) self-assessment.

As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The Poets & Writers Magazine 2013 MFA Index improves on this model by surveying individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors (i.e., by reading their published work), but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to consider.

Overall assessments of program quality—in any field of study—are impossible, as such determinations differ depending upon the student, a large slate of unquantifiable program features, the period of time in which that student matriculates (as faculty turnover ensures program characters change over time), and a host of chance-based factors that no methodology ever devised could hope to or even wish to encapsulate. While the data provided in the Poets & Writers Magazine 2013 MFA Index is hopefully invaluable to applicants—especially given the historic opacity of graduate creative writing programs—it is no substitute for an individual applicant’s subtle, many-factored assessment of which program is best for him or her.

Survey Cohort Demographics

Online surveys conducted in 2010 using a Google-sponsored survey application suggest that the online MFA applicant community, including the community at The Creative Writing MFA Blog and the MFA Draft 2012 Facebook Group, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. Asked, "If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program? ", and given the alphabetically-listed options "Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 survey respondents provided the following responses:

1. Funding (68 percent)                
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)

There is substantial similarity between these survey results and the results of a 2009 survey that asked applicants, "Which of these is most important to your decision about where to apply?" with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of survey responses, were identical to the top four responses in 2010:

1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)

These survey responses also closely correspond to the responses provided by MFA faculty members when Poets & Writers Magazine asked them in 2011, to provide in narrative form some of those program features they considered most important for prospective students to consider. With over forty responses, the top priorities of MFA faculty members, as reported by the faculty members themselves, were (with the number in parentheses constituting the percentage of surveyed faculty members who cited a specific program feature):

1. Funding (71 percent)
2. Quality/Aesthetics of Faculty Work (69 percent)
3. Student Self-Reporting (60 percent)
4. Program Atmosphere (50 percent)
5. Faculty Accessibility (41 percent)
6. Teaching Opportunities (38 percent)
7. Location (36 percent)
8. Editing Opportunities (31 percent)

The remaining priorities were all cited by fewer than 30 percent of faculty respondents. Of the eight program features most frequently cited by faculty members as being appropriate bases to evaluate a graduate creative writing program, five—Funding, Quality/Aesthetics of Faculty Work, Teaching Opportunities, Location, and Editing Opportunities—are as well-known and accessible to, and as easily understood by, MFA program applicants as by current or former MFA students or their professors. The remaining three depend upon a phenomenon not yet common in the field of creative writing: For applicants to take into account Student Self-Reporting, Program Atmosphere, and Faculty Accessibility, programs would first need to routinely make current students and faculty available to non-admitted or pre-admission MFA applicants. As of the time of the writing of this article, none of the 224 creative writing MFA programs worldwide are known to regularly offer this courtesy to non-admitted or pre-admission applicants. It is worth noting, too, that of the next thirteen program features cited as important by current MFA faculty members, nine—Workshop Format, Curricular Flexibility, Alumni Publishing Success, Presence of a Reading Series, Curricular Intensity, Program Size, Cost of Living, Internship Opportunities, and Program Duration—are known or knowable to applicants at the time they make their application decisions, and two others are within the unique power of applicants to access (Program Visit and Alumni Self-Reporting). As of the writing of this article, fewer than 5 percent of creative writing MFA programs worldwide were known to offer funds to applicants for pre- (or even post-) admission program visits.

When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?” 1,929 survey respondents provided the following responses on The Creative Writing MFA Blog:

1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)

These results are consistent with earlier online survey results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.

Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?” 686 survey respondents provided the following responses:

1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)

Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?” 860 survey respondents provided the following responses:

1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)                

In 2011, the application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):

1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)

Asked, on The Creative Writing MFA Blog in 2010, "Why do you want to get a graduate creative writing degree?" and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:

1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)

The Poets & Writers Magazine 2013 MFA Index does not use the above survey data to create a weighting system for the columns of information it provides. There is a presumption, instead, that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.

Were the above data used to create a weighting system for the data presented in the Poets & Writers Magazine 2013 MFA Index, or were the applicant survey to be removed from the table altogether, many of the nation's most prominent and popular programs would disappear from the table altogether—as programs widely admired by applicants (and working poets and novelists) do not always perform superlatively in hard-data measures. A program assessment missing critical application-trend data would constitute a poor reflection of the present national consensus on which programs are most popular among applicants and working authors alike. For instance, under the applicant survey’s current methodology a popular but largely-unfunded MFA program in a major urban center might still appear in the top half of the one-year and four-year surveys because even a relatively low standing in the funding, selectivity, student-faculty, fellowship placement, and job placement categories can be counterbalanced by a program's popularity due to location, faculty, and/or other unquantifiable factors. The popularity of a program's location and faculty is best reflected by privileging applicants' application lists rather than a confluence of these lists and publicly accessible hard data.

Genre of Survey Respondents

Asked in 2010, using a Google-sponsored survey application, “What is your primary genre?” 701 respondents from The Creative Writing MFA Blog provided the following responses:

1. Fiction (53 percent)
2. Poetry (28 percent)
3. Nonfiction (15 percent)
4. Other (2 percent)

Isolating only the 578 poetry and fiction respondents to the above survey question, the results are as follows:

1. Fiction (65 percent)

2. Poetry (35 percent)

This suggests that the potential survey cohort at The Creative Writing MFA Blog is similar in its constitution, in terms of genre affiliation, to the national MFA-applicant cohort. Hard data from twenty MFA programs with available admissions data for both genres (constituting a total of twenty-four data-sets ranging in age from the 2008–2009 admissions cycle to the 2010–2011 admissions cycle) generates a total data-set of 12,368 applicants, 8,730 of these being fiction applicants (70 percent) and 3,638 poetry applicants (30 percent). The genre breakdown for the one-year applicant survey published in the MFA Index in 2012 is identical to this figure: As between poets and fiction-writers, 70 percent of the members of the surveyed cohort were fiction-writers and 30 percent were poets. For the 2011 applicant survey published in the summer of 2011 by Poets & Writers Magazine, the genre breakdown was 63 percent fiction-writers and 37 percent poets. Some of the deviations between the two surveys are best explained by the fact that the present (2012-published) survey more closely approximates the actual genre demographics of the national full-residency applicant pool.

Applicant survey respondents for the Poets & Writers Magazine 2013 MFA Index were self-selected. Just as a survey aimed at determining popular car manufacturers might use a self-selecting cohort to only compile the responses of the best-researched car buyers—for instance, those who had spent time on websites that allow consumers to compare various available car brands and styles—the one-year and four-year applicant popularity surveys do not intend to sample a generic cohort of MFA applicants. Instead, it is their aim to primarily if not exclusively catalogue application decisions made by the best-researched MFA applicants, which class of applicants is considerably more likely to be found in a massive, real-time applicant community in which scores of data-points regarding individual programs are researched, shared, and discussed daily.

National Full-Residency Applicant Pool Size

The median estimate for the national full-residency fiction/poetry applicant pool (as calculated in 2011) is 2,797, the mean estimate is 3,253, and the adjusted mean is 3,042. The same series of calculations produced a median estimate, for the national nonfiction applicant pool, of 291, and a mean estimate of 345. The total size of the national full-residency applicant pool, across all three of the “major” genres of study, is therefore likely between 3,000 and 4,000. The four-year, 2,215-respondent applicant survey that appears in the Poets & Writers Magazine 2013 MFA Index consequently surveys the equivalent of 55% to 74% of an annual national applicant pool in the field of creative writing; the one-year surveys published annually by Poets & Writers Magazine survey between 13% and 23% of the three-genre national applicant pool for that admissions cycle.

Data Sources

For those program measures not subject to applicant surveys, such as recitations and ordered listings of admissions, curricular, placement, student-faculty ratio, and funding data, only data publicly released by the programs—either to individual applicants, to groups of applicants, to Poets & Writers Magazine directly, in a program's promotional literature, or via a program website—have been included in the index. All data were updated regularly to reflect programs' most recent public disclosures.

Many of the nation's full- and low-residency MFA programs decline to publicly release internal data. Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by an approach that relies on transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures.

The Nonfiction Survey

Because fewer than half (47 percent) of full-residency MFA programs offer a dedicated nonfiction or creative nonfiction track—defined as a curricular track which permits a master’s thesis in the genre—nonfiction and creative nonfiction applicants have been surveyed separately from poetry and fiction applicants. These survey responses do not factor, in any sense, into either the one-year or four-year popularity surveys published in the Poets & Writers Magazine 2013 MFA Index.

For the nonfiction/creative nonfiction survey, the designation “n/a” indicates that a given program does not offer a nonfiction track.

LOW-RESIDENCY SURVEY

Structure

Low-residency programs were assessed in eleven categories, nine of which are either applicant surveys or ordered listings of hard data—six of these employing unscientific but probative surveying of the sort described above, and three based upon publicly-available hard data. Low-residency programs have not been assessed with respect to their funding packages because these programs generally offer no or very little financial aid to incoming students. The reason for this is that low-residency programs presume their students will continue in their present employment during the course of their graduate studies.

Cohort

Over the course of five successive application cycles, a total of 280 low-residency applicants were surveyed as to their program preferences, with these preferences exhibited in the form of application lists. The locus for this surveying was (between April 16, 2007 and April 15, 2011) the Poets & Writers Magazine online discussion board, the Speakeasy Message Forum, widely considered the highest-trafficked low-residency community on the Internet; from April 16, 2011 to April 15, 2012, the survey locus was the MFA Draft 2012 Facebook Group. The relatively small cohort used for this surveying accounts for the following: (1) The annual applicant pool for low-residency programs is approximately one-eighth the size of the full-residency applicant pool; (2) low-residency applicants do not congregate online in the same way or in the same numbers that full-residency applicants do; and (3) low-residency programs are subject to a "bunching" phenomenon not evident among full-residency programs, with only ten programs nationally appearing on even 10 percent of survey respondents' application lists, and only three appearing on 20 percent or more.

One explanation for the bunching phenomenon may be that low-residency programs are less susceptible to comparison than full-residency programs, as many of the major considerations for full-residency applicants, including location, funding, cohort quality, class size, program duration, student-faculty ratio, job placement, and cost of living, are not major considerations for low-residency applicants due to the structure and mission of low-residency programs. Generally speaking, low-residency programs are assessed on the basis of their faculty and pedagogy, neither of which are conducive to quantification.  It is worth noting, too, that a significant number of the world's fifty-seven low-residency MFA programs were founded within the last eight to ten years; applicant familiarity with these programs may still be relatively low.

The five-year low-residency surveying described above has been further broken down into year-by-year survey results. The survey cohort for the 2011-2012 annual survey was forty-six, for the 2010–2011 survey thirty-six, for the 2009–2010 survey eighty-nine, for the 2008–2009 survey fifty-six, and for the 2007–2008 survey fifty-three. If and when individual Speakeasy account-holders applied to programs in more than one admissions cycle, their application lists from each cycle were treated as separate survey responses; repeat applicants accounted for less than 10 percent of the survey cohort, however. Full-residency applicants on The Creative Writing MFA Blog who applied to one or more low-residency programs as part of their overall slate of target programs were also included in the low-residency survey; due to the exceedingly small number of such survey responses, these entries were manually compared both to one another and to existing low-residency application lists to ensure duplicate lists were avoided.

While surveys with larger cohorts are, all other things being equal, more reliable than those with smaller ones, the fact that the annual applicant pool for low-residency programs is likely between 350 and 400 suggests that the total survey cohort for the Poets & Writers Magazine 2013 MFA Index of low-residency programs likely represents approximately 50 percent of a single-year national applicant pool for this sort of degree program. Moreover, as is the case with the full-residency program table, crosschecking applicant survey responses across a period of five years reveals substantial consistency in the responses and quickly unearths any significant anomalies or outliers. Of the ten most popular low-residency programs listed in the this year's index, eight (80 percent) were among the ten most popular programs—according to applicants—in all five years of surveys, while the other two programs were among the fifteen most popular low-residency programs in all five of the application cycles studied (and in both cases missed the ten-most-popular grouping in only a single admissions cycle).

An “n.d.” notation signifies that a program has not released the requisite data. Two dashes (--) indicate that the program did not place in that category. Only fourteen of the nation’s fifty-seven low-residency MFA programs earned a positive score in the either of the two placement surveys, which considered placement data for full- and low-residency programs in a single assessment. In order to better acknowledge the achievement, in the placement categories, of these fourteen low-residency programs relative to their low-residency peers, and in recognition of the fact that low-residency graduates are substantially less likely to seek postgraduate fellowships or even postgraduate university teaching positions (largely because they do not give up their present employment when they matriculate), the national placement data collected for the low-residency table have been reconstituted as an ordered, low-residency-only listing. This applies equally to both the one-year and five-year applicant popularity surveys and to the surveys of selectivity and fellowship placement.

Low-Residency Applicant Pool Size

A realistic estimate for the annual number of low-residency MFA applicants is four hundred. Added to the adjusted mean for annual full-residency poetry, fiction, and nonfiction applicants, the estimate for the annual number of low-residency applicants suggests a total annual applicant pool to creative writing MFA programs—across all genres and types of residency, and gauging discrete applicants only—of somewhere between 3,500 and 4,250.

INTERNATIONAL PROGRAMS

Special Note on International Programs

The Poets & Writers Magazine full- and low-residency program tables have always considered, and will continue to consider, international MFA programs. However, international programs are unlikely to fare as well as they otherwise might in the surveys for several reasons: (1) nearly all non-U.S./non-Canadian graduate creative writing programs are (by U.S. accreditation standards) nonterminal (that is, they are M.Phil, M.St., or MA degrees, as opposed to the terminal MFA degrees considered by the Poets & Writers Magazine charts); (2) non-U.S./non-Canadian applicants are less likely to frequent U.S./Canadian-based MFA-related websites like The MFA Draft 2012 Facebook Group and The Creative Writing MFA Blog, and therefore non-U.S./non-Canadian programs are less likely to appear on the application lists of those surveyed for the Poets & Writers Magazine tables (and Canadian applicants applying to Canadian programs may be less likely to patronize the aforementioned websites than American applicants applying to American programs); (3) unlike U.S. and Canadian MFA programs, overseas programs are rarely fully funded for nondomestic students (U.S./Canadian MFA programs less frequently distinguish between domestic and international applicants with respect to funding eligibility), and therefore are less likely to be popular amongst the U.S. and Canadian applicants that frequent The MFA Draft 2012 Facebook Group and/or The Creative Writing MFA Blog; and (4) due to the exceedingly small number of non-U.S. terminal-degree MFA programs now in operation (well over 90 percent of all creative writing MFA programs now extant are located in the United States, and more than half of those in operation outside the United States were founded within the last five years), programs in Canada and elsewhere simply have fewer entrants into the international MFA system with which to achieve a relatively high placement in the applicant popularity surveys.

The 2013 MFA Index: Full-Residency Programs Categories

Funding

Nothing in the MFA Index funding assessments is intended to impugn the motives or character of professors, administrators, or staff at any of the nation's graduate creative writing programs. The presumption of the funding listing is that all of these groups have and do militate, with varying degrees of success, for more funding for their students—and that, given the choice, every program would choose to be fully funded. Still, there is no question that some programs require virtually no financial outlay by admitted students, and others are expensive. The Poets & Writers Magazine 2013 MFA Index takes this into account, as funding is an important factor among the current MFA applicant pool when deciding where to apply—and is also rated the number one consideration by MFA faculties themselves.

Program funding packages were calculated on the basis of annual cost-of-living-adjusted stipend values for programs with full tuition waivers, and on the basis of annual cost-of-living-adjusted stipend values less annual tuition for programs offering only partial tuition waivers. Programs were further divided into categories on the basis of the percentage of each incoming class offered full funding. "Full funding" is defined as the equivalent of a full tuition waiver and an annual stipend of at least $8,000/academic year. No program offering full funding to less than 100 percent of its incoming class placed ahead of any program fully funded for all students. Likewise, no nonfully funded program placed, in the numeric ordering of programs, ahead of any program in a higher "coverage" bracket. The five coverage brackets acknowledged by the hard-data funding assessment are as follows: "All” (100 percent fully funded); “Most” (60 to 99 percent); “Some” (30 to 59 percent); “Few” (16 to 29 percent); and “Very Few” (0 to 15 percent). All of these percentages refer to the percentage of each annual incoming class that receives a full funding package.

Programs that fully fund 33 percent or more of their admitted students were considered eligible for “package averaging.” If and when programs meeting this criterion were revealed to offer funding packages of differing value to different students, the total stipend value of all full-funding packages was divided by the number of such packages to determine average annual stipend value.

The funding category does take into account duration of funding, as programs’ funding packages were assessed for this category by multiplying average annual package value by the duration of each program in years. Other than for the deduction of outstanding tuition costs (as described above), the varying amount of tuition charged at individual programs was disregarded, as students receiving full funding do not, by definition, pay tuition.

Applicants should be aware that many programs deduct administrative fees—almost always less than $1,000, and usually less than $500—from their annual stipends. These fees were not considered in the funding listing. Moreover, some programs offer health insurance to all admitted students and some do not. Programs that offer health insurance to all admitted students include, but are not limited to, the following (programs are listed in order of their appearance in the numeric funding ordering): University of Texas in Austin [Michener Center]; Cornell University in Ithaca, New York; University of Michigan in Ann Arbor; Louisiana State University in Baton Rouge; Ohio State University in Columbus; University of Alabama in Tuscaloosa; Virginia Polytechnic Institute (Virginia Tech) in Blacksburg; Washington University in Saint Louis, Missouri; Arizona State University in Tempe; Iowa State University in Ames; Purdue University in West Lafayette, Indiana; University of Minnesota in Minneapolis; McNeese State University in Lakes Charles, Louisiana; Pennsylvania State University in University Park; University of Iowa in Iowa City; University of Wyoming in Laramie; Vanderbilt University in Nashville; University of Wisconsin in Madison; University of Texas in Austin [English Department]; University of Virginia in Charlottesville; University of California in Irvine; University of Oregon in Eugene; University of Central Florida in Orlando; University of New Mexico in Albuquerque; Rutgers University in Camden, New Jersey; and Oklahoma State University in Stillwater.

Selectivity

As fewer than five full- or low-residency programs nationally publicly release “yield” data—the percentage of those offered admission to a program who accept their offers and matriculate—the acceptance rate figures used for the index’s selectivity listing are necessarily yield-exclusive. Most have been calculated using the simplest and most straightforward method: Taking the size of a program's annual matriculating cohort in all genres and dividing it by the program's total number of annual applications across all genres. Thirty-two of the 92 (i.e., 35 percent of) full-residency programs with a) an annual applicant pool over 50, and b) known acceptance rates, had available admissions data from the 2011-2012 admissions cycle; 28 (30 percent) most recently released data from the 2010-2011 admissions cycle; 14 (15 percent) from the 2009-2010 admissions cycle; five (5 percent) from the 2008-2009 admissions cycle; and five (5 percent) from the 2007-2008 admissions cycle. In total, 74 programs (more than 80 percent of the total with available admissions data and a sufficiently large annual applicant pool) had data available from 2010 or later.

The relative paucity of data available for the selectivity listing—acceptance rates are available for 121 of the 224 MFA programs worldwide (54 percent; however, dozens of these data-unavailable programs were too new to have produced reliable admissions trends yet)—is partly attributable to programs' continued reticence in releasing the sort of internal admissions and funding data regularly released by colleges, universities, and most professional degree programs. Hundreds of interviews with MFA applicants between 2006 and 2012 suggest that a program's acceptance rate is one of the five pieces of information applicants most frequently seek out when researching a graduate creative writing program.

In order to avoid artificially privileging smaller or regional programs with an unknown but possibly modest annual yield—that is, programs with small applicant pools but also small incoming cohorts, and consequently, in some instances, extremely low yield-exclusive acceptance rates—only programs receiving more than fifty applications annually were eligible for the selectivity listing. Of the sixty-five full-residency programs with unknown admissions data, no more than ten would likely even be eligible for inclusion in the selectivity listing on the basis of their applicant-pool size. Whether these programs' annual incoming cohorts are also sufficiently small—and thus the programs, statistically, sufficiently selective—to make any of these programs entrants into the top half of all programs in the selectivity category is unknown. The likelihood is that three or fewer programs that would otherwise appear in the top half of all programs for selectivity are ineligible for the selectivity listing solely because they have thus far declined to publicly release their admissions data.

Of programs with fewer than fifty applications whose admissions data are known, the ten most selective programs (from most to least selective) are as follows: Northern Michigan University in Marquette; Old Dominion University in Norfolk, Virginia; Temple University in Philadelphia; Savannah College of Art & Design in Georgia; Otis College of Art & Design in Los Angeles; University of Missouri in Kansas City; University of Central Florida in Orlando; Butler University in Indianapolis; Chapman University in Orange, California; and Sewanee: University of the South in Tennessee.

The small number of low-residency programs with publicly accessible acceptance rates makes crafting a selectivity listing for such programs difficult. Of the 19 programs (33 percent of all low-residency programs) with available data, many have available admissions data only from the 2007–2008 admissions cycle or earlier. Fortunately, of the fourteen programs in this class most popular among applicants, nine (64 percent) have available admissions data. Moreover, the three most popular programs (in the view of applicants) have all released data from one of their past three admissions cycles.

The applicant-pool-size cutoff for inclusion in the low-residency selectivity listing is set at forty annual applicants.

Student-Faculty Ratio

Using data on individual programs’ total student-body sizes, along with recitations of full-time core faculty in fiction, nonfiction, and poetry from program websites and the Poets & Writers Magazine online MFA database [3], student-faculty ratios were calculated for the 122 full-residency MFA programs (75 percent of all such programs) with both sets of data available. Tiebreakers in student-faculty ratio were awarded (where necessary and where possible) to the program with the higher number of total core faculty members. Note that this listing takes into account any and all fiction, poetry, and nonfiction faculty and students at individual programs, not merely faculty and students in the former two genres.

Just as a large percentage of applicants report that they prefer, all things being equal, a more selective program, or a better-funded program, or a program that performs better at placing its graduates in fellowships and full-time jobs post-graduation, generally speaking creative writing graduate students prefer a lower student-faculty ratio to a higher one—the better to have immediate and meaningful access to those charged with instructing, mentoring, and advising them.

Fellowship Placement

Programs' postgraduate fellowship placement records were assessed by determining how many individual "placement events" a given program's current students or (much more commonly) graduates achieved during the past decade (2002 to 2012). Only a limited number of fellowships and residencies are available to MFA graduates while in-program or immediately postgraduation, and fewer still are specifically targeted at current MFA students and/or recent MFA graduates. Most of these make publicly available the names and biographical data of their fellows and residents. The focus for this year's fellowship placement listing was on forty-two of the fellowships and residencies in this group—generally speaking, the nation's most prestigious post-MFA fellowships and residencies.

The fellowships and residencies surveyed for this measure were the following: The Akademie Schloss Solitude in Stuttgart, Germany; the Amy Clampitt Residency Award at the Amp Clampitt House in Lenox, Massachusetts; the Axton Fellowship at University of Louisville in Kentucky; the Bard Fiction Prize and Residency at Bard College in Annandale-on-Hudson, New York; the Bennett Fellowship/Writer-in-Residence at Phillips Exeter Academy in Exeter, New Hampshire; Bread Loaf Writers’ Conference Fellowships at Middlebury College in Vermont; the Charles Pick Fellowships at the University of East Anglia in Norwich, United Kingdom; the Daehler Fellowship/Writer-in-Residence at Colorado College in Colorado Springs, Colorado; the David T.K. Wong Fiction Fellowships at the University of East Anglia in Norwich, United Kingdom; the Dobie Paisano Fellowship Program at the University of Texas in Austin; the Elma Stuckey Liberal Arts and Sciences Emerging Poet-in-Residence Program; the Emory Creative Writing Fellowship in Atlanta, Georgia; the Eva Jane Romaine Coombe Writer-in-Residence Program at Seven Hills School in Cincinnati, Ohio; the Fine Arts Work Center Fellowships in Provincetown, Massachusetts; the Gaius Charles Bolin Fellowship in English at Williams College, Williamstown, Massachusetts; the Gettysburg Emerging Writer Lectureship; the Herbert Martin Fellowship in Creative Writing and Diversity at Dayton University in Ohio; the Hodder Fellowship at Princeton University in New Jersey; the HUB-BUB Artist-in-Residence Program in Spartanburg, South Carolina; the Hugo House Writer-in-Residence at the Richard Hugo House in Seattle, Washington; the James Merrill Writer-in-Residence at the James Merrill House in Stonington, Connecticut; the Jenny McKean Moore Writer-in-Residence at George Washington University, Washington, D.C.; the Kelly Writers House ArtsEdge Residency at the University of Pennsylvania in Philadelphia; the Kenan Visiting Writer Lectureship at University of North Carolina at Chapel Hill; the Kenyon Review Fellowships at Kenyon College, Kenyon, Ohio; the Kimmel Harding Nelson Center Writer-in-Residence Fellowships in Nebraska City, Nebraska *; the Madeleine P. Plonsker Emerging Writer’s Residency Prize at Lake Forest College in Illinois; McKnight Artist Fellowships at The Loft Literary Center in Minneapolis, Minnesota; the Milton Center Image Fellowships at Seattle Pacific University in Washington; the Moseley Fellowship in Creative Writing at Pomona College in Pomona, California; New York Foundation for the Arts Artist Fellowships; the Olive B. O’Connor Fellowship at Colgate University in Hamilton, New York; the Philip Roth Residency at Bucknell University; Ruth Lilly Poetry Fellowships from the Poetry Foundation in Chicago, Illinois; the Writer-in-Residence Residency at the Saint Albans School in Washington, D.C.; the Southern Review Resident Scholar Program at Louisiana State University in Baton Rouge; the Stadler Fellowship at Bucknell University in Lewisburg, Pennsylvania; the Steinbeck Fellowship at the Center for Steinbeck Studies at San Jose State University in California; the Stegner Fellowships at Stanford University in Palo Alto, California; the Tickner Fellowship at The Gilman School in Baltimore, Maryland; Studio Center Fellowships from the Vermont Studio Center in Johnson *; and the Wisconsin Creative Writing Institute Fellowships at the University of Wisconsin in Madison.

* = Due to their brief duration and implicit emphasis on poets and writers already working full-time outside of the academy, these placements were assessed as to low-residency program graduates only.

These forty-two fellowships and residencies played host to 900 placement events between 2002 and 2012. As the fellowship placement listing acknowledges placement “events” rather than placed fellows or residents, it is possible for a single fellow or resident to be the subject of more than one placement event.

As simply ordering programs by the number of their students or graduates subject to placement events between 2002 and 2012 would unfairly favor larger programs (which naturally have more graduates on the fellowship market annually), programs have instead been ordered on the basis of a placement score, calculated as follows: A program's total number of placement events between 2002 and 2012 was divided by the size of the program's annual incoming cohort. The resulting scores ranged from 4.40 to 0.03. In several instances, programs identical both in size and in their number of placement events received identical placement scores; where possible, these ties were broken by privileging the program with the higher number of total placement events. Programs founded during the assessment period had their scores pro-rated on the basis of how many years (out of the last ten) they had had a full class of graduated students on the postgraduate fellowship market.

Because fellowships and residencies draw no distinction between full- and low-residency programs, this is the only measure in which full- and low-residency programs were combined in a single measure. This said, low-residency programs were subsequently granted their own numeric ordering, in recognition of the fact that these programs are hampered by the decreased likelihood that their graduates will seek fellowships or residencies in the first instance (as by definition low-residency students already have full- or part-time employment).

This fellowship placement listing should be used with caution. As selection for a fellowship or residency is often the result of one or more individuals being the "consensus pick(s)" of a panel of judges—and as consensus in art has not always, historically, favored innovation—it is possible for selected fellows and residents to in some instances be amongst the most talented, but not necessarily the most innovative, of their graduating year's national cohort. This is by no means to impugn, or remark upon, the writing of any particular fellow or resident, or on the selections of any particular fellowship or residency. Instead, the emphasis in this caveat is on causation: Applicants should not presume either that a program with a high standing in the placement category can ensure them a fellowship or residency, nor that a program with such a standing necessarily hosts the strongest student cohort if innovation, rather than technical mastery, is the particular interest of the applicant.

Job Placement

Between 2008 and 2012, the most popular online discussion board for creative writing job-seekers pursuing full-time employment at the university level, The Academic Jobs Wiki, listed 353 full-time positions available for poets and fiction writers. Data on the individuals ultimately hired for these openings was available for 220 of these 353 listings (62 percent). Based on research into the educational credentials of the individuals ultimately hired for these positions, an ordered listing was created to indicate which graduate creative writing programs’ alumni enjoyed the most success on the academic job market over these four hiring cycles. The following figures may be of academic interest to those tracking employment opportunities for creative writers in higher education:

2008–2009 Hiring Season: Sixty-eight positions available (hire information available for 62 percent); male/female split for those positions with available data for final fires was 57 percent female, 43 percent male.

2009–2010 Hiring Season: Eighty-four positions available, ten of which were holdovers from the previous year (i.e., cancelled or frozen searches from the previous year); hire information available for 52 percent; male/female split for those positions with available data for final hires was 57 percent female, 43 percent male.

2010–2011 Hiring Season: Seventy-five positions available, eight of which were holdovers from the previous year (i.e., cancelled or frozen searches from the previous year); hire information available for 79 percent; male/female split for those positions with available data for final hires was 51 percent male, 49 percent female.

2011–2012 Hiring Season: One hundred forty-six positions available, one of which was a holdover from the previous year (i.e., a cancelled or frozen search from the previous year); hire information available for 36 percent; male/female split for those positions with available data for final hires was 51 percent male, 49 percent female.

While the number of available creative writing positions in higher education appears to be increasing, given that the world’s 224 full- and low-residency MFA programs, and 33 doctoral programs in creative writing, graduate more than 2,000 poets and 2,000 fiction-writers every year, along with between 500 and 1,000 nonfiction writers (some of whom have qualifications and prior publications in fiction and/or poetry), the data above suggests that each year full-time teaching positions at the university level are available for, on average, well less than 1 percent of graduate creative writing program alumni. Even if graduates were only required to compete for employment against those in their own annual cohort, and even assuming only between 10 and 20 percent of nonfiction program graduates can or do compete for positions advertised for poetry and/or fiction, this figure would be less than 4 percent. Realistically, however, each year’s graduate creative writing program alumni are competing against an ever-increasing stock of unemployed, underemployed, and employed-but-still-job-hunting alumni from previous years.

While surveys of MFA applicants suggest that only about half of the nation’s creative writing program graduates wish to teach, even this statistic—if it is used to amend the figures provided above—cannot bring an individual degree-holding poet or writer’s employment chances (all things being equal) higher than, at best, 8 percent. Consequently, those graduate creative writing programs with the best track records in terms of job placement—the ten highest-placing programs in this measure achieved full-time job-placement rates, during the period assessed, of between 10 and 25 percent—are offering to students significant value-added as they pursue postgraduate employment. Whether higher job placement rates at certain schools are due to stronger alumni networks, better career placement services, better teaching, or simply more talented and/or better-published graduates is unclear, though there appears to be a high correlation between a program’s standing in this measure and its standing in other cohort-assessment indicia.

Program Duration

A program's duration is measured by the average length of its curriculum in years. Some programs allow students to petition to extend their stay; because such petitions are granted on a case-by-case basis, and because the granting of such petitions often results in only an additional unfunded, non-tuition-remitted year of study, individual programs' program-extension policies have not been considered in calculating program duration. Nationally, only one full-residency MFA program is known to be one year in duration, and only two programs are four years in duration. The remaining 164 full-residency programs are either two or three years in duration, with a notable trend being the increasing popularity of three-year programs among applicants.

Generally speaking, few programs that fund less than 33 percent of their incoming students are more than two years in duration. In contrast, fully funded programs are slightly more likely to be three years in duration as opposed to two. Low-residency programs are much more commonly two years in duration rather than three, and yet these programs cannot be measured by their duration because, unlike full-residency programs, they are generally highly flexible with respect to the duration of their students' courses of study.

Assessments of program duration do not consider the availability of postgraduate fellowships, or automatic postgraduate placements, unless these opportunities are guaranteed to all rising third-years in good standing in the program. As applicable, nonguaranteed postgraduate funding opportunities are formally acknowledged in program funding provided the average number of opportunities available each year is known. This said, at least two programs, Cornell University in Ithaca, New York, and University of Michigan in Ann Arbor, so consistently offer a postgraduate lectureship to all or nearly all of their graduating students that they may nominally be considered (in the case of Cornell University) a three-year program (though some students receive a fourth-year lectureship as well) or, in the case of the three-year program at University of Michigan, a four-year program. Another program, the University of Iowa in Iowa City, is known to offer postgraduate fellowships or lectureships to as many as 33 percent of its graduates. In the index, these three programs have had a “+” appended to their program duration to indicate the frequent availability of program-sponsored postgraduate fellowship and employment opportunities.

Program Size

In the MFA Index, the size of a program's annual incoming cohort is expressed using the usual acronyms for magnitude: XS (Extra-Small, an average total of two to nine students, per matriculating class, across all genres combined); S (Small, ten to nineteen students); M (Medium, twenty to thirty students); L (Large, thirty-one to forty-nine students); and XL (Extra-Large, fifty or more students per year). Because many programs do not include their matriculating class size on their websites, in some instances this data has been extrapolated from other available information. One program, the University of Wisconsin in Madison, was by necessity granted a special dispensation in several categories, as it is the only MFA program in the United States or abroad to admit fiction and poetry students in alternating years. This required two methodological accommodations: (1) using statistical extrapolation for the one-year and four-year applicant surveys (the program’s previous-year percentage of survey responses in the “off-year” genre—that is, the percentage of all poetry-applicant responses compiled for the 2010–2011 application cycle that the University of Wisconsin’s poetry program received—is multiplied by the number of respondents in that genre in the current year; this is then added to the actual number of applicant responses attributable to the program in the “on-year” genre); and (2) averaging the class-size figures for the program. Because the program accepts six poets and six fiction writers every two years, the program is treated as having an average annual matriculating class size of six.

Students Fully Funded

Full funding is defined as the equivalent of a full tuition waiver and a minimum $8,000/academic year stipend. Where the tuition waiver offered is less than 100 percent, the program's stipend value is reduced by the amount an admitted student is asked to pay in tuition annually. All stipend values are adjusted for cost of living. Cost of living assessments were made using the website Sperling's Best Places (www.bestplaces.net/COL/default.aspx [4]). Healthcare costs, administrative fees, and student relocation costs were not estimated or considered, nor was the cost of tuition—as students receiving full funding do not pay tuition.

In setting the stipend value for full funding at a cost-of-living-adjusted $8,000/academic year, the funding listing makes no assertion as to whether this should be considered a living wage. A single individual earning this amount per annum is eligible for the Earned Income Tax Credit when filing federal taxes; however, because the "$8,000/academic year" standard translates to $8,000/nine months, this rate of income accrual in fact extrapolates to an annual income of $10,667. This is still below $13,440—the amount, in 2009 dollars, at which a single individual is phased out completely from the Earned Income Tax Credit.

The MFA Index acknowledges that MFA students receiving the minimum full-funding stipend may still find themselves borrowing a de minimis amount (defined as less than $3,000/academic year) to help defray the costs of program attendance. For the purposes of this article, the de minimis borrowing level has been set at that rate of borrowing that both puts an applicant out of range of pro-rated EITC coverage and yet results in less than $10,000 in total federal debt during a three-year MFA program. Of the nation's 41 fully funded full-residency programs, only two are known to offer cost-of-living-adjusted stipends of less than $10,000/academic year.

page_5: 

Cost of Living

The cost of living in the various programs' listed host locations was determined using Sperling's Best Places (www.bestplaces.net/COL/default.aspx [4]). All cost-of-living data were then compared to a randomly selected national-average-range constant, in this case Ann Arbor, Michigan.

Notations used for cost of living are as follows: Very Very Low (25 percent or more below Ann Arbor, Michigan, the national-average-range constant for the 2013 MFA Index); Very Low (between 16 percent and 24 percent below); Low (between 6 and 15 percent below); Average (between 5 percent below and 5 percent above); High (between 6 and 15 percent above); Very High (16 percent or more above); and Very Very High (25 percent or more above the cost of living in Ann Arbor). While some students may choose to live outside the boundaries of their program's host location, commuting to an MFA program rather than living near campus includes hidden costs of its own, indeed costs of both a pecuniary and nonpecuniary nature. For this reason, only a program's host location was assessed for this measure. Cost-of-living adjustments were also used to determine the package value at individual programs for the funding and “full funding” categories (see “Full-Residency Program Profiles: Additional Program Measures: Funding” and “Full-Residency Program Profiles: Additional Program Measures: Full Funding”).

Teaching Load

While individual applicants' interest in teaching composition, rhetoric, literature, or creative writing to undergraduates will vary, generally speaking the most popular teaching load is a 1/1 (one course to be taught in the fall semester, one in the spring semester). The teaching loads of individual programs have not been ordered in a vertical hierarchy per se, yet this 1/1 standard has been used to determine whether a given program's teaching load is considered Low (“L”), Average (“A”), or High (“H”). That is, because the 1/1 load is the most popular amongst applicants—though it is not the most common teaching load at MFA programs—average annual teaching loads of 0/0, 0/1, 1/0, and 1/1 have been denominated “Low.” An average annual teaching load of 2/1 or 1/2 (the most common teaching load) is termed "Average," while an average annual teaching load of 2/2 is considered "High." Note that the term “load” is not used here pejoratively; some applicants will wish to teach more rather than less, even as other applicants prefer to do no teaching whatsoever. At present the MFA Index takes no position whatsoever on the academic or professional value of teaching a large or small number of undergraduate sections per academic year, nor on the effect such teaching may or may not have on individual students’ personal writing practices.

The term “average” is used here in two different senses: First, to denote a category of teaching load; second, to indicate that all programs are assessed by their “average” teaching load. Because many programs offer different teaching-load packages to different students, and/or increase or decrease teaching load over the duration of the program, the average (mean) number of courses taught per year per student in each program is used. In some instances, students may request and/or get assigned—once admitted to a program—a larger and therefore better-compensated teaching load. Such additional teaching sections are by no means guaranteed, however, and therefore are not noted in or considered by the MFA Index.

Some programs fund a small enough percentage of admittees through teaching assistantships that to assign such programs an "average teaching load" would be to wrongly imply that admitted students are likely to receive an assistantship. For this reason, programs that offer assistantships to less than one-third of their incoming cohort received an “--” in the “teaching load” column. Programs eligible for a “teaching load” designation, but which do not publicly disclose the teaching load they assign their teaching assistants, are indicated with a “no data available” (“n.d.”) notation.

CGSR Compliance

The Council of Graduate Schools Resolution, also known as the “April 15th Resolution,” states that graduate programs that are signatories to the Resolution shall keep funded offers of admission open through April 15 of each application cycle. Colleges and universities that adhere to the Resolution represent that all of their constituent programs and departments adhere to the terms of the Resolution, which include mailing a copy of the Resolution with all acceptances. Under the terms of the Resolution, programs may neither rescind nor threaten to rescind offers of admission to which any funding whatsoever is attached prior to April 15, nor may they explicitly or implicitly indicate to such accepted candidates, in writing or in person or via telephone, that there is any deadline for their matriculation decision other than April 15. Historically, MFA applicants have reported widespread noncompliance with the Resolution, which is problematic for applicants because CGSR-violative programs often require final matriculation decisions from applicants well before they have heard admissions responses from the other programs to which they applied. Applicants increasingly see such acceptances as excessively restrictive of their options and opportunities.

At present, only two CGSR signatories are believed to be noncompliant with the contract they and more than a hundred other universities signed and published for prospective applicants. This said, the CGSR Compliance category does not distinguish between programs known to have already violated the Resolution and those nonsignatories that simply could do so without running afoul of their host universities’ administrative policies. Therefore, while applicants should exercise due diligence and caution in applying to programs that are not CGSR compliant, they should also not presume violations will occur. The best policy is to contact nonsignatory programs directly and inquire regarding their CGSR-related policies; needless to say, some programs will welcome such queries more than others, as of late the question of the CGSR’s viability for creative writing MFA programs has been hotly contested by certain nonsignatory programs.

Any signatory to the CGSR found to be in violation of that contract will be listed as noncompliant, whether or not the program’s host college or university continues to be a CGSR signatory. Compliance inquiries are initiated on the basis of applicant self-reporting; since 2006, fully 100 percent of applicant complaints regarding programs’ CGSR-related policies have been found, following an investigation, to be meritorious. Indeed, in all but one instance the offending program ultimately confessed to the violation.

GRE Required

This category indicates whether or not a program requires applicants to submit Graduate Record Examination (GRE) General Test scores as part of their applications. Generally, programs that offer a substantial portion of incoming students some form of financial aid require these scores, and so applicants are advised to take this test prior to applying in order to avoid artificially limiting their application options. In most instances, student scores are only lightly scrutinized (or simply ignored altogether) by the programs themselves, and instead reviewed—where they are reviewed—by individual universities’ Graduate Colleges, which often have minimum GRE-score requirements (typically very generous ones). Creative writing MFA applicants should not avoid the GRE General Test for fear of the Mathematics portion of the exam; even those programs that do give minor weight to standardized test scores in their admissions processes generally look only at applicants’ Verbal and Analytical Writing scores. At present no programs require the GRE Subject Test in English Literature, though two programs (Johns Hopkins University in Baltimore and Boston University in Massachusetts) strongly suggest that applicants sit for and submit their scores from this exam. Applicants should also be aware that certain university-wide fellowships and grants require the submission of GRE scores. Applicants who do not submit such scores with their applications cannot be considered for these forms of financial aid.

Language Required

This category indicates whether or not a program requires applicants to exhibit proficiency in a foreign language prior to graduation. Some programs with a foreign-language requirement allow applicants to place out of this requirement through the submission and application of prior foreign-language course credits at the college level; other programs require that applicants take an exam (often a reading-knowledge-only translation exam) to show proficiency, regardless of their prior foreign-language experience. At present only a small minority of programs—nine of the 85 listed in the index, or 11 percent—have a foreign-language requirement as part of their curriculum. However, the category is presented here due to applicants’ great interest in, and sometimes anxiety about, such prerequisites for graduation.

Cross-Genre

Certain MFA programs require that individuals who apply and are admitted in a particular genre take only workshops in this “declared” genre while in-program. Other programs permit, or even require, matriculated students to take out-of-genre workshops—and among this latter group are two further subcategories of programs, those that permit students to take as many out-of-genre workshops as they wish, and those that permit or require only a limited number of out-of-genre workshops.

The past five years of online, public discussions between and amongst MFA applicants suggest that the availability of cross-genre study has become one of the top concerns for applicants seeking additional curricular information about the programs to which they wish to apply. Many applicants already write in more than one genre, and hope to have their multifaceted talents as literary artists shepherded, rather than impeded, by the curricula of programs on their chosen application list; other students are merely curious about genres other than their own, and view their in-program time as a rare opportunity to experiment with modes of literary art other than those with which they are already conversant. A smaller—but growing—subset of the applicant pool is comprised of self-styled “literary artists” rather than simply “poets” or “writers,” and these individuals already incorporate so many different aesthetic traditions into their work that to be limited to either “poetry workshops” or “prose workshops” would (in their view) be a betrayal of their artistic vision. Because the availability of cross-genre study is such a prominent concern amongst the applicant class, it is listed as a separate category here. All data for this category were taken directly from program websites; any program that permits or requires applicants to take out-of-genre workshops, in whatever number, has been listed in this column as a “yes” (“Y”). Programs that explicitly prohibit such study are indicated with a “no” (“N”). Because the tradition, among MFA programs, has been to disallow cross-genre study, programs whose websites were silent on the question of such study were also treated as, and are listed in the index as, a “no” for this measure.

Dates of Establishment

Reciting the dates of establishment for the nation’s full-residency MFA programs offers a critical historical context for the full-residency program index, the institutions profiled and assessed in the chart, and the very degree that is the focus of both the chart and the institutions whose attributes the chart’s surveys and hard-data measurements catalogue. This column of data does not apply to nonterminal, academic master’s programs in creative writing (with or without the option of a creative thesis), which are different in form and function from their longer, generally better-funded, more studio-oriented, terminal, art-degree MFA peers.

Previous survey methodologies used in assessing terminal-degree creative writing programs have leaned heavily on the somewhat tendentious factor of program visibility. When programs are assessed by individuals already within the system, the natural result is that older programs—whatever their selectivity, financial resources, faculty resources, curriculum, pedagogy, or student outcomes—move into positions of prominence due to their profile advantage. Yet applicants report only limited interest in programs’ historical pedigrees, as pedigree itself is often considered a suspect quantity in the national literary arts community. By publishing, for the first time, the dates of establishment of eighty-four of the nation’s 167 full-residency MFA programs, the 2013 MFA Index permits applicants and other consumers of these data to both disassociate historical pedigree from the distinct question of program quality, while also better understanding the historical context in which the creative writing MFA has achieved such cultural prominence.

Creative writing as an academic discipline originated in the late nineteenth century, yet by January of 1964 there was still only one MFA-conferring graduate creative writing program in the world. In fact, though the first MFAs in any field were granted in the 1920s, and the MFA-conferring Iowa Writers’ Workshop was founded in 1936, the MFA as a degree would have no abiding place in the national literary arts community until the 1980s. The 1940s, 1950s, and much of the 1960s were marked by attempts to find alternative models to the one provided by the Iowa Writers’ Workshop: first, in the degree-granting, relatively nonselective, grade-free creative writing program at Black Mountain College, which was founded in the 1930s but had its heyday in the late 1940s and early to mid-1950s; second, in the undergraduate-only creative writing program at Stanford University (founded in 1947 by Wallace Stegner) and other undergraduate programs modeled closely upon this one; third, in institutional but non-degree-granting programs like the Writers’ Program at the University of California in Los Angeles, founded in 1964; fourth, in non-institutional workshops such as the Black Arts Movement’s Umbra Workshop, founded in Manhattan’s Lower East Side in 1962; and fifth, in nonterminal MA programs in creative writing founded at a number of institutions, including Johns Hopkins University in Baltimore (1946), University of Denver in Colorado (1947), Cornell University in Ithaca, New York (1948), Indiana University in Bloomington (1948), University of Florida in Gainesville (1948), and Hollins University in Roanoke, Virginia (1960). Some of these latter programs required academic theses of their students rather than creative ones.

Ultimately, certain elements of the Iowa Writers’ Workshop MFA model became ascendant—after these and other elements had been experimented with by the types of degree programs listed above—because of a grass-roots campaign by working creative writers (among both faculties and student bodies at various institutions) to gain greater administrative, pedagogical, and creative autonomy from the academy to which they had previously been attached. Most of the early MFA programs appear to have been founded only after years—in some cases several decades—of struggle between creative writers and university bureaucrats, with the two primary bases for the latter’s objection to the MFA being that it cost much more than the MA to administer (due to the need for greater faculty resources, and the necessity of awarding tuition remission-eligible assistantships to many terminal-degree candidates) and permitted universities less immediate oversight over their resident literary artists. Far from a “cash cow” warmly embraced by U.S. universities, the creative writing MFA was for decades rejected by America’s universities—and often their English department faculties—as too exotic, too expensive, and too distant from the traditional academic functions of an American English department.

At the beginning of the 1980s there were still fewer than two dozen creative writing MFA programs in the world. It was not until the turn of the century that the rate of MFA-program creation significantly increased, as indicated by the table below, which catalogues MFA programs’ dates of establishment by decade:

MFA Programs Founded, by Decade

1920s: 0
1930s: 1
1940s: 0
1950s: 0
1960s: 11
1970s: 11
1980s: 27
1990s: 41
2000s: 94
2010s: 132 *

* = This is prorated from the number of programs founded in the first thirty months of the decade.

Location Assessments

While not listed in the 2013 MFA Index, location was one of the seven categories used to determine inclusion in the index. Programs located in or in thirty miles of a location that appeared in any one of the following eight national media assessments of the best places for individuals (particularly students and young professionals) to live and work were included. Those assessments were Bloomsberg/Businessweek (“America’s 50 Best Cities,” 2011); U.S. News & World Report (“10 Great College Towns,” 2011); Parents & Colleges (“Top 10 Best College Towns,” 2011); Travel + Leisure (“America’s Coolest College Towns,” 2009); The American Institute for Economic Research (“Best College Towns and Cities,” 2011); StudentUniverse (“Top 10 Cities to Visit in Europe,” 2011); MoneySense (“Canada’s Best Places to Live,” 2012); or ELM (“2011 Top Cities to Live and Work Abroad in Asia,” 2011).

Some of the criteria used by the listed media organizations included: Which towns and cities have the highest index of student-centered restaurants, bars, museums, pedestrian malls, bicycle paths/lanes, parks, hiking trails, sporting events, theaters, and concert venues; the total number of restaurants, bars, and museums per capita; the total number of colleges, libraries, and professional sports teams; income, poverty, unemployment, crime, and foreclosure rates; the percent of the local population with bachelor’s degrees; park acres per 1,000 residents; and air quality.

Application Fee

The application fee column lists each program’s application fee for the most recent application cycle.

The relevance of these data has increased in recent years, as three distinct but related phenomena have been observed in the MFA admissions system over the past six admissions cycles: acceptance rates at the nation’s most competitive programs are steadily declining; applicants are responding to this trend by applying to a larger and larger number of programs each year (the conventional wisdom in 2005 was that the average applicant should apply to eight to ten programs; now, applicants are regularly advised to apply to between twelve and fifteen programs, and more if financially feasible.

Given the cost of the Graduate Record Examination (GRE) General Test ($160), the cost per GRE “score report” to be sent to individual programs ($23), and the cost per transcript ordered from an alma mater (costs as high as $50 per transcript at some institutions, and rarely if ever less than $5 per transcript; some applicants, particularly nontraditional and international applicants, must order transcripts from multiple alma maters), applicants are increasingly unable to afford to apply to programs with high application fees. And because of the importance of applicant surveys to the MFA Index, programs with higher application fees are likely to receive fewer applications per annum and thus place lower in the one-year and four-year applicant popularity surveys than they otherwise would.

LOW-RESIDENCY PROGRAM PROFILES: ADDITIONAL PROGRAM MEASURES

Genre Availability for Study

Genre Availability for Study measures the number of genres in which a given low-residency program offers either a formal track or an informal concentration or focus. As many programs occasionally offer courses in genres for which they do not offer a track or concentration, it can be difficult to assess which genres are sufficiently supported at a given program that the program is likely to receive creative portfolios in that genre. The aim of the Genre Availability for Study category is to include only those genres in which a program is likely to attract an appreciable number of applicants—as opposed to a genre offered only as an occasional brief-residency course lacking substantial nonresidency, academic-year faculty support.

Residency

Residency measures the number of days per year a low-residency student in the program is required to be on campus. While at present there is no conventional wisdom as to the value of a long residency versus a shorter one, low-residency programs' residencies are typically the only opportunity for students to interact in person with their professors and to meet their classmates; consequently, many applicants to low-residency programs may prefer longer residencies.

 

Seth Abramson is the author of Northerners (Western Michigan University Press, 2011), winner of the 2010 Green Rose Prize from New Issues Poetry & Prose, and The Suburban Ecstasies (Ghost Road Press, 2009). A contributing author to The Creative Writing MFA Handbook (Continuum, 2008) and a 2008 recipient of the J. Howard and Barbara M. J. Wood Prize from Poetry, his work has recently appeared in Best New Poets 2008, American Poetry Review, Boston Review, New York Quarterly, and elsewhere. Currently a doctoral candidate in English at the University of Wisconsin in Madison, he is a graduate of Harvard Law School and the Iowa Writers' Workshop.

 

 

 

 

 


Source URL:https://www.pw.org/content/2013_mfa_index_further_reading

Links
[1] https://www.pw.org/content/2013_mfa_index_further_reading [2] https://www.pw.org/content/septemberoctober_2012 [3] http://www.pw.org/mfa [4] http://www.bestplaces.net/COL/default.aspx