Note: The following article explains the approach used to compile the 2013 MFA Index, published in the September/October 2012 print edition of Poets & Writers Magazine.
The 2013 MFA Index comprises two tables: one of 85 full-residency MFA programs and one of 29 low-residency MFA programs. Full-residency MFA programs are assessed on the basis of twenty measures: eight listings of survey results, and twelve listings of other important program features. While the eight surveys detailed in the full-residency table cannot be classified as scientific, all are predicated upon sufficient hard data to be substantially probative. A scientific survey of full- or low-residency creative writing MFA programs, or doctoral creative writing programs, is not presently possible, as more than half of the nation's full- and low-residency graduate creative writing programs do not publicly release the data necessary for such an assessment.
Three of the eight full-residency MFA surveys are based on a survey of a large sample of current MFA applicants. In each survey, programs are ordered on the basis of the number of times they appeared on applicants’ application lists; the resulting numeric ordering therefore assesses only the popularity of individual programs among a large sample of well-researched aspiring poets and writers, and is not an assessment of overall program quality. While prospective MFA students have a wide range of resources at their disposal in selecting where to apply—including not only quantitative data but also programs’ promotional materials; first-hand interviews with program staff, faculty, and current and former students; large online communities of fellow applicants, many of which are devoted to MFA research, discussion, and application advising; their own past experiences in undergraduate and non-institutional workshop settings; the literary oeuvres of current faculty members and recent alumni; previous program assessments in various online and print media, including first-hand accounts from, and interviews with, current and former faculty and graduates; and in some instances program visits or prior first-hand experience workshopping with selected faculty members—there is necessarily some information that is impossible for prospective MFA students to access unless and until they become matriculants. Once a student has matriculated, however, they may develop subjective attachments to their alma mater, which makes objective assessment of their own experiences, as opposed to the provision of definitionally “self-serving” survey responses, difficult or even impossible; for this reason and others discussed in more detail, current MFA students are not asked to self-report on their programs, nor are they considered to have a broader, more accurate, or more timely knowledge of programs they do not attend than those unmatriculated applicants who are currently researching program options.
In the 2013 MFA Index, programs appearing in the top half of each applicant popularity survey are noted by their numeric placement in the survey results. Programs not appearing in the top half of the 2012 popularity survey or the four-year applicant popularity survey (taken between April 16, 2008 and April 15, 2012) may still appear in the full-residency table, provided they appear in the top half of three of the seven survey areas described below.
The five hard data–based survey columns in the full-residency table are as follows: selectivity, funding, student-faculty ratio, fellowship placement, and job placement. These categories appear in the table in the chronological order in which applicants to graduate creative writing programs encounter them: a program’s selectivity determines, all thing being equal, an individual applicant’s likelihood of securing admission; the funding category indicates, all things being equal, what aid package will be attached to that program admission if and when it comes; student-faculty ratio gestures toward, all things being equal, a student’s ability to access individual faculty members while in-program; fellowship placement generally measures the success of individual programs in placing recent graduates in post-MFA fellowships; job placement generally measures the success of individual programs in placing graduates who have already achieved some measure of fellowship/publishing success in full-time creative writing teaching positions in higher education .
These survey results are scientific to the extent that they order programs on the basis of quantitative data publicly released by the programs themselves, and unscientific to the extent that not every program has released data for every category of assessment. These five columns therefore constitute an ordering of all publicly known data rather than an ordering of all extant data. A full complement of funding and admissions data is available for well over half of the nation's full-residency MFA programs; the remaining programs are generally smaller, newer, lightly advertised, nondomestic, or regional in terms of their applicant base. As all of these programs have websites, however, and as all enjoy exclusive dominion over their online presence, the absence of any specific funding or selectivity data in these programs' online promotional materials is taken as an indication that these programs fully fund less than a third of their students and do not have an acceptance rate low enough for inclusion in the top half of programs in this category (currently, a program's yield-exclusive acceptance rate would need to be less than 11.1 percent for it to be included in the top half of the selectivity category). The full-residency table is based in part on the presumption that it would be counterintuitive for a program providing full funding to a substantial percentage of its student body to not indicate as much in its promotional materials. Program websites are regularly reviewed to determine whether a program has added information to its online profile.
Based on the data presently available, it is not anticipated that any of those programs without a full complement of funding and admissions data available in some form online would have appeared in the top half of full-residency programs in the funding category. These programs, given the incompleteness of their promotional materials, are also much less likely to attract sufficient applications to be eligible for the selectivity listing; a program must receive at least fifty applications annually to be considered eligible for this listing. As to the two placement-related surveys, these do not rely on programs' promotional materials or on their willingness to release internal data to individual applicants or groups of applicants, so all programs nationally, both full- and low-residency, are equally eligible for a top-half placement in the fellowship and job placement categories. Data sufficient to calculate student-faculty ratios for virtually every MFA program in the United States were also readily available.
The overlap between those programs appearing in the top half of the two fiction/poetry surveys—the third applicant survey being of nonfiction applicants only—and those programs appearing in the top half of the five hard-data survey results is significant. Of the 71 full-residency programs that appear in the top half of both the one-year and four-year survey results, 60 (85 percent) are also in the top half in selectivity, and two (3 percent) have no selectivity data available; 60 (85 percent) appear in the top half in the funding category; 47 (66 percent) appear in the top half in student-faculty ratio; 56 (79 percent) appear in the top half in fellowship placement; and 42 (59 percent) appear in the top half in fellowship placement. Of the 32 programs listed in the chart as placing in the top half of the nonfiction survey, 28 (88 percent) also appear in the top half of both the one-year and four-year (fiction/poetry) surveys, with another three (9 percent) appearing in the top half of one of the two. In a sign that applicant mores remain fairly consistent over time, of the 60 programs appearing in the top half of programs in the four-year applicant survey, 58 (97 percent) also appear in the top half of programs in the one-year survey.
The approach to compiling the 2012 MFA Index was devised with the following goals in mind: (1) To better inform applicants of their program options by offering the nation’s first complete listing of graduate creative writing programs; (2) to more widely disseminate hard data regarding objective but non-exhaustive/non-conclusory measures of program quality, which often play a role in applicants’ application and matriculation decisions; (3) to be responsive to, and reflective of, the opinions, interests, concerns, and values of current applicants to graduate creative writing programs; (4) to emphasize the importance of both inputs and outputs in assessing individual graduate creative writing programs; and (5) to enhance the transparency of the admissions process for present and future applicants to graduate creative writing programs.
In the twelve months between April 16, 2011, and April 15, 2012, 540 full-residency MFA applicants were surveyed on The MFA Draft 2012 Group, hosted by Facebook.com. Created by a group of then-current MFA applicants in the early Spring of 2011, the group at its peak had nearly 1,500 members during the survey period, and featured dozens of new MFA-related conversation threads—some garnering up to a thousand individual responses—each day. The MFA Draft 2012 Group was created and is moderated by the applicants themselves; admission of new members was and is determined entirely by applicants; and decisions regarding the production of personal or program information for purposes of research were likewise made entirely by applicants. The site was promoted via several methods: Word-of-mouth; links and references in the MFA Draft 2011 Group, the predecessor to the MFA Draft 2012 Group (and there is also, now, a well-attended MFA Draft 2013 Group); and links and references on The Creative Writing MFA Blog founded by novelist Tom Kealey. The four-year survey is a compilation of the data from this one-year survey and the past three years of Poets & Writers Magazine program assessments.
Individual users in the MFA Draft 2012 Group were distinguished by their proper names, as reflected by the full names (first and last) associated with their Facebook accounts. Internet research was conducted to verify applicants’ identities as and when authenticity was in question. The document in which Draft 2012 applicants voluntarily compiled their application decisions was part of an application that tracks all changes to uploaded documents by editors’ full names/Facebook accounts. This application ensured that any and all changes to the document were traceable to specific users. Users were permitted to amend their application lists in real-time; though less than 15 percent of users elected to make amendments to their lists once posted, all changes to applicants’ lists were accounted for by both the one-year and four-year applicant surveys appearing in the program tables. Substantial additional security measures were taken to ensure the authenticity of compiled application lists.
As noted, surveyed applicants voluntarily listed the programs to which they had applied or intended to apply, and were permitted to adjust these lists at any point during the survey period.
Only recently-collected data is of use to applicants. The one-year full-residency survey uses no survey data predating April 16, 2011; the four-year full-residency survey uses no survey data predating April 16, 2008; the low-residency survey uses no data predating April 16, 2007; and the creative writing doctoral survey uses no data predating April 16, 2007. The longer survey periods for low-residency MFA and creative writing doctoral programs were necessitated by the significantly smaller applicant pools for these degrees.
Eligibility for Inclusion in the 2013 MFA Index
Eligibility for the 2013 MFA Index was determined by how programs placed in seven survey areas: popularity among applicants, selectivity, funding, student-faculty ratio, fellowship placement, job placement, and location. Programs are categorized by the number of survey areas in which they place in the top half of all programs a) eligible for the category, and b) with data publicly available. The number of programs considered eligible for each category of assessment varies by the category; in some instances (such as fellowship placement and job placement) newer programs cannot yet be added to the pool of eligible programs because they have yet to graduate any poets or writers, whereas in other instances (such as selectivity, funding, and student-faculty ratio) certain programs may not yet have released the data necessary for them to be assessed in these areas. Popularity and location are the only categories in which all presently extant programs (or in the case of the four-year applicant survey, all programs extant for a majority of the four-year survey period) were instantly members of the pool of eligible programs. Otherwise, the pool of eligible programs was 124 for the selectivity survey, 150 for the funding survey, 130 for the student-faculty ratio survey, 130 for the fellowship placement survey, and 90 for the job placement survey. The reason for the significantly lower number of eligible programs in the job placement category is that MFA graduates are generally not competitive for the sort of hires catalogued by the job placement assessment—full-time, non-adjunct positions teaching creative writing at the college/university level—until they have been away from their alma maters for at least five years, during which time they have continued to publish in their genre (and in many instances gain teaching experience as adjuncts) to make themselves more marketable as full-time hires.
Given that the job placement survey tracks hires made between 2009 and 2012, only programs known to have graduated at least two classes of poets and writers by 2004 are considered to have had a full opportunity to place in the job placement category, with the result being that the number of programs founded prior to fall 2001 was used as the “eligible pool” figure for this measure (but note that, in order not to disadvantage programs founded after this time, such programs were nevertheless included in the survey if they successfully placed a graduate in an eligible position). As 90 full-residency programs were founded prior to fall 2001, the size of the pool for the job placement measure was set at 90. In future years the size of the “eligible pool,” consistent with the above-stated methodology, will increase.
MFA Program Response Rates
Graduate degree programs in creative writing respond to assessment-related inquiries at a lower rate than do programs in almost any other field of study in the United States. The following response rates were reported for questionnaires sent pursuant to the 2012 U.S. News & World Report program assessments (in each instance, the field of study is followed by the response rate from all programs surveyed in the field): Engineering (98 percent); Business (91 percent); Criminology (90 percent); Education (90 percent); Medicine (84 percent); Healthcare Management (76 percent); Statistics (67 percent); Law (66 percent); Public Health (61 percent); Audiology (57 percent); Library and Information Studies (56 percent); Pharmacological Sciences (56 percent); Social Work (56 percent); Occupational Therapy (53 percent); Veterinary Medicine (48 percent); Nursing (47 percent); Computer Science (46 percent); Physician Assistance (45 percent); Sociology (43 percent); Speech-Language Pathology (42 percent); Public Affairs (40 percent); Rehabilitation Counseling (40 percent); Fine Arts (39 percent); Political Science (37 percent); Economics (34 percent); Mathematics (34 percent); Physical Therapy (33 percent); English (31 percent); Physics (31 percent); Earth Sciences (29 percent); Clinical Psychology (28 percent); Chemistry (25 percent); Psychology (25 percent); History (23 percent); and Biological Sciences (15 percent). Respondent institutions in each of these academic fields were aware that their questionnaire responses would not be kept confidential, and that their participation in surveys sponsored by U.S. News & World Report would result in publication of a substantial stock of program-specific data regarding each university queried.
Every two years, the Association of Writers and Writing programs (AWP) sends a questionnaire to programs in the field of creative writing—a field whose administrators and faculty are no longer surveyed (and whose programs are no longer assessed) by U.S. News & World Report—in much the same way U.S. News & World Report does for the thirty-five fields of study listed above. A crucial difference between the two questionnaires, however, is that the AWP questionnaire guarantees anonymity to its respondents; AWP releases no program-specific data or survey results pursuant to its biennial questionnaire. It is worth noting, too, that AWP estimates (as of its 2009 Annual Report) that 34 percent of programs in the field of creative writing have declined to become members of AWP. These programs are not subject to AWP questionnaires.
According to AWP’s publicly released summary of the program questionnaire it distributed in 2007, between 40 and 60 percent of AWP’s member programs declined to answer AWP’s queries regarding internal admissions and funding data. Specifically, 47 percent of programs declined to reveal how many assistantships they offered annually to incoming students, 61 percent declined to reveal the stipend offered to teaching assistants, 56 percent declined to reveal whether they offered a full-tuition waiver to teaching assistants, 49 percent declined to reveal how many scholarships were offered to incoming students, 55 percent declined to reveal their annual number of applicants, and 52 percent declined to reveal the size of their annual matriculating class.