Avoidance of Respondent Bias
The most extensive program assessment system in the United States, the higher education surveys published annually by U.S. News & World Report, produce assessments almost exclusively by individuals with no firsthand experience attending or teaching in the programs they are being asked to assess. For the magazine’s much-lauded law school assessment, for instance, judges, lawyers, and law firm hiring coordinators are asked to assess the academic quality of programs others have attended, and that they have encountered only to the same extent an MFA applicant encounters the graduates of individual creative writing programs in the course of his or her in-genre reading (or, alternately, in a social or professional context). In fact, all of the program assessments published by U.S. News & World Report use the same basic methodology, as stated in the 2011 edition of the magazine containing its graduate school program assessments: “[These assessments] are based on the results of surveys sent to academics…[t]he individuals rated the quality of the program at each institution from marginal (1) to outstanding (5). Individuals who were unfamiliar with a particular school’s programs were asked to select ‘don’t know.’” This last provision merely ensures that survey respondents have some basic familiarity with the programs they are assessing; it does not ask or encourage respondents to submit an institutional (or personal) self-assessment.
As is the case with the methodology described above, national educational-institution assessment schemes have historically sought out unbiased observers to assess accredited degree programs, with self-reporting of interested observers implicitly or explicitly disallowed. The Poets & Writers Magazine 2014 MFA Index improves on this model by surveying individuals who not only are in a position to gauge the professional performance of individual programs’ graduates and professors (i.e., by reading their published work), but who also have access to—and a natural interest in—a large stock of hard data regarding the programs they are being asked to consider.
The 2014 MFA Index makes a further improvement on the U.S. News & World Report methodology by eschewing overall program quality assessments altogether, and by stating explicitly that neither its constituent surveys nor its hard-data listings in any sense constitute an overall assessment of program quality. Overall assessments of program quality—in any field of study—are impossible, as such determinations differ depending upon the student, a large slate of unquantifiable program features, the period of time in which that student matriculates (as faculty turnover ensures program characters change over time), and a host of chance-based factors that no methodology ever devised could hope to or even wish to encapsulate. While the data provided in 2014 MFA Index is hopefully invaluable to applicants—especially given the historic opacity of graduate creative writing programs—it is no substitute for an individual applicant’s subtle, many-factored assessment of which program is best for him or her. The 2014 MFA Index should be used as one tool among many.
Survey Cohort Demographics
Online surveys conducted in 2010 using a Google-sponsored survey application suggest that the online MFA applicant community, including the community at The Creative Writing MFA Blog and the MFA Draft 2012 Facebook Group, subscribes to the current conventional wisdom (as first laid out in the 2005 edition of Tom Kealey's Creative Writing MFA Handbook) regarding the most important considerations in applying to and matriculating at an MFA program. When asked, “If you are a current creative writing MFA applicant, which of the following program features are among your top five reasons for choosing to apply to a particular program?” and given the alphabetically listed options “Alumni,” “Cost of Living,” “Curriculum,” “Duration,” “Faculty,” “Funding,” “Internship Opportunities,” “Location,” “Postgraduate Placement,” “Reputation,” “Selectivity,” “Student-to-Faculty Ratio,” “Size,” “Teaching Opportunities,” and “Other,” 909 survey respondents provided the following responses:
1. Funding (68 percent)
2. Reputation (61 percent)
3. Location (59 percent)
4. Faculty (50 percent)
5. Teaching Opportunities (41 percent)
6. Curriculum (28 percent)
7. Cost of Living (23 percent)
8. Alumni (21 percent)
9. Duration (19 percent)
10. Size (13 percent)
11. Selectivity (13 percent)
12. Postgraduate Placement (11 percent)
13. Student-to-Faculty Ratio (10 percent)
14. Internship Opportunities (6 percent)
15. Other (5 percent)
There is substantial similarity between these survey results and the results of a 2009 survey that asked applicants, “Which of these is most important to your decision about where to apply?” with the listed options being "Location," "Funding," "Faculty," "Reputation," "Selectivity," "Curriculum," or "None of the Above" (respondents were permitted to select more than one answer). The top four answers, out of hundreds of survey responses, were identical to the top four responses in 2010:
1. Funding (56 percent)
2. Reputation (45 percent)
3. Location (32 percent)
4. Faculty (18 percent)
These survey responses also closely correspond to the responses provided by MFA faculty members when they were asked by Poets & Writers Magazine, in 2011, to provide in narrative form some of those program features they considered most important for prospective students to consider. With over forty responses, the top priorities of MFA faculty members, as reported by the faculty members themselves, were (with the number in parentheses constituting the percentage of surveyed faculty members who cited a specific program feature):
1. Funding (71 percent)
2. Quality/Aesthetics of Faculty Work (69 percent)
3. Student Self-Reporting (60 percent)
4. Program Atmosphere (50 percent)
5. Faculty Accessibility (41 percent)
6. Teaching Opportunities (38 percent)
7. Location (36 percent)
8. Editing Opportunities (31 percent)
The remaining priorities were all cited by fewer than 30 percent of faculty respondents. Of the eight program features most frequently cited by faculty members as being appropriate bases to evaluate a graduate creative writing program, five—Funding, Quality/Aesthetics of Faculty Work, Teaching Opportunities, Location, and Editing Opportunities—are as well-known and accessible to, and as easily understood by, MFA program applicants as by current or former MFA students or their professors. The remaining three depend upon a phenomenon not yet common in the field of creative writing: For applicants to take into account Student Self-Reporting, Program Atmosphere, and Faculty Accessibility, programs would first need to routinely make current students and faculty available to non-admitted or pre-admission MFA applicants. As of this writing, only one of the 171 full-residency creative writing MFA programs worldwide is known to regularly offer this courtesy to non-admitted or pre-admission applicants. It is worth noting, too, that of the next thirteen program features cited as important by current MFA faculty members, nine—Workshop Format, Curricular Flexibility, Alumni Publishing Success, Presence of a Reading Series, Curricular Intensity, Program Size, Cost of Living, Internship Opportunities, and Program Duration—are known or knowable to applicants at the time they make their application decisions, and two others are within the unique power of applicants to access (Program Visit and Alumni Self-Reporting). As of this writing, fewer than 5 percent of creative writing MFA programs worldwide were known to offer funds to applicants for pre- (or even post-) admission program visits.
When applicants were asked, in 2010, “If you are a current creative writing MFA applicant, how old will you be when you begin your program, assuming you’re admitted this year?” 1,929 survey respondents provided the following responses on The Creative Writing MFA Blog:
1. 23 or 24 (18 percent)
2. 25 or 26 (16 percent)
3. 21 or 22 (13 percent)
4. 27 or 28 (11 percent)
5. Older than 40 (10 percent)
6. 29 or 30 (8 percent)
7. 31 or 32 (6 percent)
8. 33 or 34 (5 percent)
9. 35 or 36 (4 percent)
10. 37 or 38 (2 percent)
11. 39 or 40 (2 percent)
These results are consistent with earlier online survey results, from 2009, suggesting that the median age of a creative writing MFA applicant is between twenty-six and twenty-seven.
Asked, “As part of your research into MFA programs, how many current or former MFA students or faculty have you spoken to?” 686 survey respondents provided the following responses:
1. 1 to 2 (34 percent)
2. 3 to 5 (27 percent)
3. 0 (25 percent)
4. 6 to 10 (7 percent)
5. 11 or more (4 percent)
Asked, “Have you received advice from an undergraduate creative writing faculty member in applying to MFA programs?” 860 survey respondents provided the following responses:
1. Yes (59 percent)
2. No (30 percent)
3. Not Yet, But I Plan To (10 percent)
In 2011, the application lists of a random sampling of three hundred 2010–2011 MFA applicants were analyzed to determine the frequency of different list sizes. The results were as follows (the first number is the number of programs on an applicant’s application list, while the second is the number of such lists in the analyzed sample; the third figure is the percentage of the total sample with an application list of the stated size):
1: 10 (3 percent)
2: 6 (2 percent)
3: 10 (3 percent)
4: 18 (6 percent)
5: 23 (8 percent)
6: 30 (10 percent)
7: 26 (9 percent)
8: 31 (10 percent)
9: 31 (10 percent)
10: 29 (10 percent)
11: 24 (8 percent)
12: 15 (5 percent)
13: 14 (5 percent)
14: 14 (5 percent)
15: 7 (2 percent)
16: 4 (1 percent)
17: 2 (1 percent)
18: 4 (1 percent)
19: 0 (0 percent)
20: 0 (0 percent)
21: 1 (0 percent)
22: 1 (0 percent)
Asked, on The Creative Writing MFA Blog in 2010, "Why do you want to get a graduate creative writing degree?” and given the options "Credential," "Employability," "Time to Write," "Mentoring," "Networking," "Community," "Validation," "Avoid Work," and "None of the Above," with the option to select more than one answer, the top three answers, among hundreds of responses, were as follows:
1. Time to Write (55 percent)
2. Employability (43 percent)
3. Mentoring (36 percent)
The Poets & Writers Magazine 2014 MFA Index does not use the above survey data to create a weighting system for the columns of information it provides. There is a presumption, instead, that applicants' own application lists best reflect the extent to which they take into account funding, location, reputation, selectivity, faculty, curriculum, and other applicant-specific factors in choosing which programs to apply to and attend.
Were the above data used to create a weighting system for the data presented in the 2014 MFA Index, or were the applicant survey to be removed from the MFA Index altogether, many of the nation’s most prominent and popular programs would disappear from the table altogether—as programs widely admired by applicants (and working poets and novelists) do not always perform superlatively in hard-data measures. A program assessment missing critical application-trend data would constitute a poor reflection of the present national consensus on which programs are most popular among applicants and working authors alike. For instance, under the applicant survey’s current methodology a popular but largely-unfunded MFA program in a major urban center might still appear in the top half of the one-year and five-year surveys because even a relatively low standing in the funding, selectivity, student-faculty, fellowship placement, and job placement categories can be counterbalanced by a program's popularity due to its location, faculty, and/or other unquantifiable factors. The popularity of a program’s location and faculty is best reflected by privileging applicants’ application lists rather than a confluence of these lists and publicly accessible hard data. To redesign the 2014 MFA Index to deprivilege current applicant mores would be to ensure that virtually no nonfully funded and/or big-city programs (with only a handful of exceptions) would appear in the table, nor many (if any) nonfully funded programs whose appeal lies in large part in the composition of their faculty rosters.
While it’s fair to assume that program popularity going forward may be directly affected by a higher or lower relative placement in the funding, selectivity, student-faculty ratio, fellowship-placement, and job-placement categories, the pace of this trend is arrested, rather than hastened, by the current program assessment. The present methodology both registers the relative decline or stagnation in the popularity of certain programs while allowing for these programs to improve their funding, selectivity, student-faculty ratio, and placement statistics before losing their positions (in part as a result of applicant consensus) in 2014 MFA Index altogether.
Genre of Survey Respondents
Asked in 2010, using a Google-sponsored survey application, “What is your primary genre?” 701 respondents from The Creative Writing MFA Blog provided the following responses:
1. Fiction (53 percent)
2. Poetry (28 percent)
3. Nonfiction (15 percent)
4. Other (2 percent)
Isolating only the 578 poetry and fiction respondents to the above survey question, the results are as follows:
1. Fiction (65 percent)
2. Poetry (35 percent)
This suggests that the potential survey cohort at The Creative Writing MFA Blog is similar in its constitution, in terms of genre affiliation, to the national MFA-applicant cohort. Hard data from twenty MFA programs with available admissions data for both genres (constituting a total of twenty-four data-sets ranging in age from the 2008–2009 admissions cycle to the 2010–2011 admissions cycle) generates a total data-set of 12,368 applicants, 8,730 of these being fiction applicants (70 percent) and 3,638 poetry applicants (30 percent). The genre breakdown for the one-year applicant survey published in 2014 MFA Index is nearly identical to this figure: Between poets and fiction writers, 67 percent of the members of the surveyed cohort were fiction writers and 33 percent were poets. For the 2013 applicant survey published in September 2012 by Poets & Writers Magazine, the genre breakdown was 70 percent fiction writers and 30 percent poets; for the 2011 applicant survey, the genre breakdown was 63 percent fiction-writers and 37 percent poets.
Applicant survey respondents for the 2014 MFA Index were self-selected, and it is the particular and express design of the survey methodology that this survey cohort be self-selected. Just as a survey aimed at determining popular car manufacturers might use a self-selecting cohort to only compile the responses of the best-researched car buyers—for instance, those who had spent time on websites that allow consumers to compare various available car brands and styles—the one-year and five-year applicant popularity surveys do not intend to sample a generic cohort of MFA applicants. Instead, it is their aim to primarily if not exclusively catalogue application decisions made by the best-researched MFA applicants, which class of applicants is considerably more likely to be found in a massive, real-time applicant community in which scores of data-points regarding individual programs are researched, shared, and discussed daily.
National Full-Residency Applicant Pool Size
The median estimate for the national full-residency fiction/poetry applicant pool (as calculated in 2011) is 2,797, the mean estimate is 3,253, and the adjusted mean is 3,042. The same series of calculations produced a median estimate, for the national nonfiction applicant pool, of 291, and a mean estimate of 345. The total size of the national full-residency applicant pool, across all three of the “major” genres of study, is therefore likely between 3,000 and 4,000. The two-genre, five-year, 2,519-respondent applicant survey that appears in the 2014 MFA Index consequently surveys the equivalent of 77 percent to 90 percent of an annual national two-genre applicant pool in the field of creative writing; the one-year surveys published annually by Poets & Writers Magazine survey between 13 percent and 23 percent of the three-genre national applicant pool for each admissions cycle.
For those program measures not subject to applicant surveys, such as recitations and ordered listings of admissions, curricular, placement, student-faculty ratio, and funding data, only data publicly released by the programs—either to individual applicants, to groups of applicants, in a program's promotional literature, or via a program website—have been included in the 2014 MFA Index. All data were updated regularly to reflect programs’ most recent public disclosures.
Many of the nation’s full- and low-residency MFA programs decline to publicly release internal data. Programs unable or unwilling to release data regarding their funding and admissions processes are necessarily disadvantaged by a program assessment that relies on transparency. Yet no program that fails to release this data for applicants' consideration can avoid being judged, by applicants and other observers, through the lens of such nondisclosures. As research for these surveys and listings is based entirely on publicly available, publicly verifiable data, the accuracy of the data of which the MFA Index is comprised can be readily confirmed by any party.
The Nonfiction Survey
Because fewer than half (47 percent) of full-residency MFA programs offer a dedicated nonfiction or creative nonfiction track—defined as a curricular track which permits a master’s thesis in the genre—nonfiction and creative nonfiction applicants have been surveyed separately from poetry and fiction applicants. These survey responses do not factor, in any sense, into either the one-year or five-year popularity surveys published in the 2014 MFA Index.
For the nonfiction/creative nonfiction survey, the designation “n/a” indicates that a given program does not offer a nonfiction track or concentration.
Low-residency programs were assessed in twelve categories, nine of which are either applicant surveys or ordered listings of hard data—six of these employing unscientific but probative surveying of the sort described above, and three based upon publicly-available hard data. Low-residency programs have not been assessed with respect to their funding packages because these programs generally offer no or very little financial aid to incoming students. The reason for this is that low-residency programs presume their students will continue in their present employment during the course of their graduate studies.
Over the course of six successive application cycles, a total of 304 low-residency applicants were surveyed as to their program preferences, with these preferences exhibited in the form of application lists. The locus for this surveying was (between April 16, 2007 and April 15, 2011) the Poets & Writers Magazine online discussion board, the Speakeasy Message Forum, widely considered the highest-trafficked low-residency community on the Internet; from April 16, 2011 to April 15, 2013, the survey locus was the MFA Draft 2012 and MFA Draft 2013 Facebook Groups described in detail above. The relatively small cohort used for this surveying accounts for the following: (1) The annual applicant pool for low-residency programs is approximately one-eighth the size of the full-residency applicant pool; (2) low-residency applicants do not congregate online in the same way or in the same numbers that full-residency applicants do; and (3) low-residency programs are subject to a "bunching" phenomenon not evident among full-residency programs, with only nine of 53 eligible programs nationally appearing on even 10 percent of survey respondents' application lists, and only three appearing on 20 percent or more.