2011 Poets & Writers Magazine Ranking of MFA Programs: A Guide to the Methodology

Seth Abramson

Special Note on International Programs
The Poets & Writers Magazine MFA rankings have always considered, and will continue to consider, international MFA programs. However, international programs are unlikely to fare well in the overall rankings for several reasons: (1) nearly all non-U.S./non-Canadian graduate creative writing programs are (by U.S. accreditation standards) non-terminal (that is, they are M.Phil, M.St., or MA degrees, as opposed to the terminal MFA degrees considered by the Poets & Writers rankings); (2) non-U.S./non-Canadian applicants are less likely to frequent a U.S./Canadian-focused MFA website like The MFA Blog, and therefore non-U.S./non-Canadian programs are less likely to appear on the application lists of those polled for these rankings (and Canadian applicants applying to Canadian programs may be less likely to patronize The MFA Blog than American applicants applying to American programs); (3) unlike U.S. and Canadian MFA programs, overseas programs are rarely fully funded for non-domestic students (U.S./Canadian MFA programs do not distinguish between domestic and international applicants with respect to funding eligibility), and therefore are less likely to be popular amongst the U.S. and Canadian applicants that frequent The MFA Blog; and (4) due to the exceedingly small number of non-U.S. terminal-degree MFA programs now in operation, programs in Canada and elsewhere simply have fewer entrants into the international MFA system with which to achieve a top 50 ranking in any of the "ranked" categories in this assessment (for instance, in funding, selectivity, and postgraduate placement).

Every publicly-advertised terminal-degree MFA program in the world presently appears somewhere in the Poets & Writers Magazine MFA rankings. Even so, applicants seeking to attend MFA programs outside the United States should use these rankings with caution, as they may only imperfectly capture the level of esteem in which non-U.S. programs are held by non-U.S. MFA applicants. 


Special Note on MA Programs

Over the past four years, the present MFA rankings project has catalogued several thousand MFA applicants' application lists. One abiding trend is that only a small number of non-terminal Master's (MA) degree programs in creative writing can frequently be found on application lists otherwise comprised entirely of terminal-degree MFA programs. As only two MA programs presently answer to this description, these two programs have been included in the full-residency rankings for terminal-degree creative writing programs. Applicant mores seem to indicate that these programs are now considered on par with MFA programs in overall qualityif not in the critical feature of "terminality." Many of those who attend MA programs in creative writing subsequently apply to terminal-degree MFA programs upon graduation.

A third program is included in these rankings despite being an MPW (Master's of Professional Writing). The two reasons for this inclusion are (1) the MPW is arguably a terminal degree, though there is little evidence yet on the question of whether or not it enjoys the same regard in the field of creative writing (as to "terminality") as the MFA, and (2) as with the two MA programs referenced above, this MPW program often appears on the application lists of applicants who are otherwise applying only to MFA programs. In contrast, the overwhelming majority of MA programs only appear on the application lists of MA-only applicants.

Apart from these three ranked non-MFA programs, the non-terminal creative writing Master's programs most commonly appearing on MA-only application lists are Eastern Michigan University in Ypsilanti (three appearances), University of Tennessee in Knoxville (three appearances), and University of Louisville in Kentucky (two appearances). Eleven MA programs appeared only once on application lists during the course of this year's polling of MFA applicants. Because MA applicants do not commonly frequent The MFA Blog, the MA cohort involved in this polling is too small for any substantive conclusions to be drawn about trends in MA-only applications.


Nothing in these rankings' funding assessments is intended to impugn the motives or character of professors, administrators, or staff at any of the nation's graduate creative writing programs. The presumption of these rankings is that all of these groups have and do militate, with varying degrees of success, for more funding for their studentsand that, given the choice, every program would choose to be fully funded. Still, there is no question that some programs require virtually no financial outlay by admitted students, and others are institutionally structured to induce students to take out substantial student loans. The rankings must and do take this into account, just as they take into account that one has more time to focus on one's in-genre writing and reading if one is fully fundedor asked merely to work a low-impact part-time teaching jobthan if one is forced to work a full-time non-academic job to make ends meet while in-program. The rankings' strong emphasis on funding is an accurate reflection of what conventional wisdom amongst the MFA applicant class now holds.

Program funding packages were calculated on the basis of annual cost-of-living-adjusted stipend values for programs with full tuition waivers, and on the basis of annual cost-of-living-adjusted stipend values less annual tuition for programs offering only partial tuition waivers. Programs were further divided into categories on the basis of the percentage of each incoming class offered full funding. "Full funding" is defined as the equivalent of a full tuition waiver and an annual stipend of at least $8,000/academic year. No program offering full funding to less than 100% of its incoming class is ranked ahead of any program fully funded for all students. Likewise, no non-fully funded program is ranked ahead of any program in a higher "coverage" bracket. The four coverage brackets acknowledged by the rankings are as follows: "100% fully funded"; "75%+ fully funded"; "40% to 75% fully funded"; and "33% to 39% fully funded." All of these refer to the percentage of each annual incoming class that receives a full funding package.

No program fully funding less than a third of its admitted students received a top 50 ranking in the funding category. Programs whose coverage bracket is sufficiently high to receive a national ranking, and whose stipend is sufficiently high to meet the definition of "full funding," but whose specific annual stipends were unknown at the time the rankings were compiled, were ranked last within their respective coverage brackets. Top 50 programs awarded an Honorable Mention in funding are indicated with a plus-sign (+) in the print edition of the rankings. In the online-only Honorable Mention, second, and third tiers of the overall rankings, the designation "HM" is used instead.

Programs that fully fund 33% or more of their admitted students were also considered eligible for "package averaging." If and when programs meeting this criterion were revealed to offer funding packages of differing value to different students, the total stipend value of all full-funding packages was divided by the number of such packages to determine average annual stipend value. Because some programs do not advertise special funding offerings available only to select students, not every program benefited from this feature of the rankings. Consistent with the structure and conceit of these rankings, programs exhibiting maximum transparency with respect to their promotional materials are most likely to receive a comprehensive assessment of their program's total funding package.

The "total funding" rankings take into account duration of funding, as programs were ranked for this measure by multiplying annual package value by the duration of each program in years. The varying amount of tuition charged at individual programs was disregarded, as students receiving full funding do not, by definition, pay tuition. For the annual funding ranking, only annual package value was considered.

Applicants should be aware that many programs deduct administrative feesalmost always less than $1,000, and usually less than $500from their annual stipends. Moreover, some programs offer health insurance to all admitted students and some do not. Programs that offer health insurance to all admitted students include but are not limited to the following (listed in order of "total funding" rank): University of Alabama in Tuscaloosa; Cornell University in Ithaca, New York; Louisiana State University in Baton Rouge; Ohio State University in Columbus; Virginia Polytechnic Institute (Virginia Tech) in Blacksburg; Arizona State University in Temple; Purdue University in West Lafayette, Indiana; Washington University in Saint Louis, Missouri; Iowa State University in Ames; University of Michigan in Ann Arbor; University of Minnesota in Minneapolis; Vanderbilt University in Nashville, Tennessee; Pennsylvania State University in University Park; University of Wyoming in Laramie; University of Iowa in Iowa City; University of Virginia in Charlottesville; University of Wisconsin in Madison; University of Oregon in Eugene; University of Nevada in Las Vegas; University of New Mexico in Albuquerque; and Rutgers University in Camden, New Jersey.

As fewer than five full- or low-residency programs nationally publicly release "yield" datathe percentage of those offered admission to a program who accept their offers and matriculatethe acceptance rate figures used for the national selectivity ranking are necessarily yield-exclusive. Most have been calculated using the simplest and most straightforward method: Taking the size of a program's annual matriculating cohort in all genres and dividing it by the program's total number of annual applications across all genres. Thirty-three of the top 50 programs in selectivity (66%) had available admissions data from the 2009–10 admissions cycle, nine of the top 50 programs in this category (18%) most recently released admissions data during the 2008–09 admissions cycle, and eight programs (16%) most recently released admissions data during the 2007–08 admissions cycle.

The relative paucity of data available for the selectivity rankings is attributable to programs' continued reticence in releasing the sort of internal admissions and funding data regularly released by colleges, universities, and most professional degree programs. Hundreds of interviews with MFA applicants between 2006 and 2010 suggest that a program's acceptance rate is one of the top five pieces of information applicants request when researching a graduate creative writing program. Fortunately, only one of the top 50 MFA programs has not yet made its annual acceptance rate public either directly or indirectly. More than 30 programs not ranked in the top 50 nationally have publicly released their annual acceptance rate.

In order to avoid artificially privileging small programs with an unknown but likely modest annual "yield"programs with small applicant pools but also small incoming cohorts, and consequently, in some instances, extremely low yield-exclusive acceptance ratesonly programs receiving more than 100 applications annually are eligible for the top 50 in selectivity. Of the approximately 68 full-residency programs and 31 low-residency programs with unknown admissions data, the "overall" polling done for these rankings suggests that no more than ten to fifteen would be eligible for inclusion in the top 50 for selectivity on the basis of applicant-pool size. Whether these ten to fifteen programs' annual incoming cohorts are sufficiently smalland thus the programs, statistically, sufficiently selectiveto make any of these programs entrants into the top 50 for selectivity is unknown. The likelihood is that fewer than five programs that would otherwise appear in the top 50 for selectivity are ineligible for that ranking solely because they have thus far declined to release their admissions data to applicants.

Of programs with fewer than 100 applications whose admissions data are known, the eight most selective programs are as follows: Florida International University in Miami, #1; University of Massachusetts in Boston, #2; University of California in San Diego, #3; University of Kansas in Lawrence, #4; Northern Michigan University in Marquette, #5; Old Dominion University in Norfolk, Virginia, #6; West Virginia University in Morgantown, #7; and Georgia College & State University in Milledgeville, #8. These program rankings are provisional; admissions data for low-volume MFA programs cannot be considered as probative as data for programs with larger applicant pools.

The small number of low-residency programs with publicly-accessible acceptance rates makes crafting a selectivity ranking for such programs difficult. Of the 15 programs (32.6% of all low-residency programs) with available data, all but five have available data only from the 2007-8 admissions cycle or earlier. Fortunately, the programs ranked first, second, and third in this measure have all released data from one of their past two admissions cycles. The applicant-pool-size cutoff for inclusion in the low-residency selectivity rankings is set at 50 annual applicants.

Programs' placement records were assessed by determining how many individual "placement events" a given program's graduates achieved during the twelve-year period from 1998 to 2010. Only a limited number of fellowships and residencies are available to MFA graduates immediately post-graduation, and fewer still are specifically targeted at recent MFA graduates. Most of these make publicly available the names and biographical data of their fellows and residents. The focus for this year's rankings was on eleven of the fellowships and residencies in this groupgenerally speaking, the nation's eleven most prestigious post-MFA fellowships and residencies.

The fellowships and residencies surveyed for this measure were the following: The Stegner Fellowship at Stanford University in Palo Alto, California; the Wisconsin Creative Writing Institute Fellowship at the University of Wisconsin in Madison; the Hodder Fellowship at Princeton University in New Jersey; the Fine Arts Work Center Fellowship at the Fine Arts Work Center in Provincetown, Massachusetts; the Emory University Creative Writing Fellowship in Atlanta, Georgia; the Stadler Fellowship at Bucknell University in Lewisburg, Pennsylvania; the Axton Fellowship at University of Louisville in Kentucky; the Olive B. O'Connor Fellowship at Colgate University in Hamilton, New York; the Bennett Fellowship/Writer-in-Residence at Phillips Exeter Academy in Exeter, New Hampshire; the James Merrill Writer-in-Residence at the James Merrill House in Stonington, Connecticut; and the Amy Clampitt Residency Award at the Amy Clampitt House in Lenox, Massachusetts.

These eleven fellowships and residencies played host to more than 300 "placement events" between 1998 and 2010. As the placement rankings acknowledge placement events rather than placed fellows or residents, it is possible for a single fellow or resident to be the subject of more than one placement event.

As simply ranking programs by the number of their graduates subject to placement events between 1998 and 2010 would unfairly favor larger programs, programs have instead been ranked on the basis of a placement "score," calculated as follows: A program's total number of placement events between 1998 and 2010 was divided by the size of the program's annual incoming cohort. The resulting top 50 size-adjusted scores ranged from 1.30 to 0.03. Three programsUniversity of Utah in Salt Lake City; Florida International University in Miami; and University of Alaska in Fairbanksexperienced a single placement event during the twelve-year period surveyed but could not be ranked because their annual incoming cohort size has not been publicized. None of these programs were eligible for a placement ranking higher than #42, however, based on the number of placement events to which their graduates were subject during the survey period. In several instances, programs identical both in size and in their number of placement events received scores resulting in rankings "ties."

Because fellowships and residencies draw no distinction between full- and low-residency programs, this is the only measure in which full- and low-residency programs were ranked in a single measure. This said, the low-residency programs were subsequently granted their own numerical ranking, in recognition of the fact that these programs are hampered by the decreased likelihood that their graduates will seek fellowships or residencies in the first instance (as by definition low-residency students already have full- or part-time employment). It must also be said that, because a program needed to have begun graduating students in 1998 to be an eligible program for the full survey period, only programs founded in 1996 or after enjoyed the full benefit of the twelve-year assessment window, and many if not most low-residency programs were founded after 1996. Likewise, several top 50 programs that might have ranked (or ranked higher) in the top 50 in placement did not do so for the same reason. Top 50 programs founded in the midst of the assessment period include University of Wisconsin in Madison (first class graduated, 2004); University of New Hampshire in Durham (first class graduated as terminal-degree MFA recipients, 2005); and University of Wyoming in Laramie (2007).

These placement rankings should be used with caution. As selection for a fellowship or residency is often the result of one or more individuals being the "consensus pick(s)" of a panel of judgesand as consensus in art has not always, historically, favored innovationit is possible for fellows and residents to in some instances be amongst the most talented, but not necessarily the most innovative, of their graduating year's national cohort. This is by no means to impugn, or remark upon, the writing of any particular fellow or resident, or on the selections of any particular fellowship or residency. Instead, the emphasis in this caveat is on causation: Applicants should not presume either that a program with a high placement ranking can ensure them a fellowship or residency, nor that a program with a high placement ranking necessarily hosts the strongest student cohort if innovation, rather than technical mastery, is the particular interest of the applicant. On these points the rankings make no specific claim other than to note these important distinctions.


Stop Publishing These Rankings

Dear Poets & Writers,

I can't believe Poets & Writers is going to keep publishing these ridiculous rankings. First of all, doesn't it occur to anyone that the values of a group of people who frequent an MFA blog might not be the same values of the general MFA community? There’s no way to tell how good a program is going to be by staring at a hundred program websites and comparing their funding packages, which is what a group of people answering polls on a blog are doing. You shouldn’t apply to an MFA program in order to become a person funded by an MFA program; you should apply to an MFA program to become a better writer. And this emphasis on “time to write” is flawed. I’m from Hartford, CT. You want time to write, move to Hartford. You can rent a one bedroom for $250 dollars a month, and write all the time. Good teachers. A good community. These are what a person should look for in a MFA program, and Seth Abramson is never going to point you toward that. Please stop legitimizing his preposterous internet fetish. Let’s go back to when we admitted that this was something you couldn’t rank.

Samuel Amadon

Hi Samuel, You're absolutely

Hi Samuel,
You're absolutely right in thinking that the values of the (total) annual national applicant pool are not those of the nation's largest (or, really, any) online community of MFA applicants; the article above (pp. 1-2) emphasizes this point several times and in several different ways. The goal of the polling, which is only one portion of the ranking system as you know, is to measure only the attitudes of those who pool their resources and knowledge when applying to MFA programs by participating in a community of fellow applicants -- those less likely to do so are also less likely to enjoy positive outcomes with respect to the first of the primary goals of the rankings (pg. 1, above: "Specifically, the goals of these rankings and their methodology are the following: Less overall student debt among MFA graduates, more transparency in the promotional materials and public disclosures of existing MFA programs, and greater access, for applicants, to the wealth of conventional wisdom in the MFA applicant community about which programs and which program features are most conducive to a memorable and valuable MFA experience"). You're also absolutely right to say that polling can never offer a complete picture of program quality--that's why the article above says (pg. 1) that the matriculation decision "will finally be made, and must be made, using the rankings as only a secondary resource," why it does not attempt to measure "faculty and community" (two unmeasurables both you and the article agree are not quantifiable) directly but uses applicants' application decisions as an indirect reflection of word-of-mouth about both, and why a good portion of the rankings are assessments of publicly-announced, hard-data program features like funding, selectivity, and postgraduate placement. The first measure is aimed at helping applicants avoid unnecessary, crippling debt, which was rampant among applicants before programs' funding information received national release via a single ranking methodology, and the second two hard-data measures aim at helping applicants gauge prospective cohort quality (an imperfect science, one reason the rankings are often cited as "unscientific" in the article above; still, "cohort quality" being one vital element of "community," this does strike at the heart of what you've termed the key to the MFA application/matriculation decision). I know you went to Columbia, as I'm familiar with and enjoy your work, and I think the key for you, as for anyone, is to simply ask whether you enjoyed your experience there and found it, on balance, worthwhile--if so, and I've no reason to think or guess otherwise, the rankings are admittedly of no relevance, as they're not aimed at/toward current students or graduates but only future applicants whose MFA years may still lie ahead. The hope is that future applicants to Columbia (or anywhere else) will be able to use the rankings to get hard data on funding, selectivity, and postgraduate placement, even if they decide the polling portion of the rankings is not helpful to them--though as the article above details (pg. 1) the correlation between what the hard data tells us about program features that affect real lives, and what applicants are saying about where they want to apply, is intimately linked. This suggests that applicants are now able and inclined to use information to make application and matriculation decisions, rather than rumor and guesswork. I can't imagine willingly going back to a time when such an important decision was made without the benefit of even the "secondary resource" of information. If you (I mean the generic "you" here) didn't decide where to attend college without the benefit of information, why apply to an MFA that way, especially when it's an unmarketable degree that it's financially dangerous to go into debt for, unlike the B.A.? The response to the rankings among applicants has been overwhelming--more than 98% positive. Those who are not applicants may tend to misunderstand the rankings because, at base, the rankings are not geared toward meeting the needs or interests of those who are not applicants (i.e., whose futures in no way depend on or involve an MFA-related decision). It is much easier to dismiss all the research and information contained in the rankings when one does not need that research or information; those who do need it are saying, en masse, that it is enormously profitable for them to have it, and that's why it keeps getting national release. Again, read pg. 1 above if you have any additional questions about the underlying principles behind, and/or the aim of, the rankings. It's spelled out fairly explicitly there. In any case, I'm glad you wrote in, because these are important questions and concerns. And (side note) congratulations on your recent book! Best wishes, Seth

Samuel, If you believe that

If you believe that the best writers always make the best teachers; that the aesthetics of a writer determine his or her in-class pedagogy; that an artist of one aesthetic inclination is temperamentally incapable of working productively with an aspiring artist of an entirely different bent; that applicants can conclusively determine, through sheer force of will, which poets and writers (all of whom are individuals they've never met) will be most helpful to their future development as artists... in that case, yes, David's argument might have some purchase. But we'd have to assume that you also cared little about accruing crippling student debt or attending a program with a strong cohort of artists, weren't at all interested in how large, how long, how student-teaching-intensive, how studio-intensive, and how focused on faculty teaching (cf. student-to-faculty ratio) your prospective program would be, and had time to research 200+ programs in grave detail rather than relying on massive online communities where others charitably contribute, for free, such intelligence. Granted, I don't know of any MFA applicant who fits this description--and I've had contact with literally thousands since 2006--but if I do come across any I will pass along the link. The point is, the rankings are the product of a community, and implicitly promote that community; David's comments mention some undoubtedly important considerations in choosing an MFA--and I endorse such considerations wholeheartedly--but nowhere can one find better discussions of such considerations than the polling locus used by the P&W rankings. It's not a coincidence. In any case, hopefully at some point in the future there'll be a possibility of discussing this more responsibly and decently (cf. "this guy"); David knows, I think, that the views he's attributed to me are not mine, and that I've said, from the start, and quite publicly, and repeatedly, that it would be foolish for any person to make an application or matriculation decision purely or largely on the basis of rankings. The difference between me and David is that I think artists are fiercely independent-minded enough to actually do this; meanwhile, David's concern on this score has somehow morphed into A) a categoric opposition to rankings (don't misunderstand his comments; at the time AWP vehemently opposed the very methodologies David's now implicitly endorsing, i.e. those of USNWR and The Atlantic), and B) a brand of advice -- as mystical as it is misleading -- which endows MFA applicants with powers of perception and prediction not even the best artists among us could possibly lay claim to. Be well, Seth


It's unfortunate that P&W would continue to back flawed methodology and a writer whose logorrhea is well-documented (just check out Abramson's responses to pithy statements in this comment section). The first word in the title of the magazine is "Poets" (which Abramson professes to be!) but this ranking and the sheer amount of insecure writing done to back up the ranking (funding...funding...funding...) is so far removed from anything poetic, or even useful to a writer or human being. Yes, this article, this ranking, this comment thread will get the clicks and eyeballs that P&W wants (and probably needs), but it is worth it?

Stovedore, I'm sorry you

I'm sorry you feel that way, and sorry also for my long-windedness. This is a complicated issue, and I'll admit that I balk when folks approach it only superficially. Any good faith discussion of the subject would need to be more exhaustive than the sort of pith that finds favor in our drive-by online exchanges -- all too many of which, like your own note, are peppered with irrelevant personal attacks. (These don't help a single applicant.) If you're curious about my poetry I hope you'll check it out, it's readily available -- and I can assure you, from personal experience, that there's more than enough time and space in the world for both writing poetry and providing a public service for young, under-resourced applicants to MFA programs. Cheers,

MFA rankings

According to the magazine, somewhere on the website there's a complete listing of all MFA programs, domestic and international. Can't find it. What's the URL? Thx.

Hi Sputnik, These two links

Hi Sputnik,

These two links -- (http://www.pw.org/content/2011_mfa_rankings_the_top_fifty_0) and (http://www.pw.org/content/2011_mfa_rankings_the_additional_rankings_of_fullresidency_mfa_programs) -- together constitute the largest and most complete listing of domestic and international full-residency MFA programs (as opposed to M.St., M.A., or M.Phil programs) available online or in print. In fact, every full-residency MFA program domestically or internationally that advertises itself is believed to be contained somewhere on these two lists.



Seth, I first found your rankings last year, and couldn't wait for this year's. They are an integral component of my MFA quest. Thank you so much for the time and effort put into the database. Having so much useful information gathered in one place is an inestimable help.


There is no doubt that you do great work with Full Residency MFA Programs. What about Low Residency? Don't they deserve some attention as well?


See here:


At that link, there's also a link to a listing of the additional 36 low-res programs in the U.S. and abroad. And if you read the methodology article (see sidebar) it covers low-res programs as well (there's a separate section). Plus these programs are mentioned in my articles in the print edition of the magazine. Hope you find them helpful! Best,


Doing a Classical Argument on this subject

I am an undergrad, a really low undergrad (sophomore) and I want to get an MFA. Your rankings have helped me make a decision about where to apply and to know that I need an MA in something else. I am thinking of being an editor if I can't make it writing, because lets face it, few can. For an unbiased classical argument, I need as many facts as I can and your article helps. How much influence do you have on the rankings? This is a real help and as someone who lives below poverty level currently, I appreciate your rankings. Perhaps the naysayers have money that they can throw around but I struggle and your rankings have helped me decide. I also look for faculty and community. Those are my top criteria. But for anyone to say that they may dismiss P&W because of these rankings is missing the entire point of P&W. It is a side endeavor. And they should know poor folks like myself rely on such thoroughness.

MFA & PhD versus MA & PhD

Seth - My son is a College Junior with a post graduate goal of getting his PhD in Creative Writing. I have to admit I am somewhat confused about the benefits of an MFA along with a PhD. From what I've seen, both of these are thought of as terminal degrees. So my question is - is it advantageous to get the MFA over the MA if the intention is to get your PhD? Also, there are only about 35 colleges in the U.S. that offer a PhD with a Creative Dissertation. Are there any rankings of these schools?

Hi CWD, Unfortunately no


Unfortunately no ranking of CW Ph.D. programs has been possible thus far due to a lack of data, but I'm hoping that will change soon. Suffice to say that you can expect the programs at University of Southern California, University of Houston, Florida State University, University of Denver, and University of Illinois at Chicago to be in the top 10, and likely also (though with less definite assurance) University of Georgia, University of Missouri, and University of Utah. CW Ph.D. programs are slightly more likely to accept applicants with MFA degrees, I feel, so in that sense an MFA may be preferable to an M.A., but generally you're absolutely right--both are terminal degrees, and one doesn't need more than one terminal degree technically (though with today's CW job market it really couldn't hurt), so one could certainly get an M.A. if one wanted to go on and get a CW Ph.D (or as more and more folks are doing, get a terminal CW MFA and then a terminal non-CW English Lit Ph.D.). The question I'd ask, though, is this: Why get an M.A. over an MFA? Why not get the terminal degree instead, in the event something unexpected happens (for instance one hits one's own personal comfort "limit" as to student loan debt, one suddenly can't move from one's current location for personal/family reasons, etcetera)--that way, one would already have a terminal degree, whereas if all you're holding is an M.A. when additional schooling becomes impossible you now have zero terminal degrees. Also, graduate school admissions in CW work almost entirely off one's portfolio, and the MFA gives one more time, generally, to work on one's thesis (and thus, by extension, one's CW Ph.D. portfolio) than an M.A. does. So one's chances of ending up in a top CW Ph.D. are better, for that reason also, following an MFA. I think the reason many MFA grads get a CW Ph.D. is not because of some added practical value--there's no proof yet it really affects one's job prospects, and there are no signs the CW Ph.D. is becoming the new CW terminal degree as some say (there's been almost no growth in the number of such programs in the past decade, whereas there have been maybe 40 new MFA programs over that time)--but because it gives one more teaching experience, more time to write and publish, more time in a supportive community of fellow artists, and so on. And yes, in a "tie-breaker" employment-related situation it might break a tie between two job candidates. Hope this helps, and best of luck to your son! --S.