We need to re-think the Ph.D...

A few months ago, I wrote a post about the differences between academic and industry interviews (including some advice for the latter). Since then, I've been involved in more interviews and I've come to the conclusion that the field of academia from whence I came is largely failing Ph.D. students.

One of my frustrations with academic research was a lack of appreciation for details. The 'point' of everything seems to be the results themselves - which generate publications - rather than the process of getting to those results. Papers in genomics often illustrate this: many publications choose arbitrary thresholds for analysis without explanation, or don't bother explaining why they chose this statistical test and/or normalization method over others, as just a few examples. I suppose that you can argue that it doesn't really matter: you're probably going to accept a false-discovery rate of 10%+ anyways, and ultimately, nothing's going to happen if you have a bunch of false-positives/negatives in your data.

Things are quite different on the industry side. The results that you generate are directly responsible for your livelihood as well as that of your co-workers. The point isn't to meet some minimum publishable standard of analysis, but rather to convince yourself and others that your results are legitimate. Consequently, it's no surprise then that good companies want you to prove that you are an expert in the methods required to generate the results that you've touted on your resume.

Which brings me to the title of this post: shockingly few Ph.D.s actually understand the details of the methods underlying their work. They can probably cite every paper published about their chosen discipline, but when pressed they'll admit that they analyzed their data like this because they were told to do so by a postdoc, or that they performed such and such normalization/analysis/test because it's what another paper did.

I completely understand - I spent 6 years in grad school and another 5-and-a-half as a postdoc. I've actually seen PIs tell students to skip the details of their analyses during lab meetings because they're not interested; they only want to see results. Furthermore, I've never seen a lab where members are actively encouraged to spend time at work improving their skills rather than performing experiments and analyzing data as quickly as possible [1].

As we all know, 85% of Ph.D.s will not be going into academia, and I expect that this percentage will only grow as time goes on and academic jobs continue to become less and less attractive. So regardless of the underlying factors (publish-or-perish, etc.) by focusing on results rather than skills, academia is leaving most of their trainees ill-prepared for the job market in which they will find themselves [2].

If you think that I'm blowing things out of proportion, then consider the following observations: most industry job postings require candidates to have 2-5 years of postdoctoral and/or industry experience above the Ph.D. in order to apply for a scientist position (rather than a technician). Also, my own employer interviews many, many candidates in order to fill each position, and a very common complaint is that candidates fail to show that they understand the methodology upon which their work is based.

The saddest aspect of all of this is that I've been hearing versions of these complaints since I was an undergrad: most university grads aren't going on to become professors, so why are we training all of them as if they were? My fear is that we're just going to accumulate more and more unemployed Ph.D.s until the system breaks under their weight.     

[1] 1) I assume that PIs generally believe that you'll learn by doing, but there's a surprising amount of stuff you can accomplish by jury-rigging random stuff off of the internet while learning very little of substance. 2) Labs that encourage such 'personal development' must exist, but have any biologists ever seen anyone give a weekly update at lab meeting about how they made their code more elegant, or efficient, or that they generalized it and shared it on the lab server? This should be part of the culture. 

[2] There's a stronger case to be made here: I honestly think that academic labs are under-performing because their members aren't learning the most efficient ways to accomplish their objectives. There's a total lack of knowledge-sharing among members of many labs and a lot of reinventing the wheel ad nauseum