Photo by Tim Sackton, licensed under CC BY SA.

I came to academia not to stay in academia. Having worked in the development sector for many years, I embarked on a PhD with the goal of satisfying my intellectual curiosity about what I did in practice. I always assumed that I would return to the development sector. During my final year of writing up, though, I found myself applying for many academic jobs. My plan not to stay in academia had changed as I progressed through my training, a reflection of the doxic expectations to proceed on to an academic career. I received my PhD in 2016 and found myself unemployed: having received my degree from a nonelite university, I began to understand the bleak chances of landing an academic job.

As someone who studies bureaucratic practices, I would argue that the process of looking for a job deserves attention in its own right. Contemporary academic hiring practices seem out of sync with the actual processes they coordinate, whether because of an unwillingness to respond to the reality of the academic job market or because of the slow, creaking apparatus of bureaucracy under the heavy weight of tradition.

The application process for a faculty position, which entails sending an electronic package of documents to the hiring department, is fairly standard. Yet the range of specific documents solicited and the degree of customization expected of those documents is staggering. Take, for example, the request to supply a sample syllabus. Even if one already has two or three syllabi in hand from courses one had previously taught, this would not suffice: positions vary in their topical focus, teaching level, and teaching expectations, requiring further iterations. After my first round of applications in 2015–2016, I was the proud owner of eight syllabi.

Constantly asking my advisors to write recommendation letters was another part of the process. Each application asked for a minimum of two letters, while many asked for three; in each case, they had to be submitted at the same time as the application, or at best a week or two later. My advisors wrote all of the letters I asked them for, despite their already considerable workload of teaching, research, and advising.

Why, in the existing labor market, does the application process proceed in this way? Would it not be kinder to everyone involved to first each applicant for a cover letter and CV—nothing more—and only later, after shortlisting, to ask for other documents? After all, how carefully can search committees possibly scrutinize the array of documents they currently receive when they are confronted with hundreds of applications. Shortcuts are already being employed. Standardizing and scaling back expectations for initial applications would significantly reduce the burden that the current system imposes on everyone involved and would reflect a more realistic assessment of the job market today. It is heartening to see that a few universities, especially in Europe, have moved in this direction, but change on this front has been slow.

More care could also be taken in communicating with unsuccessful applicants. During my first round of applicants, I was surprised by how many departments—especially in the United States—never even wrote to indicate that I had not been hired. Others provided generic responses, which generally took the form of “you were great, we were impressed, but there was over a thousand applicants to choose from.” It is, of course, easy to understand that search committees faced with so many expressions of interest are hard-pressed to provide individual feedback on all of them. Yet the boilerplate responses leave those of us on the outside what, if anything, we could have done differently.

After my first round of applications, I stopped applying for academic jobs in the United States, a decision that coincided with the U.S. presidential elections. While the idea of living in a country with a president like Donald Trump was daunting, this was a decision I made first and foremost in light of the lack of transparency and outdated hiring processes I encountered. At this point, I considered returning to the development sector. But I came to realize that the critical gaze I had learned to train on existing institutions now made the prospect of working within them unsettling. My skepticism of quick fixes and my commitment to an immersive understanding, gained over time, made me an uneasy fit for the development sector, in which short-term funding commitments incentivize project designs that can be implemented quickly and for which change can be measured in a year or two.

While working with a civil-society organization in Tajikistan last year, for example, I critiqued a proposal written by a colleague for relying (unrealistically, I thought) on training of migrant workers while they were in migration. She took my point but asked me: what else can we do? This gave me pause. I knew that the organization’s work was structured by geopolitics, economies of aid, and the discursive tropes of underdevelopment. What could be done was not an entirely an open possibility. So was an imperfect solution better than doing nothing at all?

If anthropology graduate programs are to train students for the private or public sector, they will need to place questions like these front and center. They will need to address the perception on the part of employers that a PhD pegs you as a researcher or makes you overqualified for positions that might otherwise be a good fit. Even then, anthropology alone cannot change the fact that we live and work in an increasingly postemployment economy. Unless there are major transformations in the distributive function of the market and the creation of jobs that do not now exist, this generation of scholars will likely continue to be the “renter generation,” resigned either to move wherever a job presents itself or to do without.