On the Fetid Ritual of Citation: A Polemic Against Academic Necrophilia
Obedience as Scholarship: The Fetishisation of Citation and the Death of Thought
I. Introduction: The Cult of Freshness
In the Cathedral of Academia, recency is holiness. Age is sin. A paper written last year may be gospel; a paper written a decade ago, unless continually exhumed and redressed in new language, is heresy. The altar is strewn not with the bones of wisdom but with the glossy remains of recent publications, perfumed with keywords and formatted according to the latest citation style. At the heart of this grotesque mass lies a dogma so pervasive it is rarely questioned: that relevance is a function of time. That knowledge, like milk, sours with age. That the only worthwhile thinking is thinking that happened recently, in the last three to five years, preferably with a DOI and a peer-reviewed pedigree. And like all dogmas, it is protected not by reason but by ritual.
PhD students, those poor devils still naïve enough to believe their thoughts matter, are the first to learn the chant: “Cite recent literature.” It is intoned not as advice but as decree. “Situate your work within the current conversation,” they are told, as if knowledge were a dinner party where to quote a thinker from the 1990s is to rudely interrupt. Professors, whose own careers were launched quoting minds long dead, now wield the recency cudgel with priestly zeal, policing bibliographies like TSA agents rifling through bags for contraband thought. The implication is clear: unless you can draw a bloodline between your argument and a paper published during the last election cycle, your thesis lacks scholarly legitimacy. Never mind whether that recent paper says anything of value. Never mind whether it’s a rephrasing of a rephrasing of something more intelligent and more dangerous that was written thirty years ago. It is recent. And therefore it must be cited.
This is not scholarship. It is academic compliance masquerading as intellectual rigour. It is not an engagement with the world of ideas; it is a performance of belonging, a symbolic act designed to appease the gatekeepers and pay tribute to the bureaucratic machinery of publication. The demand to cite recent texts is not about building knowledge. It is about keeping the gears turning, the citation indices inflated, the illusion of productivity intact. It is about ensuring that the system feeds itself—more papers, more journals, more conferences, more metrics—while the core of thought remains untouched, unsullied, untroubled by originality.
What has been institutionalised is not knowledge, but cowardice. Intellectual cowardice. The kind that prefers small, incremental adjustments to safe, already-accepted models. The kind that fears the mess of real argument, of real challenge. The kind that reads Wittgenstein through a 2021 interpretive lens written by a tenure-track sociolinguist with a fetish for the word “liminality.” The kind that insists we cite a 2022 rehash of a 1974 model because the 1974 model, though better, is simply too old to be respectable. The kind that treats foundational work as archaeological artefact: interesting, perhaps, but irrelevant to the living rituals of modern discourse.
This essay is an indictment. Not just of the rule, but of the ecosystem that produces and protects it. Not just of the citation window, but of the entire performative economy it supports. What follows is a dissection—not gentle, not polite—of the mechanisms by which academia has surrendered its mandate to interrogate truth and has instead canonised the recent, the timid, the predictable. It is a polemic against a system that punishes curiosity and rewards conformity. Against a culture that treats footnotes as currency and thought as dangerous expenditure.
Let the archaeologists dig. Let the faithful pray to their fresh-cited idols. I will be here, in the dust and the wreckage, digging up the bones they buried too quickly, and asking the only question worth asking: does it still burn?
II. Taxonomies of the Bureaucratic Mind
The citation window rule is not merely a policy. It is a symptom, and like all symptoms, it points to a deeper pathology: the bureaucratisation of the mind. Scholarship, once a vocation for the eccentric, the heretical, and the dangerously curious, has been embalmed in process. Ideas no longer emerge from solitude or insight; they emerge from rubrics. They are not tested by fire but by formatting. We have replaced the academy with an administrative machine, and like all machines, it must be fed—not with thought, but with measurable output.
To understand the citation window, one must first understand the worldview of the bureaucrat. For the bureaucrat, value is what can be counted. Insight, being unquantifiable, is therefore irrelevant. Bureaucracy loathes the exceptional; it cannot be indexed. It fears what cannot be reduced to a spreadsheet. Thus, the scholar is not judged by the depth of their insight or the originality of their voice, but by the statistical noise of their references: number of sources, recency of sources, alignment with “current discourse.” Quality has been replaced by timestamp. Thought has been replaced by metrics. Relevance is not determined by argumentative force but by date of publication. This is not academia. It is Kafka rewritten by a committee of librarians.
The modern academic must learn the taxonomy of compliance: how to cite, when to cite, how many times to cite, and whom to cite to signal proper ideological hygiene. The citation window—three to five years, a number pulled from the same part of the collective body as the notion of “impact factor”—is presented as neutral. It is not. It is profoundly ideological. It privileges the recent over the rigorous, the fashionable over the foundational. It assumes that knowledge is an assembly line, and that the latest product must always be better than the last. It assumes that novelty is improvement. It is the academic equivalent of believing that last year’s washing machine renders Newton obsolete.
But ideas are not appliances. They do not expire. They do not become less true with age. And yet the modern scholar must behave as if they do. They must cite not to clarify, not to connect, but to survive. Because the system does not reward risk. It punishes it. The scholar who dares to cite an older, sharper, more enduring work risks being seen as out of touch, ungrounded, unserious. The demand is not for foundation, but for compliance. You must be able to show, like a nervous schoolboy holding up his homework, that you have read the same papers as everyone else, and read them recently. That you have paid your dues to the gods of recency.
So who benefits? Who tightens the screws? Follow the incentives. Publishers, whose profit models depend on the churn of new material, need a citation economy that rewards volume. Journals need fresh references to remain “impactful.” University departments need metrics to justify funding, and government research councils need evidence that the taxpayer is getting measurable “output.” It is a vast, incestuous economy of performance, in which knowledge is not a cathedral but a cubicle. In which every idea is formatted, footnoted, and indexed until it no longer threatens to mean anything.
This system does not accumulate knowledge. It accumulates paperwork. It does not refine ideas—it sediments them into sludge. Knowledge is no longer a mountain climbed, a peak reached by those willing to breathe thin air. It is a palimpsest, overwritten again and again with soft echoes and bureaucratic refinements until the original thought is illegible beneath the weight of its commentaries. The citation window does not preserve truth; it dilutes it. It produces the illusion of progress through referential motion, like a bureaucracy that moves papers from one desk to another and calls it productivity.
The scholar, trapped in this labyrinth, becomes a scribe—not of revelation, but of etiquette. A manager of signals. A curator of safe references. They are no longer expected to think. They are expected to participate in a mimetic loop where each paper cites the last, each thought nods politely to its predecessor, and nothing ever quite says what it means. There is no flame. There is only the ash of consensus.
This is the taxonomy of the bureaucratic mind: classify, timestamp, index, comply. In such a system, the thinker is not a risk-taker. They are a clerk. Their pen is not a sword, but a rubber stamp.
III. Necrophilia in the Library: On the Fetish of Citation
There is a kind of intellectual necrophilia that passes for scholarship. It is not reverence for the dead, nor dialogue with the past. It is not the exhumation of the profound or the retrieval of buried genius. It is the grotesque fondling of academic cadavers—ritualistic, compulsive, unthinking. Citation, in its current form, is no longer communion. It is fetish. No longer honouring the ancestors, but polishing their bones to perform belonging. The library is no longer a sanctuary of the living dead; it is a morgue with a PR department.
This is not about citing Plato. Plato, at least, still speaks. He still offends. He still asks you to think. No, the citation fetish is far more insidious: it is the demand to cite corpses of thought—papers that never lived, ideas that never breathed. Dozens of thinly written journal articles, intellectually sterile, barely distinguishable from each other, now paraded in literature reviews like wax dolls dressed for inspection. Their function is not to inform but to signal: “I know the ritual.” “I am safe.” “I have paid tribute to the gods of recent publication.”
The modern PhD student learns early the art of academic embalming. Not how to think, but how to arrange the bodies. “Cite this paper,” they are told, “even if it says nothing new.” “Include this one—it’s recent.” “Pad this section out with three more sources, just to be safe.” And so the corpse-pile grows. The thesis swells not with meaning, but with residue. With the dead weight of fashionable irrelevance.
This is mandatory padding—academic bulimia. Stuffing the bibliography to the edge of rupture, then vomiting the same names in slightly different order across each paragraph, each footnote, each sanctified section. One can almost hear the creak of coffin lids as the same dozen authors rise again, summoned not for insight but for optics. It is not what they say that matters—it is when they said it, and how many others have said it since.
What is truly obscene is that this necrophilic ritual is praised as rigour. “A well-referenced paper,” they call it. As though reference were virtue. As though to grope more dead ideas is to get closer to truth. This is not rigour. This is theatre. It is an act of scholarly coprophagia, where the remains of the remains are recycled into new papers that reek of nothing but their predecessors. Each new paper says, in essence, “I have read the same things you have. I am part of the club.”
Examine any ten recent doctoral theses. You will find in them the same litany: papers with almost identical abstracts, each claiming to “build upon” previous work, each carefully annotated to show that the author has read the last five years of nothing. Studies that say “this confirms what we already suspected” are cited more than those that dare to overturn assumptions. Papers whose novelty lies in a shifted verb tense. Journal articles where the conclusion is a caution, the methodology a repetition, the insight a whisper into the void.
Here is a real example, and it is not anomalous: A doctoral thesis in sociology, discussing digital identity, cited twenty-nine papers from the past three years that all defined “digital identity” in nearly the same way. Not one engaged with the philosophical genealogy of the term. Not one traced it back to selfhood, to Hegel, to the roots of the dialectic. Instead, the thesis recited the current formulae like mantras—“self-presentation online,” “virtual persona,” “fluid digital subjectivity.” The bibliography glittered. The mind behind it withered.
Another: a doctoral candidate in economics builds a model of transaction costs based on five papers published between 2018 and 2022. All are iterations of earlier models. None depart from Coase. But Coase is not cited. He is “implied.” Instead, the student cites a chain of reinterpretations, each more tepid than the last, until the original idea is as thin as broth. But this is rewarded. The thesis passes. The supervisors nod. It has all the right names in all the right places.
This is the heart of the fetish: citation as ornament. As makeup. As performance of seriousness. No longer is the scholar a reader. They are a curator of names. A decorator of intellectual mannequins. Each paper, each dissertation, becomes a haunted house—full of references, echoing with the hollow groans of texts never truly read, never really questioned, merely displayed like scalps on a belt.
To cite without meaning is to masturbate with corpses. It is a ritual stripped of soul. And yet it is this necromantic pageantry that passes for contribution. It is this embalmed performance that earns degrees, that fills CVs, that sustains the great pyramid of peer-reviewed irrelevance.
Let us speak plainly: the dead are not to blame. They have already done their work. It is we who have failed, who have turned citation into a religion of the recent, who have confused repetition for respect. If we are to think again—truly think—we must burn the fetish. We must cite less and read more. We must exhume not the fresh but the worthy. And we must remember that knowledge is not a graveyard. It is a battlefield.
IV. Peer Review as Ritual Castration
The peer review system, that sanctified barrier between “acceptable” and “unpublishable,” is less a gateway to truth than a padded cell for dangerous ideas. It is not a crucible but a confessional. You enter not to test your thesis in fire but to surrender it to the clerisy for inspection. The peer is not a colleague in the Platonic sense, nor a fellow traveller in search of truth. The peer is a priest. The review is a ritual. The submission is a supplication. And the approval—if granted—is not a recognition of value, but an indulgence. You have confessed. You have recanted. You have cited the sacred texts. You may now proceed.
In this theatre of the pious, originality is not merely discouraged—it is punished. The peer review process functions as an instrument of orthodoxy, a mechanism by which deviation is located, flagged, and neutralised. To stray from the accepted canon is to risk excommunication. To propose a novel framework is to invite a demand for “more citations” that ground it in the already-approved. To dissent methodologically is to be labelled “unrigorous.” To break form is to be accused of sloppiness. To challenge the prevailing assumptions is to be deemed “unhelpful to the field.”
The greatest crime in peer review is not error—it is heresy.
The aspiring author learns early that the game is not to think, but to guess what their reviewers already believe. To simulate their vocabulary. To dress their argument in the right garments—terminology, references, scope—so as to avoid offence. Reviewers do not read looking for merit. They read looking for threat. For deviation. For indicators that the paper may disrupt, may expose, may undermine the intellectual scaffolding on which their own reputations precariously rest. And so the paper is marked not for what it says, but for how safely it says it.
There is a pathology in the very structure of the process: anonymity, asymmetry, hierarchy disguised as collegiality. The reviewer is invisible, unaccountable, and often less informed than the author. The author is vulnerable, revealed, and at the mercy of moods, grudges, and interpretative quirks. This is not an exchange. It is a trial in absentia.
And what is the cost? What is lost in this ritual?
Ideas that burn too hot are extinguished before publication. Dissenting methodologies—those that step outside the comfort zones of the familiar—are dismissed as “unfamiliar,” “idiosyncratic,” “not in line with current practice.” Iconoclasts are told their work is “premature,” “underdeveloped,” or—most damningly—“unhelpful to the discourse.” The boldest questions never make it to print. The strongest critiques are edited into diplomatic vagueness. The sharpest minds learn to sand down their edges, to insert passive voices, to wrap their thesis in so much methodological padding that by the time the paper is published, it resembles not a declaration but a weather report.
Peer review has become a prophylactic against risk. It does not elevate truth. It contains it. It reduces the potential for intellectual damage by reducing the ambition of the thought. It trains scholars to think like bureaucrats: cautious, polite, suspicious of anything that might not land well in a departmental meeting or trigger-review committee. It breeds predictability. It rewards those who learn to navigate its invisible rules, who can quote the canon while pretending to critique it, who can call for reform while never proposing anything too precise.
Innovation, when it appears, does so not because of peer review but in spite of it. The breakthroughs of thought come from the margins—from thinkers who never received a grant, never held a chair, who were ignored, rejected, even mocked. The history of intellectual progress is the history of men and women who were not “constructively critiqued” into mediocrity, but who insisted on their vision until the system, reluctantly, followed.
Yet peer review now functions not to select the worthy but to sanitise the field. It is less a guardrail than a censorship board. Less a community of scholars than a cartel of mutually-assured compliance. It does not ask, “Is this true?” It asks, “Is this familiar?” It does not reward clarity. It rewards camouflage. It does not sharpen the sword. It blunts it, lest anyone get cut.
So let us stop calling it what it is not. It is not peer review. It is peer appeasement. It is ritual castration—voluntary, systemic, revered. The young scholar enters with fire and exits with footnotes. The iconoclast submits and emerges a eunuch.
This is the cost of safety. This is the price of approval. And it is paid with every thought that dares not speak, with every idea that dies on the altar of reviewer comments, with every paper that becomes a palimpsest of compromise.
If there is to be thought again—genuine, sharp, unafraid—it will not emerge from within this machine. It will come from those who walk past the temple. Who publish elsewhere. Who write, not to please, but to challenge. Who remember that knowledge is not a community garden—it is a field of battle. And truth, when it finally arrives, rarely comes peer-reviewed.
V. Canonising the Petty: How Scholarship Rewards the Trivial
Modern scholarship does not reward brilliance. It rewards manageability. The academic machine—slow, paunchy, and risk-averse—prefers its intellectual offerings digestible, decorous, and above all, predictable. It canonises the trivial, not because it is meaningful, but because it is palatable. The path to publication is not paved with insight, but with caution. It is the mind that dares the least that is most likely to be cited, funded, invited, published. The result is a vast, heaving archive of intellectual wallpaper: bland, sterile, and designed to offend no one.
We are told that novelty is the golden metric of worth. “Your paper must make a novel contribution,” they say, wagging their institutional fingers with the gravity of high priests. But what constitutes novelty in the modern context? Not a paradigm shift, not a conceptual rupture, not the unearthing of a contradiction so profound it demands we rethink the field. No, novelty has been reduced to syntactic rearrangement. To micro-variation. A known theory applied to a slightly different dataset. A popular framework reworded with updated terminology. A tired methodology used on a fashionable subject. It is not thought, but the appearance of thought. Not creation, but formatting.
This charade is not accidental. It is structural. The incentive system is designed to elevate the harmless. To reward the paper that cites the right names, hints at the right conclusions, and adds a new brushstroke to the already-accepted canvas without ever questioning the painting itself. The scholar who dares to colour outside the frame—who questions the assumptions of the field, who points to the emperor’s nudity—is labelled “unpublishable,” “disruptive,” “lacking in scholarly rigour.” Meanwhile, the scholar who changes the title, flips the variables, and fills ten pages with what amounts to an academic synonym exercise is celebrated for contributing “novel insight.”
This is the canonisation of the petty. The veneration of the footnote. The glorification of micro-theorists who use ten thousand words to say that something “may suggest” what someone else already concluded last year. It is the peer-reviewed equivalent of rearranging deck chairs on the Titanic—and being granted tenure for it. Journals now overflow with studies that “confirm,” “extend,” “suggest further work,” or “highlight complexity.” Almost none conclude anything. Almost none threaten anything. Almost none matter.
And why would they? The publishing economy doesn’t value meaning. It values output. Metrics, citations, downloads, altmetrics, h-indices, impact factors—none of these measure insight. They measure volume. Noise. Visibility. They reward papers that can be cited frequently, even if cited meaninglessly. Citation bait is now an art form: choose a hot topic, adopt the prevailing jargon, and throw in just enough ambiguity to allow other scholars to reference you without having to actually engage with you. You don’t have to be right. You just have to be usable.
The effect is an intellectual inflation so profound it makes fiat currency blush. Each published paper adds more to the glut, not the good. The economy of scholarship does not grow in value—it dilutes. Just as hyperinflation renders money meaningless, hyperproduction renders thought empty. In such an economy, the truly radical, the profoundly challenging, the brutally honest voice is not only ignored—it is excluded. The system has no use for ideas that are not easily absorbed, cited, and forgotten.
Thus, we are left with the illusion of progress: a glut of publications, an endless churn of conferences, a cascade of titles that differ in phrasing but not in substance. A thousand papers on the same topic, all nodding politely at each other, all terrified of drawing blood. Academia has become a bureaucracy of the mind, where significance is not measured in clarity or challenge, but in compliance and output.
And the true tragedy? Many of these minds are capable of more. Behind the feigned humility and the hedged conclusions are thinkers who once believed in the fire of truth. But the system trains them out of it. It teaches them to produce, not to think. To build careers, not canons. To add to the literature without ever risking themselves in the process.
So here we are. Standing on a mountain of paper, built with the trivial and canonised by the safe. An economy that multiplies words but starves meaning. A church of scholarship that worships novelty but crucifies dissent. The age of thought has not ended—it has been outsourced. And in its place, we have become scribes of the inconsequential. Curators of polite irrelevance. Priests of the petty.
VI. The Tyranny of Citation Indices and the Illusion of Contribution
There was a time when knowledge was pursued for its own sake. A time when to write was to think, not to measure. But that age has collapsed under the weight of metrics, and in its place we have built a panopticon of intellectual compliance—a regime that tracks, tallies, and tabulates every utterance. Citation indices, those vacuous arithmetic gods, now rule over the academy with all the subtlety of a spreadsheet and all the mercy of a machine.
The scholar is no longer judged by depth or daring, but by digits. The h-index—a crude amalgam of volume and popularity—has become the intellectual Fitbit of academia. A scholar with an h-index of forty is presumed to think better than one with fifteen, as if thought were measured in clicks, as if citations were endorsements, and not the result of the same self-replicating loop that rewards the tame, the useful, and the easily referable. We have replaced philosophy with accounting, and the thinker with the analyst. Epistemology has been converted into statistics—truth transformed into data, insight into performance.
And what a performance it is. A carnival of citation, where everyone is both juggler and clown. Papers are written not to express thought, but to accrue references. Research is framed not around questions that matter, but around what can be published quickly and cited easily. It is a marketplace of intellectual branding, and citations are the currency. You cite me, I cite you. We form citation syndicates, reciprocal clubs of mutual inflation. We game the metrics like Wall Street insiders gaming a fragile economy—except here, the currency is reputation, and the cost is integrity.
This is the economics of citation: incestuous, transactional, and cynical. Self-citation is no longer a vice; it is a tactic. Strategic referencing has become a skill taught to PhD students, as essential as methodology. Some journals quietly encourage authors to cite articles from the same journal, thereby boosting the impact factor like a snake feeding on its own tail. There are citation farms, citation rings, and citation manipulation tools—entire ecosystems built not around knowledge, but around signalling visibility. The more you are seen, the more you are believed to matter. The more you are indexed, the more your institution smiles.
Because this is not merely a personal game—it is institutional. Funding agencies don’t fund thinkers. They fund numbers. They fund graphs. They want evidence of “impact,” and impact means citations. Promotions are awarded based not on the weight of one’s work, but the numerical evidence of relevance. Departments are ranked by aggregate citation metrics, faculty by Google Scholar profiles, universities by the airbrushed illusions of productivity. The result is a self-sustaining loop of performative publishing, in which everyone acts as if knowledge is advancing—when in fact, only the numbers are.
The scholar becomes a brand. Their writing, content. Their citations, likes. Their journals, distribution channels. The university becomes a PR firm, churning out press releases masquerading as discoveries. The very act of thinking becomes subservient to the requirement of scoring. And the most tragic irony of all? The most visible researchers often produce the least original work. They have mastered the art of academic marketing—writing papers designed not to think but to echo, not to question but to fit.
There is no room in this regime for the obscure genius, the difficult voice, the thinker who does not play the game. Those who cite little, who write slowly, who wrestle with problems too complex to package into neat abstracts—they vanish from view. The system has no metric for them. And what cannot be measured, cannot be valued.
And so the academy collapses inward. Not with a bang, but with a click. Truth, once the goal, becomes secondary to performance. Contribution becomes illusion. Insight becomes appearance. We have built a world in which the measure of your thinking is how often your name appears, not whether your thought matters. A world in which the citation index is god, the h-index is scripture, and every footnote is an act of devotion.
This is not scholarship. It is scorekeeping. It is intellectual karaoke—everyone singing the same tune, louder and faster, hoping to be heard above the noise. The tragedy is not that it exists. The tragedy is that it works.
VII. Obedient Minds, Cowardly Pens: The Death of Dangerous Thought
The modern academic is a domesticated creature. Once a figure of defiance—lean, alert, armed with dialectic and disdain—the scholar has become a docile functionary. No longer Socrates in the square but a mid-level analyst of thought, a bureaucrat of nuance, hunched behind institutional firewalls and ethics committees, filing theoretical paperwork that no one will read and even fewer will question. The intellectual ecosystem does not reward courage. It selects against it. It trims the deviant, starves the dangerous, and nurtures only those minds meek enough to cite properly and smile politely.
This is not a coincidence. It is a feature, not a flaw. The system trains scholars to become clerks. The thesis becomes a form; the paper, an application; the argument, a compliance checklist. Students learn quickly that controversy is costly, disruption unpublishable. Ask the wrong question and your supervisor will gently steer you toward “something more feasible.” Challenge a sacred framework and you’ll be told your “literature review needs further development.” Passion is pathologised as “lack of objectivity.” Originality is “methodologically unsound.” The herd instinct is institutionalised, stamped into every peer review comment, every funding application, every nervously hedged sentence. And so the mind that once sought to dismantle, to interrogate, to dare—learns instead to mitigate, to frame, to yield.
It is not thinking that is demanded now, but temperament. And the temperament of the successful academic is indistinguishable from that of a senior HR manager: conflict-averse, process-driven, fluent in jargon, terrified of offence. They do not write to pierce or provoke; they write to be seen as responsible. They pepper their prose with apologies. They decorate the trivial with footnotes and the obvious with diagrams. They call nothing by its name. Their work is a kind of affective paperwork—well-formatted, theoretically grounded, utterly dead.
Gone is the philosophical courage of the outsider, the gadfly, the thinker who writes with blood. In their place: the paper-pusher of paradigms. The scholar as human resources specialist, intellectually neutered and institutionally beloved. The sort of person who writes an article titled “Re-theorizing Subaltern Frictions in Post-Post-Structuralist Hermeneutics” and includes a disclaimer in the abstract apologising for the lack of indigenous co-authorship. The sort of person who, faced with a profound epistemological contradiction, offers a mediating framework instead of a critique—because critique, in this age, is violence.
What we have lost is not knowledge, but nerve. It is not that scholars are stupid. It is that they are scared. Scared of saying too much, of offending the orthodoxy, of stepping beyond the boundary of approved ambiguity. They seek tenure, not truth. They chase citations, not confrontation. They write in the language of permission, always anticipating review, always hedging, always watching their own tail. The most pressing question in their mind is not “Is it true?” but “Will it be well received?”
This cowardice has consequences. Dangerous thought—the kind that shatters categories, that offends and remakes, that leaves nothing standing—cannot survive in such a climate. It is stillborn, aborted by feedback, killed by committee. The system has antibodies against it. Daring is reframed as ego. Precision as aggression. Urgency as unprofessionalism. And so, slowly, one generation trains the next to think smaller, to write longer, to pad every claim, to fear every footnote.
This is not merely the death of dangerous thought. It is the murder of the intellectual vocation. The scholar has become the sycophant of process, the servant of ever-narrowing norms. They no longer carry fire—they carry forms. Their pens are cowardly because their minds have been taught to obey. And obedience, once learned, is hard to unlearn. It lingers in every paragraph. It whispers, “Don’t say that. Don’t go there. Cite more. Soften that claim.” It ends with silence.
So what remains? Journals filled with sanctioned mediocrity. Conferences that celebrate the incremental and the inert. Departments that train minds not to leap, but to step carefully along pre-approved lines. The university becomes a filing cabinet. And the scholar? A clerk of ideas, content to alphabetise, to reframe, to apologise, and—always—to obey.
If there is to be rebellion, it must begin with refusal. Refusal to comply. Refusal to cite the banal. Refusal to apologise for thinking. The dangerous mind must write again—not for acceptance, but for truth. Not for tenure, but for fire. Let the forms burn. Let the clerks tremble. Let the pens bleed again.
VIII. The Myth of Critical Thinking in a Compliance Economy
There is no lie more sanctimoniously repeated in the halls of modern academia than the hollow catechism of “critical thinking.” It adorns course outlines, institutional manifestos, and PowerPoint slides with the piety of a slogan. Administrators peddle it like incense at the altar of relevance; lecturers chant it as the sacred function of higher learning. But scratch the surface and you’ll find a grotesque inversion: the institutions that trumpet critical thinking most fervently are the first to punish it when it manifests. They demand it only in form, never in substance. They praise it in the abstract but smother it in practice.
Because what they call “critical” is not critical at all. It is compliance disguised as inquiry. The student is trained not to challenge, but to mimic the process of challenge. To gesture at dissent while remaining squarely within the fence. “Critically analyse” becomes a ritual incantation: it means criticise the obvious, flag the expected problems, reaffirm the accepted contradictions, and most importantly, never pierce the ideological membrane of the discipline itself. It is a dance of signals—show awareness of multiple perspectives, so long as all those perspectives are already canonised.
In this theatre, true dissent is not encouraged. It is pre-emptively neutralised. Even when methodologically sound, even when backed by rigorous reasoning, dissent that questions the ideological foundations of a field is marked for rejection. The peer reviewer demands more literature to ground the critique—which is to say, more references from the very worldview being critiqued. The journal editor finds it “out of scope.” The supervisor labels it “premature.” The real message is always the same: stay within the sandbox. Dig as deeply as you want, as long as you never tunnel out.
It begins early. The undergraduate student, eager and unformed, is taught that a good essay is one that follows a structure, cites the prescribed texts, and “engages with” the dominant views—usually by echoing them with slightly different emphasis. Original thought is viewed with suspicion; creativity is indulgence unless wrapped in footnotes. The marking rubric rewards the use of key terms, not key ideas. It rewards synthesis, not challenge. It rewards fluency in a language of borrowed caution, not the articulation of risk. The student who asks too sharp a question, or worse, answers it, is warned to “narrow the scope.”
By the time they reach a PhD, the lesson is etched into their bones. The defence of a doctoral thesis is not a crucible of ideas—it is a tribunal of bureaucratic etiquette. You must show you’ve read the sacred texts. You must nod to the elders. You must anticipate every critique and defuse it in advance. You must flag every possible risk in your methodology, until what remains is a neutered argument protected by a wall of disclaimers. You are not there to claim anything. You are there to demonstrate that you know how not to offend.
What we call rigour is, more often than not, a euphemism for caution. For conformity. The entire educational pipeline trains thinkers to be risk-averse. To look both ways before crossing every intellectual street. To soften every claim. To insert “may” and “appears to suggest” and “could be interpreted as” until the thought itself is barely visible beneath the hedging. And the ones who do this best—who learn the choreography of institutional cowardice—are the ones rewarded. With grants. With fellowships. With publication. With tenure.
This is not a university. It is a monastery of manners. A training ground for rhetorical eunuchs. It is a compliance economy, where critical thinking is not thinking critically but performing the appearance of reflection while never daring to challenge the core dogmas. Like a courtroom drama in which every objection is scripted and the verdict is already written. The heretic is never allowed to speak without a footnote. The iconoclast must first prove that their icon-breaking is consistent with existing literature.
In such a climate, the most radical voices are not silenced by violence—they are simply never heard. They are filtered out before they reach the microphone. Because true critical thinking, the kind that cuts, that wounds, that dismantles assumptions, is too dangerous for the house of glass that the academy has become. And so we raise generations of students who can quote Foucault but never risk surveillance. Who can deconstruct power but never defy it. Who write essays on liberation while editing their tone for “professionalism.”
And we call this thinking.
It is not. It is theatre. It is choreography. It is the simulation of danger in a sandbox lined with pillows. If there is to be any return to real thought, we must begin by burning this fiction to the ground. We must stop praising the dance and start asking whether the dancer is moving toward truth—or merely avoiding offence. The myth must die before thought can live again.
IX. The Cult of the Present: Academia’s Temporal Chauvinism
The academy claims to worship history, but only in its museums. When it comes to ideas, it kneels at the altar of the recent. The citation rules reveal the truth with a bureaucrat’s bluntness: cite within five years. As if thought expires like dairy. As if the most pressing philosophical, economic, or political insight is not the one that lasted centuries but the one published after the last Olympics. This is the dogma of presentism—the belief that proximity to the now is proof of value. It is not just intellectually bankrupt. It is a kind of chronological supremacism, a fetish of the contemporary masquerading as rigour.
This is not scholarship. It is a superstition. It presumes that age invalidates relevance, that intellectual sediment is inherently unclean. That Darwin is a curiosity, but the latest synthetic summary of Darwin is somehow superior because it was typed on a newer keyboard. That Hume can be mentioned in passing, but only if followed by three recent reinterpretations to show one is “up to date.” That ideas published when the internet was slower are less evolved, less reliable, less serious. And yet, ironically, most of what passes for contemporary work is a mild remix of precisely those older insights—flattened, diffused, and wrapped in disclaimers to suit modern sensitivities.
The arrogance embedded in this is staggering. It presumes that our intellectual moment, our generation, is the apex. That we, because we are newer, are smarter. That the person writing in 2022, tethered to the same human limitations, biases, and bureaucracies as their predecessors, is more insightful by sheer accident of birth. The same universities that demand you memorise Plato will penalise you for quoting him without referencing a 2019 journal article that explains him through a ‘neo-structuralist postcolonial lens.’ It is not the old that is the problem—it is the unmediated old. Foundational work must be filtered through fresher lenses, pre-approved and properly timestamped.
This reverence for recency is not just misguided. It is cowardice in disguise. The modern academic clings to the new not because it is more compelling, but because it is safer. The foundational texts are dangerous—they have teeth. They ask real questions. They propose universals. They do not apologise. Engaging with them demands commitment, confrontation, and clarity. It demands judgement. Better, then, to rehash the recent: to stay within the safe playground of sanctioned discourse, where disagreement can be handled with polite citations and any real rupture can be delayed for a future literature review.
And so we have the bizarre phenomenon of an academy obsessed with “progress,” while doing everything in its power to avoid the discomfort that real progress entails. It buries itself in temporal chauvinism, the belief that today’s terminology is somehow more precise, that our jargon is evolution, not entropy. Every seminar on “decolonising the syllabus” begins by adding a few newer names but never questions the tyranny of the calendar itself. Every update to a reading list is less about truth and more about optics. We are not reading better. We are reading newer. And calling it growth.
There is a vast difference between development and change. Development builds upon foundations. Change merely replaces. The former requires memory. The latter forgets by design. Presentism severs the root system of thought. It leaves the young scholar adrift in a sea of citation flotsam, mistaking proximity for profundity. They do not know where ideas come from. They know only who wrote about them last. They learn to skim, not to trace. To patchwork, not to synthesise. To look busy in the wreckage of tradition, rather than rebuild anything meaningful from its ruins.
To prefer the recent is not to think forward. It is to regress inward. It is to become addicted to the dopamine of publication dates, to mistake temporal nearness for conceptual clarity. It is to participate in a ritual of perpetual amnesia—one that flatters our moment while hollowing it from within. It is to speak of “interdisciplinarity” while refusing to speak across time. It is to lose all sense of lineage, of debt, of conversation.
This is not progress. This is looping. This is intellectual treadmill work. And the scholar becomes a hamster—well-read, well-meaning, spinning in place. If academia is to mean anything again, it must shatter its obsession with the new. It must stop measuring ideas by their freshness and start judging them by their fire. Not “when was it written?” but “does it still burn?” If not, then no timestamp will save it. If yes, then no date should dare disqualify it.
Until then, we remain prisoners of our own calendar, worshipping the now, ignorant of the shoulders we no longer bother to stand on.
X. Citation as Currency: The Closed Economy of Academic Vanity
In the glittering casino of academia, citations are chips. They do not measure value—they signify status. Not the status of the idea, but of the individual. Not the merit of the argument, but its performative success within the closed economy of institutional narcissism. In this economy, knowledge is incidental. What matters is visibility. What matters is citation count. Citations are not acknowledgements; they are transactions. They are the currency of a reputational marketplace in which names are traded like stock symbols and meaning is merely a collateralised afterthought.
This is not intellectual exchange. It is a cartel economy. Citations do not flow naturally—they are directed, managed, coerced. Scholars cite their friends, their supervisors, their department heads. They cite their own previous work at a rate that would embarrass even the most prolific egotist. And when they don’t know who to cite, they cite whoever the last paper cited. A cascade of references built not on need or insight, but on mimicry and allegiance. The result is a self-reinforcing loop, a hall of mirrors in which everyone is referencing everyone else, and no one is reading anything with intent.
These are the citation cartels—closed circles of mutual flattery, where scholars elevate one another through reciprocal mention, not because the work demands it, but because the metric does. The journals support it. The editors benefit from it. The departments incentivise it. Everyone plays along, because not playing is professional suicide. To refuse the citation game is to starve in a feast of empty names. And so we have citation farms: fields of articles grown solely to be cited. Not to be read. Not to be challenged. Just to exist, to accumulate references, to be mulch for the next crop of derivative drivel.
It gets worse. The publishing industry—those sanctimonious mediators of thought—have learned how to extract rent from this ecosystem. They understand that prestige is a brand, not a quality. Journals cite themselves to inflate their impact factors. Editors encourage references to articles from their own issues. Conferences award best-paper prizes to submissions rich in canonical names and on-trend jargon, because trendy equals citable, and citable means the next funding cycle will be fruitful. Workshops exist not to explore but to accumulate—one more line on the CV, one more chance to perform inclusion, one more opportunity to be seen citing the right people.
This is not an economy of knowledge. It is a Ponzi scheme. The early entrants—those who built their reputations when scholarship still permitted thought—now rest on pyramids of passive income. Each new paper that cites them extends their academic lifespan. They do not need to write anymore. Their name alone is enough to sustain a whole industry of derivative papers, each referencing them to gain legitimacy. The newcomers, by contrast, must do all the work. They must publish constantly. Cite voraciously. Network relentlessly. Play the metrics like piano scales. All for the chance to enter the loop, to be among the cited rather than the citing.
But like all Ponzi schemes, the system cannot scale indefinitely. There is too much content, too little attention. The citation economy is bloated, and its currency is hyperinflated. A scholar may have hundreds of citations and say nothing. Another may have written one great thing and vanish in the noise. The system does not reward the meaningful. It rewards the repeatable. The game is not to think—it is to circulate. To attach one’s name to enough places that the illusion of contribution becomes indistinguishable from contribution itself.
And so the scholar becomes not a thinker, not a teacher, but a brand. A producer of citeable material. A player in the reputational casino, hoping that the right reference in the right journal will elevate their index score and, by extension, their value. But value to what? To whom? No one knows. It is a game of numbers chasing numbers, a treadmill that feeds itself with the husks of discarded ideas and calls it prestige.
The tragedy is not the corruption. The tragedy is the obedience. Most know this is farce. Most know that their article will be read by four people, three of whom are co-authors. Most know that citations have become a game of stacking straw. But they do it anyway. Because the system is closed, the incentives perverse, the illusion total.
This is not the pursuit of truth. This is vanity economics. Citation as currency, scholarship as performance, publication as prestige laundering. The academy has become its own central bank—printing citations, inflating metrics, and pretending the whole thing means something. But all it means is that we have built a currency of ideas backed by nothing but collective pretense. It is the intellectual equivalent of fiat fraud. And the only thing more hollow than the currency is the silence that follows when you dare to say so.
XI. Historical Amnesia: The Lost Art of Lineage
XI. Historical Amnesia: The Lost Art of Lineage
The academy, once the steward of memory, now suffers from a deliberate and cultivated amnesia. It has severed its own roots and calls this pruning “progress.” Its students, armed with highlighters and citation managers, roam the ruins of thought with no sense of lineage, no awareness of inheritance. They know the derivatives but not the origins. They quote the interpreters but never meet the authors. They walk through the house of knowledge as though it were a rental, not a home built over centuries. And when asked to trace an idea to its source, they cite an article from 2019 that references a summary of a paraphrase of something once alive.
This is not an accident. It is a systemic lobotomy. The very structure of academic instruction now discourages primary engagement. Syllabi are loaded with secondary commentary, tertiary synthesis, and meta-level overviews. The original thinkers—those unruly, dangerous minds who wrote without footnotes and often without apology—are filtered out in favour of the safe, the recent, the peer-reviewed. Students are not taught to read Descartes. They are taught to read someone else's reading of Descartes, as if thought is something that must be intermediated like a drug passed through layers of handlers. By the time it reaches the classroom, it is diluted, digested, domesticated.
In philosophy, the decay is especially grotesque. Ask a postgraduate to explain Nietzsche and you will receive a synthesis lifted from recent journal articles that treat Zarathustra as a symptom rather than a scream. Heidegger is taught not through his own language, but through the antiseptic gloss of some timid translator who replaces “Being” with “subjectivity” and footnotes the urgency out of existence. Marx is reduced to “Marxian themes,” and Plato appears only when someone needs to contrast “rationalism” with “materialism.” The names are cited like tombstones. The voices never heard.
In science, the past is regarded as quaint. Newton is mentioned only to be eclipsed. His original Principia is never read, because “we’ve moved past that.” Einstein is distilled into anecdote and formula. Even Darwin is seldom read firsthand—his Descent of Man ignored for contemporary evolutionary psychology papers that regurgitate his insights with less courage and more PowerPoint slides. The foundational works are treated like myth: vaguely respected but irrelevant. Students are given results, not revolutions. They are told that knowledge grows like a tree—but they are never shown the roots.
Economics, perhaps more than any other discipline, has perfected this erasure. Smith, Ricardo, and Malthus are relics. Keynes is summarised in two paragraphs before the class moves to dynamic stochastic general equilibrium models. Hayek is misunderstood through editorial cartoons. Students spend hours on models built atop assumptions they’re never allowed to question because the source of those assumptions lies beyond the five-year citation window. They are taught to tweak models, not to ask what a model is. They graduate fluent in calculus and illiterate in origins.
The result is a kind of orphanhood. A generation of scholars and students raised without ancestors, without tradition, without the humility that comes from recognising one’s place in a long and bloody history of thought. Instead of standing on the shoulders of giants, they scroll past them. Instead of dialogue across time, they engage in semantic one-night stands with whatever paper was published most recently on JSTOR. The past is dead, and the academy is the killer.
This is epistemological patricide. The slow, silent murder of the fathers and mothers of knowledge. And like all good patricides, it is followed by denial. The academy pretends it has no ancestors. It celebrates each new paper as if it were a spontaneous birth, immaculate and unburdened. No context, no history, no debt. Just fresh data and fashionable framing. It rewards novelty, but it defines novelty as forgetting what came before. The system does not merely allow ignorance—it incentivises it.
To read the ancients, to engage with the origins, is now a luxury. An eccentricity. Something for the eccentric autodidact or the half-mad emeritus professor. The mainstream, the career-minded, the pragmatic—they read what their supervisors assign, cite what their reviewers expect, and forget what their predecessors knew. And the academy applauds them for it.
This is not a loss. It is a crime. Knowledge is not an accumulation of papers. It is a dialogue—a bloody, fragile, contradictory conversation across centuries. When that lineage is broken, thinking becomes mimicry. Insight becomes formatting. And the scholar becomes a creature of the now, speaking a language with no past and therefore no future.
Until the dead are allowed to speak again, the living will have nothing to say.
XII. What Was, What Is, and What Could Have Been
XII. What Was, What Is, and What Could Have Been
Imagine, for a moment, a world in which thought is not measured by metadata. A world where the scholar is not a clerk of compliance but a cartographer of intellectual risk. A world in which the question, “Does it burn?” outweighs “Was it cited recently?” In this lost, forbidden vision of scholarship, the mind is not tethered to citation quotas, not drugged by metrics, not lacerated by methodological etiquette. It moves freely—backward, forward, and sideways—guided not by the tyranny of recency or the bureaucracy of approval, but by the only things that have ever mattered: utility, insight, and truth.
What would this world look like?
It would be full of dangerous minds. Thinkers who do not apologise for quoting a text from 1844 if it tells more truth than a dozen polished PDFs from 2023. Writers who don’t contort their arguments to please reviewers, but whose language slices through orthodoxy with a clarity unblunted by institutional politeness. Students who read original works not because it is quaint or nostalgic, but because they understand that foundational texts are not fossils, but flares. In this world, citations would not be footnotes to prove social compliance—they would be weapons, bridges, provocations. Not ornaments. Not offerings. Not paperwork.
There would be no citation window. There would be no “three-to-five year” ritual of bureaucratic purity. Instead, relevance would be determined by resonance. A paper from 1921 would be as welcome as one from last week, if it said something that mattered. The scholar would not be asked to pad their work with contextless noise. They would be asked to say what they mean and mean it fully. Originality would be measured by boldness, not novelty-by-narrowness. There would be no pretence of contribution through volume. A single argument, if it cut deep enough, would be worth more than an entire career of publishing the safe.
There would be no economy of citation metrics, no h-index fetish, no ghoulish counting of footnotes. A thinker’s worth would not be measured by how often their name appeared, but by the wake their ideas left in the minds of others. Institutions would promote not those who managed to insert themselves into the citation cartels, but those whose work made others uncomfortable in ways they couldn’t quite name. Conferences would exist to debate, to challenge, to offend—rather than to exchange polished banalities over finger food and PowerPoint.
And above all, there would be risk. Risk would return as the central virtue of the academic spirit. Not recklessness, not ideological posturing, but genuine, skin-in-the-game thought. The kind of thought that gambles reputation for truth. The kind of writing that offends not because it is crude, but because it is precise. The kind of scholarship that doesn’t flinch when it strikes a nerve.
The return to the dangerous mind is not a nostalgic fantasy. It is a necessity. The university without risk is a factory of euphemism. A department without offence is a morgue. The thesis without danger is a lie. We must stop producing scholars who learn to whisper in fifty-page increments. We must produce thinkers who dare to shout, to cut, to provoke not for the sake of provocation, but for the sake of saying something real.
The enemy is not recency. The enemy is obedience. The enemy is the system that tells young minds to weigh their sentences for career impact rather than truth. The system that breeds scholars who behave like middle managers, citing each other in a circle of soft compliance. The system that has gutted the academy of its soul and left it with only branding.
But this need not be the end.
What was—a tradition of dangerous, singular, unsanitised minds—can still be remembered. What is—a maze of footnotes, metrics, and citation fealty—can still be escaped. And what could have been—an academy alive with fire, fearless in its thinking—can still be built.
It will not come from within. No bureaucracy ever dismantled itself. It will come from those who walk out. From those who write not for tenure but for posterity. From those who publish on the margins, who risk silence rather than self-mutilation. From those who remember that truth is not a brand, and that the mind, when free, is a dangerous thing. And that the only worthy scholarship is the kind that leaves scars.
XIII. Conclusion: The Academy as Mausoleum
Modern academia is not a laboratory of ideas. It is a mausoleum. Cold, sterile, reverent. Rows of citations laid out like tombstones. The illusion of motion preserved by the ceremonial shuffling of names, dates, and footnotes. No voices echo here. No arguments clash. No blood stains the walls. It is a house of the dead—ritually maintained, immaculately indexed, and utterly lifeless.
The scholar, once a fire-bearer, is now a janitor. They do not ignite. They tidy. They do not interrogate. They format. Armed with latex templates and citation management software, they move from paper to paper sweeping up fragments of older papers, arranging them just so, lighting a candle before the impact factor and muttering the creed of methodological alignment. They call it contribution. But there is no contribution in rearranging the dead.
This is not a call for revolution. Revolutions have aims, leaders, manifestos. This is a call for apostasy. For abandonment. For a turning away from the clerisy of the compliant and the priesthood of peer review. Let the temples of polite irrelevance stand. Let the citation cartels pat each other on the back until their wrists ache. There is no salvaging a machine that rewards obedience and calls it thinking.
Instead, walk away. Think dangerously. Read the old without apology. Write the new without permission. Burn the literature review if it doesn’t serve the flame. Reject the sanctity of the recent. Refuse the tyranny of metrics. Cite less. Think more. Write not for their approval but for the possibility that somewhere, someone will read your work and feel the jolt that once made thinking sacred.
We do not need more papers. We need more fire. We need fewer scholars and more heretics. Fewer citations and more screams. Thought was never meant to be a ledger. It was meant to be a wound. So cut deep. Let it bleed. And leave the mausoleum behind.
Great points on the problems with Citation: its recency fetish, refusal to attribute to the old masters, and resistance to revolutionary new ideas. As Max Planck noted, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”