Archive of ‘post-modernism’ category

In Defence of Hegel

Hegel_speechNever thought I’d write a post like this. But it took a politician to cast aspersions against a University of Sydney philosopher of the continental persuasion in our recent election campaign to get me to take real notice of Hegel. And I’m pleasantly surprised by what I’ve seen.

I can generally sum up my feelings towards Hegel to date as being a combination of incredulity and abject dismissal. From what I understand of his philosophy – largely gleaned from Bertrand Russell’s adroitly droll treatment in A History of Western Philosophy – Hegel’s main themes are reasonably interesting, if unremarkable. His notion that carving the world up into discreet chunks for contemplation mars the unity of all things – an idea I am more familiar with from Taoism or Madhyamaka – is one to which I am very sympathetic. His other big idea about the intrinsic progressiveness of history is also of interest, even if his argument as to why it is so is almost certainly wrong.


Prescriptive Art

Prescriptivism is a new art movement that embodies the notion that art can not only act as a mirror giving us a critical view of the world as it is, but can act as a window through to a better world.

Under the near ubiquitous influence of postmodernism, art has become largely critical. For the past several decades contemporary art has implicitly sought to challenge objectivity and encourage us to reflect on the way we subjectively project meaning on to the world. Furthermore, it  has sought to reveal the forces that shape the way we see the world, challenging us to divorce ourselves from the powers that be which seek to steer our worldview to their ends.

Critical art is descriptive: it seeks to hold up a mirror to ourselves and the world, revealing the way they really are – and ultimately presenting a sceptical argument stressing there is no way they really are.

One of the main tools of critical art is subversion. It seeks to disrupt our ways of seeing, and make us aware of the way we invest objects and scenes with meaning. It challenges us to question the way we construct reality.

Critical art is important, but it is not the only role that art can play. Art can also be prescriptive. Art – as it was not so long ago in our history – can be a window through to a better world.

Prescriptive art seeks to open that window, to open our imaginations to the way the world isn’t, but as it could be; the way the world should be.

Prescriptive art is not wedded to postmodernism. It is not implicitly sceptical of reality or of perception. Yet it isn’t naive about objectivity or the difficulty or even impossibility of seeing the world as it really is. Prescriptive art is an orthogonal movement to postmodernism, and one which is both challenging to the postmodern critical paradigm and complementary to it.

Postmodernism reveals the world in an unflattering light, demanding that it should be changed, yet offering no alternative world in its place. Prescriptive art explores the bounds of the alternatives and encourages us to think actively about how we would shape the world to conform to how we believe it should be.

Prescriptive art is inherently optimistic, but not naively so. It acknowledges the difficulty and risks involved with change. But it insists that if change is to happen, we need to employ our imagination in order to direct that change.

If you wish to engage with prescriptive art, follow this simple maxim: paint* the world as it should be.

*By “paint” is intended any form of artistic practice or expression.

Reality and its Depictions

It’s of interest to me that film makers, largely of the Hollywood persuasion, are inclined to modify reality in order to conform to our expectations of reality rather than, well, real reality.

In the pseudo-reality of the blockbuster grenades disgorge great plumes of flame and cause provocateurs to hurtle through the air, slowly. In reality grenades evince a short, sharp BANG and emit a cloud of smoke along with supersonic compression wave that crushes rather than pushes. And that’s not to mention the shrapnel. They rarely produce flame, nor drama. Only noise and tragedy.

What’s interesting about this is that if a blockbuster offered an accurate representation of a grenade, the audience would quite likely be thrown into confusion, jolting them out of the fantasy. “What was the puff and bang? It couldn’t have been a grenade.”

You can almost hear the effects department advising the director: “Grenades don’t look like grenades on film. You gotta use pyrotechnics.”

And it’s not just that fireballs are more dramatic than real grenade explosions. I fully appreciate artistic licence. But artistic license is intended to remove the undramatic elements of reality and replace them with dramatic alternatives. However, grenades are, in my opinion, intrinsically dramatic, at least as dramatic as a fireball. It’s just that puff-and-bang is not what people expect when a grenade goes off on screen. They do expect a fireball.


The Poverty of Postmodernism

You may not realise it, but you’ve probably been poisoned by postmodernism. No-one who lived through the 1970s would have escaped untainted. And just about anyone who underwent schooling or a university education in the 1980s or 1990s received a crippling dose. I was entirely oblivious to my own indoctrination during my undergraduate in the early ‘90s until only a few years ago.

You can blame postmodernism for the banalities of political correctness.

You can blame it for making contemporary art ugly and incomprehensible.

You can blame it for moral relativism, and the inability to criticise individuals from other cultures when they do plainly heinous things.

You can blame it for rampant individualism and greed.

You can also blame it for words like ‘deconstruction,’ ‘hermeneutics,’ and my favourite, ‘subversion.’ You can even blame it for the identity crisis afflicting the political Left.

The good news is that postmodernism is philosophically defunct. Deep exhale. We can all let it go now. Let it sink to the bottom of the Swamp of Bankrupt Ideas. And we can move on to firmer conceptual territory, in doing so discovering the world is, in fact, more (and less) explicable than we probably think, and intractable problems – like multiculturalism, for one – are more solvable than we realise.


The Revolution is Dead (For Now)

There aren’t any revolutionaries any more. The closest contemporary figure I can muster from the cloudy reaches of my imagination who might qualify as a revolutionary is Julian Assange. Certainly he’s an original thinker, far more so than most people these days.

But even Assange’s revolution is incremental, if profound. He a seeks to change the landscape of democracy without necessarily wiping the slate clean entirely. His is not a prescriptive vision of a better world, but a solution to the ills of this one, underpinned by a conviction about the particular nature of corruption – or, as he calls it, ‘conspiracy.’

So where are the true revolutionaries? Where are the visionaries with a compelling view of a better world, one for which we ought to fight to bring into reality? Who’s thinking beyond the contingencies of this world to the possibilities of the next?

There was a time, not so long ago, when revolution was in common parlance and bold visions of a new world were talked about openly, debated, fought over and striven for. Only 40 years ago there was talk of building nothing less than a new civilisation.

What happened?


Can There Be a Science of Morality?

Can we have a science of morality? This question has been thrown around quite a bit of late, especially fuelled by the spirited ejaculations of one Sam Harris. Harris firmly believes there are no barriers to a science of human values, but I fear things aren’t that simple, and I’m not alone in this concern.

Sam Harris

While a ‘science of morality’ is a laudable notion in a loose sense, such a science would, by necessity, look nothing like what Harris has in mind. Harris is seeking not only a science of morality, but a science of human values. He wants a “universal conception of human values” that can be checked, verified and proven using the tools of empirical science.

But that’s just not going to work. Science doesn’t do that kind of thing. At least not without assistance from other disciplines, like philosophy. And if we try to force science alone into providing us with values, there is no shortage of traps that will inevitably spring up.


American History Through the Conservative Lens

Noted American Historian, Eric Foner – noted as much for his scholarship as for his vilification by radical conservatives – has written a wonderful analysis of the new social studies curriculum recently approved by the Texas Board of Education. Foner leans left himself, but he’s an esteemed historian and expert on American history, unlike the members of the Board of Education.

Eric Foner

His summary of the new curriculum shows it as being an exemplar of the conservative world view. It stresses the values of individual enterprise, self discipline, group conformity, religious obedience, in-group favouritism; and explicitly dodges issues of racism, secularisation, criticisms of capitalism and any values that promote communitarianism, pluralism of values or multiculturalism.

While I acknowledge the value of some aspects of conservatism and I’m critical of some aspects of (particularly post-modern) liberalism, the history class is not the place to have these issues fight their battle – not that there’s even a fight going on here now the Board has had its say along party lines.

History is a tricky subject to arbitrate; there’s arguably an infinite amount that can be said of history, not just covering the facts of events but interpreting their significance. But the crucial aspect is to provide the facts about historical events in as impartial a way as possible while acknowledging the various frames and influences on the interpretation of these historical events and then to engage in debate over the significance and the lessons learned. Neither should we be viewing history through the lens of conservatism as we should through the lens of radical liberalism.

He Said, She Said: Gender and Pronouns in Academic Papers

The Feminist Philosophers blog has an interesting post on gender discrimination in philosophy. It raises some important issues and, helpfully, cites some empirical research to support its points. This kind of stuff is crucial for philosophers – and academics of all stripes – to keep in mind. No-one likes being told they’re biased; better to detect and deal with your own biases on your own terms.

man-womanHowever, one aspect that isn’t mentioned in that piece is the use of personal pronouns in academic papers. It has become the fashion over the past couple of decades to frown on the exclusive use of “he” in academic papers. However, it’s not that “she” has replaced “he” in its entirety. Instead, we now have an interesting, and complex, mix.

To see what that mix might look like, I conducted an entirely unscientific experiment on my own repository of over 200 downloaded academic papers, covering topics including animal behaviour, economics, evolutionary biology, evolutionary psychology, political science, political psychology and a number of disciplines of philosophy, namely ethics and philosophy of mind.

I conducted a comprehensive search within these papers for the words “he” and “she”, and the results are quite surprising, even if you did expect to see an imbalance:


“he” – 4,413 instances in 163 documents

“she” – 997 instances in 101 documents

Philosophy only

“he” – 252 instances in 18 documents

“she” – 114 instances in 9 documents

So, of the overall number of personal pronouns used in all my saved academic papers (5,410), 81.6% were “he” and only 18.4% were “she”. That’s some bias.

In philosophy it’s a little more balanced. Of the overall personal pronouns (366), 68.9% were “he” and 31.1% were “she”. However, I should add that a few of my stored papers concern the knowledge argument, a thought experiment that heavily involves a fictitious neuroscientist named “Mary”, so frequent references to her might account for the inflated “she” figure.

I should also add that while many of my stored papers were authored in the last decade, I have a fair number authored prior to 1970, and one would imagine there’d be less awareness of personal pronouns in those authors’ minds.

What does all this mean? Well, I’d like to know. One experiment I haven’t seen done is whether a balanced use of personal pronouns has any impact at all on gender perceptions or gender equality in academia.

I suspect the reason that balanced usage of personal pronouns became an issue at all was because of a social constructionist notion that our language shapes the world around us. Thus, usage of the word “he” to the exclusion of “she” actively contributed to making our world more male-dominated.

As it happens, social constructionism is a thesis to which I do not subscribe, and I suspect many others in academia also hold reservations about the theory. However, I’d be very interested to see some experimental results testing the hypothesis that usage of personal pronouns influences the way we perceive gender equality in academia.

In the mean time, I suggest we might hedge our bets: male lead authors could always use “she” where a pronoun is required that isn’t referring to a specific individual; and female lead authors could always use “he”.

Then, we’ll have parity on the day we have the same number of male as female academics publishing papers. And until that day, if social constructionism is correct, we’ll be influencing social reality in such a way as to encourage more women in academia. And if social constructionism isn’t correct, at least we have a simple model that doesn’t rely on our poor randomisation abilities.

Breaking News: The Author Lives

Reports of the death of the author have been greatly exaggerated. (How suitable to quote one of the all-time greatest authors in this context.) It still boggles my mind (which, I admit, is easily boggled) that anyone could have subscribed to the notion that a piece of work is only loosely associated with its author. But that’s what arch-post-modernist, Roland Barthes, posited.

Who wrote these? Who cares? Barthes doesn't!

Who wrote these? Who cares? Barthes doesn't!

But then again, it stands to reason that Barthes’ would suggest such a theory. He was a strong proponent of language being a fickle and malleable medium, and one that cannot be unshackled from experience or culture, whether that be the culture of the author or the reader. He was also writing in a charged time, with structuralism giving away to post-structuralism care of Derrida’s deconstructionism.

Oops. Look at what I just did… I just interpreted Barthes’ work by considering the context in which the author wrote it. Silly me.

But turning post-modernism on itself isn’t the only way to reveal its deep banalities – which disciples of po-mo somehow manage to hurl about while keeping a straight face. Cold hard science also weighs in to deconstruct deconstructionism.

Apparently not only does the author matter, but their genes as well. This is not necessarily new stuff. Just two posts back I linked to a fascinating book that takes a Darwinian approach to literature. But what is interesting is the angle this research takes in suggesting that literature “could continually condition society so that we fight against base impulses and work in a cooperative way.”

While I don’t doubt that literature does serve to encourage deep rooted notions of eglitarianism, I do wonder whether this interpretation might be arse-backwards. Is it literature that encouraged the spread of altruistic genes? Or was it the presence of altruistic genes that encouraged the authoring of literature with pro-social themes?

I suspect it’s the latter. But I only think that because of my personal beliefs about evolutionary psychology and social science. (Barthes apologists can ignore that last sentence.)