Where I live, near Sydney’s Eastern beaches, we have something we call mud. But it is not proper mud. Sure, it marks clothes, necessitating a wash, but is otherwise unremarkable.
I am currently in North Devon in England, where it has drizzled on every one of the six days since I arrived here.
Here they have PROPER mud.
It is sticky yet somehow also slippery. It makes squelchy noises when you tread in it. I went jogging beside a canal in Stratford-upon-Avon and, after the path ended, there was only grass interspersed with boggy bits of naked mud. I did my best to skirt around them, but that wasn’t possible in all cases and once or twice I had to actually tread in the mud. It looked very precarious, especially with the freezing canal only a metre away, ready to gobble me up if I slipped. As it happened, I only slipped once, falling away from the canal rather than towards it. My glove is now coloured by a souvenir of that special mud.
Sydney mud has too much sand in it.
English mud seems much more fertile. Mud is quickly regrown by vegetation if left undisturbed. Wherever I see mud I also see lush, fertile grass or close-packed, flourishing vegetable crops just nearby.
In that sense, mud is great – it is life.
But it is also death. The Great War, at least on the Western Front, seemed to be all about mud. It contaminated water to help spread disease, afflicted soldiers with Trench Foot, and it trapped horses, gun carriages and even soldiers, who could find themselves stuck in a position that was exposed to opposition gunfire. The bodies of soldiers killed by machine-gun fire, where they weren’t suspended on rolls of barbed wire, were soon partly or wholly swallowed up by mud. It was part of the horror.
There is something extraordinary about European mud. It is mythical. That’s why they write songs about it.
Perhaps it’s not just European mud. The song I had in mind was ‘Mud, Mud, Glorious Mud’, which I find is called ‘the Hippopotamus Song’ and is by Flanders and Swann (I had always assumed it was supposed to be a song for pigs). But a bit of web-searching informs me there are also lots of songs about American mud, although mostly in the South.
As the spine-chilling Maori Haka says, ‘It is death it is death it is life it is life…’ (‘Ka mate Ka mate Ka ora Ka ora’). One can imagine that life on Earth first originated in mud. There is so much richness in good mud, it would be difficult for life not to arise.
I realise that may not be scientific. I think a key reason why mud is so fertile is that it’s mostly organic, made up of lots of decayed organisms, animal and vegetable. So maybe they didn’t have mud on Earth before life arose, what with there being no decayed organisms around. Who knows, there was nobody there to write it down.
But even if life didn’t originate in mud, I bet that, once originated, it did an awful lot of evolving in there. We humans evolved in Africa, so I bet they have really impressive mud there. I’ve only been to North Africa, which is sandy, like Sydney, so I have never encountered real, proper African mud, of the sort that Joseph Conrad describes in Heart of Darkness.
If you are fortunate enough to live in a place where they have proper mud, I urge you to give it a thought next time you get caught in or covered by it, rather than just cursing as you head for the washing machine. Mud is magical stuff. No wonder little children loving playing in it.
Woolacombe, North Devon, December 2017
It’s a busy time, the end of choir practice. It’s 9:05pm and I haven’t had my dinner. I need to put away chairs, don my cycling safety gear, unlock the bike and whiz back home to look for something to eat. Busy, busy, busy. So one is distracted, right?
And I found myself singing that famous line from the Hallelujah chorus
‘and She shall reign for ever and ever’
Why were you singing that, Andrew? I hear you ask.
Well the chorus had been the last thing we were practising and, you have to admit (if you’ve ever heard it) that it’s very catchy. No wonder that GF Handel was the Andrew Lloyd Webber of his day.
No, not that, you respond – I mean, why ‘She’? Don’t you know that the official lyric says ‘He’?
Well actually yes, I do know that, which is why I was a little surprised to find that my subconscious mind, after deciding to make me sing that song, had also decided to make me sing ‘She’. I don’t know why it did. I think it may be because there’s a lovely alliteration in ‘She shall….’ that you don’t get with ‘He’.
But then I thought to myself, as I strode in a purposeful and manly manner towards my bicycle, why not She? Where does it say that God has a sex, and that it is masculine?
Now I know what you’re thinking: the Bible and the Quran are both full of He this, He that, Father this and Lord the other. That’s true, but you ask any theologically sophisticated Christian or Muslim whether God has gonads and I’m pretty sure they’ll say ‘Of course not!’ God is much too big and impressive, not to mention invulnerable, to have a collection of soft, funny-looking, easily damaged organs dangling annoyingly between his legs.
I think there are two reasons why male pronouns and nouns are used to refer to God in the scriptures of Middle-Eastern religions (Christianity, Islam, Judaism), both of which are to do with cultural traditions and have no theological basis.
The first is that ancient Middle-Eastern cultures, like most all other old cultures, including English and American, are patriarchal and use masculine pronouns in all cases except where the person being referred to is definitely female. All sorts of interesting reasons for this can be discussed but, whatever the reason, we cannot doubt that that is the practice. In a sense, ‘He’ is just the way of saying ‘She or He’ in that language tradition. In the modern, progressive parts of the world, we are working to undo those traditions, because of their toxic effect on sexual equality. But that’s a modern phenomenon that occurred centuries after the King James bible, let alone the original versions written in the period 950BCE – 150CE (600-900CE for the Quran).
The second reason is more specific. In those patriarchal cultures, it was assumed that a figure of authority must be male. Yahweh / Allah was the ultimate Boss, so It was described as male, as the notion of a female boss would have just been too incomprehensible – and unacceptable – to consumers of the stories.
Neither of these reasons retain any validity in modern, Western society, so there is no reason to perpetuate the implicitation of masculinity that was adopted at the time of writing. In fact, there are good reasons to actively overturn that implication, as just another undesirable plank in the ugly edifice of male dominance.
There is one other reason that was suggested to me by a Roman Catholic friend, that is more concrete. That is that Jesus was a man. Let’s accept for now the biblical narrative that there was a single man called Jesus of Nazareth, on whom the gospel stories are based, and whose body housed the incarnate spirit of God. Then the worldly container for the spirit of God did indeed have an XY chromosomal pattern, testicles and a penis. But why should that make us think that the immaterial spirit that pre-existed that body, and survived it, also has those things. We are told that Jesus had a beard. Does that mean that the spirit also has a beard?
If God’s plan was to incarnate as a human and preach an important message, It had three options for a body in which to incarnate: as a man, as a woman, or as a human of indeterminate sex. In Palestine CE30, only one of them had any chance of success. Nobody would have taken a woman seriously, and someone of indeterminate sex would likely have been put to death as a perceived infraction of God’s laws. So the choice of Christ (the part of God’s spirit that is said to have incarnated as Jesus) to incarnate as a man was simply an expedient, and says nothing about the sex of Christ.
Christians pray to Christ – the spirit – rather than to Jesus, even though they may say Jesus because it sounds more friendly. Jesus was the incarnated man, and he only existed for about thirty years. It is Christ that the religion says is eternally in heaven, and to whom a Christian prays. And there is nothing to credibly suggest that Christ has a sex.
Are there any other reasons why God should have a sex?
I can’t think of any that aren’t completely silly. One that immediately comes to mind is that God is The Boss, and bosses are more often than not male (although personally I have been fortunate to have had at least as many female as male bosses in my work career, and there is no doubt about who wields the power in the reasonably-happy home I inhabit). We’ve already dealt with that.
Another is that God is portrayed as a Father. But again, the intent of this metaphor (metaphor because It’s not really a father – there is no divine sperm involved) is to convey that God has the same loving, guiding, protective relationship to us that a parent typically has to their child. The scripture writers just wrote Father rather than Mother or Parent because of the language conventions mentioned above.
Any more reasons? No, I’m afraid I can’t think of any.
On the other side, there are excellent theological reasons against attributing a sex to God.
According to 1 John 4:8, God is Love. Does love have a specific sex? No.
According to John 1:1 God is The Word. Do words have a sex? No.
According to the influential theologian Paul Tillich, God is the Undifferentiated Ground of Being. Do Grounds of Being have a sex (provided we don’t differentiate them!)? No.
According to St Thomas Aquinas, God is Pure Actuality. If we distill Actuality until it is pure, does it acquire a sex? No.
According to St Augustine, God is Goodness Itself. Does Goodness have a sex? No.
I can tell you don’t want me to go on, so I won’t.
Right, now that we’re all agreed that God has no sex, what are we going to do about the fact that nearly all the words written and spoken about God attribute masculinity to It?
This is my plan. Please listen carefully.
From now on, whether you believe in God or not, in every reference you make to God that is in a context where use of a sexed pronoun is natural, I want you to use the female form.
As you are all intelligent and attentive readers, you naturally understand that this is not because I think God has a sex and that sex is female. Rather it is that, even if this idea went viral, it would have no hope of balancing out the enormous number of references to God as male that are out there. So we’ll keep on at this until God references achieve sexual parity, and then we’ll think about what to do next. This is not, as Alan Jones or Donald Trump might claim, ‘playing with words’ or ‘political correctness gone mad’. It’s just using sensible language that recognises that women and men are equally human and equally capable of anything except for a very few sex-specific activities such as fertilising an ovum or gestating a baby human. It’s a step that subverts the subtle message that only a man can be a person of power and wisdom. It’s a small but meaningful step in the project of gradually dismantling millennia of male dominance and oppression. And who better to lead such a step than the religions that have historically been – and unfortunately in some cases still are – platforms for those that seek to perpetuate that dominance.
So, if you please, it’ll be:
‘Our Mother who art in Heaven….’, in The Dame’s Prayer.
‘And He shall reign for ever and ever….’
Jesus is the Son of Woman (note the preservation of the word Son for Jesus, on account of the real-life testicles on the body used for Christ’s incarnation).
‘And She looked down on Her creation, and saw that it was good’.
The hymns will need reworking too:
‘Hail Redeemer Queen divine’
‘Queen of Queens, and Dame of Dames’
Everything, except specific references to the body of Jesus of Nazareth, has to go, and be replaced by its feminine equivalent.
What nice, friendly, inclusive places churches will become when this is adopted. I would happily visit them and sing along to ‘God rest ye merry (gentle)women’ in a spirit of ecumenical solidarity.
I don’t want to pick unfairly on Middle-Eastern religions, even though, they being by far the most powerful ones, they can take it. So let’s pause to consider the others.
Non-Middle-Eastern religions seem to generally be less patriarchal than the Middle-Eastern ones. There are powerful goddesses in Indian, Egyptian, Native American, Norse, Greek and Roman religions. But in all cases the boss of the gods is male. Apparently there have been, through the twentieth century, groups of scholars that believed that ancient religions such as druidism worshipped an Earth Mother type deity as their main focus, but these beliefs have fallen into disfavour in academia, and start to look more like wishful thinking of survivors of the Peace and Love generation of the sixties, than historically accurate accounts. The only well-known religions – ancient or modern – in which the most powerful being is female are neopagan religions such as Wicca. Well good for them, I say. But they are a very small minority, and the male dominance of the other religions I mentioned at least lets the Middle-Eastern triumvirate that currently dominates the world off the hook a little.
But Andrew, you protest, you are not a practising Christian, or a Muslim, or a Jew, so why should you care what words they use to talk about their gods?
You make a fair point, dear reader. The religions towards which I feel the greatest affinity are Buddhism and Vedanta, neither of which have any connection with the Middle East. But although I am not a Christian, Christianity has a major effect on my daily life and the lives of those around me, through the enormous influence that Christian power-brokers have on our laws and social customs. So it is in my interest, and in the interest of anybody that wishes for a kinder society, for the average Christian, as well as the power hierarchies of the various Christian sects, to become more consultative and compassionate. I think the religion becoming less male-dominated and male-oriented would help in moving along the road towards that goal.
And the same applies to Judaism and Islam. While their influences are minor where I live, there are parts of the world where their influence is intense. The people living in those regions would greatly benefit from those religions shedding some of their patriarchal orientation, and where better to start than to stop pretending that God is a bloke.
Bondi Junction, December 2017
There are two lovely trees in my garden. It seems to be my life’s remaining mission to try to save these trees from the attempts of the local possums to kill them. They do this by climbing into the trees and systematically stripping off buds and young leaves, leaving branch after branch a grim, ravaged skeleton.
The war has been running for several years now. It is not always the same possum, but it is usually only one at a time. Possums are territorial and defend their territory against interloper possums with vigour. I don’t know what has led to the two or three changes of the possum guard we have had since the war began. Perhaps the incumbent died. Or perhaps they were ousted by a more powerful interloper. While the actors may change, the role – the archetype – remains constant. ‘The possum’ is the unitary force of darkness that seeks to turn my garden to a barren, Mordorish wasteland, regardless of which particular possum is playing the role in any given year.
There are now two trees that the possum does it best to kill, but at first it was only one tree – an Albizia or ‘Silk Tree’ – a deciduous tree with a spreading canopy of lovely, silky-soft fern-like fronds.
They remind me of the Truffula tufts in Dr Seuss’s story of The Lorax, who tried to save the Truffula trees from the greedy depredations of the Once-ler. The tree was planted about twelve years ago to replace a tree that died, grew quickly and flourished for several years, delighting us with its beauty and its cool shade in summer. But then the possum arrived!
But did the possum arrive, like the greedy, destructive Once-ler in Seuss’s book? Was it a possum explorer that ‘discovered’ this ‘uninhabited’ territory in the same way that the plaque below Captain Cook’s statue in Hyde Park claims that he “discovered” this territory of Australia. Did the possum claim the territory for his possum king, thereby instantly erasing the rights of any existing residents, just as Cook did? Or was the possum there all along, never bothering about the Albizia until one day it munched some as an experiment and discovered how delicious it was? Were we ever peaceful cohabitants? I don’t know. But whatever the genesis, the war started a few years back when I realised one day that it was December and the Albizia had no leaves. It had always been late waking from its winter slumber, so I had never been concerned when it lagged other deciduous plants. But December? I thought maybe it has some disease, and took some of the skeletal fronds – stripped of greenery – to a garden shop, where they told me it was either possums or rats. I did the experiment they suggested (leave cloths around base of tree overnight, then examine for droppings the next day) and confirmed that it was a possum – they have larger droppings than rats.
I was devastated. They say you don’t realise how much you love until the subject of your love is threatened. Well, I realised.
What followed was a series of all sorts of strategies to protect the tree. I won’t bore you with the details, but the list includes spraying repellant on the leaves, hanging camphor balls from branches, installing a motion-activated ultrasonic noise-maker, winding fairy lights over the branches, putting various types of plastic spikes on branches and the nearby fence, leaving a powerful, timer-activated floodlight pointed at the tree and even shrouding possum-accessible parts of the tree in clear, flexible perspex that a possum cannot grip. At times the tree has looked more like a missile bunker than a beautiful piece of nature.
The most successful strategies have been the floodlight and the perspex sheeting. But neither seems sustainable to me. Although it’s very energy-efficient LED, the floodlight uses 50 watts of energy for eight to ten hours a night, depending on the time of year – almost half a kilowatt hour per day. That may not sound like much to you but to a radical greenie like me that feels like treason to my most dearly-held principles. As for the perspex sheeting, the trouble is that it acts as a sail in the wind, putting enormous strains on the connection points and the branches when we get our ‘Southerly Busters’ that bring blasts of welcome cool air from the Southern Oceans at the end of some scorching summer days. After particularly windy days there is often repair work to be done, and I feel bad that the loud noise the sails make may disturb our neighbours.
Further, every now and then, even those Best Practice strategies fail. A new possum takes up residence that is less afraid of light, or the possum works out a sneaky way around the perspex barricades. I am always in search of a solution that doesn’t have such loopholes, or the high maintenance of the sails or the energy consumption of the light.
In war the most important weapon is information. My primary source of information is an infrared camera that I mount on a tripod and which wirelessly transmits to a base station that records video. When I suspect the possum is breaching the defences, I deploy the camera overnight and review the footage in the morning to see if it has broken through and if so how. I then use that information to work out how to plug the gap.
Reviewing the video in the morning is an angst-ridden experience. You watch hours of nothing happening, at 32 times fast-forward speed, then suddenly the possum creeps into view, its eyes glowing in the infrared like a demon. It contemplates the tree, tries this approach – blocked, then that – blocked again. On a good day it goes away defeated. But on a bad day it tries something new and by some unbelievable feat of gymnastic agility manages to get a claw hold on some part of the trees wood and wrestle its way up into the canopy. Once there it proceeds to massacre the tree at its leisure. It’s like watching CCTV of a bully beating up a dear friend, with no help in sight, and you, the viewer, helpless to intervene because it has all already happened. Words cannot do justice to the sick feeling I get in my stomach when that happens. But I have to force myself to watch the torture, second by miserable second, because only by doing so can I hope to learn how to prevent its recurrence.
Which brings me to the topic of this essay – obsession. My family and friends chuckle about me and my war. I can well understand that it appears as a monomaniacal obsession – the sort of thing that arty fiction is written about. But what sort of obsession is it? There are two great works of fiction that in my mind compete to represent my battle: Moby Dick and The Fourth Wish.
You probably know the story of Moby Dick, even if you haven’t read all of it (I never made it past about page 100). The story is of Captain Ahab who, having had his leg bitten off by a huge white sperm whale, spends the rest of his life pursuing the whale around the seas, obsessed with obtaining his revenge, and (spoiler alert!) eventually being killed by the whale in a final battle. Ahab’s obsession is beyond all reason. It consumes him, when he could have led a perfectly enjoyable and prosperous life as a ship’s captain. Is The Possum my Moby Dick?
The Fourth Wish is quite different. It is a three-part TV drama from 1974, remade as a movie in 1976, about a single father whose school-aged son is dying of leukaemia. The father asks his son to make three wishes, and then does all he can to make the wishes come true before the child dies. Fulfilling those wishes for his son is the father’s obsession (the father is played by the late John Meillon, who became much more famous subsequently for his role as Walter in the Crocodile Dundee movies). He is an inarticulate, emotionally repressed, not terribly capable person. But he rises to the occasion in his desperate quest to make his son’s limited remaining life memorable and fulfilling. It is the first television or movie drama I can ever remember having been moved by – I would have been eleven at the time. Meillon does a marvellous job of conveying the father’s tremendous sadness.
My story is of the tree and the possum and me. If we focus on the enmity with the possum, it parallels Moby Dick. In Moby Dick there is no counterpart to the tree – no character that needs protecting. On the other hand if we focus on the nurturing side, it is like The Fourth Wish. In that case there is no counterpart to the possum, unless we anthropomorphise the leukaemia and cast it in that enemy role.
Having written all that, and read it over, I think it is more Fourth Wish than Moby Dick. Ahab’s obsession was founded in hatred and revenge. I don’t hate the possum, even though it is my enemy. I expect I would kill it if I could, but humanely, and as a regrettable necessity, certainly not as revenge. The creature is only trying to live. If it would agree to go easy on the Albizia I would readily forgive its past savageries. I’m sure we could become great friends. But alas, it will not. It is not the possum’s fault that it doesn’t have the foresight to spread its foraging between many different trees in order that all of them may flourish. I feel great sorrow when I see the tree’s ravaged limbs. I so want to do something to help it, yet I feel as though I am up against implacable odds.
The story of the second tree is similar to that of the first. A red gum, it was planted about three years ago as a mature sapling, and very soon grew and flourished. It has almost tripled in height. But a few months ago I noticed that its canopy of leaves – formerly rich and luxuriant, was looking thin and sickly. Going up closer to look, I saw that many branches had had most of their leaves bitten off – the tell-tale chewed stubs bearing testament to what had happened. Deploying the camera overnight, my fears were confirmed – it was the possum. From then on I knew that I had two patients to protect from the predator, when before there was only one.
There is another story that this saga reminds me of – Bram Stoker’s Dracula.
In that novel, Lucy Westenra, a vivacious young friend of the narrator’s fiancee, is vulnerable to the hypnotic spell of Count Dracula, who is able to lure her in the middle of the night, in a sleepwalking trance, out of her heavily protected house to the wild lands that surround it, where he feasts on her blood. Each time that happens she loses masses of blood and gets weaker and her friends don’t know how to help her. They save Lucy’s life several times after she has suffered otherwise fatal blood loss, by giving her transfusions of their own blood. Each time they do this, they take further measures to try to keep Dracula away and Lucy safely in her room. But repeatedly, after Lucy has recovered for a few days and is starting to look healthy again, Dracula finds a new way to lure her out, and she is found near death’s door again. Finally, this happens one too many times, and she dies (sort of, but to say any more would be a spoiler).
It may seem a little melodramatic to cast the possum as Dracula, and my beloved trees as a Victorian heroine, but when I see those eyes suddenly appear out of the dark, glowing like demonic coals in the infra-red image of the CCTV, it doesn’t feel far-fetched at all. By trying every strategy I have, I manage to keep the demon at bay for a few days, so the tree can recover and grow a few new leaves (so it can breathe! – just like we need blood to carry our oxygen). Then, one night, the possum gets into the tree, savages it and I find it in the morning at death’s door again.
Here’s a picture of the possum, demonic coal-like eyes and all, posing grimly atop the almost-cadaver of its victim.
The war continues. I have bought another floodlight. Now I am using 90 Watts when both lights are on – one for each tree. I will have to buy carbon credits to offset the electricity. I have also devised new defences involving longer spikes arrayed along the top of the fence, in an attempt to deny the possum a launching pad. If they turn out to be effective, maybe I can turn the lights back off so that I won’t be single-handedly responsible for pushing global warming beyond the point of no return. Time will tell.
Bondi Junction, October 2017
I try hard to be open-minded. I think I succeed at that reasonably well, but I still regularly get surprised at the discovery of a prejudice I didn’t know I had.
I don’t know whether it’s possible to rid one’s self of all prejudice – I suspect it’s not. If so, the best I can aim for is to be on the alert for prejudices, try to rid myself of them when I discover them, and try to always remember that any opinion I have – regardless of how carefully thought out it may seem – may be inextricably tied up with some prejudice I don’t yet realise I have.
The Wikipedia article on Cognitive Biases has a very long list of them. With so many opportunities to go wrong, it’s hard to imagine one can escape all of them.
There’s a popular phrase ‘It’s good to have an open mind, but not so open that your brain falls out‘. I don’t like that phrase at all. It is most commonly used by bigots in an attempt to defend their bigotry while at the same time appearing rational. Nobody’s brain has ever fallen out from being open-minded, either literally or metaphorically. However, the rankest nonsense phrases often have a grain of truth of them, and there is a grain of truth even in that one. It is that, in order to achieve anything with our thoughts, we need a framework within which they can operate, and that framework will be made of rules and suppositions that are accepted without evidence. I agree that we need such a framework, but what is crucial is that we acknowledge the existence of the framework, that it has no supporting evidence, and that we hence have no basis on which to claim it is better than any other framework. That doesn’t mean we should refuse to act on conclusions drawn within our framework. But it does mean that (in my opinion, which was derived within my mental framework!) it is a good idea to regularly examine and challenge our framework, and consider alternatives. Sometimes that may lead to a radical change in worldview, which opens up whole new vistas.
It may lead to a Christian becoming a Buddhist, or vice versa. It may lead to a Socialist becoming a Libertarian, or vice versa. It may even (heaven forfend!) lead to personnel exchanges between Platonism and Existentialism. I may have my own preferences about which of those and other sets of ideas most people aligned themselves with but, regardless of the outcome of any migrations of beliefs, I see it as good that people regularly examine their beliefs, so that belief migration becomes a commonplace possibility. If we know what our prejudices are, we have the power to change them. But we cannot change a prejudice we don’t even know we have.
Here are two of my prejudices. The first is that it is preferable for there to be less suffering in the world. I know it’s a prejudice. I know I can’t prove it. But I’m going to hang onto it, for now at least.
The second prejudice is that if I have observed two phenomena to occur in close conjunction many, many times then, in the absence of strong reasons to the contrary, I should expect them to continue to occur in conjunction in future. Every self-supporting person on Earth has this prejudice. But nobody even realised it was a prejudice until David Hume pointed it out in the eighteenth century – his famous ‘Problem of Induction’. If you don’t believe me, think of how you use language. You speak English to somebody – say it’s Bertha, expecting them to understand it, because they have understood English when you spoke it to them in the past. But why should the fact that Bertha has always understood spoken English in the past indicate anything at all about whether she will understand it in the future? You might object that you know that Bertha learnt English as a child, so you know she knows English. But then you are relying on the association between the events ‘X has learned English‘ and ‘X understands English‘, which has been reliably observed in the past, but why should that tell us anything about whether it will be observed in the future? Whatever objection is raised, I (or rather David Hume) can find an answer to it. But I’m still going to hang on to this prejudice.
Prejudice in Music
I had been thinking over this in the context of musical styles. It’s hard to think of any other human activity, the study of whose history is so riddled with the use of the word ‘shocking‘. The most casual observer probably knows about how Rap was considered shocking when it emerged in the eighties, ditto Punk in the seventies, how Rock n Roll was considered shocking when it emerged in the fifties, and how Jazz was considered shocking in the early twentieth century.
But the history of people being shocked by music goes back much farther than that. The history of classical music in particular is regularly punctuated by shocks when some innovator broke hallowed rules. Working back in time we have Schoenberg, Stravinsky, Debussy, Wagner, Beethoven, Haydn and Monteverdi as major disruptors of established musical conventions.
The following story from a radio music presenter made a big impression on me. They told of how they had been working in the archives of a classical music operation, listening to, classifying and cataloguing recordings. After having been doing this for a few weeks they walked past a studio in where music was playing over the loudspeaker. Appalled at the terrible, disorganised racket they were hearing, they asked somebody what the noise was. It was JS Bach! [For the non-classical music buff, JS Bach was a genius who lived from 1685 to 1750, in the ‘Baroque’ period, and is as revered a part of the musical establishment as it is possible to be] The reason it sounded so terrible and formless was that the music the presenter had been listening to non-stop for the last few weeks was all pre-Baroque, and hence operated within a framework of rules and norms that Bach’s music ‘broke’. If they had heard it a few weeks earlier they would have likely thought ‘how lovely!‘ or maybe even ‘that’s a bit old-fashioned!‘
I want to pick on Schoenberg, because on the face of it he might seem to go as far as one can go in breaking rules. The Austrian composer Arnold Schoenberg rebelled against the tyranny of tunes having to be in a musical key, like G major or A minor. Although only classical pieces tend to state their key, with names like ‘String Quartet in E flat Major‘, nearly all pieces have one. Perhaps the most famous song of all, Lennon and McCartney’s ‘Yesterday‘, could have been called ‘Sad song in F major‘. Key changes do occur within a piece, but they have a big effect, because we become attached to the key in which the tune is set. That’s why key changes are often used towards the end of a song to build up the levels of excitement and energy towards a final climax.
Schoenberg’s project was to refuse to use any key at all, not even for one phrase at a time. To do that he invented his ‘Twelve-tone system’ in which he set a rule that every one of the twelve notes in a chromatic scale must be used exactly once within each short section of the piece (called a ‘tone row’). By giving every one of the twelve possible notes equal status, he prevented any note gaining prominence as the ‘Tonic’, the home note of a key. Unlike another famous Austrian, Schoenberg was very anti-racist: he wanted the black piano keys to get as much opportunity as the white piano keys in his pieces (note that’s a different use of the word ‘key’).
Here, from YouTube, is a Schoenberg piano piece using his twelve-tone system, for you to enjoy.
As I was musing over whether Schoenberg had achieved the ultimate in open-mindedness, I suddenly realised a hidden prejudice. Sure, he had proclaimed equality between all twelve notes. But a note is defined by a frequency – vibrations per second. So the number of possible notes is infinite, not only twelve per octave. Between any two different notes there are an infinite number of frequencies between them. In Western music, which is descended from Ancient Greek music, the smallest interval between two notes is a semitone, which means the ratio of the two frequencies is 21/12 or about 1.06.
In classical Indian music, the twelve Western notes are used, plus ten others, called ‘nadas’, giving twenty-two altogether, so that the average gap between adjacent permissible notes is just over a quarter-tone. A piece containing nadas sounds to a Western ear, which is not trained to understand those extra notes, like it is being performed on an out-of-tune instrument.
Here is a scale that goes up an octave in twenty-four quarter-tone steps (not exactly the same as an Indian scale, but closer to that than to a Western scale), then walks back down again. What does it sound like to you?
For comparison, here is the same thing using just the twelve notes in the Western scale.
Try to sing or hum along to first the Chromatic scale, and then the quarter tone scale. I’m a reasonably accurate singer and can do the first, but can’t even get started on the second.
But even writing music that includes nadas or quarter tones still involves a prejudice against the in-between notes. It’s just a smaller prejudice than Westerners like me have. I expect a piece involving eighth-tone intervals would sound just as weird to an Indian as one using quarter-tones does to us.
If we want to write music that is free from all prejudice, we need to go beyond Schoenberg, beyond Indian music, beyond even eighth tones, and write music in which each note can be any frequency at all, without limiting the choice to notes that are certain multiples and ratios of others.
I wrote a piece of such music. To be precise, I programmed a computer to randomly generate a series of frequencies and note-lengths and produce notes using those. I then produced another version of it, in which each note was rounded to the nearest semitone, so that only the twelve notes in the Western scale were used.
Can you tell which is which? They are both weird. Both break most of the rules we are used to. But one is a bit weirder, a bit more free, than the other.
Click here to find out which one is which.
The above is a long way from JS Bach, but is it free from any form of musical prejudice (aka structure)? No. For a start I have constrained the notes to be within the audible frequency range, even though it is entirely conceivable that notes that we cannot consciously hear may still have an effect on our body and thereby alter the sensory experience. I have also constrained the notes to not be very short or very long, in order not to frighten or bore the listener. The volume is also constant, rather than varying between notes, or even within notes. The shape of each sound wave is a perfect sine curve, whereas the wave shape could be allowed to change between and within notes too. That would not change the ‘tune’ but it would change the texture (‘timbre’ in musician-speak). I expect there are other prejudices in there that I have not yet realised.
I like most of my prejudices. I prefer Bach to Schoenberg most days of the week. But it’s good to challenge oneself every now and again with a bit of Schoenberg (or its equivalent), and then to occasionally challenge the Schoenberg with something even more radical.
Bondi Junction, July 2017
I was jogging on the beach, trying to think of something else because the last couple of days had been rather upsetting. I settled on thinking about an essay I am trying to write about The End of The World. Very soon I found that I had the REM song It’s the end of the world as we know it running through my head on repeat.
After a while I noticed somebody running along next to the concrete promenade, where the sand is softest because it is furthest from the water and almost never gets wet from the sea. The sand was pretty soft where I was, about halfway between the promenade and the water. But maybe it was softer over near that other guy. In any case, we’d had heaps of rain recently, so if water makes sand pack together harder, presumably where I was would be just as water-hardened as next to the promenade.
But then maybe seawater has a different effect. Perhaps it makes the sand stick together better than rainwater does. If so then the sand next to the promenade really would be softer, unless the sea ever gets up to there.
That led to me wondering about whether, in the wildest sorts of weather, the sea ever came all the way up to the concrete wall below the promenade (about fifty metres from the high tide mark).
Thinking of stormy weather made me think of the scene in the movie The French Lieutenant’s Woman where the female lead stands at the end of a long jetty in a storm, only a metre or two above the rough sea – a precarious position, deeply evocative.
That led me to wonder whether it is sexist to refer to the character as somebody’s ‘woman’, thereby seeming to suggest ownership. That led to my thinking about the reverse phrase ‘somebody’s man’, which led me to think of the Tammy Wynette song Stand by your man.
And without any conscious decision to do so, there I was, jogging along the beach, mentally humming Stand by your man instead of It’s the End of the World as we know it.
Bondi Junction, April 2017
Featured Image is from the 1981 movie The French Lieutenant’s Woman, showing the jetty called ‘The Cobb’ at Lyme Regis UK.
One day the sun will grow so large that it will first dessicate, then bake, then engulf and vaporise, the Earth and everything on it. No life will survive that. Perhaps some people will have escaped to habitable places in other solar systems, but it’s hard to imagine it would be many, given the enormous energy that is likely to be involved in any interstellar travel. I expect ordinary people will be unable to escape.
Even escapees will be wiped out eventually, as the universe, many billions of years from now, slides inexorably into heat death. No life will survive that.
So there it is: the end of the world is a matter of when, not if. We are powerless to prevent it.
That background makes it a bit confusing to work out what moral obligation we have to take actions that prevent a near-term end of the world, and to avoid actions that would hasten it.
If we are talking about preventing the end of the world in our lifetime, it can be easier to resolve, because that would affect people that are alive now, and most people recognise that they have at least some obligation of care to other people that cohabit the world with them.
But that obligation is less widely accepted when it comes to future generations, and the farther away those generations are, the fewer people tend to feel an obligation towards them. Politicians sometimes talk about intergenerational equity and caring for the future of our children, maybe even our grandchildren. But it’s a rare politician that argues for a policy on the basis of its effect on our great-great-great-great-great-great-great-great-children.
At some stage, life on Earth will come to an end, and it seems likely that that end, unless it occurs in the blink of an eye – which it is hard to imagine happening – will be accompanied by tremendous suffering. If that is inevitable then how can we work out whether it matters whether it occurs sooner or later?
We cannot solve this by reason alone. As David Hume so acutely observed “Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger.” Of course he wasn’t saying he did prefer the destruction of the world. He was saying that one must look to one’s emotions to find an answer.
Looking to my own emotions, I confess that I am more alarmed at the prospect of the world ending in a catastrophe in 100 years than in 100 million years, despite the fact that I will not be here to see either..
I use the word ‘catastrophe’ rather than ‘cataclysm’ because I think the end would be lingering and painful. We should be so lucky as to be extinguished in the blink of an eye. Philosophers who have nothing better to do with their time make up thought experiments involving a button you could press to instantaneously end the world, and ask under what circumstances you would press it. But there are no such buttons, nor ever likely to be, so we need to contend with the end being a drawn-out, painful process. I suspect widespread famine would be a major part of it. That would lead to outbreaks of uncontrolled violence as people compete for the dwindling resources of food and water. Disease would spread to accompany the famine – perhaps providing a more merciful end for some. We see this sort of catastrophe already in some parts of the Earth, and we will see them more often as climate change becomes more severe.
Is a ‘soft landing’ possible? What if, realising that the world will become uninhabitable within 200 years, we were to decide that we were morally obliged to not have children, in order not to inflict on those new people the pain of experiencing the world’s slow death? What would a world with no new children be like? Most of us, including me, feel that it would be very sad. I know of two novels that explore this: ‘The Children of Men‘ by PD James, and ‘I who have not known men‘ by Jacqueline Harpmann. In the first, for some unknown reason, humans cease to be able to conceive. The novel is set about twenty-six years after the last baby was born. In the second novel, a group of female prisoners escape from their underground dungeon to find the Earth deserted. They wander for many years in vain search of other survivors and after a while start to die of old age, with no replacement.
Both novels are confronting, bleak and sad. The James also has a thriller element to it (which I won’t spoil for you), but the basic premise is still bleak.
It would be very hard for us now to decide ‘No more babies’. Imagine us all gradually dying one by one, deprived of that feeling of continuity – the circle of life – that one gets from seeing younger generations. But what if society had the time to work up to that over several generations? What if, realising that all life would cease within ten generations, society worked to change its culture in order to equip people to feel more positive about non-procreation and less reliant on younger generations. It would be a very difficult psychological shift to accomplish. It would have to counteract the powerful impulse embedded in our psyche by evolution – to perpetuate the species. But who knows what techniques of psychological manipulation humans may have managed to invent in a thousand or more years’ time? Maybe they could condition future humans to find fulfilment in bringing their species in for a soft landing – for instance in working as a childless carer for old people until one becomes too old to work. Things could be set up so that the last remaining people have all the food, water, clothes, medicine, shelter, power and entertainment they need to survive solo (we would also need to train people to be comfortable with isolation, which we current humans are definitely not). They might also be provided with pills to provide a painless end to life once they near the point where they will no longer be able to feed themselves. That is not how it happens in ‘The Children of Men‘. But that book is set in 2021, not 3021, and with no notice for society to prepare for the landing (for some reason fertility just suddenly ceases in 1994).
If a soft landing were possible then, while an end of the world may be inevitable, its accompaniment by great suffering would not be. It would then become easy to argue for doing what we can to delay the end of the world. It is simply to prevent a great suffering.
What if it’s not possible, so that the great suffering is simply a matter of ‘when’ rather than ‘if’? What if the amount of suffering accompanying the end of the world will be roughly the same regardless of whether it occurs in 200 years or 200 million years? Are we morally obliged to do what we can to defer it beyond 200 years? I pick 200 years by the way because that should be long enough to be fairly certain that nobody currently alive will be around to experience a world’s end in 200 years.
It seems to me that the main difference between the two end dates is all the currently-unconceived humans that would experience life in the intervening 199,999,800 years. Is it a good thing or a bad thing that such lives should come to pass? There is very little moral guidance on this. Even religions have little to say about this, with only a very few religions (albeit big powerful ones) forbidding contraception.
A group that has a decisive opinion that is the direct opposite of the anti-contraceptionists is the anti-natalists, led by the prominent South African philosopher David Benatar. Benatar argues that, since all life contains some suffering, it is immoral to create any new life. He does not accept that suffering may be offset by pleasure at other times in a life. Even a few moments of mild pain in an otherwise long, happy life makes the creation of that life a moral mistake, in Benatar’s book. Less extreme anti-natalists argue that procreating is OK if we think the new life will have more pleasure than suffering but that, since we can’t be sure, we are obliged to not procreate. A more folksy version of this is the comment uttered at many a late-night D&M discussion, that ‘this is no world to bring an innocent child into‘.
Not many people are anti-natalists. Most people, despite the exaggerated doom and gloom on the news – terrorist this and serial-killer that – (of course no mention of the real dangers like climate change, malaria, poverty, road carnage and plutocratic hijack of our democracies) see life as a generally pleasant experience and look positively on conferring it on new humans. But that tends to be a very personal feeling, in which the moral dimension cannot be disentangled from the powerful personal urge to procreate.
For those of us who are neither anti-natalists nor anti-contraceptionists, the question of those lives in the intervening 199,000,800 years remains a mystery to be explored. Is it important that they come to pass? Is it good that they do so?
Lest you decide I sound like a homicidal maniac and ring Homeland Security to have me ‘dealt with’, let me state here that I feel that it is better to do what we can to delay the end of the world. That’s a major factor in why I think action on climate change is the most important issue facing humanity today. But I won’t go into the reasons why in this essay, because this topic will be discussed at my upcoming philosophy club meeting and I want to avoid spoilers. In any case, I’m more interested in what other people think about this.
The dilemma posed by this essay was first raised by Oliver Kirk.
Bondi Junction, April 2017
- What, if any, obligations do we have to unborn generations? Do they include an obligation to ensure their existence?
- Does the nature or strength of the obligation change with the remoteness of the future generation?
- If we accept that the end of humanity will occur, and will be accompanied by great suffering, are we obliged to do what we can to delay it for as many centuries or millennia as possible (taking as agreed that we are obliged to delay it beyond the lifespan of anybody currently alive)?
- If we do feel obliged to delay, does that imply an obligation to maximise the population of the Earth, subject to being able to maintain adequate living standards?
- How do I feel about the fact that a time will come when there is no more life? Does it strip life of meaning? Or does it enhance meaning? Or neither?
- How would I feel about a world in which human reproduction became impossible?
- Do I feel differently about the world ending in 200 years from how I feel about it ending in 200 million years?
- What implications do our opinions on the above have on our feelings of what stance we should take on current future-oriented issues like climate change, balancing government budgets, infrastructure building, asteroid mapping, solar flare prediction?
PD James: ‘The Children of Men’
Jacqueline Harpmann: ‘I Who Have Never Known Men’ (‘Moi qui n’ai pas connu les hommes’)
Peter Singer: “Practical Ethics’. Discussion of obligations to future generations on p108-118 of Third Edition (2011, Cambridge University Press).