Queen Elizabeth II’s Platinum Jubilee

Queen Elizabeth II was born Elizabeth Alexandra Mary on 21 April 1926 and is the present Queen of the United Kingdom and fourteen other Commonwealth realms. She was born in Mayfair, London as the first child of the Duke and Duchess of York. A while later the duke then became King George VI, whilst the duchess became Queen of the United Kingdom and the Dominions of the British Commonwealth from 11 December 1936 to 6 February 1952. After her husband died, she was known as ‘Queen Elizabeth the Queen Mother’, and this was to avoid confusion with her daughter, Queen Elizabeth II, whose father had acceded to the throne in 1936 upon the abdication of his brother, King Edward VIII, making Elizabeth the heir presumptive. Elizabeth was educated privately at home and began to undertake public duties during the Second World War, serving in the Auxiliary Territorial Service (ATS). In November 1947 she married Philip Mountbatten, a former prince of Greece and Denmark, and their marriage lasted 73 years until Philip’s death in Windsor Castle at the age of 99 on the morning of 9 April 2021, just two months before his 100th birthday. They had four children; Charles, Prince of Wales, Anne, Princess Royal, Prince Andrew, Duke of York and Prince Edward, Earl of Wessex. When her father died in February 1952 Elizabeth, then 25 years old, became Queen regnant of seven independent Commonwealth countries as well as Head of the Commonwealth. Significant events include her coronation in 1953 and the celebrations of her Silver, Golden and Diamond jubilees. To celebrate her Platinum jubilee this year there will be an extra bank holiday and the usual Spring bank holiday is moved from the end of May to the start of June to create a four-day Jubilee bank holiday weekend from Thursday 2 June to Sunday 5 June.

Her Majesty the Queen during her visit in 2015 to HMS Ocean in Devonport at a ceremony to rededicate the ship.

Elizabeth Alexandra Mary was born on 21 April 1926, during the reign of her paternal grandfather, King George V. Her father, the Duke of York (later King George VI) was the second son of the King. Her mother, the Duchess of York was the youngest daughter of Scottish aristocrat Claude Bowes-Lyon, 14th Earl of Strathmore and Kinghorne. Elizabeth was named Elizabeth after her mother, Alexandra after her paternal great-grandmother who had died six months earlier and Mary after her paternal grandmother. Called ‘Lilibet’ by her close family, based on what she called herself at first, she was cherished by her grandfather, George V, whom she affectionately called ‘Grandpa England’ and during his serious illness in 1929 her regular visits were credited in the popular press and by later biographers in raising his spirits and aiding his recovery. Elizabeth’s only sibling, Princess Margaret, was born in 1930. The two princesses were educated at home under the supervision of their mother and their governess, with lessons concentrating on history, language, literature, and music. During her grandfather’s reign, Elizabeth was third in the line of succession to the British throne behind her uncle Edward and her father, so although her birth generated public interest she was not expected to become queen, as Edward was still young and likely to marry and have children of his own, who would precede Elizabeth in the line of succession. When her grandfather died in 1936 and her uncle succeeded as Edward VIII, she became second in line to the throne, after her father. Later that year, Edward abdicated following his proposed marriage to the divorced socialite Wallis Simpson which provoked a constitutional crisis. As a result, Elizabeth’s father became king, taking the regnal name of George VI and since Elizabeth had no brothers, she became heir presumptive. Elizabeth’s parents toured Australia and New Zealand in 1927, then in 1939 they toured Canada and the United States but Elizabeth remained in Britain since her father thought her too young to undertake public tours. She ‘looked tearful’ as her parents departed. They corresponded regularly and she and her parents made the first royal transatlantic telephone call on 18 May.

HRH Princess Elizabeth in ATS uniform, April 1945.

In September 1939, Britain entered the Second World War. Lord Hailsham suggested that Princesses Elizabeth and Margaret should be evacuated to Canada to avoid the frequent aerial bombings of London by the Luftwaffe but this was rejected by their mother, who declared, “The children won’t go without me. I won’t leave without the King. And the King will never leave. In 1940, the 14-year-old Elizabeth made her first radio broadcast during the BBC’s Children’s Hour, addressing other children who had been evacuated from the cities. She stated: “We are trying to do all we can to help our gallant sailors, soldiers and airmen, and we are trying, too, to bear our own share of the danger and sadness of war. We know, every one of us, that in the end all will be well”. In 1943, Elizabeth undertook her first solo public appearance on a visit to the Grenadier Guards, of which she had been appointed colonel the previous year. As she approached her 18th birthday, parliament changed the law so she could act as one of five Counsellors of State. These are senior members of the British royal family to whom the monarch can delegate and revoke royal functions through letters patent under the Great Seal, to prevent delay or difficulty in the dispatch of public business in the case of their illness (except total incapacity) or of their intended or actual absence from the United Kingdom. This was done in the event of her father’s incapacity or absence abroad, such as his visit to Italy in July 1944. In February 1945, she was appointed as an honorary second subaltern in the ATS, she trained as a driver and mechanic and five months later was given the rank of honorary junior commander. Then on Victory in Europe (VE) Day, Elizabeth and Margaret mingled incognito with the celebrating crowds in the streets of London. Elizabeth said later in a rare interview, “We asked my parents if we could go out and see for ourselves. I remember we were terrified of being recognised… I remember lines of unknown people linking arms and walking down Whitehall, all of us just swept along on a tide of happiness and relief”. She went on her first overseas tour in 1947, accompanying her parents through southern Africa. In a broadcast to the British Commonwealth on her 21st birthday, she made the following pledge: “I declare before you all that my whole life, whether it be long or short, shall be devoted to your service and the service of our great imperial family to which we all belong.”

Posing for photographs at Buckingham Palace with new husband Philip after their wedding, in 1947.

Elizabeth met her future husband, Prince Philip of Greece and Denmark, in 1934 and 1937. After another meeting at the Royal Naval College in Dartmouth in July 1939, Elizabeth, though only thirteen years old, said she fell in love with Philip and they began to exchange letters. She was 21 when their engagement was officially announced on 9 July 1947. The engagement was not without controversy however, as Philip had no financial standing, was foreign-born (though a British subject who had served in the Royal Navy throughout the Second World War) and had sisters who had married German noblemen with Nazi links. Some biographies reported that Elizabeth’s mother had reservations about the union initially, and teased Philip but in later life the Queen Mother told a biographer that Philip was “an English gentleman”. Before the marriage, Philip renounced his Greek and Danish titles, officially converted from Greek Orthodox to Anglican and adopted the name of Lieutenant Philip Mountbatten, taking the surname of his mother’s British family. Just before the wedding, he was created Duke of Edinburgh and granted the style ‘His Royal Highness’. Elizabeth and Philip were married on 20 November 1947 at Westminster Abbey and they received 2,500 wedding gifts from around the world. Elizabeth gave birth to her first child, Prince Charles, on 14 November 1948. One month earlier, the King had issued letters patent to allow her children to use the style and title of a royal prince or princess, to which they otherwise would not have been entitled as their father was no longer a royal prince. Their second child, Princess Anne, was born in 1950. Following their wedding, the couple leased Windlesham Moor near Windsor Castle until July 1949, when they took up residence at Clarence House in London. At various times between 1949 and 1951, the Duke of Edinburgh was stationed in the British Crown Colony of Malta as a serving Royal Navy officer. He and Elizabeth lived intermittently in Malta for several months at a time in the rented home of Philip’s uncle, Lord Mountbatten. Their children remained in Britain.

Coronation portrait of Elizabeth II with husband Philip in 1953.

King George VI’s health declined during 1951, and Elizabeth frequently stood in for him at public events. In October 1951, when she toured Canada and visited President Harry S Truman in Washington, D.C. her private secretary carried a draft accession declaration in case the King died whilst she was there. In early 1952, Elizabeth and Philip set out for a tour of Australia and New Zealand by way of Kenya and on 6 February 1952, they had just returned to their Kenyan home after a night spent at the Treetops hotel when word arrived of the death of the King and consequently Elizabeth’s immediate accession to the throne. Philip broke the news to the new queen. She chose to retain Elizabeth as her regnal name, she was therefore called Elizabeth II, which offended many Scots, as she was the first Elizabeth to rule in Scotland. She was proclaimed queen throughout her realms and the royal party hastily returned to the United Kingdom. She and the Duke of Edinburgh moved into Buckingham Palace. As a result of her accession, it seemed probable the royal house would bear the Duke of Edinburgh’s name, in line with the custom of a wife taking her husband’s surname on marriage. The Duke’s uncle, Lord Mountbatten, advocated the name ‘House of Mountbatten’ and Philip suggested ‘House of Edinburgh’, after his ducal title. However the British Prime Minister, Winston Churchill, and Elizabeth’s grandmother, Queen Mary, favoured the retention of the House of Windsor and so on 9 April 1952 Elizabeth issued a declaration that ‘Windsor’ would continue to be the name of the royal house. The Duke complained, “I am the only man in the country not allowed to give his name to his own children”. In 1960, after the death of Queen Mary in 1953 and the resignation of Churchill in 1955, the surname Mountbatten-Windsor was adopted for Philip and Elizabeth’s male-line descendants who do not carry royal titles. Despite the death of Queen Mary on 24 March 1953, the planned coronation on 2 June that year went ahead, as Mary had asked before she died. The ceremony in Westminster Abbey, with the exception of the anointing and communion, was televised for the first time. At her instructions, Elizabeth’s coronation gown was embroidered with the floral emblems of Commonwealth countries. Elizabeth gave birth to her third child, Prince Andrew, in 1960, which was the first birth to a reigning British monarch since 1857. Her fourth child, Prince Edward, was born in 1964.

In 1977, Elizabeth marked the Silver Jubilee of her accession. Parties and events took place throughout the Commonwealth, many coinciding with her associated national and commonwealth tours and these celebrations re-affirmed the Queen’s popularity. But it was in a speech on 24 November 1992, to mark her Ruby Jubilee on the throne that Elizabeth called 1992 her ‘annus horribilis’, or horrible year. Republican feeling in Britain had risen because of press estimates of the Queen’s private wealth, which were contradicted by the Palace, and reports of affairs and strained marriages amongst her extended family. In March, her second son, Prince Andrew, and his wife Sarah separated and in April, her daughter Princess Anne divorced Captain Mark Philips. Then in November, a large fire broke out at Windsor Castle, one of her official residences. The monarchy came under increased criticism and public scrutiny. In an unusually personal speech, the Queen said that any institution must expect criticism, but suggested it be done with “a touch of humour, gentleness and understanding”. On the eve of the new millennium, the Queen and the Duke of Edinburgh boarded a vessel from Southwark, bound for the Millennium Dome. Before passing under Tower Bridge, the Queen lit the National Millennium Beacon in the Pool of London using a laser torch and shortly before midnight she officially opened the Dome. During the singing of Auld LangSyne, the Queen held hands with the Duke and British Prime Minister Tony Blair. It was in 2002 the Queen marked her Golden Jubilee, the 50th anniversary of her accession. Her sister and mother had died in February and March respectively and the media speculated on whether the Jubilee would be a success or a failure. She again undertook an extensive tour of her realms, beginning in Jamaica in February, where she called the farewell banquet “memorable” after a power cut plunged the King’s House, the official residence of the governor-general, into darkness. In the same way as 1977, there were a great many street parties and commemorative events, also monuments were named to honour the occasion. One million people attended each day of the three-day main Jubilee celebration in London and the enthusiasm shown for the Queen by the public was greater than many journalists had anticipated.

Visiting Birmingham in July 2012 as part of the Diamond Jubilee tour.

The Queen’s Diamond Jubilee in 2012 marked her sixty years on the throne and celebrations were held throughout her realms, the wider Commonwealth and beyond. She and her husband undertook an extensive tour of the United Kingdom, whilst her children and grandchildren embarked on royal tours of other Commonwealth states on her behalf. On 4 June, Jubilee beacons were lit around the world. During a tour of Manchester as part of her Jubilee celebrations, the Queen made a surprise appearance at a wedding party at Manchester Town Hall, which then made international headlines. In November, the Queen and her husband celebrated their Sapphire wedding anniversary, their 65th and it was on 18 December she became the first British sovereign to attend a peacetime cabinet meeting since George III in 1781. The Queen, who had opened the 1976 Summer Olympics in Montreal, also opened the 2012 Summer Olympics and Paralympics in London, making her the first head of state to open two Olympic Games in two countries. For the London Olympics, she played herself in a short film as part of the opening ceremony, alongside actor Daniel Craig as James Bond. On 4 April 2013 she received an honorary BAFTA for her patronage of the film industry and was called “the most memorable Bond girl yet” at the award ceremony.

Official opening of the Borders Railway in 2015.

The Queen, pictured in 2015 on the day she became the longest-reigning British monarch to date and in her speech, she said she had never aspired to achieve that milestone. She had surpassed her great-great-grandmother, Queen Victoria on 21 December 2007 to become the longest-lived British monarch and the longest-reigning British monarch, also the longest-reigning queen regnant and the longest-reigning female head of state in the world. She became the oldest current monarch after King Abdullah of Saudi Arabia died on 23 January 2015 and she later became the longest-reigning current monarch and the longest-serving current head of state following the death of King Bhumibol of Thailand on 13 October 2016, also the oldest current head of state on the resignation of Robert Mugabe on 21 November 2017. On 6 February 2017, she became the first British monarch to commemorate a Sapphire Jubilee and on 20 November she was the first British monarch to celebrate a Platinum wedding anniversary. Philip had retired from his official duties as the Queen’s consort in August 2017.

A virtual meeting in 2021 with Dame Cindy Kiro following the Covid-19 pandemic.

On 19 March 2020, as the Covid-19 pandemic hit the United Kingdom, the Queen moved to Windsor Castle and sequestered there as a precaution. All public engagements were cancelled and Windsor Castle followed a strict sanitary protocol. On 5 April, in a televised broadcast watched by an estimated 24 million viewers in the UK, she asked people to “take comfort that while we may have more still to endure, better days will return; we will be with our friends again; we will be with our families again; we will meet again.” And on 8 May, the 75th anniversary of VE Day, in a TV broadcast at 9:00pm (the exact time at which her father George VI had broadcast to the nation on the same day in 1945) she asked people to “never give up, never despair”. In October, she visited the UK’s Defence, Science and Technology Laboratory in Wiltshire, her first public engagement since the start of the pandemic. On 4 November, she appeared masked for the first time in public, during a private pilgrimage to the Tomb of the Unknown Warrior at Westminster Abbey, to mark the centenary of his burial. Prince Philip died on 9 April 2021 after 73 years of marriage, making Elizabeth the first British monarch to reign as a widow or widower since Victoria. She was reportedly at her husband’s bedside when he died, and remarked in private that his death had “left a huge void”. Due to the restrictions in place in England at the time, the Queen sat alone at Philip’s funeral service, which evoked sympathy from people around the world. In her Christmas broadcast that year, she paid a personal tribute to her “beloved Philip”, saying, “That mischievous, inquiring twinkle was as bright at the end as when I first set eyes on him”. The Queen’s Platinum Jubilee began on 6 February 2022, marking 70 years since she acceded to the throne on her father’s death. She held a reception for pensioners, local Women’s Institute members and charity volunteers on the eve of the date at Sandringham House. In her Accession Day message, Elizabeth renewed her commitment to a lifetime of public service, which she originally made in 1947. The Queen does not intend to abdicate, although Prince Charles has begun to take on more of her duties as she grows older and begins carrying out fewer public engagements. The popularity of the monarchy remains high during the Jubilee, as a poll conducted in March 2022 has revealed.

Personal flag of Elizabeth II.

With all that is going on at this time to mark the Queen’s Platinum Jubilee, I wanted to find some particular words to write, perhaps about all that has happened to us all in the last seventy years and I have found the following. It written by a good friend and I copy it with grateful thanks.
“The wisdom that I learn as getting old is to be simple: accept mistakes if I do something wrong, although it gives me severe pain at the moment, spend time reflecting on it, and ask for forgiveness, instead of giving hundreds of psychological excuses to myself and others which sometimes numbs our moral sense. I rather found this gives me much freedom not lingering around the memories from the past but letting me fully live in the present moment. The advancement of psychology seems to help our understanding of human behaviours better, but the basics do not age.”

I end this week with a favourite of mine.
“We are all visitors to this time, this place. We are just passing through. Our
purpose here is to observe, to learn, to grow, to love… and then we return home.” ~ Australian Aboriginal Proverb

Click: Return to top of page or Index page

This Human Life

I find it fascinating (yes, a favourite word of mine) to consider ourselves and how we as humans have grown in knowledge and understanding since we have existed. But in relative terms, that isn’t very long at all. I remember being told that attempting to imagine the Universe as a whole is just a total and absolute impossibility for us to do. I tend to agree, as I saw an item on the Internet the other day where someone was asking how big the largest thing in the Universe is, as well as how small. The answer given was that the biggest thing in the universe which scientists have discovered so far is a supercluster of galaxies called the ‘Hercules-Corona Borealis Great Wall’ and the smallest thing in the universe is a particle called a ‘Quark’. In fact the Hercules-Corona Borealis Great Wall is considered to be so wide that light itself would take 10 billion years to move around the structure and the amazing thing is that our universe itself is just 13.8 billion years old. Whilst a quark is a type of elementary particle and a fundamental constituent of matter, as they are amongst the smallest particles in the universe and carry only fractional electric charges. Scientists have a good idea of how quarks make up things that are called hadrons, which in particle physics are composite, subatomic particles (such as protons or neutrons) made of two or more quarks held together by their strong interaction. They are rather like molecules which are held together by electric force. Except the properties of individual quarks have been difficult to ‘tease out’ because they cannot be observed outside of their respective hadrons! Whew – too much for me, I guess that’s one for someone doing a science degree. When I was at school, I wasn’t by any means brilliant but as I have said before I had what was called ‘an enquiring mind’. I still do. In the past couple of years I have had the great opportunity of chatting to medical students at the Leicester University Medical School and one day when I attended a session with them, outside one building I saw a sign which really made me chuckle. It is this one.

It’s not rocket science…

It just goes to show that there are many things which we never could imagine ourselves doing, yet years later we find that we have achieved. It is also, to me at least, of great importance that as we grow, we share. There are, sadly, many people in the world today who want to be great but at the expense as well as the detriment of others. Such people think nothing of taking things from others, whether it be money, knowledge or skills, but they will give absolutely nothing back of themselves. Or, if they see someone else doing well, then they attempt to take credit for the other person’s achievements. That, in my humble opinion, is very wrong. My father was an excellent teacher, he went on to be the deputy headmaster of an infant/junior school. My mother worked in a few different places, when she left school she was insistent on not working in a local factory in London as many of her schoolmates did, she wanted and did work in the offices of W.H. Smith, where she met my dad. When circumstances moved the family from London to Peterborough, mum looked after us children and later worked at a local solicitors, then the local town hall. They weren’t highly skilled jobs that she had but they were vital ones, nevertheless. Then, when retirement came along mum and dad had several years together travelling, they particularly liked Jersey and Guernsey. Having been there myself, I can understand why. Sadly my dad got cancer, as he smoked a fair bit just as so many people did in those days. It is all part of life and I think we should do our best to learn from each and every experience, the good and the not so good. To me it isn’t ‘bad’, it is just what it is. I have mentioned how me being left-handed meant that my writing is not good. But I found that by angling the paper or whatever that I was writing on to around forty-five degrees, I could see the words I was writing without smudging what I had written. At work I found others who also wrote left-handed, one man even wrote in such a way that it looked like he was writing backwards. But it most definitely worked for him. I have said previously about computers, how they were of benefit to me in all sorts of ways and still are. I was learning, learning all the time about new things, seeing others with new ideas and at times seeing how they could be adapted in new or simply different ways. It was and is to me all part of life, how things should be. But then I saw another question that had been posted onto the Internet, which was “What is the purpose of learning how the Universe works?”. This really caught my attention, it piqued my interest as I have always wanted and been encouraged to learn, to develop and understand new things. The following is a what my research has found.

This is part of what is called ‘disinterested inquiry.’ And the synonym for this is of course ‘science’. Learning is a virtue, and virtue is its own reward. That is its purpose. At one time, Astronomy was concerned purely with ‘where’ the stars were, and not ‘what’ they were. In the early 1900s, George Ellery Hale, Director of Mount Wilson Observatory in California, U.S.A., insisted on installing a physics laboratory in the facility and the publication from the observatory was titled ‘Astro-Physics’, but it was soon changed to ‘Astrophysics’. Albert Einstein formulated and published his General Theory of Relativity in 1916 and in 1929 Edmund Hubble discovered whilst he was at Mount Wilson that the Milky Way galaxy was not the whole universe, and that it was expanding, as Einstein’s General Relativity Theory predicted. As a matter of interest, the Mount Wilson Observatory really is an important astronomical facility in Southern California with historic 60-inch (1,524mm) and 100-inch (2,540mm) telescopes, and 60-foot (18.3m) and 150-foot (45.7m) solar towers. Located there is also the newer, Centre for High Angular Resolution Astronomy (CHARA) array, an ‘optical interferometer’ and it is there that the technique of Interferometry is used, as this uses the interference of superimposed waves to extract information and typically uses electromagnetic waves. It is an important investigative technique in the fields of astronomy, fibre optics, engineering, metrology, oceanography, seismology and many other sciences too numerous to mention. It is even used in the making of holograms. The array consists of six, 1-metre (40-inch) telescopes operating as an astronomical interferometer. Construction was completed in 2003 and is owned and run by Georgia State University (GSU). It does important interferometric stellar research. The summit of Mount Wilson is at 5,710 feet (1,740m) and whilst not the tallest peak in its vicinity, it is high enough in elevation that snow can sometimes interrupt astronomical activities on the mountain. All of the mountains south of the summit are far shorter, leading to unobstructed views across the Los Angeles Basin, Orange County, San Diego and the Pacific Ocean. At such an elevation the horizon over the ocean extends 92 miles (148km). Mount Wilson is also heavily used for relay broadcasting of both radio and television for the Greater Los Angeles Area. But back to the history lesson. Some years ago now a plethora of physicists were figuring out atoms and quantum mechanics, considering what were the smallest things in the universe. Discoveries were made and when more is known, things happen. The Apollo Moon Project was pure applied astronomy, going to the Moon to obtain samples of rock. As a result of the need for navigation, the small compact computer, which could operate continuously in real time, was invented. You are reading this with that computer’s core development, but consider for a moment if you will the Apollo 11 mission in 1969, which was the first to land men on the Moon. On board that spacecraft was a computer called the Apollo Guidance Computer (AGC) which had a Random Access Memory (RAM) of just four kilobytes (4KB) with a 32KB hard disk. This meant that it had just 2,048 ‘words’ of memory which could be used to store temporary results, data that is lost when there is no power. Since then, the most obvious advances have been in computing and electronics, especially in reducing size. My very first computer was a Sinclair ZX81 which I purchased in 1981, it had just 1K of RAM and no hard disk. To use it, I had to tune one of the channels on my portable television and programs were either manually typed in each time or saved onto a cassette tape and then reloaded from that. I spent hours and hours copying programs from computer magazines, being careful to put in all the letters, numbers and other symbols just exactly as they were printed, only to find that the program wouldn’t run because of a typing error in the magazine which I (and many others!) only found out about in the following week’s edition of the magazine! But it passed the time and I learned a great deal about computer programming. Then Sir Clive Sinclair sold his business to Amstrad and bigger, better computers were made with larger memory, built-in storage and finally disk drives. But manufacturers realised the gap in the market and made ‘home’ computers, although they were poor compared to the ones we have nowadays. For a number of years I soldiered on with my Amstrad/Sinclair computer, but eventually I bought a more modern computer. Sadly none of the programs I had would work on that new one, although many years later some very clever folk found a way of making those old programs work by making ’simulators’ that made these old programs run properly!

By now mobile telephones were getting popular, they were getting smaller too. I bought what was an ‘electronic diary’, a separate, hand-held computer that worked very well. I was still working in a Sales office of British Telecom and the engineers were going around fitting, installing as well as repairing telephone equipment. Then a few years later I learned that these engineers had been issued with the very same electronic diaries as part of their work! I wish I had known… However, technology continued to move on and after a while I learned about a combined unit which had a diary, a calendar, email and a camera all built into one! Over the past few years I have upgraded that unit, I then moved over to using one made by Apple and am happy with it. But it is a very far cry from that first computer the astronauts used on Apollo 11. We have had bigger and better spacecraft, the space shuttle as well as craft launched out into deep space that take many years to even get as far as Jupiter or Saturn. But research is continuous and at the other end of the ‘size’ spectrum, research with giant synchrotrons at CERN in Europe goes on and they continually learn more about the basic building blocks of the universe. At one point CERN developed so much data that it was difficult to manage, so in 1988 an enterprising physicist there, Timothy Berners-Lee, asked his supervisor for $1,200 to develop what he called ‘a distributing information system’ which was up and running in 1989. Today we call that the Internet. The point of continuous research is that economists discovered in 1990 that the source of 80% of the wealth of nations comes from the support of science research, any science. Knowing how the universe works gives us an idea about what to actually expect from life, our existence and reality. Also, knowing what to expect lets you know how to plan and how to prepare, it lets you focus on what matters, and ignore or dismiss that which does not matter. Knowing helps us to grow as the more you know, the stronger you are when faced with claims and efforts to compel using false thoughts and ideas. To my mind, knowledge is not the opposite of ignorance as no-one can know it all, in fact we simply don’t know what we don’t know. But knowledge is certainly the path to getting there, as well as recognising when you have arrived. So the purpose of learning how the universe works is the joy of inquiry and the benefits are the wealth of nations. That is surely a good thing. The only other thing we should also remember though, in my view, is in our faith, no matter what our race, colour or creed may be.

This week… a personal tale.
I expect my dad told my mum this and it must have amused both of them, as you will see when you read on!

I was quite young when I began singing in our local church choir and on Sundays I would sit in the stalls with other choir members. I would listen to the vicar preach the sermon, but being young I didn’t always understand exactly what was being said. I did my best. At school I and others were taught basic reading, writing and arithmetic and at home we listened to the radio. One of my elder brothers had a small, battery-powered transistor radio and this fascinated me so I tried to learn all about these little radios along with the transistors that were in them. So it was a surprise to me one Sunday when our elderly vicar said what sounded to me like “The changes and chances of this transistory life”. Because I knew transistors had only recently been invented, there was something not quite right so I asked (as was my way) my dear dad, who quietly and kindly pointed out that what the vicar had actually said was “ The changes and chances of this transitory life” and it was nothing to do with transistors! Dad also got me to look up the word ‘transitory’ in the dictionary, learning as I did that it meant “not permanent; tending to pass away”. I learned…

Click: Return to top of page or Index page

Superstitions

So a week ago, Friday the 13th came and went around the world. To many it would have been an ordinary day, whilst others may have almost feared it. I have heard about some folk not wanting to even get out of bed, for fear of something ‘bad’ happening to them. Many of us have our own ways, our own peculiarities, perhaps eccentricities, even foibles. Incidentally, the latter word can also mean the part of a sword blade from the middle to the point.

In fact a superstition is defined as any belief or practice considered to be irrational or supernatural, attributed to fate or magic, perceived supernatural influence or fear of that which is unknown. It is commonly applied to beliefs and practices surrounding luck, amulets, astrology, fortune-telling, spirits and certain paranormal entities, more particularly the belief that future events can be foretold by specific and apparently unrelated prior events. Equally the word ‘superstition’ is often used to refer to a religion not practiced by the majority of a given society, regardless of whether the prevailing religion contains alleged superstitions or not. The Oxford English Dictionary (OED) defines superstition as ‘a religious belief or practice considered to be irrational, unfounded, or based on fear or ignorance; excessively credulous belief in and reverence for the supernatural’, as well as ‘a widely held but irrational belief in supernatural influences, especially as leading to good or bad luck, or a practice based on such a belief’. Oxford Advanced Learner’s Dictionary defines superstition as ‘the belief that particular events happen in a way that cannot be explained by reason or science; the belief that particular events bring good or bad luck’. According to Merriam Webster’s dictionary, it is ‘a false conception about causation or belief or practice emanating from ignorance, fear of the unknown, trust in magic or chance amounts to superstition’. Meanwhile, the Cambridge Dictionary denotes superstition as ‘a belief that is connected with old ideas about magic etc., without grounding in human reason or scientific knowledge’. The dictionary cites Cambridge English Corpus contextually in that the term ‘superstition’ might define controversial beliefs, the practice of confessional opponents or the beliefs of the ignorant masses as superstitious. Different authors have attempted to categorise different superstitions and one even gave time a category, noting the observances of various ones such as dog days, Egyptian days (which, in Europe during the Middle Ages, were certain days of the year held to be unlucky), year prognoses and lunar timings, also where signs might constitute significances like particular animal behaviours, such as the call of birds, neighing of horses or sighting of comets, as well as dreams. But identifying something as a ‘superstition’ is considered by many as somewhat pejorative or seen with contempt and these items are commonly referred to as folklore. Webster’s ‘The Encyclopaedia of Superstitions’ points out that whilst many superstitions are related with religion, people have been carrying individual subjective perceptions against one another and people of one belief are likely to call people of another belief superstitious. Constantine regarded paganism as a superstition, whilst on the other hand Tacitus regarded Christianity as pernicious superstition. Both Paul the Apostle and Martin Luther (10 November 1483 – 18 February 1546) perceived any thing that was not centred on Christ to be superstitious. Whilst the formation of the Latin word ‘superstition’ is clear, from the verb ‘super-stare’, i.e. to stand over, stand upon, to survive, its original intended sense is less clear. It can be interpreted as ‘standing over a thing in amazement or awe’, but other possibilities have been suggested, for example the sense of excess, such as over-scrupulousness or over-ceremoniousness in the performing of religious rites, or else the survival of old, irrational religious habits. The earliest known use of the word as a noun is found in written works by Plautus, Ennius and later by Pliny, with the meaning of ‘art of divination’. From its use in the Classical Latin of Livy and Ovid it is used in the pejorative sense that it holds today in relation to an excessive fear of the gods or unreasonable religious belief, as opposed to the proper and reasonable awe of the gods. However Cicero derived the term from ‘superstitiosi’, literally ‘those who are left over’, meaning survivors or descendants, connecting it to excessive anxiety of parents in hoping that their children would survive them to perform their necessary funerary rites.

Greek and Roman polytheists (those with the belief in multiple deities who are usually assembled into a pantheon of gods and goddesses along with their own religious sects and rituals) modelled their relations with the gods on political and social terms and scorned the man who constantly trembled with fear at the thought of the gods, as a slave feared a cruel and capricious master. Such fear of the gods was what the Romans considered to be the meaning of superstition. The current Catechism of the Catholic Church considers superstition sinful in the sense that it denotes ‘a perverse excess of religion’ as a demonstrated lack of trust in divine providence and in violation of the first of the Ten Commandments. The Catechism is therefore a defence against the accusation that Catholic doctrine is superstitious. In 1948 a behavioural psychologist published an article in which he described his pigeons exhibiting what appeared to be superstitious behaviour. One pigeon was making turns in its cage, another would swing its head in a pendulum motion, whilst others also displayed a variety of different behaviours. He believed these behaviours were all done ritualistically in an attempt to receive food from a dispenser, even though the dispenser had already been programmed to release food at set time intervals regardless of the pigeons’ actions, so the psychologist believed that the pigeons were trying to influence their feeding schedule by performing these actions. He then extended this as a proposition regarding the nature of superstitious behaviour in humans. That was his considered opinion. But some people seem to believe that superstitions influence events by changing the likelihood of currently possible outcomes rather than by creating new possible outcomes. In sporting events, for example, a lucky ritual or object is thought to increase the chance that an athlete will perform at the peak of their ability, rather than increasing their overall ability at that sport. There are some people who tend to attribute events to supernatural causes most often under two circumstances. In the first instance they are more likely to attribute an event to a superstitious cause if it is unlikely than if it is likely. In other words, the more surprising the event, the more likely it is to evoke a supernatural explanation. This is believed to stem from an ‘affected’ motivation – a basic desire to exert control over one’s environment. When no natural cause can explain a situation, attributing an event to a superstitious cause may give people some sense of control and ability to predict what will happen in their environment. In the second, people are more likely to attribute an event to a superstitious cause if it is negative than positive. This is called ‘negative agency bias’, for example in American baseball, the Boston Red Sox fans attributed the failure of their team to win the world series for 86 years to the ‘curse of the bambino’, an alleged curse placed on the team for trading a professional baseball player named Babe Ruth to the New York Yankees so that the team owner could fund a Broadway musical. When the Red Sox finally won the world series in 2004 however, the team’s success was attributed to the team’s skill and the rebuilding effort of the new owner and general manager. As you might expect, people are more likely to perceive their computer to act according to its own intentions when it malfunctions than when it functions properly. However, according to various analysts who study consumer behaviour superstitions are employed as a heuristic tool and as a result these can influence a variety of consumer behaviours. These analysts say that, after taking into account a set of antecedents, trait superstitions are predictive of a wide variety of consumer beliefs, like beliefs in astrology or in common negative superstitions, for example the fear of black cats. Additionally, a general proneness to be superstitious may lead to an enduring temperament to gamble, to participate in promotional games, invest in stocks, to forward superstitious e‐mails, keep good‐luck charms and exhibit sports fan regalia. But superstition can also be found in politics, as the Ancient Greek historian Polybius wrote in his work “The Histories” where he used the word ‘superstition’, explaining that in Ancient Rome such beliefs maintained the cohesion of the Roman Empire, operating as it did as a means of controlling the masses, in particular to achieve both political and mundane ends.

Boston Red Sox.

In the Classical era, the existence of gods was actively debated amongst both philosophers and theologians and consequently opposition to superstition arose. The poem ‘De Rerum Natura’, written by the Roman poet and philosopher Lucretius further developed the opposition to superstition. Cicero’s work ‘De Natura Deorum’ also had a great influence on the development of the modern concept of superstition as well as the word itself. Whereas Cicero distinguished ‘superstitio’ and ‘religio’, Lucretius used only the word ‘religio’. That is because for Cicero, ’superstitio’ meant excessive fear of the gods as he believed that only superstition, and not religion, should be abolished. In fact the Roman Empire also made laws condemning those who excited excessive religious fear in others. During the Middle Ages, the idea of God’s influence on the world’s events went mostly undisputed. Trials by ordeal were quite frequent, even though King Frederick II (1194 – 1250 AD) was the first king who explicitly outlawed trials by ordeal as they were considered to be irrational. The rediscovery of lost classical works and scientific advancement led to a steadily increasing disbelief in superstition and a new, more rationalistic view was beginning to be seen. In addition, opposition to superstition was central to the Age of Enlightenment. In fact, most superstitions arose over the course of many centuries and were rooted in regional and historical circumstances, such as religious beliefs or the natural environment. For instance geckos were at one time believed to be of medicinal value in many Asian countries, whilst in China (and in other countries now) the belief of Feng Shui is said to have a negative effect on different places, for example that a room in the northwest corner of a house may have very bad energy. Similarly, the number 8 is thought to be a lucky number in China, so that it is more common than any other number in the Chinese housing market. Equally there are certain phrases, in particular plays, which are considered to bring bad luck, for example it is said that a coven of witches objected to William Shakespeare using real incantations, so they put a curse on that well-known Scottish play. Legend has it the play’s first performance (around 1606) was riddled with disaster. The actor playing Lady Macbeth died suddenly, so Shakespeare himself had to take on the part. Prior to a performance, some actors will say “break a leg” in the hope that this will ward off any unlucky events. But there are some equally and quite reasonable actions which at first seem without much foundation. At one time, many children were forced to use their right hands for writing, mainly as a prejudice against the awkwardness of left-handed writing and the prevalence of ‘right-handed’ utensils. Happily, left-handedness is more accepted nowadays, which is all to the good for me personally! But many years ago when greeting someone, the task of shaking hands was done with the right hand because back then a great many swordsmen had their sword on the left side of their waist because they were right-handed, so it was easy for them to draw their sword. But by shaking hands with the right hand it showed openness and trust towards the person they were greeting, not hostility. It is fascinating how these actions have their historical connections, rather than simply thought of as superstition.

This week… an interesting tale.
The following is an actual question given on a University of Washington chemistry mid-term paper. The answer given by one student was considered so ‘profound’ that the professor shared it with colleagues via the Internet, which is of course why we now have the pleasure of enjoying it as well.

Bonus Question:
Is Hell exothermic (gives off heat) or endothermic (absorbs heat)?
Most of the students wrote proofs of their beliefs using Boyle’s Law (gas cools down when it expands and heats up when it is compressed) or some variant. One student, however, wrote the following:

“First, we need to know how the mass of Hell is changing in time. So we need to know the rate that souls are moving into Hell and the rate they are leaving. I think that we can safely assume that once a soul gets to Hell, it will not leave. Therefore, no souls are leaving.

As for how many souls are entering Hell, let’s look at the different religions that exist in the world today. Most of these religions state that if you are not a member of their religion, you will go to Hell. Since there is more than one of these religions and since people do not belong to more than one religion, we can project that all souls go to Hell.

With birth and death rates as they are, we can expect the number of souls in Hell to increase exponentially. Now, we look at the rate of change of the volume in Hell because Boyle’s Law states that in order for the temperature and pressure in Hell to stay the same, the volume of Hell has to expand proportionately as souls are added.

This gives two possibilities:
1) If Hell is expanding at a slower rate than the rate at which souls enter Hell, then the temperature and pressure in Hell will increase until all Hell breaks loose.
2) If Hell is expanding at a rate faster than the increase of souls in Hell, then the temperature and pressure will drop until Hell freezes over. So which is it?

If we accept the postulate given to me by Teresa during my Freshman year, “…that it will be a cold day in Hell before I sleep with you”, and take into account the fact that I still have not succeeded in having an affair with her, then #2 above cannot be true, and thus I am sure that Hell is exothermic and will not freeze over.”

This student received the only “A”.

Click: Return to top of page or Index page

The Gold Standard

As part of my research on gold for last week’s blog post, I saw an item on the Internet about how gold had only stopped being used as a Gold Standard in recent years, so I decided to do just a bit of research. This was because what the writer of the article had said simply didn’t seem right to me. I learned that at the time of London’s first Olympics in 1908, the amount of money in circulation in the UK was tied to the amount of gold in the economy. The gold standard had prevailed for most of the previous two centuries and was to continue until World War I began in 1914. But the UK was not the only country whose monetary system was based on gold. From 1880 to 1914, almost all of the world’s leading economies had followed suit, with each country fixing the price of gold in their local currency. In the UK, the price of one troy ounce of gold was £4 5s 0d (£4.25). In the US it was fixed at $20.67. This implied a fixed exchange rate between pound sterling and the dollar ($4.87 per £1) with all the other countries on the gold standard. To enhance the credibility of the arrangements, authorities guaranteed that paper money was fully convertible into gold and anyone could request to convert their pounds into the equivalent value of gold. This was because it limited the ability of governments to print money and the gold standard stopped countries from deliberately devaluing their own currency in order to improve the competitiveness of their exports or pay off their debts. As a result, membership of the gold standard was seen as a commitment to sound government finance. By constraining the growth in money supply, the gold standard was also believed to contribute to stable prices. Over long periods this was generally the case, as price levels in the UK were much the same in 1914 as they were in 1880. However, the gold standard’s inflexibility had major disadvantages. Changes in the world’s money supply were dependent not on economic conditions, but on the amount of new gold that was mined. This meant that on the one hand, monetary policy could not be used to respond to recessions and booms but on the other, significant rises in gold production would lead to faster money supply growth and ultimately inflation, regardless of a country’s underlying economic conditions. World War I saw the end of the gold standard as governments suspended the convertibility of their currencies into gold in order to freely finance rapidly escalating military expenditure. It was briefly reintroduced in some countries after the War, including the UK from 1925 to 1931, but fell apart again during the Great Depression. After World War II, a form of gold standard under the Bretton Woods system which involved the dollar being fixed to gold and then other currencies being fixed to the dollar was in operation until 1971. So technically, a ‘gold standard’ is a monetary system in which the standard economic unit of account is based on a fixed quantity of gold. This was the basis for the international monetary system from the 1870s to the early 1920s, and from the late 1920s to 1932, as well as from 1944 until 1971 when the United States unilaterally terminated converting the US dollar to gold foreign central banks, effectively ending the Bretton Woods system, though many states still hold substantial gold reserves. In fact it seems that historically, the silver standard and bimetallism have been more common than the gold standard and the shift to an international monetary system based on a gold standard reflected accident, network externalities and ‘path dependence’ (a concept in economics and the social sciences, referring to processes where past events or decisions constrain later events or decisions) occurred. Great Britain accidentally adopted a ‘de facto’ gold standard in 1717 when Sir Isaac Newton, who was then master of the Royal Mint, set the exchange rate of silver to gold far too low, thus causing silver coins to go out of circulation. As Great Britain became the world’s leading financial and commercial power in the 19th century, other states increasingly adopted Britain’s monetary system. The gold standard was largely abandoned during the Great Depression before being reinstated in a limited form as part of the post-World War II Bretton Woods system. The gold standard was abandoned due to its propensity for volatility, as well as the constraints it imposed on governments, as by retaining a fixed exchange rate, governments were hamstrung in engaging in expansionary policies to, for example, reduce unemployment during economic recessions. There is a consensus among economists that a return to the gold standard would not be beneficial and most economic historians reject the idea that the gold standard ‘was effective in stabilising prices and moderating business-cycle fluctuations during the nineteenth century.’ So it was that we slipped into a ‘gold specie standard’ in 1717 by over-valuing gold at 15.2 times its weight in silver, ‘specie’ meaning money in the form of coins rather than notes. It was unique among nations to use gold in conjunction with clipped, underweight silver shillings, addressed only before the end of the 18th century by the acceptance of gold proxies like token silver coins and banknotes. From the more widespread acceptance of paper money in the 19th century emerged the gold bullion standard, a system where gold coins do not circulate, but authorities like central banks agree to exchange circulating currency for gold bullion at a fixed price. First emerging in the late 18th century to regulate exchange between London and Edinburgh, it was noted how such a standard became the predominant means of implementing the gold standard internationally in the 1870s. Restricting the free circulation of gold under the Classical Gold Standard period from the 1870s to 1914 was also needed in countries which decided implement the gold standard while guaranteeing the exchangeability of huge amounts of legacy silver coins into gold at the fixed rate (rather than valuing publicly-held silver at its depreciated value).

Here in the United Kingdom the English pound sterling, introduced around the year 800 CE, was initially a silver standard unit worth 20 shillings or 240 silver pennies. The latter initially contained 1.35 g fine silver, reducing by 1601 to 0.464 g, hence giving way to the shilling (12 pennies) of 5.57 g fine silver. The problem of clipped, underweight silver pennies and shillings was a persistent, unresolved issue from the late 17th century to the early 19th century. In 1717 the value of the gold guinea (of 7.6885 g fine gold) was fixed at 21 shillings, resulting in a gold-silver ratio of 15.2 higher than prevailing ratios in Continental Europe. Great Britain was therefore ‘de jure’ under a bimetallic standard with gold serving as the cheaper and more reliable currency compared to clipped silver and full-weight silver coins did not circulate but went to Europe where 21 shillings fetched over a guinea in gold. Several factors helped extend the British gold standard into the 19th century, namely the Brazilian Gold Rush of the 18th century supplying significant quantities of gold to Portugal and Britain, with Portuguese gold coins also legal tender in Britain. Also ongoing trade deficits with China (which sold to Europe but had little use for European goods) drained silver from the economies of most of Europe. Combined with greater confidence in banknotes issued by the Bank of England, it opened the way for gold as well as banknotes becoming acceptable currency in lieu of silver. In addition was the acceptability of token or subsidiary silver coins as substitutes for gold before the end of the 18th century. Initially issued by the Bank of England and other private companies, permanent issuance of subsidiary coinage from the Royal Mint commenced after the Great Recoinage of 1816.

The British gold sovereign or £1 coin was the pre-eminent circulating gold coin during the classical gold standard period.

Following the Napoleonic Wars, Britain legally moved from the bimetallic to the gold standard in the 19th century in several steps, when the 21-shilling guinea was discontinued in favour of the 20-shilling gold sovereign or £1 coin. From the second half of the 19th century Britain then introduced its gold standard to Australia, New Zealand, and the British West Indies in the form of circulating gold sovereigns as well as banknotes that were convertible at par into sovereigns or Bank of England banknotes. The classical gold standard of the late 19th century was not merely a superficial switch from circulating silver to circulating gold. The bulk of silver currency was actually replaced by banknotes and token currency whose gold value was guaranteed by gold bullion and other reserve assets held inside central banks. In turn, the gold exchange standard was just one step away from modern flat currency, with banknotes issued by central banks and whose value is secured by the bank’s reserve assets, but whose exchange value is determined by the monetary policy of the central bank and its objectives on purchasing power in lieu of a fixed equivalence to gold. The final chapter of the classical gold standard ending in 1914 saw the gold exchange standard extended to many Asian countries by fixing the value of local currencies to gold or to the gold standard currency of a Western colonial power. The Netherlands East Indies guilder was the first Asian currency pegged to gold in 1875 via a gold exchange standard which maintained its parity with the gold Dutch guilder. International monetary conferences were called up before 1890, with various countries actually pledging to maintain the ‘limping’ standard of freely circulating legacy silver coins in order to prevent the further deterioration of the gold–silver ratio which reached 20 in the 1880s. However, after 1890 the decline in the price of silver could not be prevented further and the gold–silver ratio rose sharply above 30. In 1893 the Indian rupee of 10.69 g fine silver was fixed at 16 British pence (or £1 = 15 rupees; gold-silver ratio 21.9), with legacy silver rupees remaining legal tender. Nearly similar gold standards were implemented in Japan in 1897, in the Philippines in 1903 and in Mexico in 1905 when the previous yen or peso of 24.26 g silver was redefined to approximately 0.75 g gold or half a United States dollar (ratio 32.3). Japan gained the needed gold reserves after the Sino-Japanese War of 1894–1895. For Japan, moving to gold was considered vital for gaining access to Western capital markets. Governments with insufficient tax revenue suspended convertibility repeatedly in the 19th century., however the real test came with the onset of World War I. The gold specie standard came to an end in the United Kingdom and the rest of the British Empire with the outbreak of that war. A run on sterling caused Britain to impose exchange controls that fatally weakened the standard, convertibility was not legally suspended but gold prices no longer played the role that they did before. In financing the war as well as abandoning gold, many of the contributors suffered drastic inflations. Price levels doubled in the United States and Britain, tripled in France and quadrupled in Italy. Exchange rates changed less, even though European inflation rates were more severe than America and this meant that the cost of American goods decreased relative to those in Europe. Between August 1914 and spring of 1915, the dollar value of U.S. exports tripled and its trade surplus exceeded $1 billion for the first time. Ultimately, the system could not deal quickly enough with the large deficits and surpluses. This was previously attributed to downward wage rigidity brought about by the advent of unionised labour, but is now considered as an inherent fault of the system that arose under the pressures of war and rapid technological change. In any event, prices had not reached equilibrium by the time of the Great Depression which served to kill off the system completely.

The gold specie standard ended in the United Kingdom and the rest of the British Empire at the outbreak of World War I, when Treasury notes replaced the circulation of gold sovereigns and gold half sovereigns. Except legally, the gold specie standard was not abolished. The end of the gold standard was successfully effected by the Bank of England through appeals to patriotism urging citizens not to redeem paper money for gold specie. It was only in 1925, when Britain returned to the gold standard in conjunction with Australia and South Africa, that the gold specie standard was officially ended. The British Gold Standard Act 1925 both introduced the gold bullion standard and simultaneously repealed the gold specie standard and the new standard ended the circulation of gold specie coins. Instead, the law compelled the authorities to sell gold bullion on demand at a fixed price, but ‘only in the form of bars containing approximately four hundred troy ounces (12kg) of fine gold’. The pound left the gold standard in 1931 and a number of currencies of countries that historically had performed a large amount of their trade in sterling were pegged to sterling instead of to gold. The Bank of England took the decision to leave the gold standard abruptly and unilaterally. Many other countries followed Britain in returning to the gold standard, leading to a period of relative stability but also deflation. This state of affairs lasted until the Great Depression from 1929 to 1939 and forced countries off the gold standard. In the summer of 1931, a Central European banking crisis led Germany and Austria to suspend gold convertibility and impose exchange controls as a run on Austria’s largest commercial bank had caused it to fail. The run spread to Germany, where the central bank also collapsed. International financial assistance was too late and in July 1931 Germany adopted exchange controls, followed by Austria in October. The Austrian and German experiences, as well as British budgetary and political difficulties, were among the factors that destroyed confidence in sterling, which occurred in mid-July 1931. Runs ensued and the Bank of England lost much of its reserves. On September 19, 1931, speculative attacks on the pound led the Bank of England to abandon the gold standard, ‘ostensibly temporarily’. However, the ostensibly temporary departure from the gold standard had unexpectedly positive effects on the economy, leading to greater acceptance of departing from the gold standard. Loans from American and French central banks of £50 million were insufficient and exhausted in a matter of weeks, due to large gold outflows across the Atlantic. The British benefited from this departure. They could now use monetary policy to stimulate the economy. Australia and New Zealand had already left the standard and Canada quickly followed suit. The interwar partially-backed gold standard was inherently unstable because of the conflict between the expansion of liabilities to foreign central banks and the resulting deterioration in the Bank of England’s reserve ratio. France was then attempting to make Paris a world class financial centre, and it received large gold flows as well. Upon taking office in March 1933, U.S. President Franklin D. Roosevelt departed from the gold standard and by the end of 1932, it had been abandoned as a global monetary system. Finally Czechoslovakia, Belgium, France, the Netherlands and Switzerland abandoned the gold standard in the mid-1930s. So it was ended many years ago. Much has been written subsequently about the gold standard, but one economist seems to have summed it up by saying “We don’t have the gold standard. It’s not because we don’t know about the gold standard, it’s because we do.”

This week…

Togetherness.
Let there be spaces in your togetherness, and
Let the winds of the heavens dance between you.
Love one another but make not a bond of love;
Let it rather be a moving sea between the shores of your souls.
Fill each other’s cup but drink not from one cup.
Give one another of your bread but eat not from the same loaf.
Sing and dance together and be joyous,
But let each one of you be alone, even as the strings of a lute are alone
Though they quiver with the same music.
Give your hearts, but not into each other’s keeping.
For only the hand of Life can contain your hearts.
And stand together, yet not too near together.
For the pillars of the temple stand apart,
And the oak tree and the cypress grow not in each other’s shadow.

~ Khalil Gibran (06 January 1883 – 10 April 1931)

Click: Return to top of page or Index page

Gold

This substance is a chemical element with the symbol “Au”, from the Latin ‘aurum’. It is a bright, slightly orange-yellow, dense, soft, malleable and ductile metal in a pure form. It is also one of the least reactive chemical elements and is solid under standard conditions. Gold often occurs in its elemental or native form as nuggets or grains in rocks, veins and alluvial deposits. It occurs in a solid solution series with the native element silver (as electrum), naturally alloyed with other metals like copper and palladium and mineral inclusions such as within pyrite. It occurs less commonly in minerals as gold compounds, often with tellurium (gold tellurides). Gold is resistant to most acids, though it does dissolve in aqua regia, a mixture of nitric acid and hydrochloric acid, forming a soluble, but it is insoluble in nitric acid, which dissolves silver and base metals, a property long used to refine gold and confirm the presence of gold in metallic substances, giving rise to the term ‘acid test’. Gold dissolves in alkaline solutions of cyanide, which are used in mining and electroplating. It dissolves in mercury forming amalgam alloys, and as the gold acts simply as a solute, this is not a chemical reaction. A relatively rare element, gold is classed as a precious metal that has been used for coinage, jewellery and other arts throughout recorded history. In the past, a gold standard was often implemented as a monetary policy but gold coins ceased to be minted as a circulating currency in the 1930s, and the world gold standard was abandoned for a flat currency system after 1971. In 2017 the world’s largest gold producer by far was China, with 440 tonnes per year and as of 2020, a total of around 201,296 tonnes of gold exists above ground. This is equal to a cube with each side measuring roughly 21.7 metres (71ft). The world consumption of new gold produced is about 50% in jewellery, 40% in investments and 10% in industry. The high malleability, ductility, resistance to corrosion and most other chemical reactions and conductivity of electricity of gold has led to its continued use in corrosion-resistant electrical connectors in all types of computerised devices, its chief industrial use. It is also used in infra-red shielding, coloured glass production, gold-leafing and tooth restoration. Certain gold salts are still used as anti-inflammatories in medicine. A gold nugget of 5mm (0.20in) in size can be hammered into a gold foil of about 0.5 square metres, 5.4 square feet in area. Gold can be drawn into a wire of single-atom width, and then stretched considerably before it breaks. Such nanowires distort via formation, reorientation and migration of dislocations and crystal twins without noticeable hardening. Gold leaf can be beaten thin enough to become semi-transparent and the transmitted light appears greenish blue, because gold strongly reflects yellow and red. Such semi-transparent sheets also strongly reflect infra-red light, making them useful as infrared (radiant heat) shields in visors of heat-resistant suits, and in sun-visors. Whilst most metals are grey or silvery white, gold is slightly reddish-yellow, the colour determined by the frequency of plasma oscillations among the metal’s valence electrons, in the ultraviolet range for most metals but in the visible range for gold due to relativistic effects affecting the orbitals around gold atoms. Similar effects impart a golden hue to metallic caesium. Common coloured gold alloys include the distinctive eighteen-carat rose gold, which is created by the addition of copper. Also alloys containing palladium or nickel are important in commercial jewellery as these produce white gold alloys. Fourteen-carat gold-copper alloy is nearly identical in colour to certain bronze alloys, and both may be used to produce police and other badges. Fourteen and eighteen-carat gold alloys with silver alone appear greenish-yellow and are referred to as green gold, whilst blue gold can be made by alloying it with iron and purple gold can be made by alloying with aluminium. Although less common, the addition of manganese, indium and other elements can produce more unusual gold colours for various applications. The possible production of gold from a more common element such as lead has long been a subject of human enquiry, and the ancient and medieval discipline of alchemy often focussed on it. However, the transmutation of the chemical elements did not become possible until the understanding of nuclear physics in the 20th century. It can be manufactured in a nuclear reactor, but doing so is highly impractical and would cost far more than the value of the gold that is produced. Medicinal applications of gold and its complexes have a long history dating back thousands of years and several gold complexes have been applied to treat rheumatoid arthritis. Also some of its compounds have been investigated as possible anti-cancer drugs.

Gold is thought to have been produced in supernova nucleosynthesis and from the collision of neutron stars, therefore being present in the dust from which the Solar System was formed. Because the Earth was molten when it was formed, almost all of the gold present in the early Earth probably sank into the planetary core. Therefore, most of the gold that is in the Earth’s crust and mantle has, according to one theory, thought to have been delivered to Earth later by asteroid impacts about 4 billion years ago. The gold which is reachable by humans has therefore been associated with a particular asteroid impact and the asteroid that formed the Vredefort crater around two billion years ago is often credited with seeding the Witwatersrand basin in South Africa with the richest gold deposits on earth. However, this scenario is now questioned, as these gold-bearing rocks were laid down between 700 and 950 million years before the Vredefort impact. These gold-bearing rocks had also been covered by a thick layer of Ventersdorp lavas and the Transvaal Supergroup of rocks before the meteor struck, and thus the gold did not actually arrive in the asteroid/meteorite. What the Vredefort impact achieved, however, was to distort the Witwatersrand basin in such a way that the gold-bearing rocks were brought to the present erosion surface in Johannesburg, just inside the rim of the original 300km (190 mile) diameter crater caused by the meteor strike. It was the discovery of the deposit in 1886 that launched the Witwatersrand Gold Rush. Some 22% of all the gold that is ascertained to exist today on Earth has been extracted from these rocks. However, besides that much of the rest of the gold on Earth is thought to have been incorporated into the planet since its very beginning, as planetesimals formed the planet’s mantle early in Earth’s creation. In 2017 an international group of scientists established that gold ‘came to the Earth’s surface from the deepest regions of our planet’, the mantle, and this is said to have been evidenced by their findings at the Deseado Massif in the Argentinian region of Patagonia. Perhaps surprisingly, the world’s oceans also contain gold and measured concentrations estimate that they would hold 15,000 tonnes. A number of people have claimed to be able to economically recover gold from sea water, but they were either mistaken or acted in an intentional deception. There was one man who ran a gold-from-seawater swindle in the United States in the 1890s, as did an English fraudster in the early 1900s. Another man did research on the extraction of gold from sea water in an effort to help pay Germany’s reparations following World War I and based on the published values of gold in seawater a commercially successful extraction seemed possible. But after analysis of 4,000 water samples, it became clear that extraction would not be possible and he ended the project.

Grave offerings on exposition in the Varna museum, Bulgaria, thought to be the oldest golden artefacts in the world (4600 BC – 4200 BC).

The earliest recorded metal employed by humans appears to be gold. Small amounts of natural gold have been found in Spanish caves used during the late Palaeolithic period, c. 40,000 BC. The oldest gold artefacts in the world are from Bulgaria and date back to around 4,600 BC to 4,200 BC, such as those found in the Varna Necropolis near Lake Varna and the Black Sea coast, thought to be the earliest ‘well-dated’ finding of gold artefacts in history. Such items probably made their first appearance in Ancient Egypt at the very beginning of the pre-dynastic period, at the end of the fifth millennium BC and the start of the fourth, and smelting was developed during the course of the 4th millennium. The oldest known map of a gold mine was drawn in the 19th Dynasty of Ancient Egypt (1320–1200 BC), and the first written reference to gold was recorded in the 12th Dynasty around 1900 BC. Egyptian hieroglyphs from as early as 2600 BC describe gold and one of the earliest known maps, known as the Turin Papyrus Map, shows the plan of a gold mine in Nubia together with indications of the local geology. Large mines were also present across the Red Sea in what is now Saudi Arabia. Gold is mentioned frequently in the Old Testament of the Bible, starting with Genesis. In the New Testament it is included with the gifts of the Magi in the first chapters of Matthew. The book of Revelation describes the city of New Jerusalem as having streets ‘made of pure gold, clear as crystal’. Exploitation of gold in the south-east corner of the Black Sea is said to date from the time of King Midas and this gold was important in the establishment of what is probably the world’s earliest coinage in Lydia, around 610 BC. The legend of the Golden Fleece, dating from eighth century BCE may refer to the use of fleeces to trap gold dust from deposits in the ancient world. In Roman metallurgy, new methods for extracting gold on a large scale were developed from 25 BC onwards. The European exploration of the Americas was fuelled in no small part by reports of the gold ornaments displayed in great profusion by Native American peoples. The Aztecs regarded gold as the product of the gods, calling it literally ‘god excrement’ but after Moctezuma II was killed, much of this gold was shipped to Spain. However, for the indigenous peoples of North America gold was considered useless and they saw much greater value in other minerals which were directly related to their use, such as obsidian, flint and slate. Gold has played a role in western culture as a cause for desire and of corruption, for example in children’s fables where Rumpelstiltskin turns hay into gold for the peasant’s daughter in return for her child when she becomes a princess, and the stealing of the hen that lays golden eggs in Jack and the Beanstalk. The top prize at the Olympic Games and many other sports competitions is the gold medal. The main goal of alchemists has been to produce gold from other substances such as lead, perhaps by the interaction with a mythical substance called the philosopher’s stone. Trying to produce gold led the alchemists to systematically find out what can be done with substances and this laid the foundation for today’s chemistry.

Minoan jewellery from 2300–2100 BC in the Metropolitan Museum of Art, New York City.

Apart from chemistry, gold is mentioned in a variety of expressions, most often associated with intrinsic worth. As already mentioned, great achievements are frequently rewarded with gold in the form of medals as well as trophies and other decorations. Winners of athletic events and other graded competitions are usually awarded a gold medal. Many awards such as the Nobel Prize are made from gold. Other award statues and prizes are depicted in gold or are gold-plated, such as the Academy Awards, the Golden Globe Awards, the Emmy awards and the British Academy of Film and Television Awards (BAFTA) .Gold is associated with the wisdom of ageing and fruition, hence the fiftieth wedding anniversary is golden. A person’s most valued or most successful latter years are sometimes considered their ‘golden years’ and the height of a civilisation is referred to as a golden age. In some religions gold has been associated both with holiness and evil, for example in the Bible’s Book of Exodus the Golden Calf is a symbol of idolatry, whilst in the Book of Genesis Abraham was said to be rich in gold and silver, also Moses was instructed to cover the Mercy Seat of the Ark of the Covenant with pure gold. In Islam gold, along with silk, is often cited as being forbidden for men to wear. Wedding rings are typically made of gold as it is long lasting and unaffected by the passage of time and may aid in the ring symbolism of eternal vows before God and the perfection the marriage signifies. In August 2020, Israeli archaeologists discovered a trove of early Islamic gold coins near the central city of Yavneh, Israel and analysis of the extremely rare collection of 425 gold coins indicated that they were from the late 9th century.

Golden coins from the Scandinavian Monetary Union. To the left is Swedish and the right is Danish.

Gold has been widely used throughout the world as money, for efficient indirect exchange as opposed to bartering and to store wealth in hoards. For exchange purposes, mints produce standardised gold bullion coins, bars and other units of fixed weight and purity. The first known coins containing gold were struck in Lydia, Asia Minor, around 600 BC. The talent coin of gold in use during the periods of Grecian history both before and during the time of the life of Homer weighed between 8.42 and 8.75 grams. From an earlier preference in using silver, European economies re-established the minting of gold as coinage during the thirteenth and fourteenth centuries. In preparation for World War I the warring nations moved to fractional gold standards, inflating their currencies to finance the war effort. Post-war, the victorious countries, most notably Britain, gradually restored gold-convertibility, but international flows of gold via bills of exchange remained embargoed and international shipments were made exclusively for bilateral trades or to pay war reparations. After World War II, gold was replaced by a system of nominally convertible currencies related by fixed exchange rates following the Bretton Woods system of monetary management, which established the rules for commercial and financial relations among the countries of the United States, Canada, Australia, Japan and Western European countries after the 1944 Bretton Woods Agreement. This system was the first example of a fully negotiated monetary order intended to govern monetary relations among independent states. It required countries to guarantee convertibility of their currencies into U.S. dollars to within 1% of fixed parity rates, with the dollar convertible to gold bullion for foreign governments and central banks at 35 US dollars per troy ounce of fine gold, or 0.88867 gram fine gold per dollar. It also envisioned greater cooperation among countries in order to prevent future competitive devaluations and thus established the International Monetary Fund (IMF) to monitor exchange rates and lend reserve currencies to nations with balance of payments deficits. I must admit to smiling when I first came across the name ‘Bretton Woods system’, as I was brought up in Peterborough and a township of that fine city is named Bretton, where you will find the Bretton Woods Community School! But I am certain they played no part in establishing this monetary system. But back to the story. Gold standards and direct convertibility of currencies to gold have been abandoned by world governments, led in 1971 by the United States’ refusal to redeem its dollars in gold. Flat currency now fills most monetary roles. Switzerland was the last country to tie its currency to gold, it backed 40% of its value until the Swiss joined the IMF in 1999. Central banks continue to keep a portion of their liquid reserves as gold in some form, and metals exchanges such as the London Bullion Market Association still clear transactions denominated in gold, including future delivery contracts. Today, gold mining output is declining, so with the sharp growth of economies in the 20th century along with increasing foreign exchange, the world’s gold reserves and their trading market have become a small fraction of all markets, and fixed exchange rates of currencies to gold have been replaced by floating prices for gold. Though the gold stock grows by only 1 or 2% per year, very little metal is irretrievably consumed. Inventory above ground would satisfy many decades of industrial and even artisan uses at current prices. The gold proportion or fineness of alloys is measured by carat, with pure gold (commercially termed ‘fine’ gold) designated as 24 carat. English gold coins intended for circulation from 1526 into the 1930s was typically a standard 22-carat alloy called crown gold for hardness, whilst American gold coins for circulation after 1837 contain an alloy of 0.900 fine gold, or 21.6 carat. Only 10% of the world consumption of new gold produced goes to industry, but by far the most important industrial use for new gold is in fabrication of corrosion-free electrical connectors in computers and other electrical devices. Gold is a valuable commodity.

A mirror for the James Webb Space Telescope, coated in gold to reflect infrared light.

This week…
You might already know that the collective noun for crows is a murder and for lapwings it is a deceit. You might even be aware that for hawks it is an aerie. But an ambush is the collective term for both tigers and widows!

Click: Return to top of page or Index page

Managing Change

We don’t realise it at the time but when we are very young many, perhaps all of us, consider that we are the centre of the universe and that everything revolves around us. We demand attention, we want everything immediately. So, we shout and scream when that does not occur. I think that in the majority of cases, as we grow we learn that we aren’t the only one around and as a result we cannot have just what we want exactly when we want it. That is very true if we have siblings, especially ones older than ourselves. Gender can also play its part! However, there are some who get spoiled and that becomes even more apparent as time passes when they begin to interact with others. Also the longer it takes for them to realise how life really is, the harder it can become for them to change their ways. Sadly, some never do. What seems to make it worse though is when the selfish person tries to turn things round and make others feel like it is their fault and that they should be the ones to change their ways! But sadly that is the classic behaviour of a narcissist. Some may give in to them and accept that way of life, but it can be damaging and ruin their lives. There are even those who have turned to extreme violence as they could see no other way for their situation to end. Happily there are a number of groups specifically designed to help folk in these circumstances by such things as counselling, in fact simply talking to someone who will listen and empathise, letting the person see that their problems can be overcome makes a real difference. They know they are not alone. I remember a lovely tv advert from years ago which featured a child who said “and when I grow up, my mummy says I’m going to be a proper little madam!”. We must surely have all met or seen people just like that and who become extremely selfish in their ways, with absolutely no thought or consideration for others. So it can be difficult to cope with such people and there are those, who despite knowing it isn’t probably in their best interests, stay with and almost ‘accept’ such folk. It may be that there is a fear of the unknown in some, as we know the old saying ‘better the devil you know than the devil you don’t’, but sometimes it can become just too much. It can be a change in your own capabilities, or perhaps in that of your partner that causes us to recognise the need to change. I know of someone who married a big, strong man and they lived together for many years, had children and grandchildren, all seemed well. Then the man became ill and was no longer big and strong, ultimately forced into giving up work. The two finally separated and divorced as his wife could not accept the change which had occurred in her husband. I have managed many changes in my own life, from coping with a muscular disability since birth, then epilepsy and later a heart problem. So I take a few tablets every day, I am very well cared for and more especially I am alive to tell others that it is eminently possible to manage change. It is easier with a positive attitude, recognising and being thankful for the not so good as well as the good because, as has been said many times, ‘falling down is often easy – it is the getting up again that can be difficult’. It also fascinating to me how attitudes have changed over the years regarding such things as disability. I worked in an office for very many years, I have said before that my work all too often involved filling in forms and quite a few of my work colleagues were absolutely delighted when computers were installed – it meant I was able to type, as being left-handed my writing was and still is nowhere near the best! Also I can only use my left hand to type but I could and I still do so fairly quickly. Not only that, but modern programs tend to include an auto-correct feature, though that can be a hindrance rather than a help at times! So before sending out each weekly blog post, I read through it carefully as if I were a stranger. I think perhaps what helps me there comes from my few years spent in a telephone directory compilation team, where we used to hand-write entries on computer cards and include simple computer code so that the computers could recognise the difference between certain letters and numbers – now that was a challenge. Most especially, we would check the results printed by the computer every week and when it came to the final checks before the directories went to final print once a year, only a very limited number of changes were allowed on the final draft! But years later I was chatting to a former work colleague who admitted they had no knowledge of my physical disabilities. Happily the years have passed and attitudes have changed, so others have learned to accept me for exactly who and what I am today.

I have said before that change is all around us, every second of the day. As I was growing up at home I would see that my mother was worried about this or that and I would politely ask her what she was worried about. Quite often it was about something in the future over which she had no control, so I would ask why she was fretting about such things. To me, such worry is like spending life in an empty room sitting in a rocking chair, going back and forth. There is action to be sure, but no achievement. With each of the generations seeds are planted in all things, then they grow and many bear fruit which feed others. Thunderstorms occur, lightning may strike trees and create a fire which can burn parts of a forest, but when that happens seeds fall and new trees slowly grow. It is a cycle of life which continues. I learned a little while ago of a man who was having difficulty organising people to get to a particular place on time, I believe it was getting equipment for a concert, something like that. He had been taught all about geography and map reading, with coordinates, Northings and Eastings but to this man it was so very complicated. So he talked to a friend and they came up with the idea of dividing the whole world up into individual three-metre squares, so each one had a simple three-word name. As a result, we now have What3words, that is described officially as a ‘proprietary geocode system’ which has been designed to identify any location with a resolution of about 3 metres (9.8ft). It is owned by What3words Limited, based in London. The system encodes geographic coordinates into three permanently fixed dictionary words. For example, the front door of 10 Downing Street, London is identified by the code ///indoor.myself.rather and can be made into a weblink by altering the code slightly to w3w.co/indoor.myself.rather. So the three words do not change, just the prefix. This has been proven to be extremely useful finding folk who are perhaps halfway up a mountain, in fact because the English version works with the world’s oceans as well, emergency services can use it to find anyone, anywhere. But even a simple thing like meeting a friend at the entrance to a stadium or maybe a caravan on a large campsite can be, I am sure, really useful. The important point is that What3words differs from most location encoding systems in that it uses words rather than strings of numbers or letters, and the pattern of this mapping is not obvious, also the algorithm mapping locations to words is proprietary and protected by copyright. The company has a website, apps for Apple iOS and Android, and an application programming interface (API) which easily converts between postal addresses, What3words addresses and latitude/longitude coordinates. The system divides the world into a grid of 57 trillion 3-by-3-metre squares, each of which has a three-word address and the addresses are available in around fifty languages. Translations are not direct, as direct translations to some languages could produce more than three words. Rather, territories are localised considering linguistic sensitivities and nuances. Each What3words language uses a list of 25,000 words (40,000 in English, as it covers sea as well as land). The lists are manually checked to remove homophones and offensive words. The company states that densely populated areas have strings of short words due to more frequent usage, whilst less populated areas such as the North Atlantic use more complex words. Sometimes the simplest of things can be the best of ideas. As many of you know, some years ago I was able to go on a superb cruise holiday and as part of that cruise aboard the P&O ship ‘Arcadia’, each day the position, course and speed was given. I have now converted all of the daily latitude and longitude details into What3words and one example is w3w.co/encodes.mumbled.spaniel which links directly to a three metre square on board a cruise ship in the bay close to Akaroa, New Zealand. The link opens a web page with various options, including different views and sharing options. Alternatively, if I arrange to meet someone in Birmingham, perhaps by an entrance to the Symphony Hall I would share the link w3w.co/orange.over.rises. It can also be useful finding a car in a car park, maybe like the three words pull.bids.push, which can be seen on a web page as w3w.co/pull.bids.push. It would even work in a multi-storey one, I would just need to remember which level I was parked on! I could say the three words to a car Satnav, as a few have this feature now, or over the phone or text the link to a friend. I think that I will try and use this facility. I don’t often advertise, but on this occasion I think this is worth sharing. Having said that, you could already be using What3words. But just in case not…

As I have already mentioned, we live in a constantly changing world and it is, generally, our choice as to how we manage that change. But then sometimes that adjustment is forced upon us by a change in outside circumstances. I know of one particular man here in England who got married and they had children, but then their circumstances changed. Years later the man married again and this time he and his new wife had several children in a fairly short space of time. Meanwhile around them the world was still turning and the government of the time decided to ask its residents if they should either be staying in the European Union, or leave it. As a result, a referendum was put to the people of the country. It was called ‘Brexit’, a portmanteau of ‘British exit’ and resulted in the withdrawal of the United Kingdom from the European Union at 23:00 GMT on 31 January 2020. It meant that at that time, the UK was the only sovereign country to have left the EU. The UK had been a member state of the union and its predecessor the European Communities (EC) since 1 January 1973. Following Brexit, EU law and the Court of Justice of the European Union no longer had privacy over British laws, except in certain areas relating to Northern Ireland. As many expected, not everyone was happy with the decision to leave the EU, but the decision was reached by a majority vote. However, there are still people who continue to moan and complain that it was the wrong choice. I believe that the man who I mentioned earlier has stated how wrong he thought it was and he continues to moan, but I hope he will be educating all his children that despite all our hopes, dreams and wishes our lives may not always work out quite as we would have wished. I am reminded of the sorry tale about the young man who, having recently passed his driving test, went on a drinking spree to celebrate but then, whilst drunk, drove his father’s car at excessive speed and whilst he survived the subsequent crash he killed his best friend. Despite our best efforts, we make mistakes and must live with the consequences, no matter what they may be. I am presently living in a Care Home as I do my best to recover from medical issues, in my case my heart has a damaged mitral valve which I have had since birth, I have a muscular weakness on my right side and I also have epilepsy. It has meant that I am unable to do certain things, but I have learned to adapt, more especially to accept the changing circumstances as I have grown older. I am in a place where a few folk have dementia and I see how they live from day to day, they are fed, they are cared for and most especially they are treated properly and with respect. I am allowed to do as much as I can for myself, to manage as best I can, but if I need help I have learned to politely ask for it. To ask for and accept help has perhaps been the hardest thing for me to do as over the years I have learned to be independent as far as possible. We see and learn change all the time with lives, even species dying out, but there is definitely an innate willingness in so many of us to survive, to continue, to learn and to better ourselves. Yes, change continues and who knows what will occur on Earth in years to come. So I do my best to learn from the past, live in the present and look to the future with a smile. Which reminds me of a lovely quote, with I have included below.

A quote by Srinivas Arka.

This week…
The other day I happened to watch a clip from the tv series ‘Yes, Prime Minister’, which to me was extremely entertaining. The prime minister had come up with what he thought was “a brilliant idea, a real vote-winner”, as it would allow parents to choose for themselves which school they could send their children to. But Sir Humphrey Appleby was utterly appalled at the idea. In his eyes, choosing a school was a job for civil servants, as it was beyond the capability of parents! The Prime Minister then enquired who chose the school that he, Sir Humphrey, went to and with a knowing smile, Sir Humphrey replied “Oh, my parents, naturally…”.

Click: Return to top of page or Index page

Welcome To Earth Day

In previous weeks I have researched and written quite a bit on the history of things many and various. So this week I thought about bringing in a more ‘modern’ touch. I hope you like it. Except of course to set the scene, we should perhaps first consider ourselves and our lovely Earth. Today, we know from radiometric dating that Earth is about 4.5 billion years old. Had naturalists in the 1700s and 1800s known Earth’s true age, any early ideas about evolution might have been taken more seriously. We know that life began at least 3.5 billion years ago, because that is the age of the oldest rocks with fossil evidence of life here on Earth. It is the third planet from the Sun and the only astronomical object known to harbour life, at least as we know it as carbon-based life forms. Whilst large amounts of water can be found throughout the Solar System, only Earth sustains liquid surface water. About 71% of Earth’s surface is made up of the ocean, dwarfing Earth’s polar ice, lakes, and rivers. I could go on about its chronology, including its formation, geological history, origins of life and evolution but not this time! Instead, here is some detail on what is known as Earth Day. So far as I can tell, Earth Day was first celebrated in 1970, when a United States senator from Wisconsin organised a national demonstration to raise awareness about environmental issues. Rallies took place across that country and, by the end of the year, the U.S. government had created its Environmental Protection Agency. Since then, Earth Day has become an annual event around the world on April 22nd to demonstrate support for environmental protection and includes a wide range of events coordinated globally by earthday.org which was formerly the Earth Day Network. It now includes one billion people in more than a hundred and ninety-three countries and the official theme for 2022 is ‘Invest In Our Planet’, with details on the website http://www.earthday.org. In 1969 at a UNESCO conference in San Francisco, peace activist John McConnell proposed a day to honour the Earth and the concept of peace, to first be observed on March 21, 1970, the first day of spring in the northern hemisphere. This day in nature was later sanctioned in a proclamation written by McConnell and signed by then Secretary General U Thant at the United Nations. A month later, the United States Senator Gaylord Nelson proposed the idea to hold a nationwide environmental teach-in on April 22, 1970. He hired a young activist to be the National Coordinator and the two of them renamed the event ‘Earth Day’. The event grew beyond the original idea for a teach-in to include the entire United States, with more than 20 million people pouring onto the streets. Key non-environmentally focused partners played major roles and without them, it is likely that the first Earth Day would not have succeeded. Nelson was later awarded a Presidential Medal Of Freedom award in recognition of his work. The first Earth Day was focused on the United States, but in 1990 Denis Hayes, the original national coordinator in 1970, put it on the international stage and organised events in 141 nations. On Earth Day 2016, a landmark Paris Agreement was signed by the United States, the United Kingdom, China, and 120 other countries. This signing satisfied a key requirement for the entry into force of the historic draft Climate Protection Treaty adopted by consensus of the 195 nations present at the 2015 United Nations Climate Change Conference in Paris. Since then, numerous communities have continued to engage in Earth Day Week actions, an entire week of activities focused on the environmental issues that the world faces. On Earth Day 2020, over 100 million people around the world observed its 50th anniversary in what has been referred to as the largest online mass mobilisation in history.

But perhaps what really energised the birth of Earth Day was when, on January 28, 1969, an oil well drilled by Union Oil Platform A off the coast of Santa Barbara, California, blew out. More than three million gallons of oil spewed, killing more than 10,000 seabirds, dolphins, seals, and sea lions so as a direct reaction to this disaster, activists were mobilised to create good environmental regulation, environmental education, and Earth Day itself. There were a number of proponents of Earth Day who were in the front lines of fighting this disaster, but Denis Hayes, organiser of the first Earth Day said that Senator Gaylord Nelson from Wisconsin was inspired to create Earth Day upon seeing Santa Barbara Channel’s 800 square-mile oil slick from an aircraft. On the first anniversary of the oil blowout, January 28, 1970, Environmental Rights Day was created, and the Declaration of Environmental Rights was read. It had been written by Rod Nash during a boat trip across the Santa Barbara Channel whilst carrying a copy of Thomas Jefferson’s Declaration of Independence. The organisers of Environmental Rights Day had been working closely over a period of several months with a Republican Congressman to consult on the creation of their National Environmental Policy Act, the first of many new laws on environmental protection sparked by the national outcry about the blowout and subsequent oil spill and on the Declaration of Environmental Rights.

President Richard Nixon and First Lady Pat Nixon plant a tree on the White House South Lawn to recognise the first Earth Day.

In the winter of 1969–1970, a group of students met at Columbia University to hear Denis Hayes talk about his plans for Earth Day. The 1970s were a period of substantial environmental legislation in the U.S.A., including the Clean Air Act, Clean Water Act, Endangered Species Act, Marine Mammal Protection Act, Superfund, Toxics Substances Control Act, and the Resource Conservation and Recovery Act. It saw the creation of the Environmental Protection Agency and the banning of DDT and of lead in petrol. Jimmy Carter was president and the principal Washington, DC event was a festival held in Lafayette Park, across from the White House. It has been said that by mobilising two hundred million people in a hundred and forty-one countries and lifting the status of environmental issues onto the world stage, Earth Day activities in the early 1990’s gave a huge boost to recycling efforts worldwide and helped pave the way for the 1992 United Nations Earth Summit in Rio de Janeiro. Unlike the first Earth Day in 1970, this anniversary was waged with stronger marketing tools, greater access to television and radio, and multimillion-dollar budgets.

The official logo of the Mount Everest Earth Day 20 International Peace Climb.

The Earth Day 20 Foundation highlighted its April 22 activities with a live satellite phone call to members of the historic Earth Day 20 International Peace Climb who called from their base camp on Mount Everest to pledge their support for world peace and attention to environmental issues. The climb was led by Jim Whittaker, the first American to summit Mount Everest many years earlier and marked the first time in history that mountaineers from the United States, the Soviet Union and China had roped together to climb a mountain, let alone Mount Everest. The group also collected more than two tons of rubbish which was transported down the mountain by support groups along the way that was left behind on Mount Everest from previous climbing expeditions. Warner Bros records released an Earth Day-themed single in 1990 entitled ‘Tomorrow’s World’ and the song featured vocals from various artists. It reached number seventy-four on the ‘Hot Country Songs’ chart dated May 5, 1990. As the millennium approached, another campaign was begun, this time focusing on global warming and pushing for cleaner energy. The April 22 Earth Day in 2000 combined the big-picture feistiness of the first Earth Day with the international grassroots activism of Earth Day 1990. For 2000, Earth Day had the internet to help link activists around the world and by the time the day came around, some five thousand environmental groups world-wide were on board, reaching out to hundreds of millions of people in a record one hundred and eighty-four countries. Events varied, with a ‘talking drum’ chain travelling from village to village in Gabon, Africa, whilst hundreds of thousands of people gathered on the National Mall in Washington, D.C., USA. Google’s first Earth Day doodle was in 2001 and the theme for Earth Day 2003 was the Water for Life Campaign. That year, Earth Day Network developed a water quality project called “What’s in Your Water?”. Other water-related events were held on every continent, such as water workshops, exhibitions, concerts, and more in many countries. Educational curricula, teacher’s guides, water testing kits, and posters focused on water. Many other organisations also focused on environmental justice, created events concentrating on low-income communities. These events also worked on building support among low-income communities through clean-ups, park revitalisation and town halls focussing on integrating the environmental movement with community and social justice causes. Since then, Earth Day has been celebrated throughout the world in many and various ways. Over the following years such things as registering voters, major tree planting, healthy environments for children were done. Earth Day 2006 focused on science and faith and expanded into Europe, with events and speeches held in most of the EU countries. Key events included the ‘Festival on Climate Change’ in Utrecht, the Netherlands, which was focused on how to break away from the oil dependence and this included Earth Day founder Denis Hayes and members of the Dutch and E.U. parliament, local authorities, and media representatives. In the first of two years of Earth Day events in Ukraine, Denis Hayes also attended and spoke at the ‘Chernobyl 20 Remembrance for the Future’ conference in Ukraine. That year also saw events in China organised between Earth Day Network and Global Village Beijing, educating communities about energy savings along with the first-ever coordinated Earth Day events in Moscow, Russia, a scientific panel and a religious response panel on climate change throughout the U.S., and a ‘Conserve Your Energy’ event in Philadelphia. Thousands of Earth Day projects have been held across the globe that ranged from energy efficiency events, protests, letter writing campaigns, civic and environmental education trainings, urban and rural cleanups and water projects with a particular focus on building a broader and more diverse environmental movement.

On Earth Day 2010, its fortieth anniversary, an estimated one billion people around the world took part. This included action on climate change and other environmental issues through climate rallies and by engaging civil leaders in plans to build a greener economy. Through a Global Day of Conversation, more than 200 elected officials in more than 39 countries took part in active dialogues with their constituents about their efforts to create sustainable green economies and reduce their carbon footprints. Students around the world participated in school events, featuring community clean-ups, solar energy systems, school gardens, and environmental curriculum. Earth Day Network announced a partnership with Twentieth Century Fox Home Entertainment’s Avatar Home Tree Initiative to plant one million trees in 15 countries by the end of the year. Also, as part of a nationwide commemoration of the fortieth anniversary in Morocco, the government announced a unique National Charter for the Environment and Sustainable Development, the first commitment of its kind in Africa and the Arab world, which will inform new environmental laws for the country. The Kingdom of Morocco also pledged to plant one million trees. Since then, each Earth Day work has continued. The Earth Day Network completed a project to plant over 1.1 million trees, across the globe more than 100 million ‘Billion Acts of Green’ were registered. In September 2011, at the Clinton Global Initiative, U.S. President Clinton recognised this project as an exemplary approach to addressing global challenges. The goal of Earth Day 2014 was to dramatically personalise the massive challenges surrounding global climate change and weave that into both Earth Day 2014 and the five-year countdown to Earth Day 2020, the 50th anniversary. It was an opportunity to unite people worldwide into a common cause and call for action. Earth Day has in fact become very much a global event recognised my many nations, so it was no accident that in the United Nations, world leaders from 175 nations broke a record when they selected Earth Day 2016 to sign the Paris Agreement, the most significant climate accord in the history of the climate movement. Then in 2020, marches and gatherings were cancelled due to the COVID pandemic but still a three-day livestream event was organised, including speakers from all corners of the environmental movement such as Pope Francis, mayors from around the world, Ministers of the Environment from multiple countries and many more. Earth Day 2020 was a major topic across media platforms, including leading magazines and environmental publications. On January 5, 2020, Earth Day’s 50th anniversary year began with a full page in the Sunday New York Times, referencing a similar black and white advertisement that appeared in the Times 50 years earlier on the first Sunday in 1970. Through social media, Earth Day participants joined digital events and shared their support. Through Instagram, HRH The Prince of Wales reminded followers that nature is vital to human health and wellbeing, saying “For fifty years, since the very first Earth Day, I have dedicated a large part of my life to championing more balanced sustainable approaches whether in farming, forestry, fisheries, urban planning or corporate social responsibility. But as we look to shape the next fifty years, I very much need your help. To reflect and inspire the world to action, while aiming for a green recovery, I would ask you to join me by sharing your vision for a more sustainable future (socially, environmentally and economically) using the hashtag ReimagineReset.”

Sure We Can volunteers clean McKibbin Street, New York for Earth Day 2021.

Earth Day continues around the world, perhaps in ways unnoticed by many. For example there is a service in Brooklyn, New York called ‘Sure We Can’ which provides container-deposit redemption services to that area. Any person can come to Sure We Can during business hours and redeem New York State accepted bottles and cans. Additionally, the organisation serves as a community hub for the canner community that redeems there and for local environmental causes that promote the organisations’ dedication to sustainability. The facility is designed with canners (the people who collect cans and bottles from the streets) in mind. They aim to provide a welcoming facility so people can redeem their cans and bottles. In 2019, the centre annually processed 10 million cans and bottles for redemption and served a community of over 400 canners and Sure We Can estimate that they distribute $700,000 per year to canners. The average canner who visits Sure We Can earns $1,000 per year. Long may such initiatives continue, as large or not so large, they all make a vital difference. The Earth Day 2022 theme is ‘Invest in Our Planet’ and features five primary programmes, these being The Great Global Cleanup, Sustainable Fashion, Climate and Environmental Literacy, Canopy Project, Food and Environment, and the Global Earth Challenge. Earth Day is now observed in 192 countries and it is surely up to us all to do our part in sustaining this Earth.

This week…
Our British Saint’s Days are St David’s Day (March 1st), St Patrick’s Day (March 17th), St George’s Day (April 23rd) and St Andrew’s Day (November 30th). I was born on St. Patrick’s Day but my parents decided to give me the forenames Andrew David. Apparently a friend suggested they ought to include Patrick, but it was realised I’d then need George to complete the set and that was too much!

Click: Return to top of page or Index page

Easter

Easter is a Christian festival as well as a cultural holiday commemorating the resurrection of Jesus from the dead, as described in the New Testament of the Bible and having occurred on the third day of his burial following his crucifixion by the Romans at Calvary c. 30 AD. It is the culmination of the Passion of Jesus, preceded by Lent, a forty-day period of fasting, prayer and penance. Christians refer to the week before Easter as ‘Holy Week’, which in Western Christianity contains the days of the Easter Triduum, or the period of three days that begins with the liturgy on the evening of Maundy Thursday, reaches its high point in the Easter Vigil and closes with evening prayer on Easter Sunday. It is a moveable observance recalling the Passion, crucifixion, death, burial and resurrection of Jesus as portrayed in the canonical gospels. In Eastern Christianity, the same days and events are commemorated with the names of days all starting with “Holy” or “Holy and Great”; and Easter itself might be called “Great and Holy Pascha”, “Easter Sunday”, “Pascha” or “Sunday of Pascha”. In Western Christianity Eastertide, or the Easter Season, begins on Easter Sunday and lasts seven weeks, ending with the coming of the 50th day, Pentecost Sunday. In Eastern Christianity the Paschal season ends with Pentecost as well, but the leave-taking of the Great Feast of Pascha is on the 39th day, the day before the Feast of the Ascension. Easter and its related holidays are movable feasts, not falling on a fixed date but computed based on a lunar calendar, the solar year plus the Moon phase, similar to the Hebrew calendar. The first council of Nicaea, a council of Christian bishops, was convened in the Bithynian city of Nicaea (now Iznik, Turkey) by the Roman Emperor Constantine in 325AD and was the first effort to attain consensus in the church through an assembly representing all of Christendom. Its main accomplishments were settlement of the Christological issue of the divine nature of God the Son and his relationship to God the Father, the construction of the first part of the Nicene Creed, mandating uniform observance of the date of Easter and promulgation of early canon law. No details for the computation were specified, these were worked out in practice, a process that took centuries and generated a number of controversies. It has come to be the first Sunday after the ecclesiastical full moon that occurs on or soonest after March 21st. Even if calculated on the basis of the more accurate Gregorian calendar, the date of that full moon sometimes differs from that of the astronomical first full moon after the March equinox. Easter is linked to the Jewish Passover by its name, as pesach and pascha are the basis of the term by its origin (according to the synoptic gospels) where both the crucifixion and the resurrection took place during the Passover and by much of its symbolism, as well as by its position in the calendar. In most European languages the feast is called by the words for passover in those languages and in the older English versions of the Bible the term Easter was the term used to translate passover. Easter customs vary across the Christian world and include sunrise services, midnight vigils, exclamations and exchanges of Paschal greetings and one I had never heard of before, called ‘clipping the church’. I have learned that this is an ancient custom traditionally held only in England on Easter Monday, Shrove Tuesday or a date relevant to the saint associated with the church. The word “clipping” is Anglo-Saxon in origin and is derived from the word ‘clyppan’, meaning ‘embrace’ or ‘clasp’. So ‘clipping the church’ involves either the church congregation or local children holding hands in an inward-facing ring around the church, and can then be reversed to an outward-facing ring if a prayer for the wider world beyond the parish is said. Once the circle is completed, onlookers will often cheer and sometimes hymns are sung. Often there is dancing and after the ceremony a sermon is delivered in the church, then there are sometimes refreshments. Christians adopted this tradition to show their love for their church and the surrounding people, but currently there are only a few churches left in England that hold this ceremony, and all of these appear to honour it on a different day. Other customs include the decoration and the communal breaking of Easter eggs, a symbol of the empty tomb. The Easter lily, a symbol of the resurrection in Western Christianity, traditionally decorates the chancel area of churches Easter Day and for the rest of Eastertide. Additional customs that have become associated with Easter and are observed by both Christians and some non-Christians include Easter parades, communal dancing (in Eastern Europe), the Easter Bunny and egg hunting. There are also traditional Easter foods that vary by region and culture.

The modern English term ‘Easter’, with modern Dutch ‘ooster’ and German ‘Ostern’, developed from an Old English word that usually appears in the form ‘Ēastrun’, but also as ‘Ēostre’. Bede provides the only documentary source for the etymology of the word, in his eighth-century ‘The reckoning of Time’. He wrote that ‘Ēosturmōnaþ’ (Old English ‘Month of Ēostre’, translated in Bede’s time as ‘Paschal month’) was an English month, corresponding to April, which he says “was once called after a goddess of theirs named Ēostre, in whose honour feasts were celebrated in that month”. In Latin and Greek, the Christian celebration was, and still is, called ‘Pascha’, a word derived from Aramaic to Hebrew. The word originally denoted the Jewish festival known in English as Passover, commemorating the Jewish exodus from slavery in Egypt. The supernatural resurrection of Jesus from the dead, which Easter celebrates, is one of the chief tenets of the Christian faith. The resurrection established Jesus as the Son of God and is cited as proof that God will righteously judge the world, for those who trust in Jesus’s death and resurrection, “death is swallowed up in victory.” Any person who chooses to follow Jesus receives “a new birth into a living hope through the resurrection of Jesus Christ from the dead”. Through faith in the working of God, those who follow Jesus are spiritually resurrected with Him so that they may walk in a new way of life and receive eternal salvation, being resurrected to dwell in the Kingdom of Heaven. Easter is linked to the Passover and the exodus from Egypt as recorded in the Old Testament of the bible, through the Last Supper, the sufferings and subsequent crucifixion that preceded the resurrection. According to the three Synoptic gospels, Jesus gave the Passover meal a new meaning, as in the upper room during the Last Supper he prepared himself and his disciples for his death. He identified the bread and cup of wine as his body, soon to be sacrificed, and his blood, soon to be shed. Paul the apostle states, “Get rid of the old yeast that you may be a new batch without yeast, as you really are. For Christ, our Passover lamb, has been sacrificed”. This refers to the Passover requirement to have no yeast in the house and to the allegory of Jesus as the Paschal lamb.

In early Christianity, the first Christians were certainly aware of the Hebrew calendar. Jewish Christians, the first to celebrate the resurrection of Jesus, timed the observance in relation to Passover. Direct evidence for a more fully formed Christian festival of Pascha (Easter) begins to appear in the mid-2nd century but perhaps the earliest surviving primary source referring to Easter is a mid-2nd-century Paschal homily attributed to Melito of Sardis (the bishop of Sardis, near Smyrna in western Anatolia and a great authority in early Christianity) which characterises the celebration as a well-established one. Evidence for another kind of annually recurring Christian festival, those commemorating the martyrs, began to appear at about the same time. While martyrs’ days (usually the individual dates of martyrdom) were celebrated on fixed dates in the local solar calendar, the date of Easter was fixed by means of the local Jewish lunisolar calendar. This is consistent with the celebration of Easter having entered Christianity during its earliest, Jewish period, but does not leave the question free of doubt.

A stained-glass window depicting the Passover Lamb.

Easter and the holidays that are related to it are moveable feasts in that they do not fall on a fixed date in either the Gregorian or Julian calendars (both of which follow the cycle of the sun and the seasons). Instead, the date for Easter is determined on what is known as a lunisolar calendar similar to the Hebrew calendar. In 325AD the First Council of Nicaea established two rules, the independence of the Jewish calendar and worldwide uniformity, which were the only rules for Easter explicitly laid down by the council. No details for the computation were specified, these were worked out in practice, a process that took centuries and generated a number of controversies. In particular, the Council did not decree that Easter must fall on Sunday, but this was already the practice almost everywhere. In Western Christianity, using the Gregorian calendar, Easter always falls on a Sunday between 22 March and 25 April, within about seven days after the astronomical full moon. The following day, Easter Monday, is a legal holiday in many countries with predominantly Christian traditions. Eastern Orthodox Christians base Paschal date calculations on the Julian calendar. Because of the thirteen-day difference between the calendars between 1900 and 2099, 21 March corresponds, during the 21st century, to 3 April in the Gregorian calendar. Since the Julian calendar is no longer used as the civil calendar of the countries where Eastern Christian traditions predominate, Easter varies between 4 April and 8 May in the Gregorian calendar. Also, because the Julian ‘full moon’ is always several days after the astronomical full moon, the eastern Easter is often later, relative to the visible Moon’s phases, than western Easter. Amongst the Oriental Orthodox, some churches have changed from the Julian to the Gregorian calendar and the date for Easter, as for other fixed and moveable feasts, is the same as in the Western church. The Gregorian calculation of Easter was actually based on a method devised by a doctor from the Calabria region in Italy using the phases of the Moon and has been adopted by almost all Western Christians and by Western countries which celebrate national holidays at Easter. For the British Empire and colonies, a determination of the date of Easter Sunday using Golden Numbers and Sunday Letters was defined by the 1750 Calendar (New Style) Act with its annexe. This was designed to match exactly the Gregorian calculation.

Receiving the Holy Light at Easter.
St. George Greek Orthodox Church, Adelaide, Australia.

The above image shows the congregation lighting their candles from the new flame, just as the priest has retrieved it from the altar – note that the picture is illuminated by flash, as all electric lighting is off and only the oil lamps in front of the Iconostasis remain lit. In the 20th century, some individuals and institutions put forward changing the method of calculating the date for Easter, the most prominent proposal being the Sunday after the second Saturday in April. Despite having some support, proposals to reform the date have not been implemented. An Orthodox congress of Eastern Orthodox bishops, which included representatives mostly from the Patriarch of Constantinople and the Serbian Patriarch, met in Constantinople in 1923 where the bishops agreed to the revised Julian calendar. The original form of this calendar would have determined Easter using precise astronomical calculations based on the meridian of Jerusalem, however all the Eastern Orthodox countries that subsequently adopted the Revised Julian calendar adopted only that part of it that applied to festivals falling on fixed dates in the Julian calendar. The revised Easter computation that had been part of the original 1923 agreement was never permanently implemented in any Orthodox diocese. Here in the United Kingdom, the Easter Act of 1928 set out legislation to change the date of Easter to be the first Sunday after the second Saturday in April (or, in other words, the Sunday in the period from 9 to 15 April). However, the legislation has not been implemented, although it remains on the Statute book and could be implemented subject to approval by the various Christian churches. At a summit in Aleppo, Syria in 1997 the World Council of Churches (WCC) proposed a reform in the calculation of Easter which would have replaced the present divergent practices of calculating Easter with modern scientific knowledge taking into account actual astronomical instances of the spring equinox and full moon based on the meridian of Jerusalem, while also following the tradition of Easter being on the Sunday following the full moon. The recommended World Council of Churches changes would have sidestepped the calendar issues and eliminated the difference in date between the Eastern and Western churches. The reform was proposed for implementation starting in 2001, and despite repeated calls for reform, it was not ultimately adopted by any member body. In January 2016, Christian churches again considered agreeing on a common, universal date for Easter, whilst also simplifying the calculation of that date, with either the second or third Sunday in April being popular choices. So far, no date has yet been agreed.

Easter is seen by many as the state of new life, of rebirth and as one might expect, the egg is one such symbol. In Christianity it became associated with Jesus’s crucifixion and resurrection and the custom of the Easter egg originated in the early Christian community of Mesopotamia, who stained eggs red in memory of the blood of Christ, shed at his crucifixion. As such, for Christians, the Easter egg is a symbol of the empty tomb. The oldest tradition is to use dyed chicken eggs. In the Eastern Orthodox Church, Easter eggs are blessed by a priest both in families’ baskets together with other foods forbidden during Great Lent and alone for distribution or in church or elsewhere.

Traditional red Easter eggs for blessing by a priest.

Easter eggs are a widely popular symbol of new life among the Eastern Orthodox and the folk traditions of many Slavic countries. I have learned of a decorating process known as ‘pisanka’, a common name for an egg (usually that of a chicken, although goose or duck eggs are also used) richly ornamented using various techniques. The word ‘pisanka’ is derived from the verb ‘pisać’ which in contemporary Polish means exclusively ‘to write’ yet in old Polish meant also ‘to paint’. Originating as a pagan tradition, pisanki was absorbed by Christianity to become the traditional Easter egg and Pisanki are now considered to symbolise the revival of nature and the hope that Christians gain from faith in the resurrection of Jesus Christ. The celebrated House of Fabergé workshops created exquisitely jewelled Easter eggs for the Russian Imperial family from 1885 to 1916. A modern custom in the Western world is to substitute decorated chocolate filled with sweets. As many people give up these as their Lenten sacrifice, individuals enjoy these at Easter after having abstained from them during the preceding forty days of Lent.

Easter eggs, a symbol of the empty tomb.

Manufacturing their first Easter egg in 1875, the British chocolate company Cadbury sponsors the annual Easter egg hunt which takes place in over two hundred and fifty National Trust locations here in the United Kingdom. On Easter Monday, the President of the United States holds an annual Easter egg roll on the White House lawn for young children. In some traditions children put out empty baskets for the Easter bunny to fill whilst they sleep, they wake to find their baskets filled with chocolate eggs and other treats. Many children around the world follow the tradition of colouring hard-boiled eggs and giving baskets of sweets. One fascinating fact to me though is that since the rabbit is considered a pest in Australia, the Easter Bilby is used as an alternative. Bilbies are native Australian marsupials who are an endangered species, so to raise money and increase awareness of conservation efforts Bilby-shaped chocolates and related merchandise are sold within many stores throughout Australia as an alternative to Easter bunnies. But this time should surely be remembered as a new beginning, as it has been for centuries throughout the world. Happy Easter!

This week…
Not everyone has a home computer these days, but more and more folk find them useful as part of doing research on a range of subjects. Happily most public libraries allow folk free access to the ones they have, but time is strictly limited and use must be allocated. Sadly I can never get in to my local library, as every time I phone up they tell me they are fully ‘booked’…

Click: Return to top of page or Index page

Time Team

Many years ago I was looking through tv channels and chanced upon a show called ’Time Team’. The name was intriguing, so I sat down and watched. It fascinated me. I continued watching and I am glad I did. But sadly, after quite a few years, the tv series ended so I was delighted to see a mention of it again recently. As is my way, I did some research online and found that a fair bit had been written, especially recently and the following is what I found. I discovered some excellent images and some information on digs that were done last year as well as work expected in what I hope will be quite soon this year now. In fact ‘Time Team’ is a well-rehearsed story, but what I didn’t know was that it started as ‘Timesigns’, a four-part series which first aired in 1991. Roadford Lake, also known as the Roadford Reservoir is actually a man-made reservoir fed by the River Wolf, located to the north-east of Broadwoodwidger in West Devon, eight miles (13 km) east of Launceston. I do like the delightful village names we have in this country! This place is quite small and according to the 2001 census it had a population of just 548. Also, the reservoir is the largest area of fresh water in the southwest of England. Exploring the archaeology of this area came about after Tim Taylor approached Mick Aston to present the series and as a result, along with Phil Harding, three members of the future Time Team core were now in place. Yet despite bringing the past to life using the ingredients of excavation, landscape survey and reconstructions, including Phil felling a tree with a flint axe, Timesigns was a very different beast. In fact the four-part series is still available to watch online at https://www.channel4.com/programmes/timesigns and watching it now provides a lesson in just how revolutionary the Time Team format actually was. That is because Timesigns was slower paced and it had Mick talking directly to the camera in a style more akin to a history documentary or Open University broadcast, also there was a focus on interesting, previously discovered artefacts. It included Phil Harding in woodland, seeking out raw materials for a reconstructed axe and this allowed the audience to witness the hands-on practical process. It meant that viewers were placed at the heart of the action and this would later become a hallmark of Time Team. Whilst filming Timesigns, Tim and Mick often discussed other ways to bring archaeology to a television audience and what later proved to be something of a providential conversation took place in a Little Chef on the Okehampton bypass, where Mick mentioned that he had recently missed a train and, having a couple of hours to kill, decided to explore. During that time he deduced the town’s medieval layout and, struck by how much could be learned in a few hours, Tim wondered what could then be achieved in a few days. When he took this idea to various studios though, no-one wanted to know. Still, it was not the first time that a chance conversation with Mick had started someone thinking about television archaeology as a few years earlier Tony Robinson had joined a trip which Mick was leading to Santorini, a Greek island in the southern Aegean Sea about 200 km (120 miles) southeast from the mainland as this was part of some education work for Bristol University. Mick’s aptitude for breathing life into the past convinced Tony that archaeology had untapped television potential, but when he returned to Britain Tony found the studios unwilling to take the idea further. The breakthrough came when Timesigns proved an unexpected hit. Suddenly Channel 4 was receptive to the idea of a major archaeology programme, Tim Taylor devised the name ‘Time Team’ and in 1992 a pilot episode was filmed in Dorchester-on-Thames. Never screened and reputedly lost in the Channel 4 vaults, this pilot captured a show that was radically different to Timesigns and was initially seen as a quiz show in a similar vein to ‘Challenge Anneka’, where the team would be called on to solve archaeological mysteries whilst racing against the clock. Envelopes hidden at strategic points would set challenges along the lines of ‘find the Medieval high street in two hours’. Judged a misfire by Channel 4, it could have been the end. Thankfully, instead the Time Team’s format was radically overhauled although shades of the quiz-show concept did survive in early episodes. The onscreen introduction of all the team members and their specialist skills was a hangover from a time when participants would have varied from week to week, rather than coalescing into a core group but in the meantime, Tony’s role transformed from a quiz master to translator of all things archaeological for a general audience and the final piece of the jigsaw fell into place during the fledgling Time Team‘s first episode. Filmed at Athelney, site of Alfred the Great’s apocryphal burnt cakes, the site was scheduled, precluding excavation. John Gater, who was the programme’s ‘geophysics’ wizard, surveyed the field. Despite the Ancient Monuments Laboratory having drawn a blank the year before, John’s state-of-the-art kit revealed the monastic complex in startling clarity. Best of all, the cameras were rolling to capture the archaeologists’ euphoria as the geophysical plot emerged from a bulky printer in the back of the survey vehicle.

Mick Aston at work.

As well as an arresting demonstration of the power of teamwork, Athelney showed how geophysics could be the heart of the programme. As Mick Aston observed “the geophys and Time Team have always gone hand in hand. It is the programme really. Geophysics gives you that instant picture you can then evaluate”. John has kept on top of technical advances, and the results of his survey of Brancaster Roman fort provide one of the really outstanding moments in later series, with the breathtaking 3D model it produced of the buried structures persuading English Heritage to commission a complete survey on the spot. The original team brought an impressive breadth of skills to the programme. Victor Ambrus’ peerless ability to bring the past to life on the fly was well displayed after his artwork caught Tim Taylor’s eye in an edition of Readers’ Digest and the late Robin Bush brought a degree of historical expertise that would be missed almost as much as the man himself following his departure in 2003. Despite their varied talents and backgrounds it quickly became apparent that the team had a natural chemistry. Time Team became well-known for their individual ways and styles, including Mick’s famous striped jumper. Requested by a commissioning editor to wear more colourful clothing, Mick turned up in the most garish garment he could find as a joke, only to be told it was perfect. Far from a media concoction, the unique individuals on Time Team were filmed going about their work with an honesty and integrity that has seen the series heralded as Britain’s first reality television show. There can be little doubt that part of the show’s early success stems from the audience warming to the group’s genuine passion for teasing out the past. Rather than targeting the palaces and castles of the rich and famous, each of the episodes sought to solve simple, local questions. This was really highlighted by having a member of the public read out a letter of invitation at the beginning, posing the question they wanted answered. The message was simple, this is local archaeology, it is ‘your’ archaeology. It worked well, especially whenever the director of the first few seasons followed the digs as they evolved and his technique meant that viewers were often placed on the edge of a trench when discoveries happened and making them privy to key discussions. However some archaeologists were initially, quite fairly, a bit sceptical. One aspect that some treated with suspicion was the three-day deadline. Research digs usually ran for weeks if not months, and it was questioned whether anything approaching responsible archaeology could be achieved in such a short space of time. It was certainly not ideally suited to showcase all of the techniques available to modern archaeologists. Much money would be spent on scientific dating, with the results only coming back in time for a line of dialogue to be dubbed on months after filming had concluded. Coincidentally, digging within a tight timeframe was how changes were occurring within the profession. Obliged to cut evaluation trenches to meet the deadlines of multi-million pound construction projects, the 1990s saw a surge in short-term excavation projects. It led to an appreciation of just how much information could be quickly gleaned from comparatively modest trenching. The thrill of time running out also engaged viewers, and Time Team’s popularity was rewarded with increasingly longer series. Season one, aired in 1994, had four episodes, while season two followed with five, and season three then boasted six.

Some members of the Time Team.

Seasons nine to twelve have often been seen as Time Team‘s ‘golden’ age. Screening thirteen episodes a year, as well as live digs and specials the programme seemed to be ever-present. Its stars were household names and at its zenith, Time Team had regular audiences of over three million viewers. Now that the format was safely established, the programme was increasingly able to capitalise on its fame and access big name sites, even Buckingham Palace. Whilst the allure of such sites created a powerful television spectacle, it also marked a move away from the programme’s humble local archaeology origins. Even after its star began to wane, Time Team remained popular and an audience study in 2006 indicated that twenty million people watched at least one show that year. However it was season nineteen that changed everything as in 2011 the production centre for the programme moved from London to Cardiff. Very much of a political gesture aimed at building up regional television, the series was picked because it seemed a safe pair of hands. Sadly it cost the show almost all of its behind the scenes staff, expertise honed over fifteen years was lost at a stroke, to be replaced by crew and production staff who knew neither each other nor archaeology. Despite some great new people who learnt fast, expecting them to produce the same calibre of product immediately was just too big a demand. Time Team‘s cost also made it vulnerable. Towards the end of its run an average episode would cost around £200,000, a budget more on the scale of a small drama show in the eyes of television insiders but over twenty years Channel 4 had in fact pumped £4 million directly into British archaeology. It is to the Channel’s credit that it did this despite much of that outlay being channelled into post-excavation work that never appeared on-screen. The money was well spent and today only five Time Team sites remain unpublished, a record that shames many UK units and academics.

The Time Team in 2012.

Back then, Time Team’s legacy left much to celebrate. It brought the money and expertise to investigate sites that would otherwise never have been touched. The Isle of Mull episode in season seventeen is a great example of what could be discovered. With only some strange earthworks exciting the curiosity of local amateur archaeologists to go on, the programme was flexible enough to be able to take a gamble and the result was a previously unknown 5th-century monastic enclosure linked to St Columba. It enabled a local group to secure Historic Lottery Fund money to dig the site. Time Team excavations at Binchester’s Roman fort also helped kickstart a major research project. I was saddened when the series ended, but in 2021 there was excellent news when, thanks to the overwhelming support of their supporters, the Time Team returned for two brand new digs in September that year, with the episodes due to be released this year on the YouTube channel ‘Time Team Official’. This will give viewers the chance to engage as the shows are researched and developed, see live blogs during filming, watch virtual reality landscape data at home and join in Q&A’s with the team. Carenza Lewis, Stewart Ainsworth, Helen Geake and geophys genius John Gater will all be returning. They are joined by new faces representing the breadth of experts practising archaeology today. Sir Tony Robinson, who is an honorary patron, says: “I was delighted to hear about the plans for the next chapter in Time Team’s story. It’s an opportunity to find new voices and should help launch a new generation of archaeologists. While I won’t be involved in the new sites, I was delighted to accept the role of honorary patron of the Time Team project. It makes me chief super-fan and supporter. All armoury in our shared desire to inspire and stimulate interest in archaeology at all levels.” Like Tony, I too am a great fan of Time Team and feel sure that this will bode well, as there is now a Time Team website at http://www.timeteamdigital.com.

This week…

A Turkish proverb.

Click: Return to top of page or Index page

All Fools’ Day

More commonly known as April Fools’ Day, this is celebrated on April 1st each year and has been celebrated for several centuries by many different cultures, though its exact origins remain a mystery. Traditions include playing hoaxes or practical jokes on others, often ending the event by calling out “April Fool!” to the recipient so they realise they’ve been caught out by the prank. Whilst its exact history is shrouded in mystery, the embrace of April Fools’ Day jokes by the media and major brands has ensured the unofficial holiday’s long life. Mass media can be involved in these pranks, which may then be revealed as such on the day following. The day itself is not a public holiday in any country except Odessa in the Ukraine, where the first of April is an official city holiday. The custom of setting aside a day for playing harmless pranks upon one’s neighbour has become a relatively common one in the world and a disputed association between 1 April and foolishness is in Geoffrey Chaucer’s ‘The Canterbury Tales (1392) as in the ’Nun’s Priest’s Tale’, where a vain person is tricked by a fox with the words ‘Since March began thirty days and two’, i.e. 32 days since March began, which is April 1st. In 1508, French poet Eloy d’Amerval referred to a ‘poisson d’avril’, possibly the first reference to the celebration in France. Prompted by the Protestant Reformation, the Ecumenical Council of the Catholic Church issued condemnations of what it defined to be heresies committed by proponents of Protestantism and also issued key statements and clarifications of the Church’s doctrine and teachings, including scriptures, the Biblical canon, sacred tradition, original sin, the sacraments, Mass and the veneration of saints. The Council met for twenty-five sessions between 13 December 1545 and 4 December 1563 and Pope Paul III, who convoked, or called together the Council, oversaw the first eight sessions during 1545 and 1547, whilst the twelfth to sixteenth sessions, held between 1551 and 1552, were overseen by Pope Julius III and the final seventeenth to twenty-fifth sessions by Pope Pius IV between 1562 and 1563. As a result, the use of January 1st as New Year’s Day was not adopted officially until 1564 by the Edict of Roussillon, when France switched from the Julian to the Gregorian calendar. In the Julian Calendar, like the Hindu calendar, the new year began with the spring equinox around April 1st. So people who were slow to get the news of this change from the Julian to the Gregorian calendar or simply failed to realise the change but continued to celebrate the start of the new year during the last week of March and into April became the butt of jokes and hoaxes and were therefore called “April fools.” These pranks included having paper fish placed on their backs and being referred to as “poisson d’avril” (April fish), said to symbolise a young, easily caught fish or a gullible person. In 1686, a writer named John Aubrey referred to the celebration as ‘Fooles holy day’, the first British reference. On 1 April 1698, several people were tricked into going to the Tower of London to “see the Lions washed”.

An 1857 ticket to “Washing the Lions” at the Tower of London. No such event was ever held.

A study in the 1950s by two folklorists found that in the UK and in countries whose traditions derived from here, the joking ceased at midday and this continues to be the practice, with the custom ceasing at noon, after which time it is no longer acceptable to play pranks. Thus a person playing a prank after midday is considered to be the ‘April fool’ themselves. Meanwhile in Scotland, April Fools’ Day was originally called ‘Huntigowk Day’. The name is actually a corruption of ‘hunt the gowk’, this being Scottish for a cuckoo or a foolish person. Alternative terms in Gaelic would be ‘Là na Gocaireachd’, ‘gowking day’, or ‘Là Ruith na Cuthaige’, ‘the day of running the cuckoo’. The traditional prank is to ask someone to deliver a sealed message that supposedly requests help of some sort. In fact, the message reads “Dinna laugh, dinna smile. Hunt the gowk another mile.” The recipient, upon reading it, will explain they can only help if they first contact another person, and they send the victim to this next person with an identical message, with the same result. In England a ‘fool’ is known by a few different names around the country, including ‘noodle’, ‘gob’, ‘gobby’ or ‘noddy’.

Big Ben going digital…

On April Fools’ Day 1980, the BBC announced the Big Ben’s clock face was going digital and whoever got in touch first could win the clock hands. Over in Ireland, it was traditional to entrust a victim with an “important letter” to be given to a named person. That person would read the letter, then ask the victim to take it to someone else, and so on. The letter when opened contained the words “send the fool further”. A day of pranks is also a centuries-long tradition in Poland, signified by ‘prima aprilis’, this being ‘First April’ in Latin. It is a day when many pranks are played and hoaxes, sometimes very sophisticated, are prepared by people as well as the media (which often cooperate to make the ‘information’ more credible) and even public institutions. Serious activities are usually avoided, and generally every word said on April 1st could be untrue. The conviction for this is so strong that the Polish anti-Turkish alliance with Leopold I which was signed on 1 April 1683, was backdated to 31 March. But for some in Poland ‘prima aprilis’ also ends at noon of 1 April and such jokes after that hour are considered inappropriate and not classy. Over in Nordic countries Danes, Finns, Icelanders, Norwegians and Swedes celebrate April Fools’ Day. It is ‘aprilsnar’ in Danish, ‘aprillipäivä’ in Finnish and ‘aprilskämt’ in Swedish. In these countries, most news media outlets will publish exactly one false story on 1 April and for newspapers this will typically be a first-page article but not the top headline. In Italy, France, Belgium and the French-speaking areas of Switzerland and Canada, the April 1st tradition is similarly known as April fish, being ‘poisson d’avril’ in French, ‘April vis’ in Dutch and ‘pesce d’aprile’ in Italian. Possible pranks include attempting to attach a paper fish to the victim’s back without being noticed. This fish feature is prominently present on many late 19th- to early 20th-century French April Fools’ Day postcards. Many newspapers also spread a false story on April Fish Day, and a subtle reference to a fish is sometimes given as a clue to the fact that it is an April Fools’ prank. In Germany, as in the UK an April Fool prank is sometimes later revealed by shouting “April fool!” at the recipient, who becomes the April fool but over in the Ukraine, April Fools’ Day is widely celebrated in Odessa and has the special local name ‘Humorina’. It seems that this holiday arose in 1973 and an April Fool prank is revealed by saying “Pervoye Aprelya, nikomu ne veryu”, which means “April the First, I trust nobody”, to the recipient. The festival includes a large parade in the city centre, free concerts, street fairs and performances. Festival participants dress up in a variety of costumes and walk around the city fooling around and pranking passersby. One of the traditions on April Fools’ Day is to dress up the main city monument in funny clothes. Humorina even has its own logo, a cheerful sailor in a lifebelt and whose author was the artist Arkady Tsykun. During the festival, special souvenirs bearing the logo are printed and sold everywhere. Quite why or how this began I cannot determine but since 2010, April Fools’ Day celebrations include an International Clown Festival and both are celebrated as one. In 2019, the festival was dedicated to the 100th anniversary of the Odessa Film Studio and all events were held with an emphasis on cinema.

An April Fools’ Day prank in the Public Garden in Boston, Massachusetts.
The sign reads “No Photography Of The Ducklings Permitted”

As well as people playing pranks on one another on April Fools’ Day, elaborate pranks have appeared on radio and television stations, newspapers, and websites as well as those performed by large corporations. In one famous prank in 1957, the BBC broadcast a film in their ‘Panorama’ current affairs series purporting to show Swiss farmers picking freshly-grown spaghetti, in what they called the Swiss spaghetti harvest. The BBC was soon flooded with requests to purchase a spaghetti plant, forcing them to declare the film a hoax on the news the next day. With the advent of the Internet and readily available global news services, April Fools’ pranks can catch and embarrass a wider audience than ever before. But the practice of April Fool pranks and hoaxes is somewhat controversial. The mixed opinions of critics are epitomised in the reception to the 1957 BBC ’spaghetti tree hoax’ and newspapers were later split over whether it was a great joke or a terrible hoax on the public. The positive view is that April Fools’ can be good for one’s health because it encourages ‘jokes, hoaxes, pranks, and belly laughs’ and brings all the benefits of laughter including stress relief and reducing strain on the heart. There are many ‘best of’ April Fools’ Day lists that are compiled in order to showcase the best examples of how the day is celebrated and various April Fools’ campaigns have been praised for their innovation, creativity, writing, and general effort. However, the negative view describes April Fools’ hoaxes as ‘creepy and manipulative, rude and a little bit nasty’, as well as based on ‘Schadenfreude’, the experience of pleasure, joy, or self-satisfaction that comes from learning of or witnessing the troubles, failures, or humiliation of another, as well as deceit. When genuine news or a genuine important order or warning is issued on April Fools’ Day, there is risk that it will be misinterpreted as a joke and ignored, for example when Google (known to play elaborate April Fools’ Day hoaxes) announced, in 2004, their launch of Gmail with one gigabyte inboxes, an era when competing webmail services offered four megabytes or less, many dismissed it as an outright joke. On the other hand, sometimes stories intended as jokes are taken seriously. So either way, there can be adverse effects such as confusion, misinformation, wasted resources (especially when the hoax concerns people in danger) and even legal or commercial consequences. In Thailand, the police even warned ahead of the April Fools’ in 2021 that posting or sharing fake news online could lead to maximum of five years imprisonment. Other examples of genuine news on April 1st mistaken as a hoax included warnings about the Aleutian Island earthquake’s tsunami in Hawaii and Alaska in 1946 that killed 165 people, news on April 1st that a comedian by the name of Mitch Hedberg had died on 29 March 2005, an announcement that a long running USA soap opera called ‘Guiding Light’ was being cancelled in 2009 or that a USA basketball player named Isaiah Thomas had been declared for the NBA draft in 2011, probably because of his age. As well as April 1st being recognised as April Fools’ Day, there are a few other, recognisable days, notably on the first of each month when, in English-speaking countries (mainly Britain, Ireland, Australia, New Zealand and South Africa) it is a custom to say “a pinch and a punch for the first of the month” or a similar alternative, but this is typically said by children. In some places the victim might respond with “a flick and a kick for being so quick”, but that I haven’t heard said for many a long year. I do still say (or share in text messages etc) “White rabbits” as this is meant to bring good luck and to prevent the recipient saying ‘pinch, punch, first of the month’ to you! I do wonder sometimes how one of my older brothers managed at school on this particular day though, as April 1st is his birthday – perhaps he managed to keep it quiet somehow…

This week…
There are so many words in English that seem to have fallen out of use and I am starting to find a few. We know that when a word is used to emphasise or lay emphasis on a noun, it is called an emphatic adjective. Examples are found in “The very idea of living on the moon is impractical” and “They are the only people who helped me, where ‘very’ and only’ emphasise. But there are also ‘phatic’ expressions and these are ones denoting or relating to language used for general purposes of social interaction, rather than to convey information or ask questions. Utterances such as “hello, how are you?” and “nice morning, isn’t it?” are examples of phatic expressions.

Click: Return to top of page or Index page