So a week ago, Friday the 13th came and went around the world. To many it would have been an ordinary day, whilst others may have almost feared it. I have heard about some folk not wanting to even get out of bed, for fear of something ‘bad’ happening to them. Many of us have our own ways, our own peculiarities, perhaps eccentricities, even foibles. Incidentally, the latter word can also mean the part of a sword blade from the middle to the point.

In fact a superstition is defined as any belief or practice considered to be irrational or supernatural, attributed to fate or magic, perceived supernatural influence or fear of that which is unknown. It is commonly applied to beliefs and practices surrounding luck, amulets, astrology, fortune-telling, spirits and certain paranormal entities, more particularly the belief that future events can be foretold by specific and apparently unrelated prior events. Equally the word ‘superstition’ is often used to refer to a religion not practiced by the majority of a given society, regardless of whether the prevailing religion contains alleged superstitions or not. The Oxford English Dictionary (OED) defines superstition as ‘a religious belief or practice considered to be irrational, unfounded, or based on fear or ignorance; excessively credulous belief in and reverence for the supernatural’, as well as ‘a widely held but irrational belief in supernatural influences, especially as leading to good or bad luck, or a practice based on such a belief’. Oxford Advanced Learner’s Dictionary defines superstition as ‘the belief that particular events happen in a way that cannot be explained by reason or science; the belief that particular events bring good or bad luck’. According to Merriam Webster’s dictionary, it is ‘a false conception about causation or belief or practice emanating from ignorance, fear of the unknown, trust in magic or chance amounts to superstition’. Meanwhile, the Cambridge Dictionary denotes superstition as ‘a belief that is connected with old ideas about magic etc., without grounding in human reason or scientific knowledge’. The dictionary cites Cambridge English Corpus contextually in that the term ‘superstition’ might define controversial beliefs, the practice of confessional opponents or the beliefs of the ignorant masses as superstitious. Different authors have attempted to categorise different superstitions and one even gave time a category, noting the observances of various ones such as dog days, Egyptian days (which, in Europe during the Middle Ages, were certain days of the year held to be unlucky), year prognoses and lunar timings, also where signs might constitute significances like particular animal behaviours, such as the call of birds, neighing of horses or sighting of comets, as well as dreams. But identifying something as a ‘superstition’ is considered by many as somewhat pejorative or seen with contempt and these items are commonly referred to as folklore. Webster’s ‘The Encyclopaedia of Superstitions’ points out that whilst many superstitions are related with religion, people have been carrying individual subjective perceptions against one another and people of one belief are likely to call people of another belief superstitious. Constantine regarded paganism as a superstition, whilst on the other hand Tacitus regarded Christianity as pernicious superstition. Both Paul the Apostle and Martin Luther (10 November 1483 – 18 February 1546) perceived any thing that was not centred on Christ to be superstitious. Whilst the formation of the Latin word ‘superstition’ is clear, from the verb ‘super-stare’, i.e. to stand over, stand upon, to survive, its original intended sense is less clear. It can be interpreted as ‘standing over a thing in amazement or awe’, but other possibilities have been suggested, for example the sense of excess, such as over-scrupulousness or over-ceremoniousness in the performing of religious rites, or else the survival of old, irrational religious habits. The earliest known use of the word as a noun is found in written works by Plautus, Ennius and later by Pliny, with the meaning of ‘art of divination’. From its use in the Classical Latin of Livy and Ovid it is used in the pejorative sense that it holds today in relation to an excessive fear of the gods or unreasonable religious belief, as opposed to the proper and reasonable awe of the gods. However Cicero derived the term from ‘superstitiosi’, literally ‘those who are left over’, meaning survivors or descendants, connecting it to excessive anxiety of parents in hoping that their children would survive them to perform their necessary funerary rites.

Greek and Roman polytheists (those with the belief in multiple deities who are usually assembled into a pantheon of gods and goddesses along with their own religious sects and rituals) modelled their relations with the gods on political and social terms and scorned the man who constantly trembled with fear at the thought of the gods, as a slave feared a cruel and capricious master. Such fear of the gods was what the Romans considered to be the meaning of superstition. The current Catechism of the Catholic Church considers superstition sinful in the sense that it denotes ‘a perverse excess of religion’ as a demonstrated lack of trust in divine providence and in violation of the first of the Ten Commandments. The Catechism is therefore a defence against the accusation that Catholic doctrine is superstitious. In 1948 a behavioural psychologist published an article in which he described his pigeons exhibiting what appeared to be superstitious behaviour. One pigeon was making turns in its cage, another would swing its head in a pendulum motion, whilst others also displayed a variety of different behaviours. He believed these behaviours were all done ritualistically in an attempt to receive food from a dispenser, even though the dispenser had already been programmed to release food at set time intervals regardless of the pigeons’ actions, so the psychologist believed that the pigeons were trying to influence their feeding schedule by performing these actions. He then extended this as a proposition regarding the nature of superstitious behaviour in humans. That was his considered opinion. But some people seem to believe that superstitions influence events by changing the likelihood of currently possible outcomes rather than by creating new possible outcomes. In sporting events, for example, a lucky ritual or object is thought to increase the chance that an athlete will perform at the peak of their ability, rather than increasing their overall ability at that sport. There are some people who tend to attribute events to supernatural causes most often under two circumstances. In the first instance they are more likely to attribute an event to a superstitious cause if it is unlikely than if it is likely. In other words, the more surprising the event, the more likely it is to evoke a supernatural explanation. This is believed to stem from an ‘affected’ motivation – a basic desire to exert control over one’s environment. When no natural cause can explain a situation, attributing an event to a superstitious cause may give people some sense of control and ability to predict what will happen in their environment. In the second, people are more likely to attribute an event to a superstitious cause if it is negative than positive. This is called ‘negative agency bias’, for example in American baseball, the Boston Red Sox fans attributed the failure of their team to win the world series for 86 years to the ‘curse of the bambino’, an alleged curse placed on the team for trading a professional baseball player named Babe Ruth to the New York Yankees so that the team owner could fund a Broadway musical. When the Red Sox finally won the world series in 2004 however, the team’s success was attributed to the team’s skill and the rebuilding effort of the new owner and general manager. As you might expect, people are more likely to perceive their computer to act according to its own intentions when it malfunctions than when it functions properly. However, according to various analysts who study consumer behaviour superstitions are employed as a heuristic tool and as a result these can influence a variety of consumer behaviours. These analysts say that, after taking into account a set of antecedents, trait superstitions are predictive of a wide variety of consumer beliefs, like beliefs in astrology or in common negative superstitions, for example the fear of black cats. Additionally, a general proneness to be superstitious may lead to an enduring temperament to gamble, to participate in promotional games, invest in stocks, to forward superstitious e‐mails, keep good‐luck charms and exhibit sports fan regalia. But superstition can also be found in politics, as the Ancient Greek historian Polybius wrote in his work “The Histories” where he used the word ‘superstition’, explaining that in Ancient Rome such beliefs maintained the cohesion of the Roman Empire, operating as it did as a means of controlling the masses, in particular to achieve both political and mundane ends.

Boston Red Sox.

In the Classical era, the existence of gods was actively debated amongst both philosophers and theologians and consequently opposition to superstition arose. The poem ‘De Rerum Natura’, written by the Roman poet and philosopher Lucretius further developed the opposition to superstition. Cicero’s work ‘De Natura Deorum’ also had a great influence on the development of the modern concept of superstition as well as the word itself. Whereas Cicero distinguished ‘superstitio’ and ‘religio’, Lucretius used only the word ‘religio’. That is because for Cicero, ’superstitio’ meant excessive fear of the gods as he believed that only superstition, and not religion, should be abolished. In fact the Roman Empire also made laws condemning those who excited excessive religious fear in others. During the Middle Ages, the idea of God’s influence on the world’s events went mostly undisputed. Trials by ordeal were quite frequent, even though King Frederick II (1194 – 1250 AD) was the first king who explicitly outlawed trials by ordeal as they were considered to be irrational. The rediscovery of lost classical works and scientific advancement led to a steadily increasing disbelief in superstition and a new, more rationalistic view was beginning to be seen. In addition, opposition to superstition was central to the Age of Enlightenment. In fact, most superstitions arose over the course of many centuries and were rooted in regional and historical circumstances, such as religious beliefs or the natural environment. For instance geckos were at one time believed to be of medicinal value in many Asian countries, whilst in China (and in other countries now) the belief of Feng Shui is said to have a negative effect on different places, for example that a room in the northwest corner of a house may have very bad energy. Similarly, the number 8 is thought to be a lucky number in China, so that it is more common than any other number in the Chinese housing market. Equally there are certain phrases, in particular plays, which are considered to bring bad luck, for example it is said that a coven of witches objected to William Shakespeare using real incantations, so they put a curse on that well-known Scottish play. Legend has it the play’s first performance (around 1606) was riddled with disaster. The actor playing Lady Macbeth died suddenly, so Shakespeare himself had to take on the part. Prior to a performance, some actors will say “break a leg” in the hope that this will ward off any unlucky events. But there are some equally and quite reasonable actions which at first seem without much foundation. At one time, many children were forced to use their right hands for writing, mainly as a prejudice against the awkwardness of left-handed writing and the prevalence of ‘right-handed’ utensils. Happily, left-handedness is more accepted nowadays, which is all to the good for me personally! But many years ago when greeting someone, the task of shaking hands was done with the right hand because back then a great many swordsmen had their sword on the left side of their waist because they were right-handed, so it was easy for them to draw their sword. But by shaking hands with the right hand it showed openness and trust towards the person they were greeting, not hostility. It is fascinating how these actions have their historical connections, rather than simply thought of as superstition.

This week… an interesting tale.
The following is an actual question given on a University of Washington chemistry mid-term paper. The answer given by one student was considered so ‘profound’ that the professor shared it with colleagues via the Internet, which is of course why we now have the pleasure of enjoying it as well.

Bonus Question:
Is Hell exothermic (gives off heat) or endothermic (absorbs heat)?
Most of the students wrote proofs of their beliefs using Boyle’s Law (gas cools down when it expands and heats up when it is compressed) or some variant. One student, however, wrote the following:

“First, we need to know how the mass of Hell is changing in time. So we need to know the rate that souls are moving into Hell and the rate they are leaving. I think that we can safely assume that once a soul gets to Hell, it will not leave. Therefore, no souls are leaving.

As for how many souls are entering Hell, let’s look at the different religions that exist in the world today. Most of these religions state that if you are not a member of their religion, you will go to Hell. Since there is more than one of these religions and since people do not belong to more than one religion, we can project that all souls go to Hell.

With birth and death rates as they are, we can expect the number of souls in Hell to increase exponentially. Now, we look at the rate of change of the volume in Hell because Boyle’s Law states that in order for the temperature and pressure in Hell to stay the same, the volume of Hell has to expand proportionately as souls are added.

This gives two possibilities:
1) If Hell is expanding at a slower rate than the rate at which souls enter Hell, then the temperature and pressure in Hell will increase until all Hell breaks loose.
2) If Hell is expanding at a rate faster than the increase of souls in Hell, then the temperature and pressure will drop until Hell freezes over. So which is it?

If we accept the postulate given to me by Teresa during my Freshman year, “…that it will be a cold day in Hell before I sleep with you”, and take into account the fact that I still have not succeeded in having an affair with her, then #2 above cannot be true, and thus I am sure that Hell is exothermic and will not freeze over.”

This student received the only “A”.

Click: Return to top of page or Index page

The Gold Standard

As part of my research on gold for last week’s blog post, I saw an item on the Internet about how gold had only stopped being used as a Gold Standard in recent years, so I decided to do just a bit of research. This was because what the writer of the article had said simply didn’t seem right to me. I learned that at the time of London’s first Olympics in 1908, the amount of money in circulation in the UK was tied to the amount of gold in the economy. The gold standard had prevailed for most of the previous two centuries and was to continue until World War I began in 1914. But the UK was not the only country whose monetary system was based on gold. From 1880 to 1914, almost all of the world’s leading economies had followed suit, with each country fixing the price of gold in their local currency. In the UK, the price of one troy ounce of gold was £4 5s 0d (£4.25). In the US it was fixed at $20.67. This implied a fixed exchange rate between pound sterling and the dollar ($4.87 per £1) with all the other countries on the gold standard. To enhance the credibility of the arrangements, authorities guaranteed that paper money was fully convertible into gold and anyone could request to convert their pounds into the equivalent value of gold. This was because it limited the ability of governments to print money and the gold standard stopped countries from deliberately devaluing their own currency in order to improve the competitiveness of their exports or pay off their debts. As a result, membership of the gold standard was seen as a commitment to sound government finance. By constraining the growth in money supply, the gold standard was also believed to contribute to stable prices. Over long periods this was generally the case, as price levels in the UK were much the same in 1914 as they were in 1880. However, the gold standard’s inflexibility had major disadvantages. Changes in the world’s money supply were dependent not on economic conditions, but on the amount of new gold that was mined. This meant that on the one hand, monetary policy could not be used to respond to recessions and booms but on the other, significant rises in gold production would lead to faster money supply growth and ultimately inflation, regardless of a country’s underlying economic conditions. World War I saw the end of the gold standard as governments suspended the convertibility of their currencies into gold in order to freely finance rapidly escalating military expenditure. It was briefly reintroduced in some countries after the War, including the UK from 1925 to 1931, but fell apart again during the Great Depression. After World War II, a form of gold standard under the Bretton Woods system which involved the dollar being fixed to gold and then other currencies being fixed to the dollar was in operation until 1971. So technically, a ‘gold standard’ is a monetary system in which the standard economic unit of account is based on a fixed quantity of gold. This was the basis for the international monetary system from the 1870s to the early 1920s, and from the late 1920s to 1932, as well as from 1944 until 1971 when the United States unilaterally terminated converting the US dollar to gold foreign central banks, effectively ending the Bretton Woods system, though many states still hold substantial gold reserves. In fact it seems that historically, the silver standard and bimetallism have been more common than the gold standard and the shift to an international monetary system based on a gold standard reflected accident, network externalities and ‘path dependence’ (a concept in economics and the social sciences, referring to processes where past events or decisions constrain later events or decisions) occurred. Great Britain accidentally adopted a ‘de facto’ gold standard in 1717 when Sir Isaac Newton, who was then master of the Royal Mint, set the exchange rate of silver to gold far too low, thus causing silver coins to go out of circulation. As Great Britain became the world’s leading financial and commercial power in the 19th century, other states increasingly adopted Britain’s monetary system. The gold standard was largely abandoned during the Great Depression before being reinstated in a limited form as part of the post-World War II Bretton Woods system. The gold standard was abandoned due to its propensity for volatility, as well as the constraints it imposed on governments, as by retaining a fixed exchange rate, governments were hamstrung in engaging in expansionary policies to, for example, reduce unemployment during economic recessions. There is a consensus among economists that a return to the gold standard would not be beneficial and most economic historians reject the idea that the gold standard ‘was effective in stabilising prices and moderating business-cycle fluctuations during the nineteenth century.’ So it was that we slipped into a ‘gold specie standard’ in 1717 by over-valuing gold at 15.2 times its weight in silver, ‘specie’ meaning money in the form of coins rather than notes. It was unique among nations to use gold in conjunction with clipped, underweight silver shillings, addressed only before the end of the 18th century by the acceptance of gold proxies like token silver coins and banknotes. From the more widespread acceptance of paper money in the 19th century emerged the gold bullion standard, a system where gold coins do not circulate, but authorities like central banks agree to exchange circulating currency for gold bullion at a fixed price. First emerging in the late 18th century to regulate exchange between London and Edinburgh, it was noted how such a standard became the predominant means of implementing the gold standard internationally in the 1870s. Restricting the free circulation of gold under the Classical Gold Standard period from the 1870s to 1914 was also needed in countries which decided implement the gold standard while guaranteeing the exchangeability of huge amounts of legacy silver coins into gold at the fixed rate (rather than valuing publicly-held silver at its depreciated value).

Here in the United Kingdom the English pound sterling, introduced around the year 800 CE, was initially a silver standard unit worth 20 shillings or 240 silver pennies. The latter initially contained 1.35 g fine silver, reducing by 1601 to 0.464 g, hence giving way to the shilling (12 pennies) of 5.57 g fine silver. The problem of clipped, underweight silver pennies and shillings was a persistent, unresolved issue from the late 17th century to the early 19th century. In 1717 the value of the gold guinea (of 7.6885 g fine gold) was fixed at 21 shillings, resulting in a gold-silver ratio of 15.2 higher than prevailing ratios in Continental Europe. Great Britain was therefore ‘de jure’ under a bimetallic standard with gold serving as the cheaper and more reliable currency compared to clipped silver and full-weight silver coins did not circulate but went to Europe where 21 shillings fetched over a guinea in gold. Several factors helped extend the British gold standard into the 19th century, namely the Brazilian Gold Rush of the 18th century supplying significant quantities of gold to Portugal and Britain, with Portuguese gold coins also legal tender in Britain. Also ongoing trade deficits with China (which sold to Europe but had little use for European goods) drained silver from the economies of most of Europe. Combined with greater confidence in banknotes issued by the Bank of England, it opened the way for gold as well as banknotes becoming acceptable currency in lieu of silver. In addition was the acceptability of token or subsidiary silver coins as substitutes for gold before the end of the 18th century. Initially issued by the Bank of England and other private companies, permanent issuance of subsidiary coinage from the Royal Mint commenced after the Great Recoinage of 1816.

The British gold sovereign or £1 coin was the pre-eminent circulating gold coin during the classical gold standard period.

Following the Napoleonic Wars, Britain legally moved from the bimetallic to the gold standard in the 19th century in several steps, when the 21-shilling guinea was discontinued in favour of the 20-shilling gold sovereign or £1 coin. From the second half of the 19th century Britain then introduced its gold standard to Australia, New Zealand, and the British West Indies in the form of circulating gold sovereigns as well as banknotes that were convertible at par into sovereigns or Bank of England banknotes. The classical gold standard of the late 19th century was not merely a superficial switch from circulating silver to circulating gold. The bulk of silver currency was actually replaced by banknotes and token currency whose gold value was guaranteed by gold bullion and other reserve assets held inside central banks. In turn, the gold exchange standard was just one step away from modern flat currency, with banknotes issued by central banks and whose value is secured by the bank’s reserve assets, but whose exchange value is determined by the monetary policy of the central bank and its objectives on purchasing power in lieu of a fixed equivalence to gold. The final chapter of the classical gold standard ending in 1914 saw the gold exchange standard extended to many Asian countries by fixing the value of local currencies to gold or to the gold standard currency of a Western colonial power. The Netherlands East Indies guilder was the first Asian currency pegged to gold in 1875 via a gold exchange standard which maintained its parity with the gold Dutch guilder. International monetary conferences were called up before 1890, with various countries actually pledging to maintain the ‘limping’ standard of freely circulating legacy silver coins in order to prevent the further deterioration of the gold–silver ratio which reached 20 in the 1880s. However, after 1890 the decline in the price of silver could not be prevented further and the gold–silver ratio rose sharply above 30. In 1893 the Indian rupee of 10.69 g fine silver was fixed at 16 British pence (or £1 = 15 rupees; gold-silver ratio 21.9), with legacy silver rupees remaining legal tender. Nearly similar gold standards were implemented in Japan in 1897, in the Philippines in 1903 and in Mexico in 1905 when the previous yen or peso of 24.26 g silver was redefined to approximately 0.75 g gold or half a United States dollar (ratio 32.3). Japan gained the needed gold reserves after the Sino-Japanese War of 1894–1895. For Japan, moving to gold was considered vital for gaining access to Western capital markets. Governments with insufficient tax revenue suspended convertibility repeatedly in the 19th century., however the real test came with the onset of World War I. The gold specie standard came to an end in the United Kingdom and the rest of the British Empire with the outbreak of that war. A run on sterling caused Britain to impose exchange controls that fatally weakened the standard, convertibility was not legally suspended but gold prices no longer played the role that they did before. In financing the war as well as abandoning gold, many of the contributors suffered drastic inflations. Price levels doubled in the United States and Britain, tripled in France and quadrupled in Italy. Exchange rates changed less, even though European inflation rates were more severe than America and this meant that the cost of American goods decreased relative to those in Europe. Between August 1914 and spring of 1915, the dollar value of U.S. exports tripled and its trade surplus exceeded $1 billion for the first time. Ultimately, the system could not deal quickly enough with the large deficits and surpluses. This was previously attributed to downward wage rigidity brought about by the advent of unionised labour, but is now considered as an inherent fault of the system that arose under the pressures of war and rapid technological change. In any event, prices had not reached equilibrium by the time of the Great Depression which served to kill off the system completely.

The gold specie standard ended in the United Kingdom and the rest of the British Empire at the outbreak of World War I, when Treasury notes replaced the circulation of gold sovereigns and gold half sovereigns. Except legally, the gold specie standard was not abolished. The end of the gold standard was successfully effected by the Bank of England through appeals to patriotism urging citizens not to redeem paper money for gold specie. It was only in 1925, when Britain returned to the gold standard in conjunction with Australia and South Africa, that the gold specie standard was officially ended. The British Gold Standard Act 1925 both introduced the gold bullion standard and simultaneously repealed the gold specie standard and the new standard ended the circulation of gold specie coins. Instead, the law compelled the authorities to sell gold bullion on demand at a fixed price, but ‘only in the form of bars containing approximately four hundred troy ounces (12kg) of fine gold’. The pound left the gold standard in 1931 and a number of currencies of countries that historically had performed a large amount of their trade in sterling were pegged to sterling instead of to gold. The Bank of England took the decision to leave the gold standard abruptly and unilaterally. Many other countries followed Britain in returning to the gold standard, leading to a period of relative stability but also deflation. This state of affairs lasted until the Great Depression from 1929 to 1939 and forced countries off the gold standard. In the summer of 1931, a Central European banking crisis led Germany and Austria to suspend gold convertibility and impose exchange controls as a run on Austria’s largest commercial bank had caused it to fail. The run spread to Germany, where the central bank also collapsed. International financial assistance was too late and in July 1931 Germany adopted exchange controls, followed by Austria in October. The Austrian and German experiences, as well as British budgetary and political difficulties, were among the factors that destroyed confidence in sterling, which occurred in mid-July 1931. Runs ensued and the Bank of England lost much of its reserves. On September 19, 1931, speculative attacks on the pound led the Bank of England to abandon the gold standard, ‘ostensibly temporarily’. However, the ostensibly temporary departure from the gold standard had unexpectedly positive effects on the economy, leading to greater acceptance of departing from the gold standard. Loans from American and French central banks of £50 million were insufficient and exhausted in a matter of weeks, due to large gold outflows across the Atlantic. The British benefited from this departure. They could now use monetary policy to stimulate the economy. Australia and New Zealand had already left the standard and Canada quickly followed suit. The interwar partially-backed gold standard was inherently unstable because of the conflict between the expansion of liabilities to foreign central banks and the resulting deterioration in the Bank of England’s reserve ratio. France was then attempting to make Paris a world class financial centre, and it received large gold flows as well. Upon taking office in March 1933, U.S. President Franklin D. Roosevelt departed from the gold standard and by the end of 1932, it had been abandoned as a global monetary system. Finally Czechoslovakia, Belgium, France, the Netherlands and Switzerland abandoned the gold standard in the mid-1930s. So it was ended many years ago. Much has been written subsequently about the gold standard, but one economist seems to have summed it up by saying “We don’t have the gold standard. It’s not because we don’t know about the gold standard, it’s because we do.”

This week…

Let there be spaces in your togetherness, and
Let the winds of the heavens dance between you.
Love one another but make not a bond of love;
Let it rather be a moving sea between the shores of your souls.
Fill each other’s cup but drink not from one cup.
Give one another of your bread but eat not from the same loaf.
Sing and dance together and be joyous,
But let each one of you be alone, even as the strings of a lute are alone
Though they quiver with the same music.
Give your hearts, but not into each other’s keeping.
For only the hand of Life can contain your hearts.
And stand together, yet not too near together.
For the pillars of the temple stand apart,
And the oak tree and the cypress grow not in each other’s shadow.

~ Khalil Gibran (06 January 1883 – 10 April 1931)

Click: Return to top of page or Index page


This substance is a chemical element with the symbol “Au”, from the Latin ‘aurum’. It is a bright, slightly orange-yellow, dense, soft, malleable and ductile metal in a pure form. It is also one of the least reactive chemical elements and is solid under standard conditions. Gold often occurs in its elemental or native form as nuggets or grains in rocks, veins and alluvial deposits. It occurs in a solid solution series with the native element silver (as electrum), naturally alloyed with other metals like copper and palladium and mineral inclusions such as within pyrite. It occurs less commonly in minerals as gold compounds, often with tellurium (gold tellurides). Gold is resistant to most acids, though it does dissolve in aqua regia, a mixture of nitric acid and hydrochloric acid, forming a soluble, but it is insoluble in nitric acid, which dissolves silver and base metals, a property long used to refine gold and confirm the presence of gold in metallic substances, giving rise to the term ‘acid test’. Gold dissolves in alkaline solutions of cyanide, which are used in mining and electroplating. It dissolves in mercury forming amalgam alloys, and as the gold acts simply as a solute, this is not a chemical reaction. A relatively rare element, gold is classed as a precious metal that has been used for coinage, jewellery and other arts throughout recorded history. In the past, a gold standard was often implemented as a monetary policy but gold coins ceased to be minted as a circulating currency in the 1930s, and the world gold standard was abandoned for a flat currency system after 1971. In 2017 the world’s largest gold producer by far was China, with 440 tonnes per year and as of 2020, a total of around 201,296 tonnes of gold exists above ground. This is equal to a cube with each side measuring roughly 21.7 metres (71ft). The world consumption of new gold produced is about 50% in jewellery, 40% in investments and 10% in industry. The high malleability, ductility, resistance to corrosion and most other chemical reactions and conductivity of electricity of gold has led to its continued use in corrosion-resistant electrical connectors in all types of computerised devices, its chief industrial use. It is also used in infra-red shielding, coloured glass production, gold-leafing and tooth restoration. Certain gold salts are still used as anti-inflammatories in medicine. A gold nugget of 5mm (0.20in) in size can be hammered into a gold foil of about 0.5 square metres, 5.4 square feet in area. Gold can be drawn into a wire of single-atom width, and then stretched considerably before it breaks. Such nanowires distort via formation, reorientation and migration of dislocations and crystal twins without noticeable hardening. Gold leaf can be beaten thin enough to become semi-transparent and the transmitted light appears greenish blue, because gold strongly reflects yellow and red. Such semi-transparent sheets also strongly reflect infra-red light, making them useful as infrared (radiant heat) shields in visors of heat-resistant suits, and in sun-visors. Whilst most metals are grey or silvery white, gold is slightly reddish-yellow, the colour determined by the frequency of plasma oscillations among the metal’s valence electrons, in the ultraviolet range for most metals but in the visible range for gold due to relativistic effects affecting the orbitals around gold atoms. Similar effects impart a golden hue to metallic caesium. Common coloured gold alloys include the distinctive eighteen-carat rose gold, which is created by the addition of copper. Also alloys containing palladium or nickel are important in commercial jewellery as these produce white gold alloys. Fourteen-carat gold-copper alloy is nearly identical in colour to certain bronze alloys, and both may be used to produce police and other badges. Fourteen and eighteen-carat gold alloys with silver alone appear greenish-yellow and are referred to as green gold, whilst blue gold can be made by alloying it with iron and purple gold can be made by alloying with aluminium. Although less common, the addition of manganese, indium and other elements can produce more unusual gold colours for various applications. The possible production of gold from a more common element such as lead has long been a subject of human enquiry, and the ancient and medieval discipline of alchemy often focussed on it. However, the transmutation of the chemical elements did not become possible until the understanding of nuclear physics in the 20th century. It can be manufactured in a nuclear reactor, but doing so is highly impractical and would cost far more than the value of the gold that is produced. Medicinal applications of gold and its complexes have a long history dating back thousands of years and several gold complexes have been applied to treat rheumatoid arthritis. Also some of its compounds have been investigated as possible anti-cancer drugs.

Gold is thought to have been produced in supernova nucleosynthesis and from the collision of neutron stars, therefore being present in the dust from which the Solar System was formed. Because the Earth was molten when it was formed, almost all of the gold present in the early Earth probably sank into the planetary core. Therefore, most of the gold that is in the Earth’s crust and mantle has, according to one theory, thought to have been delivered to Earth later by asteroid impacts about 4 billion years ago. The gold which is reachable by humans has therefore been associated with a particular asteroid impact and the asteroid that formed the Vredefort crater around two billion years ago is often credited with seeding the Witwatersrand basin in South Africa with the richest gold deposits on earth. However, this scenario is now questioned, as these gold-bearing rocks were laid down between 700 and 950 million years before the Vredefort impact. These gold-bearing rocks had also been covered by a thick layer of Ventersdorp lavas and the Transvaal Supergroup of rocks before the meteor struck, and thus the gold did not actually arrive in the asteroid/meteorite. What the Vredefort impact achieved, however, was to distort the Witwatersrand basin in such a way that the gold-bearing rocks were brought to the present erosion surface in Johannesburg, just inside the rim of the original 300km (190 mile) diameter crater caused by the meteor strike. It was the discovery of the deposit in 1886 that launched the Witwatersrand Gold Rush. Some 22% of all the gold that is ascertained to exist today on Earth has been extracted from these rocks. However, besides that much of the rest of the gold on Earth is thought to have been incorporated into the planet since its very beginning, as planetesimals formed the planet’s mantle early in Earth’s creation. In 2017 an international group of scientists established that gold ‘came to the Earth’s surface from the deepest regions of our planet’, the mantle, and this is said to have been evidenced by their findings at the Deseado Massif in the Argentinian region of Patagonia. Perhaps surprisingly, the world’s oceans also contain gold and measured concentrations estimate that they would hold 15,000 tonnes. A number of people have claimed to be able to economically recover gold from sea water, but they were either mistaken or acted in an intentional deception. There was one man who ran a gold-from-seawater swindle in the United States in the 1890s, as did an English fraudster in the early 1900s. Another man did research on the extraction of gold from sea water in an effort to help pay Germany’s reparations following World War I and based on the published values of gold in seawater a commercially successful extraction seemed possible. But after analysis of 4,000 water samples, it became clear that extraction would not be possible and he ended the project.

Grave offerings on exposition in the Varna museum, Bulgaria, thought to be the oldest golden artefacts in the world (4600 BC – 4200 BC).

The earliest recorded metal employed by humans appears to be gold. Small amounts of natural gold have been found in Spanish caves used during the late Palaeolithic period, c. 40,000 BC. The oldest gold artefacts in the world are from Bulgaria and date back to around 4,600 BC to 4,200 BC, such as those found in the Varna Necropolis near Lake Varna and the Black Sea coast, thought to be the earliest ‘well-dated’ finding of gold artefacts in history. Such items probably made their first appearance in Ancient Egypt at the very beginning of the pre-dynastic period, at the end of the fifth millennium BC and the start of the fourth, and smelting was developed during the course of the 4th millennium. The oldest known map of a gold mine was drawn in the 19th Dynasty of Ancient Egypt (1320–1200 BC), and the first written reference to gold was recorded in the 12th Dynasty around 1900 BC. Egyptian hieroglyphs from as early as 2600 BC describe gold and one of the earliest known maps, known as the Turin Papyrus Map, shows the plan of a gold mine in Nubia together with indications of the local geology. Large mines were also present across the Red Sea in what is now Saudi Arabia. Gold is mentioned frequently in the Old Testament of the Bible, starting with Genesis. In the New Testament it is included with the gifts of the Magi in the first chapters of Matthew. The book of Revelation describes the city of New Jerusalem as having streets ‘made of pure gold, clear as crystal’. Exploitation of gold in the south-east corner of the Black Sea is said to date from the time of King Midas and this gold was important in the establishment of what is probably the world’s earliest coinage in Lydia, around 610 BC. The legend of the Golden Fleece, dating from eighth century BCE may refer to the use of fleeces to trap gold dust from deposits in the ancient world. In Roman metallurgy, new methods for extracting gold on a large scale were developed from 25 BC onwards. The European exploration of the Americas was fuelled in no small part by reports of the gold ornaments displayed in great profusion by Native American peoples. The Aztecs regarded gold as the product of the gods, calling it literally ‘god excrement’ but after Moctezuma II was killed, much of this gold was shipped to Spain. However, for the indigenous peoples of North America gold was considered useless and they saw much greater value in other minerals which were directly related to their use, such as obsidian, flint and slate. Gold has played a role in western culture as a cause for desire and of corruption, for example in children’s fables where Rumpelstiltskin turns hay into gold for the peasant’s daughter in return for her child when she becomes a princess, and the stealing of the hen that lays golden eggs in Jack and the Beanstalk. The top prize at the Olympic Games and many other sports competitions is the gold medal. The main goal of alchemists has been to produce gold from other substances such as lead, perhaps by the interaction with a mythical substance called the philosopher’s stone. Trying to produce gold led the alchemists to systematically find out what can be done with substances and this laid the foundation for today’s chemistry.

Minoan jewellery from 2300–2100 BC in the Metropolitan Museum of Art, New York City.

Apart from chemistry, gold is mentioned in a variety of expressions, most often associated with intrinsic worth. As already mentioned, great achievements are frequently rewarded with gold in the form of medals as well as trophies and other decorations. Winners of athletic events and other graded competitions are usually awarded a gold medal. Many awards such as the Nobel Prize are made from gold. Other award statues and prizes are depicted in gold or are gold-plated, such as the Academy Awards, the Golden Globe Awards, the Emmy awards and the British Academy of Film and Television Awards (BAFTA) .Gold is associated with the wisdom of ageing and fruition, hence the fiftieth wedding anniversary is golden. A person’s most valued or most successful latter years are sometimes considered their ‘golden years’ and the height of a civilisation is referred to as a golden age. In some religions gold has been associated both with holiness and evil, for example in the Bible’s Book of Exodus the Golden Calf is a symbol of idolatry, whilst in the Book of Genesis Abraham was said to be rich in gold and silver, also Moses was instructed to cover the Mercy Seat of the Ark of the Covenant with pure gold. In Islam gold, along with silk, is often cited as being forbidden for men to wear. Wedding rings are typically made of gold as it is long lasting and unaffected by the passage of time and may aid in the ring symbolism of eternal vows before God and the perfection the marriage signifies. In August 2020, Israeli archaeologists discovered a trove of early Islamic gold coins near the central city of Yavneh, Israel and analysis of the extremely rare collection of 425 gold coins indicated that they were from the late 9th century.

Golden coins from the Scandinavian Monetary Union. To the left is Swedish and the right is Danish.

Gold has been widely used throughout the world as money, for efficient indirect exchange as opposed to bartering and to store wealth in hoards. For exchange purposes, mints produce standardised gold bullion coins, bars and other units of fixed weight and purity. The first known coins containing gold were struck in Lydia, Asia Minor, around 600 BC. The talent coin of gold in use during the periods of Grecian history both before and during the time of the life of Homer weighed between 8.42 and 8.75 grams. From an earlier preference in using silver, European economies re-established the minting of gold as coinage during the thirteenth and fourteenth centuries. In preparation for World War I the warring nations moved to fractional gold standards, inflating their currencies to finance the war effort. Post-war, the victorious countries, most notably Britain, gradually restored gold-convertibility, but international flows of gold via bills of exchange remained embargoed and international shipments were made exclusively for bilateral trades or to pay war reparations. After World War II, gold was replaced by a system of nominally convertible currencies related by fixed exchange rates following the Bretton Woods system of monetary management, which established the rules for commercial and financial relations among the countries of the United States, Canada, Australia, Japan and Western European countries after the 1944 Bretton Woods Agreement. This system was the first example of a fully negotiated monetary order intended to govern monetary relations among independent states. It required countries to guarantee convertibility of their currencies into U.S. dollars to within 1% of fixed parity rates, with the dollar convertible to gold bullion for foreign governments and central banks at 35 US dollars per troy ounce of fine gold, or 0.88867 gram fine gold per dollar. It also envisioned greater cooperation among countries in order to prevent future competitive devaluations and thus established the International Monetary Fund (IMF) to monitor exchange rates and lend reserve currencies to nations with balance of payments deficits. I must admit to smiling when I first came across the name ‘Bretton Woods system’, as I was brought up in Peterborough and a township of that fine city is named Bretton, where you will find the Bretton Woods Community School! But I am certain they played no part in establishing this monetary system. But back to the story. Gold standards and direct convertibility of currencies to gold have been abandoned by world governments, led in 1971 by the United States’ refusal to redeem its dollars in gold. Flat currency now fills most monetary roles. Switzerland was the last country to tie its currency to gold, it backed 40% of its value until the Swiss joined the IMF in 1999. Central banks continue to keep a portion of their liquid reserves as gold in some form, and metals exchanges such as the London Bullion Market Association still clear transactions denominated in gold, including future delivery contracts. Today, gold mining output is declining, so with the sharp growth of economies in the 20th century along with increasing foreign exchange, the world’s gold reserves and their trading market have become a small fraction of all markets, and fixed exchange rates of currencies to gold have been replaced by floating prices for gold. Though the gold stock grows by only 1 or 2% per year, very little metal is irretrievably consumed. Inventory above ground would satisfy many decades of industrial and even artisan uses at current prices. The gold proportion or fineness of alloys is measured by carat, with pure gold (commercially termed ‘fine’ gold) designated as 24 carat. English gold coins intended for circulation from 1526 into the 1930s was typically a standard 22-carat alloy called crown gold for hardness, whilst American gold coins for circulation after 1837 contain an alloy of 0.900 fine gold, or 21.6 carat. Only 10% of the world consumption of new gold produced goes to industry, but by far the most important industrial use for new gold is in fabrication of corrosion-free electrical connectors in computers and other electrical devices. Gold is a valuable commodity.

A mirror for the James Webb Space Telescope, coated in gold to reflect infrared light.

This week…
You might already know that the collective noun for crows is a murder and for lapwings it is a deceit. You might even be aware that for hawks it is an aerie. But an ambush is the collective term for both tigers and widows!

Click: Return to top of page or Index page

Managing Change

We don’t realise it at the time but when we are very young many, perhaps all of us, consider that we are the centre of the universe and that everything revolves around us. We demand attention, we want everything immediately. So, we shout and scream when that does not occur. I think that in the majority of cases, as we grow we learn that we aren’t the only one around and as a result we cannot have just what we want exactly when we want it. That is very true if we have siblings, especially ones older than ourselves. Gender can also play its part! However, there are some who get spoiled and that becomes even more apparent as time passes when they begin to interact with others. Also the longer it takes for them to realise how life really is, the harder it can become for them to change their ways. Sadly, some never do. What seems to make it worse though is when the selfish person tries to turn things round and make others feel like it is their fault and that they should be the ones to change their ways! But sadly that is the classic behaviour of a narcissist. Some may give in to them and accept that way of life, but it can be damaging and ruin their lives. There are even those who have turned to extreme violence as they could see no other way for their situation to end. Happily there are a number of groups specifically designed to help folk in these circumstances by such things as counselling, in fact simply talking to someone who will listen and empathise, letting the person see that their problems can be overcome makes a real difference. They know they are not alone. I remember a lovely tv advert from years ago which featured a child who said “and when I grow up, my mummy says I’m going to be a proper little madam!”. We must surely have all met or seen people just like that and who become extremely selfish in their ways, with absolutely no thought or consideration for others. So it can be difficult to cope with such people and there are those, who despite knowing it isn’t probably in their best interests, stay with and almost ‘accept’ such folk. It may be that there is a fear of the unknown in some, as we know the old saying ‘better the devil you know than the devil you don’t’, but sometimes it can become just too much. It can be a change in your own capabilities, or perhaps in that of your partner that causes us to recognise the need to change. I know of someone who married a big, strong man and they lived together for many years, had children and grandchildren, all seemed well. Then the man became ill and was no longer big and strong, ultimately forced into giving up work. The two finally separated and divorced as his wife could not accept the change which had occurred in her husband. I have managed many changes in my own life, from coping with a muscular disability since birth, then epilepsy and later a heart problem. So I take a few tablets every day, I am very well cared for and more especially I am alive to tell others that it is eminently possible to manage change. It is easier with a positive attitude, recognising and being thankful for the not so good as well as the good because, as has been said many times, ‘falling down is often easy – it is the getting up again that can be difficult’. It also fascinating to me how attitudes have changed over the years regarding such things as disability. I worked in an office for very many years, I have said before that my work all too often involved filling in forms and quite a few of my work colleagues were absolutely delighted when computers were installed – it meant I was able to type, as being left-handed my writing was and still is nowhere near the best! Also I can only use my left hand to type but I could and I still do so fairly quickly. Not only that, but modern programs tend to include an auto-correct feature, though that can be a hindrance rather than a help at times! So before sending out each weekly blog post, I read through it carefully as if I were a stranger. I think perhaps what helps me there comes from my few years spent in a telephone directory compilation team, where we used to hand-write entries on computer cards and include simple computer code so that the computers could recognise the difference between certain letters and numbers – now that was a challenge. Most especially, we would check the results printed by the computer every week and when it came to the final checks before the directories went to final print once a year, only a very limited number of changes were allowed on the final draft! But years later I was chatting to a former work colleague who admitted they had no knowledge of my physical disabilities. Happily the years have passed and attitudes have changed, so others have learned to accept me for exactly who and what I am today.

I have said before that change is all around us, every second of the day. As I was growing up at home I would see that my mother was worried about this or that and I would politely ask her what she was worried about. Quite often it was about something in the future over which she had no control, so I would ask why she was fretting about such things. To me, such worry is like spending life in an empty room sitting in a rocking chair, going back and forth. There is action to be sure, but no achievement. With each of the generations seeds are planted in all things, then they grow and many bear fruit which feed others. Thunderstorms occur, lightning may strike trees and create a fire which can burn parts of a forest, but when that happens seeds fall and new trees slowly grow. It is a cycle of life which continues. I learned a little while ago of a man who was having difficulty organising people to get to a particular place on time, I believe it was getting equipment for a concert, something like that. He had been taught all about geography and map reading, with coordinates, Northings and Eastings but to this man it was so very complicated. So he talked to a friend and they came up with the idea of dividing the whole world up into individual three-metre squares, so each one had a simple three-word name. As a result, we now have What3words, that is described officially as a ‘proprietary geocode system’ which has been designed to identify any location with a resolution of about 3 metres (9.8ft). It is owned by What3words Limited, based in London. The system encodes geographic coordinates into three permanently fixed dictionary words. For example, the front door of 10 Downing Street, London is identified by the code ///indoor.myself.rather and can be made into a weblink by altering the code slightly to So the three words do not change, just the prefix. This has been proven to be extremely useful finding folk who are perhaps halfway up a mountain, in fact because the English version works with the world’s oceans as well, emergency services can use it to find anyone, anywhere. But even a simple thing like meeting a friend at the entrance to a stadium or maybe a caravan on a large campsite can be, I am sure, really useful. The important point is that What3words differs from most location encoding systems in that it uses words rather than strings of numbers or letters, and the pattern of this mapping is not obvious, also the algorithm mapping locations to words is proprietary and protected by copyright. The company has a website, apps for Apple iOS and Android, and an application programming interface (API) which easily converts between postal addresses, What3words addresses and latitude/longitude coordinates. The system divides the world into a grid of 57 trillion 3-by-3-metre squares, each of which has a three-word address and the addresses are available in around fifty languages. Translations are not direct, as direct translations to some languages could produce more than three words. Rather, territories are localised considering linguistic sensitivities and nuances. Each What3words language uses a list of 25,000 words (40,000 in English, as it covers sea as well as land). The lists are manually checked to remove homophones and offensive words. The company states that densely populated areas have strings of short words due to more frequent usage, whilst less populated areas such as the North Atlantic use more complex words. Sometimes the simplest of things can be the best of ideas. As many of you know, some years ago I was able to go on a superb cruise holiday and as part of that cruise aboard the P&O ship ‘Arcadia’, each day the position, course and speed was given. I have now converted all of the daily latitude and longitude details into What3words and one example is which links directly to a three metre square on board a cruise ship in the bay close to Akaroa, New Zealand. The link opens a web page with various options, including different views and sharing options. Alternatively, if I arrange to meet someone in Birmingham, perhaps by an entrance to the Symphony Hall I would share the link It can also be useful finding a car in a car park, maybe like the three words pull.bids.push, which can be seen on a web page as It would even work in a multi-storey one, I would just need to remember which level I was parked on! I could say the three words to a car Satnav, as a few have this feature now, or over the phone or text the link to a friend. I think that I will try and use this facility. I don’t often advertise, but on this occasion I think this is worth sharing. Having said that, you could already be using What3words. But just in case not…

As I have already mentioned, we live in a constantly changing world and it is, generally, our choice as to how we manage that change. But then sometimes that adjustment is forced upon us by a change in outside circumstances. I know of one particular man here in England who got married and they had children, but then their circumstances changed. Years later the man married again and this time he and his new wife had several children in a fairly short space of time. Meanwhile around them the world was still turning and the government of the time decided to ask its residents if they should either be staying in the European Union, or leave it. As a result, a referendum was put to the people of the country. It was called ‘Brexit’, a portmanteau of ‘British exit’ and resulted in the withdrawal of the United Kingdom from the European Union at 23:00 GMT on 31 January 2020. It meant that at that time, the UK was the only sovereign country to have left the EU. The UK had been a member state of the union and its predecessor the European Communities (EC) since 1 January 1973. Following Brexit, EU law and the Court of Justice of the European Union no longer had privacy over British laws, except in certain areas relating to Northern Ireland. As many expected, not everyone was happy with the decision to leave the EU, but the decision was reached by a majority vote. However, there are still people who continue to moan and complain that it was the wrong choice. I believe that the man who I mentioned earlier has stated how wrong he thought it was and he continues to moan, but I hope he will be educating all his children that despite all our hopes, dreams and wishes our lives may not always work out quite as we would have wished. I am reminded of the sorry tale about the young man who, having recently passed his driving test, went on a drinking spree to celebrate but then, whilst drunk, drove his father’s car at excessive speed and whilst he survived the subsequent crash he killed his best friend. Despite our best efforts, we make mistakes and must live with the consequences, no matter what they may be. I am presently living in a Care Home as I do my best to recover from medical issues, in my case my heart has a damaged mitral valve which I have had since birth, I have a muscular weakness on my right side and I also have epilepsy. It has meant that I am unable to do certain things, but I have learned to adapt, more especially to accept the changing circumstances as I have grown older. I am in a place where a few folk have dementia and I see how they live from day to day, they are fed, they are cared for and most especially they are treated properly and with respect. I am allowed to do as much as I can for myself, to manage as best I can, but if I need help I have learned to politely ask for it. To ask for and accept help has perhaps been the hardest thing for me to do as over the years I have learned to be independent as far as possible. We see and learn change all the time with lives, even species dying out, but there is definitely an innate willingness in so many of us to survive, to continue, to learn and to better ourselves. Yes, change continues and who knows what will occur on Earth in years to come. So I do my best to learn from the past, live in the present and look to the future with a smile. Which reminds me of a lovely quote, with I have included below.

A quote by Srinivas Arka.

This week…
The other day I happened to watch a clip from the tv series ‘Yes, Prime Minister’, which to me was extremely entertaining. The prime minister had come up with what he thought was “a brilliant idea, a real vote-winner”, as it would allow parents to choose for themselves which school they could send their children to. But Sir Humphrey Appleby was utterly appalled at the idea. In his eyes, choosing a school was a job for civil servants, as it was beyond the capability of parents! The Prime Minister then enquired who chose the school that he, Sir Humphrey, went to and with a knowing smile, Sir Humphrey replied “Oh, my parents, naturally…”.

Click: Return to top of page or Index page

Welcome To Earth Day

In previous weeks I have researched and written quite a bit on the history of things many and various. So this week I thought about bringing in a more ‘modern’ touch. I hope you like it. Except of course to set the scene, we should perhaps first consider ourselves and our lovely Earth. Today, we know from radiometric dating that Earth is about 4.5 billion years old. Had naturalists in the 1700s and 1800s known Earth’s true age, any early ideas about evolution might have been taken more seriously. We know that life began at least 3.5 billion years ago, because that is the age of the oldest rocks with fossil evidence of life here on Earth. It is the third planet from the Sun and the only astronomical object known to harbour life, at least as we know it as carbon-based life forms. Whilst large amounts of water can be found throughout the Solar System, only Earth sustains liquid surface water. About 71% of Earth’s surface is made up of the ocean, dwarfing Earth’s polar ice, lakes, and rivers. I could go on about its chronology, including its formation, geological history, origins of life and evolution but not this time! Instead, here is some detail on what is known as Earth Day. So far as I can tell, Earth Day was first celebrated in 1970, when a United States senator from Wisconsin organised a national demonstration to raise awareness about environmental issues. Rallies took place across that country and, by the end of the year, the U.S. government had created its Environmental Protection Agency. Since then, Earth Day has become an annual event around the world on April 22nd to demonstrate support for environmental protection and includes a wide range of events coordinated globally by which was formerly the Earth Day Network. It now includes one billion people in more than a hundred and ninety-three countries and the official theme for 2022 is ‘Invest In Our Planet’, with details on the website In 1969 at a UNESCO conference in San Francisco, peace activist John McConnell proposed a day to honour the Earth and the concept of peace, to first be observed on March 21, 1970, the first day of spring in the northern hemisphere. This day in nature was later sanctioned in a proclamation written by McConnell and signed by then Secretary General U Thant at the United Nations. A month later, the United States Senator Gaylord Nelson proposed the idea to hold a nationwide environmental teach-in on April 22, 1970. He hired a young activist to be the National Coordinator and the two of them renamed the event ‘Earth Day’. The event grew beyond the original idea for a teach-in to include the entire United States, with more than 20 million people pouring onto the streets. Key non-environmentally focused partners played major roles and without them, it is likely that the first Earth Day would not have succeeded. Nelson was later awarded a Presidential Medal Of Freedom award in recognition of his work. The first Earth Day was focused on the United States, but in 1990 Denis Hayes, the original national coordinator in 1970, put it on the international stage and organised events in 141 nations. On Earth Day 2016, a landmark Paris Agreement was signed by the United States, the United Kingdom, China, and 120 other countries. This signing satisfied a key requirement for the entry into force of the historic draft Climate Protection Treaty adopted by consensus of the 195 nations present at the 2015 United Nations Climate Change Conference in Paris. Since then, numerous communities have continued to engage in Earth Day Week actions, an entire week of activities focused on the environmental issues that the world faces. On Earth Day 2020, over 100 million people around the world observed its 50th anniversary in what has been referred to as the largest online mass mobilisation in history.

But perhaps what really energised the birth of Earth Day was when, on January 28, 1969, an oil well drilled by Union Oil Platform A off the coast of Santa Barbara, California, blew out. More than three million gallons of oil spewed, killing more than 10,000 seabirds, dolphins, seals, and sea lions so as a direct reaction to this disaster, activists were mobilised to create good environmental regulation, environmental education, and Earth Day itself. There were a number of proponents of Earth Day who were in the front lines of fighting this disaster, but Denis Hayes, organiser of the first Earth Day said that Senator Gaylord Nelson from Wisconsin was inspired to create Earth Day upon seeing Santa Barbara Channel’s 800 square-mile oil slick from an aircraft. On the first anniversary of the oil blowout, January 28, 1970, Environmental Rights Day was created, and the Declaration of Environmental Rights was read. It had been written by Rod Nash during a boat trip across the Santa Barbara Channel whilst carrying a copy of Thomas Jefferson’s Declaration of Independence. The organisers of Environmental Rights Day had been working closely over a period of several months with a Republican Congressman to consult on the creation of their National Environmental Policy Act, the first of many new laws on environmental protection sparked by the national outcry about the blowout and subsequent oil spill and on the Declaration of Environmental Rights.

President Richard Nixon and First Lady Pat Nixon plant a tree on the White House South Lawn to recognise the first Earth Day.

In the winter of 1969–1970, a group of students met at Columbia University to hear Denis Hayes talk about his plans for Earth Day. The 1970s were a period of substantial environmental legislation in the U.S.A., including the Clean Air Act, Clean Water Act, Endangered Species Act, Marine Mammal Protection Act, Superfund, Toxics Substances Control Act, and the Resource Conservation and Recovery Act. It saw the creation of the Environmental Protection Agency and the banning of DDT and of lead in petrol. Jimmy Carter was president and the principal Washington, DC event was a festival held in Lafayette Park, across from the White House. It has been said that by mobilising two hundred million people in a hundred and forty-one countries and lifting the status of environmental issues onto the world stage, Earth Day activities in the early 1990’s gave a huge boost to recycling efforts worldwide and helped pave the way for the 1992 United Nations Earth Summit in Rio de Janeiro. Unlike the first Earth Day in 1970, this anniversary was waged with stronger marketing tools, greater access to television and radio, and multimillion-dollar budgets.

The official logo of the Mount Everest Earth Day 20 International Peace Climb.

The Earth Day 20 Foundation highlighted its April 22 activities with a live satellite phone call to members of the historic Earth Day 20 International Peace Climb who called from their base camp on Mount Everest to pledge their support for world peace and attention to environmental issues. The climb was led by Jim Whittaker, the first American to summit Mount Everest many years earlier and marked the first time in history that mountaineers from the United States, the Soviet Union and China had roped together to climb a mountain, let alone Mount Everest. The group also collected more than two tons of rubbish which was transported down the mountain by support groups along the way that was left behind on Mount Everest from previous climbing expeditions. Warner Bros records released an Earth Day-themed single in 1990 entitled ‘Tomorrow’s World’ and the song featured vocals from various artists. It reached number seventy-four on the ‘Hot Country Songs’ chart dated May 5, 1990. As the millennium approached, another campaign was begun, this time focusing on global warming and pushing for cleaner energy. The April 22 Earth Day in 2000 combined the big-picture feistiness of the first Earth Day with the international grassroots activism of Earth Day 1990. For 2000, Earth Day had the internet to help link activists around the world and by the time the day came around, some five thousand environmental groups world-wide were on board, reaching out to hundreds of millions of people in a record one hundred and eighty-four countries. Events varied, with a ‘talking drum’ chain travelling from village to village in Gabon, Africa, whilst hundreds of thousands of people gathered on the National Mall in Washington, D.C., USA. Google’s first Earth Day doodle was in 2001 and the theme for Earth Day 2003 was the Water for Life Campaign. That year, Earth Day Network developed a water quality project called “What’s in Your Water?”. Other water-related events were held on every continent, such as water workshops, exhibitions, concerts, and more in many countries. Educational curricula, teacher’s guides, water testing kits, and posters focused on water. Many other organisations also focused on environmental justice, created events concentrating on low-income communities. These events also worked on building support among low-income communities through clean-ups, park revitalisation and town halls focussing on integrating the environmental movement with community and social justice causes. Since then, Earth Day has been celebrated throughout the world in many and various ways. Over the following years such things as registering voters, major tree planting, healthy environments for children were done. Earth Day 2006 focused on science and faith and expanded into Europe, with events and speeches held in most of the EU countries. Key events included the ‘Festival on Climate Change’ in Utrecht, the Netherlands, which was focused on how to break away from the oil dependence and this included Earth Day founder Denis Hayes and members of the Dutch and E.U. parliament, local authorities, and media representatives. In the first of two years of Earth Day events in Ukraine, Denis Hayes also attended and spoke at the ‘Chernobyl 20 Remembrance for the Future’ conference in Ukraine. That year also saw events in China organised between Earth Day Network and Global Village Beijing, educating communities about energy savings along with the first-ever coordinated Earth Day events in Moscow, Russia, a scientific panel and a religious response panel on climate change throughout the U.S., and a ‘Conserve Your Energy’ event in Philadelphia. Thousands of Earth Day projects have been held across the globe that ranged from energy efficiency events, protests, letter writing campaigns, civic and environmental education trainings, urban and rural cleanups and water projects with a particular focus on building a broader and more diverse environmental movement.

On Earth Day 2010, its fortieth anniversary, an estimated one billion people around the world took part. This included action on climate change and other environmental issues through climate rallies and by engaging civil leaders in plans to build a greener economy. Through a Global Day of Conversation, more than 200 elected officials in more than 39 countries took part in active dialogues with their constituents about their efforts to create sustainable green economies and reduce their carbon footprints. Students around the world participated in school events, featuring community clean-ups, solar energy systems, school gardens, and environmental curriculum. Earth Day Network announced a partnership with Twentieth Century Fox Home Entertainment’s Avatar Home Tree Initiative to plant one million trees in 15 countries by the end of the year. Also, as part of a nationwide commemoration of the fortieth anniversary in Morocco, the government announced a unique National Charter for the Environment and Sustainable Development, the first commitment of its kind in Africa and the Arab world, which will inform new environmental laws for the country. The Kingdom of Morocco also pledged to plant one million trees. Since then, each Earth Day work has continued. The Earth Day Network completed a project to plant over 1.1 million trees, across the globe more than 100 million ‘Billion Acts of Green’ were registered. In September 2011, at the Clinton Global Initiative, U.S. President Clinton recognised this project as an exemplary approach to addressing global challenges. The goal of Earth Day 2014 was to dramatically personalise the massive challenges surrounding global climate change and weave that into both Earth Day 2014 and the five-year countdown to Earth Day 2020, the 50th anniversary. It was an opportunity to unite people worldwide into a common cause and call for action. Earth Day has in fact become very much a global event recognised my many nations, so it was no accident that in the United Nations, world leaders from 175 nations broke a record when they selected Earth Day 2016 to sign the Paris Agreement, the most significant climate accord in the history of the climate movement. Then in 2020, marches and gatherings were cancelled due to the COVID pandemic but still a three-day livestream event was organised, including speakers from all corners of the environmental movement such as Pope Francis, mayors from around the world, Ministers of the Environment from multiple countries and many more. Earth Day 2020 was a major topic across media platforms, including leading magazines and environmental publications. On January 5, 2020, Earth Day’s 50th anniversary year began with a full page in the Sunday New York Times, referencing a similar black and white advertisement that appeared in the Times 50 years earlier on the first Sunday in 1970. Through social media, Earth Day participants joined digital events and shared their support. Through Instagram, HRH The Prince of Wales reminded followers that nature is vital to human health and wellbeing, saying “For fifty years, since the very first Earth Day, I have dedicated a large part of my life to championing more balanced sustainable approaches whether in farming, forestry, fisheries, urban planning or corporate social responsibility. But as we look to shape the next fifty years, I very much need your help. To reflect and inspire the world to action, while aiming for a green recovery, I would ask you to join me by sharing your vision for a more sustainable future (socially, environmentally and economically) using the hashtag ReimagineReset.”

Sure We Can volunteers clean McKibbin Street, New York for Earth Day 2021.

Earth Day continues around the world, perhaps in ways unnoticed by many. For example there is a service in Brooklyn, New York called ‘Sure We Can’ which provides container-deposit redemption services to that area. Any person can come to Sure We Can during business hours and redeem New York State accepted bottles and cans. Additionally, the organisation serves as a community hub for the canner community that redeems there and for local environmental causes that promote the organisations’ dedication to sustainability. The facility is designed with canners (the people who collect cans and bottles from the streets) in mind. They aim to provide a welcoming facility so people can redeem their cans and bottles. In 2019, the centre annually processed 10 million cans and bottles for redemption and served a community of over 400 canners and Sure We Can estimate that they distribute $700,000 per year to canners. The average canner who visits Sure We Can earns $1,000 per year. Long may such initiatives continue, as large or not so large, they all make a vital difference. The Earth Day 2022 theme is ‘Invest in Our Planet’ and features five primary programmes, these being The Great Global Cleanup, Sustainable Fashion, Climate and Environmental Literacy, Canopy Project, Food and Environment, and the Global Earth Challenge. Earth Day is now observed in 192 countries and it is surely up to us all to do our part in sustaining this Earth.

This week…
Our British Saint’s Days are St David’s Day (March 1st), St Patrick’s Day (March 17th), St George’s Day (April 23rd) and St Andrew’s Day (November 30th). I was born on St. Patrick’s Day but my parents decided to give me the forenames Andrew David. Apparently a friend suggested they ought to include Patrick, but it was realised I’d then need George to complete the set and that was too much!

Click: Return to top of page or Index page


Easter is a Christian festival as well as a cultural holiday commemorating the resurrection of Jesus from the dead, as described in the New Testament of the Bible and having occurred on the third day of his burial following his crucifixion by the Romans at Calvary c. 30 AD. It is the culmination of the Passion of Jesus, preceded by Lent, a forty-day period of fasting, prayer and penance. Christians refer to the week before Easter as ‘Holy Week’, which in Western Christianity contains the days of the Easter Triduum, or the period of three days that begins with the liturgy on the evening of Maundy Thursday, reaches its high point in the Easter Vigil and closes with evening prayer on Easter Sunday. It is a moveable observance recalling the Passion, crucifixion, death, burial and resurrection of Jesus as portrayed in the canonical gospels. In Eastern Christianity, the same days and events are commemorated with the names of days all starting with “Holy” or “Holy and Great”; and Easter itself might be called “Great and Holy Pascha”, “Easter Sunday”, “Pascha” or “Sunday of Pascha”. In Western Christianity Eastertide, or the Easter Season, begins on Easter Sunday and lasts seven weeks, ending with the coming of the 50th day, Pentecost Sunday. In Eastern Christianity the Paschal season ends with Pentecost as well, but the leave-taking of the Great Feast of Pascha is on the 39th day, the day before the Feast of the Ascension. Easter and its related holidays are movable feasts, not falling on a fixed date but computed based on a lunar calendar, the solar year plus the Moon phase, similar to the Hebrew calendar. The first council of Nicaea, a council of Christian bishops, was convened in the Bithynian city of Nicaea (now Iznik, Turkey) by the Roman Emperor Constantine in 325AD and was the first effort to attain consensus in the church through an assembly representing all of Christendom. Its main accomplishments were settlement of the Christological issue of the divine nature of God the Son and his relationship to God the Father, the construction of the first part of the Nicene Creed, mandating uniform observance of the date of Easter and promulgation of early canon law. No details for the computation were specified, these were worked out in practice, a process that took centuries and generated a number of controversies. It has come to be the first Sunday after the ecclesiastical full moon that occurs on or soonest after March 21st. Even if calculated on the basis of the more accurate Gregorian calendar, the date of that full moon sometimes differs from that of the astronomical first full moon after the March equinox. Easter is linked to the Jewish Passover by its name, as pesach and pascha are the basis of the term by its origin (according to the synoptic gospels) where both the crucifixion and the resurrection took place during the Passover and by much of its symbolism, as well as by its position in the calendar. In most European languages the feast is called by the words for passover in those languages and in the older English versions of the Bible the term Easter was the term used to translate passover. Easter customs vary across the Christian world and include sunrise services, midnight vigils, exclamations and exchanges of Paschal greetings and one I had never heard of before, called ‘clipping the church’. I have learned that this is an ancient custom traditionally held only in England on Easter Monday, Shrove Tuesday or a date relevant to the saint associated with the church. The word “clipping” is Anglo-Saxon in origin and is derived from the word ‘clyppan’, meaning ‘embrace’ or ‘clasp’. So ‘clipping the church’ involves either the church congregation or local children holding hands in an inward-facing ring around the church, and can then be reversed to an outward-facing ring if a prayer for the wider world beyond the parish is said. Once the circle is completed, onlookers will often cheer and sometimes hymns are sung. Often there is dancing and after the ceremony a sermon is delivered in the church, then there are sometimes refreshments. Christians adopted this tradition to show their love for their church and the surrounding people, but currently there are only a few churches left in England that hold this ceremony, and all of these appear to honour it on a different day. Other customs include the decoration and the communal breaking of Easter eggs, a symbol of the empty tomb. The Easter lily, a symbol of the resurrection in Western Christianity, traditionally decorates the chancel area of churches Easter Day and for the rest of Eastertide. Additional customs that have become associated with Easter and are observed by both Christians and some non-Christians include Easter parades, communal dancing (in Eastern Europe), the Easter Bunny and egg hunting. There are also traditional Easter foods that vary by region and culture.

The modern English term ‘Easter’, with modern Dutch ‘ooster’ and German ‘Ostern’, developed from an Old English word that usually appears in the form ‘Ēastrun’, but also as ‘Ēostre’. Bede provides the only documentary source for the etymology of the word, in his eighth-century ‘The reckoning of Time’. He wrote that ‘Ēosturmōnaþ’ (Old English ‘Month of Ēostre’, translated in Bede’s time as ‘Paschal month’) was an English month, corresponding to April, which he says “was once called after a goddess of theirs named Ēostre, in whose honour feasts were celebrated in that month”. In Latin and Greek, the Christian celebration was, and still is, called ‘Pascha’, a word derived from Aramaic to Hebrew. The word originally denoted the Jewish festival known in English as Passover, commemorating the Jewish exodus from slavery in Egypt. The supernatural resurrection of Jesus from the dead, which Easter celebrates, is one of the chief tenets of the Christian faith. The resurrection established Jesus as the Son of God and is cited as proof that God will righteously judge the world, for those who trust in Jesus’s death and resurrection, “death is swallowed up in victory.” Any person who chooses to follow Jesus receives “a new birth into a living hope through the resurrection of Jesus Christ from the dead”. Through faith in the working of God, those who follow Jesus are spiritually resurrected with Him so that they may walk in a new way of life and receive eternal salvation, being resurrected to dwell in the Kingdom of Heaven. Easter is linked to the Passover and the exodus from Egypt as recorded in the Old Testament of the bible, through the Last Supper, the sufferings and subsequent crucifixion that preceded the resurrection. According to the three Synoptic gospels, Jesus gave the Passover meal a new meaning, as in the upper room during the Last Supper he prepared himself and his disciples for his death. He identified the bread and cup of wine as his body, soon to be sacrificed, and his blood, soon to be shed. Paul the apostle states, “Get rid of the old yeast that you may be a new batch without yeast, as you really are. For Christ, our Passover lamb, has been sacrificed”. This refers to the Passover requirement to have no yeast in the house and to the allegory of Jesus as the Paschal lamb.

In early Christianity, the first Christians were certainly aware of the Hebrew calendar. Jewish Christians, the first to celebrate the resurrection of Jesus, timed the observance in relation to Passover. Direct evidence for a more fully formed Christian festival of Pascha (Easter) begins to appear in the mid-2nd century but perhaps the earliest surviving primary source referring to Easter is a mid-2nd-century Paschal homily attributed to Melito of Sardis (the bishop of Sardis, near Smyrna in western Anatolia and a great authority in early Christianity) which characterises the celebration as a well-established one. Evidence for another kind of annually recurring Christian festival, those commemorating the martyrs, began to appear at about the same time. While martyrs’ days (usually the individual dates of martyrdom) were celebrated on fixed dates in the local solar calendar, the date of Easter was fixed by means of the local Jewish lunisolar calendar. This is consistent with the celebration of Easter having entered Christianity during its earliest, Jewish period, but does not leave the question free of doubt.

A stained-glass window depicting the Passover Lamb.

Easter and the holidays that are related to it are moveable feasts in that they do not fall on a fixed date in either the Gregorian or Julian calendars (both of which follow the cycle of the sun and the seasons). Instead, the date for Easter is determined on what is known as a lunisolar calendar similar to the Hebrew calendar. In 325AD the First Council of Nicaea established two rules, the independence of the Jewish calendar and worldwide uniformity, which were the only rules for Easter explicitly laid down by the council. No details for the computation were specified, these were worked out in practice, a process that took centuries and generated a number of controversies. In particular, the Council did not decree that Easter must fall on Sunday, but this was already the practice almost everywhere. In Western Christianity, using the Gregorian calendar, Easter always falls on a Sunday between 22 March and 25 April, within about seven days after the astronomical full moon. The following day, Easter Monday, is a legal holiday in many countries with predominantly Christian traditions. Eastern Orthodox Christians base Paschal date calculations on the Julian calendar. Because of the thirteen-day difference between the calendars between 1900 and 2099, 21 March corresponds, during the 21st century, to 3 April in the Gregorian calendar. Since the Julian calendar is no longer used as the civil calendar of the countries where Eastern Christian traditions predominate, Easter varies between 4 April and 8 May in the Gregorian calendar. Also, because the Julian ‘full moon’ is always several days after the astronomical full moon, the eastern Easter is often later, relative to the visible Moon’s phases, than western Easter. Amongst the Oriental Orthodox, some churches have changed from the Julian to the Gregorian calendar and the date for Easter, as for other fixed and moveable feasts, is the same as in the Western church. The Gregorian calculation of Easter was actually based on a method devised by a doctor from the Calabria region in Italy using the phases of the Moon and has been adopted by almost all Western Christians and by Western countries which celebrate national holidays at Easter. For the British Empire and colonies, a determination of the date of Easter Sunday using Golden Numbers and Sunday Letters was defined by the 1750 Calendar (New Style) Act with its annexe. This was designed to match exactly the Gregorian calculation.

Receiving the Holy Light at Easter.
St. George Greek Orthodox Church, Adelaide, Australia.

The above image shows the congregation lighting their candles from the new flame, just as the priest has retrieved it from the altar – note that the picture is illuminated by flash, as all electric lighting is off and only the oil lamps in front of the Iconostasis remain lit. In the 20th century, some individuals and institutions put forward changing the method of calculating the date for Easter, the most prominent proposal being the Sunday after the second Saturday in April. Despite having some support, proposals to reform the date have not been implemented. An Orthodox congress of Eastern Orthodox bishops, which included representatives mostly from the Patriarch of Constantinople and the Serbian Patriarch, met in Constantinople in 1923 where the bishops agreed to the revised Julian calendar. The original form of this calendar would have determined Easter using precise astronomical calculations based on the meridian of Jerusalem, however all the Eastern Orthodox countries that subsequently adopted the Revised Julian calendar adopted only that part of it that applied to festivals falling on fixed dates in the Julian calendar. The revised Easter computation that had been part of the original 1923 agreement was never permanently implemented in any Orthodox diocese. Here in the United Kingdom, the Easter Act of 1928 set out legislation to change the date of Easter to be the first Sunday after the second Saturday in April (or, in other words, the Sunday in the period from 9 to 15 April). However, the legislation has not been implemented, although it remains on the Statute book and could be implemented subject to approval by the various Christian churches. At a summit in Aleppo, Syria in 1997 the World Council of Churches (WCC) proposed a reform in the calculation of Easter which would have replaced the present divergent practices of calculating Easter with modern scientific knowledge taking into account actual astronomical instances of the spring equinox and full moon based on the meridian of Jerusalem, while also following the tradition of Easter being on the Sunday following the full moon. The recommended World Council of Churches changes would have sidestepped the calendar issues and eliminated the difference in date between the Eastern and Western churches. The reform was proposed for implementation starting in 2001, and despite repeated calls for reform, it was not ultimately adopted by any member body. In January 2016, Christian churches again considered agreeing on a common, universal date for Easter, whilst also simplifying the calculation of that date, with either the second or third Sunday in April being popular choices. So far, no date has yet been agreed.

Easter is seen by many as the state of new life, of rebirth and as one might expect, the egg is one such symbol. In Christianity it became associated with Jesus’s crucifixion and resurrection and the custom of the Easter egg originated in the early Christian community of Mesopotamia, who stained eggs red in memory of the blood of Christ, shed at his crucifixion. As such, for Christians, the Easter egg is a symbol of the empty tomb. The oldest tradition is to use dyed chicken eggs. In the Eastern Orthodox Church, Easter eggs are blessed by a priest both in families’ baskets together with other foods forbidden during Great Lent and alone for distribution or in church or elsewhere.

Traditional red Easter eggs for blessing by a priest.

Easter eggs are a widely popular symbol of new life among the Eastern Orthodox and the folk traditions of many Slavic countries. I have learned of a decorating process known as ‘pisanka’, a common name for an egg (usually that of a chicken, although goose or duck eggs are also used) richly ornamented using various techniques. The word ‘pisanka’ is derived from the verb ‘pisać’ which in contemporary Polish means exclusively ‘to write’ yet in old Polish meant also ‘to paint’. Originating as a pagan tradition, pisanki was absorbed by Christianity to become the traditional Easter egg and Pisanki are now considered to symbolise the revival of nature and the hope that Christians gain from faith in the resurrection of Jesus Christ. The celebrated House of Fabergé workshops created exquisitely jewelled Easter eggs for the Russian Imperial family from 1885 to 1916. A modern custom in the Western world is to substitute decorated chocolate filled with sweets. As many people give up these as their Lenten sacrifice, individuals enjoy these at Easter after having abstained from them during the preceding forty days of Lent.

Easter eggs, a symbol of the empty tomb.

Manufacturing their first Easter egg in 1875, the British chocolate company Cadbury sponsors the annual Easter egg hunt which takes place in over two hundred and fifty National Trust locations here in the United Kingdom. On Easter Monday, the President of the United States holds an annual Easter egg roll on the White House lawn for young children. In some traditions children put out empty baskets for the Easter bunny to fill whilst they sleep, they wake to find their baskets filled with chocolate eggs and other treats. Many children around the world follow the tradition of colouring hard-boiled eggs and giving baskets of sweets. One fascinating fact to me though is that since the rabbit is considered a pest in Australia, the Easter Bilby is used as an alternative. Bilbies are native Australian marsupials who are an endangered species, so to raise money and increase awareness of conservation efforts Bilby-shaped chocolates and related merchandise are sold within many stores throughout Australia as an alternative to Easter bunnies. But this time should surely be remembered as a new beginning, as it has been for centuries throughout the world. Happy Easter!

This week…
Not everyone has a home computer these days, but more and more folk find them useful as part of doing research on a range of subjects. Happily most public libraries allow folk free access to the ones they have, but time is strictly limited and use must be allocated. Sadly I can never get in to my local library, as every time I phone up they tell me they are fully ‘booked’…

Click: Return to top of page or Index page

Time Team

Many years ago I was looking through tv channels and chanced upon a show called ’Time Team’. The name was intriguing, so I sat down and watched. It fascinated me. I continued watching and I am glad I did. But sadly, after quite a few years, the tv series ended so I was delighted to see a mention of it again recently. As is my way, I did some research online and found that a fair bit had been written, especially recently and the following is what I found. I discovered some excellent images and some information on digs that were done last year as well as work expected in what I hope will be quite soon this year now. In fact ‘Time Team’ is a well-rehearsed story, but what I didn’t know was that it started as ‘Timesigns’, a four-part series which first aired in 1991. Roadford Lake, also known as the Roadford Reservoir is actually a man-made reservoir fed by the River Wolf, located to the north-east of Broadwoodwidger in West Devon, eight miles (13 km) east of Launceston. I do like the delightful village names we have in this country! This place is quite small and according to the 2001 census it had a population of just 548. Also, the reservoir is the largest area of fresh water in the southwest of England. Exploring the archaeology of this area came about after Tim Taylor approached Mick Aston to present the series and as a result, along with Phil Harding, three members of the future Time Team core were now in place. Yet despite bringing the past to life using the ingredients of excavation, landscape survey and reconstructions, including Phil felling a tree with a flint axe, Timesigns was a very different beast. In fact the four-part series is still available to watch online at and watching it now provides a lesson in just how revolutionary the Time Team format actually was. That is because Timesigns was slower paced and it had Mick talking directly to the camera in a style more akin to a history documentary or Open University broadcast, also there was a focus on interesting, previously discovered artefacts. It included Phil Harding in woodland, seeking out raw materials for a reconstructed axe and this allowed the audience to witness the hands-on practical process. It meant that viewers were placed at the heart of the action and this would later become a hallmark of Time Team. Whilst filming Timesigns, Tim and Mick often discussed other ways to bring archaeology to a television audience and what later proved to be something of a providential conversation took place in a Little Chef on the Okehampton bypass, where Mick mentioned that he had recently missed a train and, having a couple of hours to kill, decided to explore. During that time he deduced the town’s medieval layout and, struck by how much could be learned in a few hours, Tim wondered what could then be achieved in a few days. When he took this idea to various studios though, no-one wanted to know. Still, it was not the first time that a chance conversation with Mick had started someone thinking about television archaeology as a few years earlier Tony Robinson had joined a trip which Mick was leading to Santorini, a Greek island in the southern Aegean Sea about 200 km (120 miles) southeast from the mainland as this was part of some education work for Bristol University. Mick’s aptitude for breathing life into the past convinced Tony that archaeology had untapped television potential, but when he returned to Britain Tony found the studios unwilling to take the idea further. The breakthrough came when Timesigns proved an unexpected hit. Suddenly Channel 4 was receptive to the idea of a major archaeology programme, Tim Taylor devised the name ‘Time Team’ and in 1992 a pilot episode was filmed in Dorchester-on-Thames. Never screened and reputedly lost in the Channel 4 vaults, this pilot captured a show that was radically different to Timesigns and was initially seen as a quiz show in a similar vein to ‘Challenge Anneka’, where the team would be called on to solve archaeological mysteries whilst racing against the clock. Envelopes hidden at strategic points would set challenges along the lines of ‘find the Medieval high street in two hours’. Judged a misfire by Channel 4, it could have been the end. Thankfully, instead the Time Team’s format was radically overhauled although shades of the quiz-show concept did survive in early episodes. The onscreen introduction of all the team members and their specialist skills was a hangover from a time when participants would have varied from week to week, rather than coalescing into a core group but in the meantime, Tony’s role transformed from a quiz master to translator of all things archaeological for a general audience and the final piece of the jigsaw fell into place during the fledgling Time Team‘s first episode. Filmed at Athelney, site of Alfred the Great’s apocryphal burnt cakes, the site was scheduled, precluding excavation. John Gater, who was the programme’s ‘geophysics’ wizard, surveyed the field. Despite the Ancient Monuments Laboratory having drawn a blank the year before, John’s state-of-the-art kit revealed the monastic complex in startling clarity. Best of all, the cameras were rolling to capture the archaeologists’ euphoria as the geophysical plot emerged from a bulky printer in the back of the survey vehicle.

Mick Aston at work.

As well as an arresting demonstration of the power of teamwork, Athelney showed how geophysics could be the heart of the programme. As Mick Aston observed “the geophys and Time Team have always gone hand in hand. It is the programme really. Geophysics gives you that instant picture you can then evaluate”. John has kept on top of technical advances, and the results of his survey of Brancaster Roman fort provide one of the really outstanding moments in later series, with the breathtaking 3D model it produced of the buried structures persuading English Heritage to commission a complete survey on the spot. The original team brought an impressive breadth of skills to the programme. Victor Ambrus’ peerless ability to bring the past to life on the fly was well displayed after his artwork caught Tim Taylor’s eye in an edition of Readers’ Digest and the late Robin Bush brought a degree of historical expertise that would be missed almost as much as the man himself following his departure in 2003. Despite their varied talents and backgrounds it quickly became apparent that the team had a natural chemistry. Time Team became well-known for their individual ways and styles, including Mick’s famous striped jumper. Requested by a commissioning editor to wear more colourful clothing, Mick turned up in the most garish garment he could find as a joke, only to be told it was perfect. Far from a media concoction, the unique individuals on Time Team were filmed going about their work with an honesty and integrity that has seen the series heralded as Britain’s first reality television show. There can be little doubt that part of the show’s early success stems from the audience warming to the group’s genuine passion for teasing out the past. Rather than targeting the palaces and castles of the rich and famous, each of the episodes sought to solve simple, local questions. This was really highlighted by having a member of the public read out a letter of invitation at the beginning, posing the question they wanted answered. The message was simple, this is local archaeology, it is ‘your’ archaeology. It worked well, especially whenever the director of the first few seasons followed the digs as they evolved and his technique meant that viewers were often placed on the edge of a trench when discoveries happened and making them privy to key discussions. However some archaeologists were initially, quite fairly, a bit sceptical. One aspect that some treated with suspicion was the three-day deadline. Research digs usually ran for weeks if not months, and it was questioned whether anything approaching responsible archaeology could be achieved in such a short space of time. It was certainly not ideally suited to showcase all of the techniques available to modern archaeologists. Much money would be spent on scientific dating, with the results only coming back in time for a line of dialogue to be dubbed on months after filming had concluded. Coincidentally, digging within a tight timeframe was how changes were occurring within the profession. Obliged to cut evaluation trenches to meet the deadlines of multi-million pound construction projects, the 1990s saw a surge in short-term excavation projects. It led to an appreciation of just how much information could be quickly gleaned from comparatively modest trenching. The thrill of time running out also engaged viewers, and Time Team’s popularity was rewarded with increasingly longer series. Season one, aired in 1994, had four episodes, while season two followed with five, and season three then boasted six.

Some members of the Time Team.

Seasons nine to twelve have often been seen as Time Team‘s ‘golden’ age. Screening thirteen episodes a year, as well as live digs and specials the programme seemed to be ever-present. Its stars were household names and at its zenith, Time Team had regular audiences of over three million viewers. Now that the format was safely established, the programme was increasingly able to capitalise on its fame and access big name sites, even Buckingham Palace. Whilst the allure of such sites created a powerful television spectacle, it also marked a move away from the programme’s humble local archaeology origins. Even after its star began to wane, Time Team remained popular and an audience study in 2006 indicated that twenty million people watched at least one show that year. However it was season nineteen that changed everything as in 2011 the production centre for the programme moved from London to Cardiff. Very much of a political gesture aimed at building up regional television, the series was picked because it seemed a safe pair of hands. Sadly it cost the show almost all of its behind the scenes staff, expertise honed over fifteen years was lost at a stroke, to be replaced by crew and production staff who knew neither each other nor archaeology. Despite some great new people who learnt fast, expecting them to produce the same calibre of product immediately was just too big a demand. Time Team‘s cost also made it vulnerable. Towards the end of its run an average episode would cost around £200,000, a budget more on the scale of a small drama show in the eyes of television insiders but over twenty years Channel 4 had in fact pumped £4 million directly into British archaeology. It is to the Channel’s credit that it did this despite much of that outlay being channelled into post-excavation work that never appeared on-screen. The money was well spent and today only five Time Team sites remain unpublished, a record that shames many UK units and academics.

The Time Team in 2012.

Back then, Time Team’s legacy left much to celebrate. It brought the money and expertise to investigate sites that would otherwise never have been touched. The Isle of Mull episode in season seventeen is a great example of what could be discovered. With only some strange earthworks exciting the curiosity of local amateur archaeologists to go on, the programme was flexible enough to be able to take a gamble and the result was a previously unknown 5th-century monastic enclosure linked to St Columba. It enabled a local group to secure Historic Lottery Fund money to dig the site. Time Team excavations at Binchester’s Roman fort also helped kickstart a major research project. I was saddened when the series ended, but in 2021 there was excellent news when, thanks to the overwhelming support of their supporters, the Time Team returned for two brand new digs in September that year, with the episodes due to be released this year on the YouTube channel ‘Time Team Official’. This will give viewers the chance to engage as the shows are researched and developed, see live blogs during filming, watch virtual reality landscape data at home and join in Q&A’s with the team. Carenza Lewis, Stewart Ainsworth, Helen Geake and geophys genius John Gater will all be returning. They are joined by new faces representing the breadth of experts practising archaeology today. Sir Tony Robinson, who is an honorary patron, says: “I was delighted to hear about the plans for the next chapter in Time Team’s story. It’s an opportunity to find new voices and should help launch a new generation of archaeologists. While I won’t be involved in the new sites, I was delighted to accept the role of honorary patron of the Time Team project. It makes me chief super-fan and supporter. All armoury in our shared desire to inspire and stimulate interest in archaeology at all levels.” Like Tony, I too am a great fan of Time Team and feel sure that this will bode well, as there is now a Time Team website at

This week…

A Turkish proverb.

Click: Return to top of page or Index page

All Fools’ Day

More commonly known as April Fools’ Day, this is celebrated on April 1st each year and has been celebrated for several centuries by many different cultures, though its exact origins remain a mystery. Traditions include playing hoaxes or practical jokes on others, often ending the event by calling out “April Fool!” to the recipient so they realise they’ve been caught out by the prank. Whilst its exact history is shrouded in mystery, the embrace of April Fools’ Day jokes by the media and major brands has ensured the unofficial holiday’s long life. Mass media can be involved in these pranks, which may then be revealed as such on the day following. The day itself is not a public holiday in any country except Odessa in the Ukraine, where the first of April is an official city holiday. The custom of setting aside a day for playing harmless pranks upon one’s neighbour has become a relatively common one in the world and a disputed association between 1 April and foolishness is in Geoffrey Chaucer’s ‘The Canterbury Tales (1392) as in the ’Nun’s Priest’s Tale’, where a vain person is tricked by a fox with the words ‘Since March began thirty days and two’, i.e. 32 days since March began, which is April 1st. In 1508, French poet Eloy d’Amerval referred to a ‘poisson d’avril’, possibly the first reference to the celebration in France. Prompted by the Protestant Reformation, the Ecumenical Council of the Catholic Church issued condemnations of what it defined to be heresies committed by proponents of Protestantism and also issued key statements and clarifications of the Church’s doctrine and teachings, including scriptures, the Biblical canon, sacred tradition, original sin, the sacraments, Mass and the veneration of saints. The Council met for twenty-five sessions between 13 December 1545 and 4 December 1563 and Pope Paul III, who convoked, or called together the Council, oversaw the first eight sessions during 1545 and 1547, whilst the twelfth to sixteenth sessions, held between 1551 and 1552, were overseen by Pope Julius III and the final seventeenth to twenty-fifth sessions by Pope Pius IV between 1562 and 1563. As a result, the use of January 1st as New Year’s Day was not adopted officially until 1564 by the Edict of Roussillon, when France switched from the Julian to the Gregorian calendar. In the Julian Calendar, like the Hindu calendar, the new year began with the spring equinox around April 1st. So people who were slow to get the news of this change from the Julian to the Gregorian calendar or simply failed to realise the change but continued to celebrate the start of the new year during the last week of March and into April became the butt of jokes and hoaxes and were therefore called “April fools.” These pranks included having paper fish placed on their backs and being referred to as “poisson d’avril” (April fish), said to symbolise a young, easily caught fish or a gullible person. In 1686, a writer named John Aubrey referred to the celebration as ‘Fooles holy day’, the first British reference. On 1 April 1698, several people were tricked into going to the Tower of London to “see the Lions washed”.

An 1857 ticket to “Washing the Lions” at the Tower of London. No such event was ever held.

A study in the 1950s by two folklorists found that in the UK and in countries whose traditions derived from here, the joking ceased at midday and this continues to be the practice, with the custom ceasing at noon, after which time it is no longer acceptable to play pranks. Thus a person playing a prank after midday is considered to be the ‘April fool’ themselves. Meanwhile in Scotland, April Fools’ Day was originally called ‘Huntigowk Day’. The name is actually a corruption of ‘hunt the gowk’, this being Scottish for a cuckoo or a foolish person. Alternative terms in Gaelic would be ‘Là na Gocaireachd’, ‘gowking day’, or ‘Là Ruith na Cuthaige’, ‘the day of running the cuckoo’. The traditional prank is to ask someone to deliver a sealed message that supposedly requests help of some sort. In fact, the message reads “Dinna laugh, dinna smile. Hunt the gowk another mile.” The recipient, upon reading it, will explain they can only help if they first contact another person, and they send the victim to this next person with an identical message, with the same result. In England a ‘fool’ is known by a few different names around the country, including ‘noodle’, ‘gob’, ‘gobby’ or ‘noddy’.

Big Ben going digital…

On April Fools’ Day 1980, the BBC announced the Big Ben’s clock face was going digital and whoever got in touch first could win the clock hands. Over in Ireland, it was traditional to entrust a victim with an “important letter” to be given to a named person. That person would read the letter, then ask the victim to take it to someone else, and so on. The letter when opened contained the words “send the fool further”. A day of pranks is also a centuries-long tradition in Poland, signified by ‘prima aprilis’, this being ‘First April’ in Latin. It is a day when many pranks are played and hoaxes, sometimes very sophisticated, are prepared by people as well as the media (which often cooperate to make the ‘information’ more credible) and even public institutions. Serious activities are usually avoided, and generally every word said on April 1st could be untrue. The conviction for this is so strong that the Polish anti-Turkish alliance with Leopold I which was signed on 1 April 1683, was backdated to 31 March. But for some in Poland ‘prima aprilis’ also ends at noon of 1 April and such jokes after that hour are considered inappropriate and not classy. Over in Nordic countries Danes, Finns, Icelanders, Norwegians and Swedes celebrate April Fools’ Day. It is ‘aprilsnar’ in Danish, ‘aprillipäivä’ in Finnish and ‘aprilskämt’ in Swedish. In these countries, most news media outlets will publish exactly one false story on 1 April and for newspapers this will typically be a first-page article but not the top headline. In Italy, France, Belgium and the French-speaking areas of Switzerland and Canada, the April 1st tradition is similarly known as April fish, being ‘poisson d’avril’ in French, ‘April vis’ in Dutch and ‘pesce d’aprile’ in Italian. Possible pranks include attempting to attach a paper fish to the victim’s back without being noticed. This fish feature is prominently present on many late 19th- to early 20th-century French April Fools’ Day postcards. Many newspapers also spread a false story on April Fish Day, and a subtle reference to a fish is sometimes given as a clue to the fact that it is an April Fools’ prank. In Germany, as in the UK an April Fool prank is sometimes later revealed by shouting “April fool!” at the recipient, who becomes the April fool but over in the Ukraine, April Fools’ Day is widely celebrated in Odessa and has the special local name ‘Humorina’. It seems that this holiday arose in 1973 and an April Fool prank is revealed by saying “Pervoye Aprelya, nikomu ne veryu”, which means “April the First, I trust nobody”, to the recipient. The festival includes a large parade in the city centre, free concerts, street fairs and performances. Festival participants dress up in a variety of costumes and walk around the city fooling around and pranking passersby. One of the traditions on April Fools’ Day is to dress up the main city monument in funny clothes. Humorina even has its own logo, a cheerful sailor in a lifebelt and whose author was the artist Arkady Tsykun. During the festival, special souvenirs bearing the logo are printed and sold everywhere. Quite why or how this began I cannot determine but since 2010, April Fools’ Day celebrations include an International Clown Festival and both are celebrated as one. In 2019, the festival was dedicated to the 100th anniversary of the Odessa Film Studio and all events were held with an emphasis on cinema.

An April Fools’ Day prank in the Public Garden in Boston, Massachusetts.
The sign reads “No Photography Of The Ducklings Permitted”

As well as people playing pranks on one another on April Fools’ Day, elaborate pranks have appeared on radio and television stations, newspapers, and websites as well as those performed by large corporations. In one famous prank in 1957, the BBC broadcast a film in their ‘Panorama’ current affairs series purporting to show Swiss farmers picking freshly-grown spaghetti, in what they called the Swiss spaghetti harvest. The BBC was soon flooded with requests to purchase a spaghetti plant, forcing them to declare the film a hoax on the news the next day. With the advent of the Internet and readily available global news services, April Fools’ pranks can catch and embarrass a wider audience than ever before. But the practice of April Fool pranks and hoaxes is somewhat controversial. The mixed opinions of critics are epitomised in the reception to the 1957 BBC ’spaghetti tree hoax’ and newspapers were later split over whether it was a great joke or a terrible hoax on the public. The positive view is that April Fools’ can be good for one’s health because it encourages ‘jokes, hoaxes, pranks, and belly laughs’ and brings all the benefits of laughter including stress relief and reducing strain on the heart. There are many ‘best of’ April Fools’ Day lists that are compiled in order to showcase the best examples of how the day is celebrated and various April Fools’ campaigns have been praised for their innovation, creativity, writing, and general effort. However, the negative view describes April Fools’ hoaxes as ‘creepy and manipulative, rude and a little bit nasty’, as well as based on ‘Schadenfreude’, the experience of pleasure, joy, or self-satisfaction that comes from learning of or witnessing the troubles, failures, or humiliation of another, as well as deceit. When genuine news or a genuine important order or warning is issued on April Fools’ Day, there is risk that it will be misinterpreted as a joke and ignored, for example when Google (known to play elaborate April Fools’ Day hoaxes) announced, in 2004, their launch of Gmail with one gigabyte inboxes, an era when competing webmail services offered four megabytes or less, many dismissed it as an outright joke. On the other hand, sometimes stories intended as jokes are taken seriously. So either way, there can be adverse effects such as confusion, misinformation, wasted resources (especially when the hoax concerns people in danger) and even legal or commercial consequences. In Thailand, the police even warned ahead of the April Fools’ in 2021 that posting or sharing fake news online could lead to maximum of five years imprisonment. Other examples of genuine news on April 1st mistaken as a hoax included warnings about the Aleutian Island earthquake’s tsunami in Hawaii and Alaska in 1946 that killed 165 people, news on April 1st that a comedian by the name of Mitch Hedberg had died on 29 March 2005, an announcement that a long running USA soap opera called ‘Guiding Light’ was being cancelled in 2009 or that a USA basketball player named Isaiah Thomas had been declared for the NBA draft in 2011, probably because of his age. As well as April 1st being recognised as April Fools’ Day, there are a few other, recognisable days, notably on the first of each month when, in English-speaking countries (mainly Britain, Ireland, Australia, New Zealand and South Africa) it is a custom to say “a pinch and a punch for the first of the month” or a similar alternative, but this is typically said by children. In some places the victim might respond with “a flick and a kick for being so quick”, but that I haven’t heard said for many a long year. I do still say (or share in text messages etc) “White rabbits” as this is meant to bring good luck and to prevent the recipient saying ‘pinch, punch, first of the month’ to you! I do wonder sometimes how one of my older brothers managed at school on this particular day though, as April 1st is his birthday – perhaps he managed to keep it quiet somehow…

This week…
There are so many words in English that seem to have fallen out of use and I am starting to find a few. We know that when a word is used to emphasise or lay emphasis on a noun, it is called an emphatic adjective. Examples are found in “The very idea of living on the moon is impractical” and “They are the only people who helped me, where ‘very’ and only’ emphasise. But there are also ‘phatic’ expressions and these are ones denoting or relating to language used for general purposes of social interaction, rather than to convey information or ask questions. Utterances such as “hello, how are you?” and “nice morning, isn’t it?” are examples of phatic expressions.

Click: Return to top of page or Index page

This Earth

This Earth has been in existence for quite a long while and do I wonder how many folk consider that, also how much this amazing planet has changed over time. We as humans haven’t been here all that long and it is generally believed that as a race, Homo sapiens evolved in Africa during a time of dramatic climate change some 300,000 years ago. Like other early humans that were living around that time we gathered and hunted for food, evolving behaviours that helped us to respond to the challenges of survival in unstable environments. To begin with, we certainly had a few ideas about ourselves and the Earth itself that have proven to be wrong. There have been a number of misconceptions, again now proved to be incorrect, a few of these being as follows. Ancient Greek and Roman sculptures were originally painted with bright colours, but they only appear white today because the original pigments have deteriorated. Some well-preserved statues still bear traces of their original colouration. Also, the tomb of Tutankhamen is not inscribed with a curse on those who disturb it, this was a media invention of 20th-century tabloid journalists. The ancient Greeks did not use the word ‘idiot’ to disparage people who did not take part in civic life or who did not vote. An idiot was simply a private citizen as opposed to a government official. Later, the word came to mean any sort of non-expert or layman, then someone uneducated or ignorant, and much later to mean stupid or mentally deficient.

Oath of the Horatii by Jacques-Louis David in 1784.

According to ancient Roman legend, the Horatii were triplet warriors who lived during the reign of Tullus Hostilius (r. 672–640 BC). Accounts of his death of vary, as in the mythological version of events he had angered Jupiter, who then killed him with a bolt of lightning. But non-mythological sources describe that he died of a plague after a ruling for 32 years. There is also no evidence of the use of the Roman salute by ancient Romans (as depicted in the above painting) for greeting or any other purpose. The idea that the salute was popular in ancient times originated from the painting but it then inspired later salutes, most notably the Nazi salute. Another idea was that Julius Caesar was born via Caesarean section, but at the time of his birth such a procedure would have been fatal to his mother and Caesar’s mother was still alive when Caesar was 45 years old. The name ‘caesarean’ probably comes from the Latin verb ‘caedere’, meaning ’to cut’. Also there is the myth of the Earth being flat. In fact the earliest clear documentation of the idea of a spherical Earth comes from the ancient Greeks in the 5th century BC. The belief was widespread in Greece when Eratosthenes of Cyrene, a man of learning who lived from around 276BC to 194 BC who was a mathematician, geographer, poet, astronomer and music theorist. He also became the chief librarian at the Library of Alexandria and he introduced some of the terminology still in use today. As a result, most European and Middle Eastern scholars accepted that the Earth was spherical and belief in a flat Earth amongst educated Europeans was almost nonexistent from the Late Middle Ages onward, although fanciful depictions appear in some art. However, by the 1490’s there was still an issue as to the size of the Earth and in particular the position of the east coast of Asia. Historical estimates from Ptolemy, also a mathematician, astronomer, astrologer, geographer and music theorist, placed the coast of Asia about 180° east of the Canary Islands. It was Columbus who adopted an earlier (and rejected) distance of 225°, added 28° (based on Marco Polo’s travels), and then placed Japan a further 30° east. Starting from Cape St Vincent in Portugal, Columbus made Eurasia stretch 283° to the east, leaving the Atlantic as only 77° wide. Since he planned to leave from the Canaries, 9° further west, his trip to Japan would only have to cover 68° of longitude. Columbus mistakenly assumed that the mile referred to in the Arabic estimate of 56⅔ miles for the size of a degree was the same as the actually much shorter Italian mile of 1,480 metres. His estimate for the size of the degree and for the circumference of the Earth was therefore about 25% too small. The combined effect of these mistakes was that Columbus estimated the distance to Japan to be only about 5,000km, or only to the eastern edge of the Caribbean whilst the true figure is about 20,000km. The Spanish scholars may not have known the exact distance to the east coast of Asia, but they believed that it was significantly further than Columbus’s projection and this was the basis of the criticism in Spain and Portugal, whether academic or among mariners, of the proposed voyage. The disputed point was not the shape of the Earth, nor the idea that going west would eventually lead to Japan and China, but the ability of European ships to sail that far across open seas. The small ships of the day simply could not carry enough food and water to reach Japan as Columbus’s three ships varied in length between 20.5 and 23.5 metres, or 67 to 77 feet, and carried about 90 men. In fact the ships barely reached the eastern Caribbean islands as already the crews were mutinous, not because of some fear of ‘sailing off the edge’, but because they were running out of food and water with no chance of any new supplies within sailing distance. They were on the edge of starvation. What saved Columbus was the unknown existence of the Americas precisely at the point he thought he would reach Japan. His ability to resupply with food and water from the Caribbean islands allowed him to return safely to Europe, otherwise his crews would have died, and the ships foundered. Since the early 20th century, quite a number of books and articles have documented the flat Earth error as one of a number of widespread misconceptions in the popular views of the Middle Ages and although the misconception was frequently refuted in historical scholarship since at least 1920, it persisted in popular culture and in some school textbooks into the 21st century. An American schoolbook by Emma Miller Bolenius published in 1919 has this introduction to the suggested reading for Columbus Day, October 12th: “When Columbus lived, people thought that the Earth was flat. They believed the Atlantic Ocean to be filled with monsters large enough to devour their ships, and with fearful waterfalls over which their frail vessels would plunge to destruction. Columbus had to fight these foolish beliefs in order to get men to sail with him. He felt sure the Earth was round”.

The semi-circular shadow of Earth on the Moon during a partial lunar eclipse.

Pythagoras in the 6th century BC and Parmenides in the 5th century BC stated that the the Earth was spherical and this view spread rapidly in the Greek world. Around 330 BC Aristotle maintained on the basis of physical theory and observational evidence that the Earth was indeed spherical and reported an estimate of its circumference the value was first determined around 240 BC by Eratosthenes. By the 2nd century AD, Ptolemy had derived his maps from a globe and developed the system of latitude, longitude and climes. His Almagest was the Greek-language mathematical and astronomical treatise on the apparent motions of the stars and their planetary paths. One of the most influential scientific texts in history, it canonised a geocentric model of the Universe that was accepted for more than 1,200 years from its origin in Hellenistic, in the medieval Byzantine and Islamic worlds as well as in Western Europe through the Middle Ages and early Renaissance until Copernicus. It is also a key source of information about Ancient Greek astronomy. The work was originally written in Greek and only translated into Latin in the 11th century from Arabic translations. It is fascinating to consider that in the first century BC, Lucretius opposed the concept of a spherical Earth because he considered that an infinite universe had no centre towards which heavy bodies would tend towards. Thus he thought the idea of animals walking around topsy-turvy under the Earth was absurd. By the 1st century AD, Pliny the Elder was in a position to claim that everyone agreed on the spherical shape of Earth, though disputes continued regarding the nature of the antipodes, and how it was possible to keep the oceans in a curved shape.

Thorntonbank Wind Farm near the Belgian coast.

In the above image of Thorntonbank Wind Farm, the lower parts of the more distant towers are increasingly hidden by the horizon, demonstrating the curvature of the Earth. But even in the modern era, the pseudoscientific belief in a flat Earth originated with the English writer Samuel Rowbotham in his 1849 pamphlet ‘Zetetic Astronomy’. Lady Elizabeth Blount established the Universal Zetetic Society in 1893, which published journals. There were other flat-Earthers in the 19th and early 20th centuries and in 1956, Samuel Shenton set up the International Flat Earth Research Society (IFERS), better known as the “Flat Earth Society” from Dover, England, as a direct descendant of the Universal Zetetic Society. In the era of the Internet, the availability of communications technology and social media such as Facebook, YouTube and Twitter these have made it easy for individuals, famous or not, to spread disinformation and attract others to erroneous ideas, including that of the flat Earth. I still smile at the advert I once saw which read “Join the Flat Earth Society – branches all around the world”. To maintain belief in the face of overwhelming contrary, publicly available empirical evidence accumulated in the Space Age, modern flat-Earthers must generally embrace some form of conspiracy theory out of the necessity of explaining why major institutions such as governments, media outlets, schools, scientists, and airlines all assert that the world is a sphere. They tend to not trust observations they have not made themselves, and often distrust or disagree with each other. As so many do over so many things. I think that what can also be difficult to comprehend or imagine is the sheer size of our Earth, our solar system, the Milky Way and beyond. Science has enabled us to see, through microscopes and the like, things which are so tiny that we need devices to perceive them. We do now have telescopes, but even those often use infra-red (which our eyes cannot see naturally) to ‘see’ what is a great distance from our planet. On one of the websites I look at there are often questions raised which are good ones, but equally there are a few which show that the writer seems to have no concept of how large the Universe really is. As an example, one question recently shared was “If telescopes can see billions of light years away, what stops us from seeing detailed images of planet surfaces to check for plants or other life?”. The answer given was that the Andromeda Galaxy is actually about 2.5 million light-years from Earth, but even when we use the Hubble telescope to see the surface of the planet Mars which is only about 0.000042 light-years away, the sharpest image of the surface of Mars is very blurry like the one below. This is because of the relative sizes of the planets. For example the Andromeda galaxy, although incredibly distant, is so large that its relative size, when viewed from Earth, is massive. From here we should understand why distant galaxies can be seen well with a telescope.

The Andromeda Galaxy.
The surface of Mars.

So far as this Earth is concerned, whilst we have generally explored almost the entire continental surface, with the exception of Antarctica that is, there are substantial parts of the ocean that remain unexplored and not fully studied. Even the latest technological advances for mapping the seafloor are limited by what they can do in the oceans. I have mentioned before about a computer app that is freely available and which also utilises a website called What3Words. It divides the world into individual three-metre squares and gives each one a unique three-word address, in order for people to be easily found in emergencies. It also gives people without a formal address access to one for the first time, whether a permanent address or if halfway up the side of a mountain, for example. I think this is especially useful for emergency services to locate people, even in the sea as the UK version includes that. Whether we think this is a good thing or not, it means that everywhere in the world now has an address, even a tent in the middle of a field or a ditch on the North York Moors! The website is and one example, in this case the entrance to Peterborough railway station, is and clicking this link opens a web page showing a map of Peterborough, with the square allocated by what3words to the railway station entrance. There are no duplications. This program is available on Apple and I believe Google, I think it may be of use to folk on such things as countryside walks or simply meeting up with friends.

In previous blog posts I have written a little about this Earth, its language and transportation by road and rail as well as aviation. Each have their individual benefits and we have certainly come a long way in these things. Sadly however, so many advances have been as a result of wars, with either individuals or groups for some reason wanting to better another. As a result I still struggle to comprehend this human need. Still, it is going on around us and I expect will continue to do so for years to come, long after I am here. Just as those who lived for a time but passed away, so will others. I remember when I was quite young talking to our local vicar after he had talked about heaven and earth and me saying to him how I thought that Heaven must be a very big place, thinking about all the many people who had died over time. I remember the vicar smiling gently and telling me that I was applying Earthly values to Heavenly things. I didn’t understand him at the time, but I learned in time that he was right. It took a long time to realise just how vastly, hugely enormous the Universe is, we simply cannot imagine it. But it exists, at least I believe it does! So when I learn of how certain people in this beautiful Earth are behaving, I think on how their lives will end, new ones will spring up, things will change and I hope, in years to come, we may yet learn to all live peacefully together. You will forgive me if I do not hold my breath on that one though! This world turns, the seasons change, no matter what our individual thoughts or our beliefs may be. There is good in the world, we must believe in it, do all we can, openly and honestly, and be thankful for what we have. We still have a few million years left!

This week…a tale from a few years ago.
I had bought an old Land Rover Series 3 which was quite good, but I found it needed a bit of repair on the steering mechanism. It meant that as I was driving along, rather than steer straight I was correcting it, so the vehicle would seem to almost ‘wander’ from side to side a little! I was driving home one day and was stopped by a local policeman, who stood by the driver’s open window in a way that he could smell my breath – Land Rovers sit quite high up on the road. He asked me if had been drinking or had I only just bought the vehicle. He already knew the answers to both questions, but was checking with me! I assured I had not been drinking but had recently purchased the vehicle, he agreed and even recommended a local garage who specialised in Land Rover repairs. I was advised to get the steering problem attended to as soon as possible and when I took it to this garage, the staff there were sure they actually knew who this policeman was as he himself was the proud owner of a Land Rover and was a regular customer of theirs!

Click: Return to top of page or Index page

Human Aviation

We have for so many years been fascinated by watching birds fly and tried to do so ourselves. There are a few myths and legends of flight and my research has found some entertaining ones – these are just a few of them. According to Greek legend, Bellerophon the Valiant, the son of the King of Corinth, captured Pegasus the winged horse who took him into a battle against the triple headed monster, Chimera. In an Ancient Greek legend, King Minos imprisoned an engineer named Daedalus and with his son Icarus they made wings of wax and feathers. Daedalus flew successfully from Crete to Naples, but Icarus tried to fly too high and flew too near to the sun, so the wings of wax melted and Icarus fell to his death in the ocean. It is also said that King Kaj Kaoos of Persia attached eagles to his throne and flew around his kingdom, whilst Alexander the Great harnessed four great mythical winged animals called Griffins to a basket and flew around his realm. But in fact I understand it was around 400 BC that the Chinese first made kites that could fly in the air and this started us thinking about flying. To begin with, kites were used by the Chinese in religious ceremonies and they built many colourful ones for fun, then later more sophisticated kites were used to test weather conditions. Kites have been as important to the invention of flight as they were the forerunner to balloons and gliders. We have tested our ability to fly by attaching feathers or lightweight wood to our arms to enable us to fly naturally but the results were often disastrous as the muscles of human arms are simply not like the wings of birds and do not have the required strength. But an ancient Greek engineer named Hero of Alexandria worked with air pressure and steam to create sources of power and one of the experiments he developed was the ‘aeolipile’ which used jets of steam to create rotary motion. Hero mounted a sphere on top of a water kettle, a fire below the kettle turned the water into steam and the gas then travelled through pipes to the sphere. Then two L-shaped tubes on opposite sides of the sphere allowed the gas to escape, which gave a thrust to the sphere that caused it to rotate. Leonardo da Vinci made the first real studies of flight in the 1480’s and he had over 100 drawings that illustrated his theories on flight, but his Ornithopter flying machine was never actually created. Though it was a design that he created to show how man could fly and the modern day helicopter is based on this concept. The two brothers Joseph Michel and Jacques Etienne Montgolfier were inventors of the first hot air balloon and they used the smoke from a fire to blow hot air into a silk bag which was attached to a basket. The hot air then rose and allowed the balloon to become lighter than air. In 1783 the first passengers in the colourful balloon were a sheep, rooster and duck. It climbed to a height of about 6,000 feet and travelled more than 1 mile and after this first success, the brothers began to send men up in balloons. The first manned flight was on November 21, 1783, the passengers were Jean-Francois Pilatre de Rozier and Francois Laurent. Meanwhile, George Cayley worked to discover a way that man could fly. He designed many different versions of gliders that used the movements of the body to control and a young boy, whose name is not known, was the first to fly one of his gliders. Over fifty years Cayley made improvements to the gliders, changing the shape of the wings so that the air would flow over them correctly. He also designed a tail for the gliders, to help with the stability. He tried a biplane design to add strength to the glider and recognised that there would be a need for power if the flight was to be in the air for a long time. Cayley also wrote ‘On Ariel Navigation’ which showed that a fixed-wing aircraft with a power system for propulsion and a tail to assist in the control of the airplane would be the best way to allow man to fly. A German engineer, Otto Lilienthal, studied aerodynamics and worked to design a glider that would fly. He was the first person to design a glider that could fly a person and which was able to fly long distances. He was fascinated by the idea of flight. Based on his studies of birds and how they flew, he wrote a book on aerodynamics that was then published in 1889 and this text was used by the Wright Brothers as the basis for their designs. Around the same time, Samuel Langley who was an astronomer realised that power was needed to help man fly. He built a model of an aircraft which he called an ‘aerodrome’ that included a steam-powered engine and in 1891, his model flew for three-quarters of a mile before running out of fuel. Langley then received a $50,000 grant to build a full sized ‘aerodrome’, but it was too heavy to fly and it crashed. He was of course very disappointed at this and gave up trying to fly. His major contributions to flight involved attempts at adding a power plant to a glider, he was well known too as the director of the Smithsonian Institute in Washington, DC in the U.S.A.

A Wright Brothers Unpowered Aircraft.

Orville and Wilbur Wright were very deliberate in their quest for flight. First, they read about all the early developments of flight. They decided to make “a small contribution” to the study of flight control by twisting their wings in flight. Then they began to test their ideas with a kite. They learned about how the wind would help with the flight and how it could affect the surfaces once up in the air and using a methodical approach concentrating on the controllability of the aircraft, the brothers built and tested a series of kite and glider designs from 1898 to 1902 before attempting to build a proper powered design. The gliders worked, but not as well as the Wrights had expected based on the experiments and writings of their predecessors. Their first full-size glider, launched in 1900, had only about half the lift they anticipated. Their second glider, built the following year, performed even more poorly, but rather than giving up, the Wrights constructed their own wind tunnel and created a number of sophisticated devices to measure lift and drag on the 200 wing designs they tested. As a result, the Wrights corrected earlier mistakes in their calculations and along with much testing and calculating they produced a third glider with a higher aspect ratio and true three-axis control. They flew it successfully hundreds of times in 1902, and it performed far better than the previous models. The next step was to test the shapes of gliders much like George Cayley did when he was testing the many different shapes that would fly. Finally, with a perfected glider shape, they turned their attention to how to create a propulsion system that would create the thrust needed to fly. The early engine that they designed generated almost 12 horsepower, that is the same power as two hand-propelled lawn mower engines! The “Flyer” lifted from level ground to the north of Big Kill Devil Hill, North Carolina, at 10:35 a.m., on December 17, 1903. Orville piloted the plane which weighed about six hundred pounds. The first heavier than air flight traveled one hundred twenty feet in twelve seconds. The two brothers took turns flying that day with the fourth and last flight covering 850 feet in 59 seconds, but the Flyer was unstable and very hard to control. The brothers returned to Dayton, Ohio, where they worked for two more years perfecting their design and finally, on October 5, 1905, Wilbur piloted the Flyer III for 39 minutes and for about 24 miles in circles around Huffman Prairie. He flew the first practical aircraft until it ran out of fuel. By using a rigorous system of experimentation, involving wind-tunnel testing of airfoils and flight testing of full-size prototypes, the Wrights not only built a working aircraft the following year but also helped advance the science of aeronautical engineering. The brothers appear to have been the first to make serious studied attempts to simultaneously solve both the power and control problems. These problems proved difficult, but they never lost interest, eventually solving them. Then, almost as an afterthought, they designed and built a low-powered internal combustion engine. They also designed and carved wooden propellers that were more efficient than any before, enabling them to gain adequate performance from their low engine power. Whilst many aviation pioneers appeared to leave safety largely to chance, the Wrights’ design was greatly influenced by the need to teach themselves to fly without unreasonable risk to life and limb, by surviving crashes! This emphasis, as well as low engine power, was the reason for low flying speed and for taking off in a headwind. Performance, rather than safety, was the reason for the rear-heavy design because the wing designs made the aircraft less affected by crosswinds and easier to fly. Since then, many new aeroplanes along with different engines have been developed to help transport people, luggage, cargo, military personnel and weapons around the globe, but their advances were all based on these first flights by the Wright Brothers.

The Wright Flyer, the first sustained flight with a powered, controlled aircraft.

In fact the history of aviation extends for more than two thousand years, from the earliest forms such as kites, even attempts at tower jumping all the way through to supersonic flight by powered, heavier-than-air jets. The discovery of hydrogen gas in the 18th century led to the invention of the hydrogen balloon at almost exactly the same time that the Montgolfier brothers rediscovered the hot-air balloon and began manned flights. With various theories in mechanics by physicists during the same period of time, notably fluid dynamics and Newton’s Laws of Motion led to the foundation of modern aerodynamics. Balloons, both free-flying and tethered, began to be used for military purposes from the end of the 18th century, with the French government establishing Balloon Companies during the Revolution. Experiments with gliders provided the groundwork for heavier-than-air craft and by the early 20th century advances in engine technology and aerodynamics made controlled, powered flight possible for the first time. The modern aeroplane with its characteristic tail was established by 1909 and from then on its history became tied to the development of more and more powerful engines. The first great ships of the air were the rigid dirigible balloons pioneered by Ferdinand Von Zeppelin, a name which soon became synonymous with airships and dominated long-distance flight until the 1930s, when large flying boats became popular. The ‘pioneer’ era from 1903 to 1914 also saw the development of practical aeroplanes and airships and their early application, alongside balloons and kites, for private, sport and military use. Eventually though, flight became an established technology and over a period of a few years more controls were added, providing a recognition of powered flight as something other than the preserve of dreamers and eccentrics. Such things as ailerons, also radio-telephones and guns were included and it was not long before aircraft were shooting at each other, but the lack of any sort of steady point for the gun was a problem. The French solved this problem when, in late 1914, Roland Garros attached a fixed machine gun to the front of his aircraft. Aviators were styled as modern-day knights, doing individual combat with their enemies. Several pilots became famous for their air-to-air combat, the most well known being Manfred von Richthofen, better known as the ‘Red Baron’, who shot down eighty planes in air-to air combat using several different planes, the most celebrated of which was a red triplane, that being one fitted with three wings. France, Britain, Germany, and Italy were the leading manufacturers of fighter planes that saw action during the war, then in the years between the two World Wars there was really great advancements in aircraft technology. Aircraft evolved from low-powered biplanes and triplanes made from wood and fabric to sleek, high-powered monoplanes made of aluminium, based primarily on the founding work of Hugo Junkers during the World War I and its adoption by other designers. As a result, the age of the great rigid airships came and went. The first successful flying machines that used rotary wings appeared in the form of the autogyro which was first flown in 1919. In that design, the rotor is not powered but is spun like a windmill by its passage through the air whilst a separate power-plant is used to propel the aircraft forwards. Helicopters were developed and in the 1930s, development of the jet engine began in Germany and in Britain and both countries would go on to develop jet aircraft by the end of World War II. This era saw a great increase in the pace of development and production, not only of aircraft but also the associated flight-based weapon delivery systems. Air combat tactics and doctrines took advantage. Large-scale strategic bombing campaigns were launched, fighter escorts introduced and the more flexible aircraft and weapons allowed precise attacks on small targets with various types of attack aircraft. Also, new technologies like radar allowed more coordinated and controlled deployment of air defence.

Messerschmitt Me262, the first operational jet fighter.

The first jet aircraft to fly was the German Heinkel He178 in 1939, followed by the world’s first operational jet aircraft, the Me262 in July 1942. British developments like the Gloster Meteor followed afterwards, but these saw only brief use in World War II. Also, jet and rocket aircraft had only limited impact due to their late introduction, fuel shortages, also the real lack of experienced pilots as well as the declining war industry of Germany. In the latter part of the 20th century, the advent of digital electronics produced great advances in flight instrumentation and “fly-by-wire” systems with the 21st century bringing the large-scale use of pilotless drones for military, civilian and leisure use, also inherently unstable aircraft such as ‘flying wings’ becoming possible with their use of digital controls.

The DeHavilland Comet, the world’s first jet airliner which also saw service in the Royal Air Force.

Also after World War II, commercial aviation grew rapidly, using mostly ex-military aircraft to transport people and cargo. By 1952, the British Overseas Aircraft Corporation (BOAC) had introduced the Comet into their scheduled service. Whilst a technical achievement, the plane suffered a series of highly public failures as the shape of its windows led to cracks due to metal fatigue. The fatigue was caused by cycles of pressurisation and depressurisation of the cabin and eventually led to catastrophic failure of the plane’s fuselage. By the time the problems were overcome by making the windows oval rather than square-shaped, other jet airliner designs had already taken to the skies. Much more could be written here about the changes, including the ‘jet age’, supersonic flight, even getting into space but I think that will be for another time. Suffice to say that 21st-century aviation has seen increasing interest in fuel savings and fuel diversification, as well as low-cost airlines and facilities. Also, much of the developing world that did not have good access to air transport has been steadily adding aircraft and facilities, though severe congestion remains a problem in many up and coming nations. But we continue to strive, to develop. On 19 April 2021, the National Aeronautical Space Administration (NASA) flew successfully an unmanned helicopter on Mars, making it humanity’s first controlled powered flight on another planet. ‘Ingenuity’ rose to a height of 3 metres and hovered in a stable holding position for 30 seconds, after a vertical take-off that was filmed by its accompanying rover, ‘Perseverance’. Then on 22 April 2021, ‘Ingenuity’ made a second, more complex flight, which was also observed by ‘Perseverance’. As an homage to all of its aerial predecessors, the ‘Ingenuity’ helicopter carries with it a very small, postage-stamp sized fragment from the wing of the 1903 Wright Flyer.

This week…
It just goes to show how some historical events aren’t remembered. Back in 2016, on the television quiz show ‘Pointless’, a relatively young contestant chose to answer the question “Who was assassinated by Lee Harvey Oswald in Dallas?”. They answered, somewhat hesitatingly, “J.R.?”, meaning J.R. Ewing from an American television soap opera which was aired on American tv from 1978 to 1991. The correct answer was John F. Kennedy, the former president of the United States, on November 22, 1963.

Click: Return to top of page or Index page