Human Evolution

Last week I wrote about how the Sun, along with the planets, were thought to have been formed. This time I will say a bit more about the colonisation of the land, a bit about extinctions and then talk about our human evolution and its history.

An artist’s conception of Devonian flora.

I said last week that the Huronian ice age might have been caused by the increased oxygen concentration in the atmosphere, which caused the decrease of methane (CH4) in the atmosphere. Methane is a strong greenhouse gas, but with oxygen it reacts to form CO2, a less effective greenhouse gas. Oxygen accumulation from photosynthesis resulted in the formation of an ozone layer that absorbed much of the Sun’s ultraviolet radiation, meaning unicellular organisms that reached land were less likely to die, and as a result Prokaryotes began to multiply and became better adapted to survival out of the water. These microscopic single-celled organisms have no distinct nucleus with a membrane and include bacteria. These organisms colonised the land, then along came Eukaryotes, an organism consisting of a cell or cells in which the genetic material is DNA in the form of chromosomes contained within a distinct nucleus. For a long time, the land remained barren of multicellular organisms. The supercontinent Pannotia formed around 600Ma (that is 600 million years ago) and then broke apart a short 50 million years later. Fish, the earlier vertebrates, evolved in the oceans around 530Ma. A major extinction event occurred near the end of the Cambrian period, which ended 488 Ma. Several hundred million years ago plants, probably resembling algae and fungi, started growing at the edges of the water, and then out of it. The oldest fossils of land fungi and plants date to around 480 to 460Ma, though molecular evidence suggests the fungi may have colonised the land as early as 1,000Ma and the plants 700Ma. Initially remaining close to the water’s edge, mutations and variations resulted in further colonisation of this new environment. The timing of the first animals to leave the oceans is not precisely known, but the oldest clear evidence is of arthropods on land around 450Ma, perhaps thriving and becoming better adapted due to the vast food source provided by the terrestrial plants. There is also unconfirmed evidence that arthropods may have appeared on land as early as 530Ma. The first of five great mass extinctions was the Ordovician-Silurian extinction and its possible cause was the intense glaciation of Gondwana, which eventually led to a snowball Earth where some 60% of marine invertebrates became extinct. The second mass extinction was the Late Devonian extinction, probably caused by the evolution of trees, which could have led to the depletion of greenhouse gases like CO2 or the eurotrophication, the process by which an entire body of water or parts of it, became progressively enriched with minerals and nutrients. It has also been defined as “nutrient-induced increase in phytoplankton productivity”. This meant that 70% of all species became extinct. The third mass extinction was the Permian-Triassic, or the Great Dying event, possibly caused by some combination of the Siberian Traps volcanic event, an asteroid impact, methane hydrate gasification, sea level fluctuations and a major anoxic event. In fact, either the Wilkes Land Crater in Antarctica or the Bedout structure off the northwest coast of Australia may indicate an impact connection with the Permian-Triassic extinction. But it remains uncertain whether either these or other proposed Permian-Triassic boundary craters are either real impact craters or even contemporaneous with the Permian-Triassic extinction event. This was by far the deadliest extinction ever, with about 57% of all families and 83% of all living organisms were killed. The fourth mass extinction was the Triassic-Jurassic extinction event in which almost all small creatures became extinct, probably due to new competition from dinosaurs, who were the dominant terrestrial vertebrates throughout most of the Mesozoic period. After yet another, the most severe extinction of the period around 230Ma, dinosaurs split off from their reptilian ancestors. The Triassic-Jurassic extinction event at 200Ma spared many of the dinosaurs and they soon became dominant among the vertebrates. Though some mammalian lines began to separate during this period, existing mammals were probably small animals resembling shrews. The boundary between avian and non-avian dinosaurs is not clear, but Archaeopteryx, traditionally considered one of the first birds, lived around 150Ma. The earliest evidence for evolving flowers is during the Cretaceous period, some 20 million years later around 132Ma. Then the fifth and most recent mass extinction was the K-T extinction. In 66Ma, a 10-kilometre (6.2 mile) asteroid struck Earth just off the Yucatan Peninsula, somewhere in the southwestern tip of then Laurasia and where the Chicxlub crater in Mexico is today. This ejected vast quantities of particulate matter and vapour into the air that occluded sunlight, inhibiting photosynthesis. 75% of all life, including the non-avian dinosaurs, became extinct, marking the end of the Cretaceous period and Mesozoic era.

Yucatan Chicxlub Crater in Mexico.

A small African ape living around 6Ma (6 million years ago) was the last animal whose descendants would include both modern humans and their closest relatives, the chimpanzees, and only two branches of its family tree have surviving descendants. Very soon after the split, for reasons that are still unclear, apes in one branch developed the ability to walk upright. Brain size increased rapidly, and by 2Ma the first animals classified in the genus Homo had appeared. Of course, the line between different species or even genera is somewhat arbitrary as organisms continuously change over generations. Around the same time, the other branch split into the ancestors of the common chimpanzee and the ancestors of the bonobo as evolution continued simultaneously in all life forms. The ability to control fire probably began in Homo Erectus, probably at least 790,000 years ago but perhaps as early as 1.5Ma, but it is possible that the use and discovery of controlled fire may even predate Homo Erectus and fire was possibly used by the early Lower Palaeolithic. It is more difficult to establish the origin of language and it is unclear as to whether Homo Erectus could speak or if that capability had not begun until Homo sapiens. As brain size increased, babies were born earlier, before their heads grew too large to pass through the pelvis. As a result, they exhibited more plasticity and thus possessed an increased capacity to learn and required a longer period of dependence. Social skills became more complex, language became more sophisticated and tools became more elaborate. This contributed to further cooperation and intellectual development. Modern humans are believed to have originated around 200,000 years ago or earlier in Africa as the oldest fossils date back to around 160,000 years ago. The first humans to show signs of spirituality are the Neanderthals, usually classified as a separate species with no surviving descendants. They buried their dead, often with no sign of food or tools. But evidence of more sophisticated beliefs, such as the early Cro-Magnon cave paintings, probably with magical or religious significance, did not appear until 32,000 years ago. Cro-Magnons also left behind stone figurines such as Venus of Willendorf, probably also signifying religious belief. By 11,000 years ago, Homo sapiens had reached the southern tip of South America, the last of the uninhabited continents, except for Antarctica which remained undiscovered until 1820 AD). Tool use and communication continued to improve, and interpersonal relationships became more intricate. Throughout more than 90% of its history, Homo sapiens lived in small bands as nomadic hunter-gatherers. It has been thought that as language became more complex, the ability to remember as well as to communicate information resulted so ideas could be exchanged quickly and passed down the generations. Cultural evolution quickly outpaced biological evolution and history proper began. It seems that between 8,500BC and 7,000BC, humans in the Fertile Crescent area of the Middle East began the systematic husbandry of plants and animals, so true agriculture began This spread to neighbouring regions, and developed independently elsewhere until most Homo sapiens lived sedentary lives in permanent settlements as farmers. It was also found that those civilisations which did adopt agriculture, the relative stability and increased productivity provided by farming allowed the population to expand. Not all societies abandoned nomadism, especially those in the isolated areas of the globe that were poor in domesticable plant species, such as Australia. Agriculture had a major impact; humans began to affect the environment as never before. Surplus food allowed a priestly or governing class to arise, followed by an increasing division of labour which led to Earth’s first civilisation at Sumer in the Middle East, between 4,000BC and 3,000BC. Additional civilisations quickly arose in ancient Egypt, at the Indus River valley and in China. The invention of writing enabled complex societies to arise, record-keeping and libraries served as a storehouse of knowledge and increased the cultural transmission of information. Humans no longer had to spend all their time working for survival, enabling the first specialised occupations, like craftsmen, merchants and priests. Curiosity and education drove the pursuit of knowledge and wisdom, and various disciplines, including science, albeit in a primitive form, arose. This in turn led to the emergence of increasingly larger and more complex civilisations, such as the first empires, which at times traded with one another, or fought for territory and resources. By around 500BC there were more advanced civilisations in the Middle East, Iran, India, China, and Greece, at times expanding, other times entering into decline. In 221BC, China became a single polity, this being an identifiable political entity, a group of people who have a collective identity and who are organised by some form of institutionalised social relations, having the capacity to mobilise resources. They would grow to spread its culture throughout East Asia and it has remained the most populous nation in the world. During this period, famous Hindu texts known as Vedas came in existence in Indus Valley civilisation. They developed in warfare, arts, science, mathematics as well as in architecture. The fundamentals of Western civilisation were largely shaped in Ancient Greece, with the world’s first democratic government and major advances in philosophy as well as science. Ancient Rome grew with law, government, and engineering and then the Roman Empire was Christianised by Emperor Constantine in the early 4th century but then the Roman Empire declined by the end of the 5th century. Beginning with the 7th century, the Christianisation of Europe began. In 610AD Islam was founded and quickly became the dominant religion in Western Asia. The ‘House of Wisdom’ was established in the Abbasid era of Baghdad and Iraq. It is considered to have been a major intellectual centre during the Islamic Golden Age, where Muslim scholars in Baghdad as well as Cairo flourished from the ninth to the thirteenth centuries until the Mongol sack of Baghdad in 1258AD. Meanwhile in 1054AD the Great Schism between the Roman Catholic Church and the Eastern Orthodox Church led to the prominent cultural differences between Western and Eastern Europe. In the 14th century, the Renaissance began in Italy with advances in religion, art, and science. At that time the Christian Church as a political entity lost much of its power. In 1492AD, Christopher Columbus reached the Americas, thus initiating great changes to the New World. European civilisation began to change beginning in 1500AD, leading to both the Scientific and Industrial revolutions. The European continent began to exert political and cultural dominance over human societies around the world, a time known as the Colonial era. Then in the 18th century a cultural movement known as the Age of Enlightenment further shaped the mentality of Europe and contributed to its secularisation. From 1914 to 1918 and 1939 to 1945, nations around the world were embroiled in World Wars. Following World War I, the League of Nations was a first step in establishing international institutions to settle disputes peacefully. After failing to prevent World War II, mankind’s bloodiest conflict, it was replaced by the United Nations and after that war, many new states were formed, declaring or being granted independence in a period of decolonisation.The democratic capitalist United States and the socialist Soviet Union became the world’s dominant super-powers for a time and they held an ideological, often violent rivalry known as the Cold War until the dissolution of the latter. In 1992, several European nations joined in the European Union and as transportation and communication has improved, both the economies and political affairs of nations around the world have become increasingly intertwined. However, this globalisation has often produced both conflict and cooperation. As we continue in this beautiful world though, we are at present having to cope with a world-wide pandemic for which no cure has yet been found. We are researching and looking for vaccines that it is said will at least reduce the adverse effects of Covid-19, however many do not believe that these same vaccines are what we need. As a result, a great many deaths are still being reported in countries right across our world. Some say it is a man-made virus, others are suggesting conspiracy theories, but I feel sure that just as in the past other viruses have been beaten, this one will also be. However, in the meantime we should surely behave responsibly and work together to help reduce the spread of this virus, no matter what our thoughts, ideas or beliefs may be. So that in years to come, others may then look back and learn, in order for all life on Earth to continue.

This week, a simple quote…

“The purpose of life is a life of purpose.”
~ Robert Byrne (22 May 1930 – 06 December 2016)

Click: Return to top of page or Index page

Looking Back

I try not to spend too much time looking back on my life, but there are times when it is good to do so and I have been reminded of a blog post I sent out early last year. A few years ago now a good friend sent me an article about a daughter learning about Darwin’s Theory of Evolution and then her mother telling her about the Sanskrit Avatars, which tell their version on the beginning of life here on Earth. I appreciated that, but to me there are other people, for example the Aborigines, also the American Indians who all have their traditions. Whatever way is right, however things occurred, I really do believe that this world, along with the rest of the Universe, didn’t just happen by accident. With looming discrepancies about the true age of the universe, scientists have taken a fresh look at the observable, expanding universe and have now estimated that it is almost 14 billion years old, plus or minus 40 million years. Considering that as well as our sun, our star, there are around 100,000 million stars in just the Milky Way alone, our Earth is a bit small! As well as that, there are an estimated 500 billion galaxies. With all the fighting and killing that we humans have done in the (extremely relatively) short time that we have been around, it is perhaps a good thing that the nearest star system to our sun is Alpha Centauri, which is 4.3 light-years away. That’s about 25 trillion miles (40 trillion kilometres) away from Earth – nearly 300,000 times the distance from the Earth to the Sun. In time, about 5 billion years from now, our sun will run out of hydrogen. Our star is currently in the most stable phase of its life cycle and has been since the birth of our solar system, about 4.5 billion years ago and once all the hydrogen gets used up, the sun will grow out of this stable phase. But what about our Earth. In my blog post last week I said about what might happen if we were to take an imaginary quick jaunt through our solar system in the potential ‘last days’ of the sun. There has been speculation, there have been films, tv series, all giving a view on how things were or might be. The film ‘2001 A Space Odyssey’ is just one example. Others films like Star Trek have imagined where beings from other worlds colonised Earth, some have considered life if another race were to change life completely here. A favourite of mine, Stargate, started out as a film and then became a series where the Egyptian ruler, Ra, travelled via a star-gate to a far-distant world where earth-like creatures lived. We can speculate and wonder, it is a thing that we humans can do. Though if you know of the late, great Douglas Adams and his tales, we should not panic. Just remember that in his writings, at one point the Earth was destroyed to make way for a hyperspace bypass and that just before that happened, the dolphins left Earth and as they did so, they sent a message saying “So long, and thanks for all the fish”. But I digress. I cannot possibly detail the full history of our Earth here, but I can perhaps highlight a few areas and I shall do my best.

Many attempts have been made over the years to comprehend the main events of Earth’s past, characterised by constant change and biological evolution. There is now a geological time scale, as defined by international convention, which depicts the large spans of time from the beginning of the Earth to the present and its divisions chronicle some definitive events of Earth history. Earth formed around 4.54 billion years ago, approximately one-third the age of the Universe, by accretion – this being the growth or increase by the gradual accumulation of additional layers or matter. It came from the solar nebula. Volcanic outgassing probably created the primordial atmosphere and then the ocean, but the early atmosphere contained almost no oxygen. Much of the Earth was molten because of frequent collisions with other bodies which led to extreme volcanism. Whilst the Earth was in its earliest stage, a giant impact collision with a planet-sized body named Theiais is thought to have formed the Moon. Over time, the Earth cooled, causing the formation of a solid crust, and allowing liquid water on the surface. The Hadean eon represents the time before a reliable fossil record of life, it began with the formation of the planet and ended 4 billion years ago. The following Archean and Proterozoic eons produced the beginnings of life on Earth and its earliest evolution. The succeeding eon was divided into three eras, which brought forth arthropods, fishes, and the first life on land, the next which spanned the rise, reign, and climactic extinction of the non-avian dinosaurs and the following one which saw the rise of mammals. Recognisable humans emerged at most 2 million years ago, a vanishingly small period on the geological scale. The earliest undisputed evidence of life on Earth dates from at least 3.5 billion years ago after which a geological crust started to solidify. There are microbial mat fossils found in 3.48 billion-year-old sandstone discovered in Western Australia and other early physical evidence of a biogenic substance is graphite found in 3.7 billion-year-old rocks discovered in southwestern Greenland. Photosynthetic organisms appeared between 3.2 and 2.4 billion years ago and began enriching the atmosphere with oxygen. Life remained mostly small and microscopic until about 580 million years ago, when complex multicellular life arose. This developed over time and culminated in the Cambrian Explosion about 541 million years ago. This sudden diversification of life forms produced most of the major algae, fungi, and plants known today, and divided the Proterozoic eon from the Cambrian Period of the Paleozoic Era. It is estimated that 99 percent of all species that ever lived on Earth, over five billion have become extinct and estimates on the number of Earth’s current species range from 10 million to 14 million, of which about 1.2 million are documented, but over 86 percent have not been described. However, it was recently claimed that 1 trillion species currently live on Earth, with only one-thousandth of one percent described. The Earth’s crust has constantly changed since its formation, as has life since its first appearance. Species continue to evolve, taking on new forms, splitting into daughter species, or going extinct in the face of ever-changing physical environments. The process of plate tectonics continues to shape the Earth’s continents and oceans and the life which they harbour.

So the history of Earth is divided into four great eons, starting with the formation of the planet. Each eon saw the most significant changes in Earth’s composition, climate and life. Each eon is subsequently divided into eras, which in turn are divided into periods and which are further divided into epochs. In the Hadean eon, the Earth was formed out of debris around the solar protoplanetary disk. There was no life, temperatures were extremely hot, with frequent volcanic activity and hellish-looking environments, hence the eon’s name which comes from Hades. Possible early oceans or bodies of liquid water appeared and the Moon was formed around this time, probably due to a collision into Earth by a protoplanet. In the next eon came the first form of life, with some continents existing and an atmosphere is composing of volcanic and greenhouse gases. Following this came early life of a more complex form, including some forms of multicellular organisms. Bacteria began producing oxygen, shaping the third and current of Earth’s atmospheres. Plants, later animals and possibly earlier forms of fungi formed around that time. The early and late phases of this eon may have undergone a few ’Snowball Earth’, periods, in which all of the planet suffered below-zero temperatures. A few early continents may have existed in this eon. Finally complex life, including vertebrates, begin to dominate the Earth’s ocean in a process known as the Cambrian Explosion. Supercontinents formed but later dissolved into the current continents. Gradually life expanded to land and more familiar forms of plants, animals and fungi began to appear, including insects and reptiles. Several mass extinctions occurred though, amongst which birds, the descendants of non-avian dinosaurs, and more recently mammals emerged. Modern animals, including humans, evolved at the most recent phases of this eon.

An artist’s rendering of a protoplanetary disk.

The standard model for the formation of our Solar System, including the Earth, is the Solar Nebula hypothesis. In this model, the Solar System formed from a large, rotating cloud of interstellar dust and gas, composed of hydrogen and helium created shortly after the Big Bang some 13.8 billion years ago and heavier elements ejected by supernovae. At about 4.5 billion years the nebula began a contraction that may have been triggered by a shock wave from a nearby supernova, which would have also made the nebula rotate. As the cloud began to accelerate, its angular momentum, gravity and inertia flattened it into a protoplanetary disk that was perpendicular to its axis of rotation. Small perturbations due to the collisions and the angular momentum of other large debris created the means by which kilometre-sized protoplanets began to form, orbiting the nebular centre. Not having much angular momentum it collapsed rapidly, the compression heating it until the nuclear fusion of hydrogen into helium began. After more contraction, a ’T Tauri’ star ignited and evolved into the Sun. Meanwhile, in the outer part of the nebula gravity caused matter to condense around density perturbations and dust particles, and the rest of the protoplanetary disk began separating into rings. In a process known as runaway accretion, successively larger fragments of dust and debris clumped together to form planets. Earth formed in this manner about 4.54 billion years ago and was largely completed within 10 to 20 million years. The solar wind of the newly formed T Tauri star cleared out most of the material in the disk that had not already condensed into larger bodies. The same process is expected to produce other accretion disks around virtually all newly forming stars in the universe, some of which yield planets. Then the proto-Earth grew until its interior was hot enough to melt the heavy metals and having higher densities than silicates, these metals sank. This so-called ‘iron catastrophe’ resulted in the separation of a primitive mantle and a metallic core only 10 million years after the Earth began to form, producing the layered structure of Earth and setting up the formation of its magnetic field.

This Earth is often described as having had three atmospheres. The first atmosphere, captured from the solar nebula, was composed of lighter elements from the solar nebula, mostly hydrogen and helium. A combination of the solar wind and Earth’s heat would have driven off this atmosphere, as a result of which the atmosphere was depleted of these elements compared to cosmic abundances. After the impact which created the Moon, the molten Earth released volatile gases; and later more gases were released by volcanoes, completing a second atmosphere rich in greenhouse gases but poor in oxygen. Finally, the third atmosphere, rich in oxygen, emerged when bacteria began to produce oxygen. The new atmosphere probably contained water vapour, carbon dioxide, nitrogen, and smaller amounts of other gases. Water must have been supplied by meteorites from the outer asteroid belt also some large planetary embryos and comets may have contributed. Though most comets are today in orbits farther away from the Sun than Neptune, some computer simulations show that they were originally far more common in the inner parts of the Solar System. As the Earth cooled, clouds formed, rain created the oceans and recent evidence suggests the oceans may have begun forming quite early. At the start of the Archean eon, they already covered much of the Earth. This early formation has been difficult to explain because of a problem known as the ‘faint young sun’ paradox. Stars are known to get brighter as they age, and at the time of its formation the Sun would have been emitting only 70% of its current power. Thus, the Sun has become 30% brighter in the last 4.5 billion years. Many models indicate that the Earth would have been covered in ice and a likely solution is that there was enough carbon dioxide and methane to produce a greenhouse effect. The carbon dioxide would have been produced by volcanoes and the methane by early microbes whilst another greenhouse gas, ammonia, would have been ejected by volcanos but quickly destroyed by ultraviolet radiation. One of the reasons for interest in the early atmosphere and ocean is that they form the conditions under which life first arose. There are many models, but little consensus, on how life emerged from non-living chemicals; chemical systems created in the laboratory fall well short of the minimum complexity for a living organism. The first step in the emergence of life may have been chemical reactions that produced many of the simpler organic compounds, including nuclei and amino acids that are the building blocks of life. An experiment in 1953 by Stanley Miller and Harold Urey showed that such molecules could form in an atmosphere of water, methane, ammonia and hydrogen with the aid of sparks to mimic the effect of lightning. Although atmospheric composition was probably different from that used by Miller and Urey, later experiments with more realistic compositions also managed to synthesise organic molecules. Additional complexity could have been reached from at least three possible starting points, these being self-replication, an organism’s ability to produce offspring that are similar to itself, metabolism, its ability to feed and repair itself and external cell membranes, which allow food to enter and waste products to leave, but exclude unwanted substances. The earliest cells absorbed energy and food from the surrounding environment. They used fermentation, the breakdown of more complex compounds into less complex compounds with less energy, and used the energy so liberated to grow and reproduce. Fermentation can only occur in an oxygen-free) environment. The evolution of photosynthesis made it possible for cells to derive energy from the Sun. Most of the life that covers the surface of the Earth depends directly or indirectly on photosynthesis. The most common form, oxygenic photosynthesis, turns carbon dioxide, water, and sunlight into food. It captures the energy of sunlight in energy-rich molecules, which then provide the energy to make sugars. To supply the electrons in the circuit, hydrogen is stripped from water, leaving oxygen as a waste product. Some organisms, including purple bacteria and green sulphur bacteria, use an an oxygenic form of photosynthesis that uses alternatives to hydrogen stripped from water as electron donors, such as hydrogen sulphide, sulphur and iron. Such organisms are restricted to otherwise inhospitable environments like hot springs and hydrothermal vents. The simpler form arose not long after the appearance of life. At first, the released oxygen was bound up with limestone, iron and other minerals. The oxidised iron appears as red layers in geological strata which are called banded iron formations. When most of the exposed readily reacting minerals were oxidised, oxygen finally began to accumulate in the atmosphere. Though each cell only produced a minute amount of oxygen, the combined metabolism of many cells over a vast time transformed Earth’s atmosphere to its current state. This was Earth’s third atmosphere. Some oxygen was stimulated by solar ultraviolet radiation to form ozone, which collected in a layer near the upper part of the atmosphere. The ozone layer absorbed, and still absorbs, a significant amount of the ultraviolet radiation that once had passed through the atmosphere. It allowed cells to colonise the surface of the ocean and eventually the land. Without the ozone layer, ultraviolet radiation bombarding land and sea would have caused unsustainable levels of mutation in exposed cells. Photosynthesis had another major impact. Oxygen was toxic; much life on Earth probably died out as its levels rose in what is known as the oxygen catastrophe. Resistant forms survived and thrived, and some developed the ability to use oxygen to increase their metabolism and obtain more energy from the same food. The Sun’s natural evolution has made it progressively more luminous during the Archean and Proterozoic eons and the Sun’s luminosity increases 6% every billion years. As a result, the Earth began to receive more heat from the Sun in the Proterozoic eon. However, the Earth did not get warmer. Instead, geological records suggest that it cooled dramatically during the early Proterozoic. Glacial deposits found in South Africa based on paleo-magnetic evidence suggest they must have been located near the equator. Thus, this glaciation, known as the Huronian glaciation, may have been global. Some scientists suggest this was so severe that the Earth was frozen over from the poles to the equator, a hypothesis called Snowball Earth. The Huronian ice age might have been caused by the increased oxygen concentration in the atmosphere, which caused the decrease of methane (CH4) in the atmosphere. Methane is a strong greenhouse gas, but with oxygen it reacts to form CO2, a less effective greenhouse gas. When free oxygen became available in the atmosphere, the concentration of methane could have then decreased dramatically, enough to counter the effect of the increasing heat flow from the Sun. However, the term Snowball Earth is more commonly used to describe later extreme ice ages during the Cryogenian period. There were four periods, each lasting about 10 million years, between 750 and 580 million years ago, when the earth is thought to have been covered with ice apart from the highest mountains, and average temperatures were about −50°C (−58°F). The snowball may have been partly due to the location of the supercontinent straddling the Equator. Carbon dioxide combines with rain to weather rocks to form carbonic acid, which is then washed out to sea, thus extracting the greenhouse gas from the atmosphere. When the continents are near the poles, the advance of ice covers the rocks, slowing the reduction in carbon dioxide, but in the Cryogenian the weathering of that supercontinent was able to continue unchecked until the ice advanced to the tropics. The process may have finally been reversed by the emission of carbon dioxide from volcanoes or the destabilisation of methane gas.

Astronaut Bruce McCandless II outside of the Space Shuttle Challenger in 1984.

I think that sets the basic scene for the Earth itself, but there is still much to write about in terms of colonisation of land, extinctions and human evolution. But I think this is more than enough for now. Change has continued at a rapid pace and along with technological developments such as nuclear weapons, computers, genetic engineering and nanotechnology there has been economic globalisation, spurred by advances in communication and transportation technology which has influenced everyday life in many parts of the world. Cultural and institutional forms such as democracy, capitalism and environmentalism have increased influence. Major concerns and problems such as disease, war, poverty and violent radicalism along with more recent, human-caused climate-change have risen as the world population increases. In 1957, the Soviet Union launched the first artificial satellite into orbit and, soon afterwards Yuri Gagarin became the first human in space. The American, Neil Armstrong, was the first to set foot on another astronomical object, the Moon. Unmanned probes have been sent to all the known planets in the Solar System, with some, such as the two Voyager spacecraft having left the Solar System. Five space agencies, representing over fifteen countries, have worked together to build the International Space Station. Aboard it, there has been a continuous human presence in space since 2000. The World Wide Web became a part of everyday life in the 1990s, and since then has become an indispensable source of information in the developed world. I have no doubt that there will be much more to find, learn, discover and develop.

This week, as we begin a new year…
When attempting to remember the order of planets in our Solar System, I have found they can be remembered by:
‘My Very Educated Mother Just Served Us Nachos’

Mercury
Venus
Earth
Mars
Jupiter
Saturn
Uranus
Neptune

(Pluto was first discovered in 1930 and described as a planet located beyond Neptune, but following improvements in technology, in 2006 it was then reclassified as a ‘dwarf planet’ in 2006.)

Click: Return to top of page or Index page

A Year Ends, A New Year Begins

We are approaching the end of what for many of us has been the year 2021 but for some, the number will be different because according to tradition, the Hebrew calendar started at the time of Creation, placed at 3761 BCE. So for our current 2021/2022, the Hebrew year is 5782. Right over on the other side of this amazing world it will soon be New Year’s Day, whilst others have a few hours to wait. I will admit to finding it strange a few years ago on my lovely holiday as I crossed the International Dateline a couple of times when I ‘skipped over’ some days, whilst others were counted twice. Whatever our circumstances it has been a very trying and troubling year for so many on this world, this Earth. In years to come I wonder what we will reflect on, those who are doing so. We have both seen and experienced change, of that we may be sure. I have no doubt that there will be further change too, in the years ahead. On earlier blog posts I have said a bit about my young days, growing up in Whittlesey, where almost everyone seemed to know almost everyone else! We moved up from London, at first it was Mum, Dad, my two elder brothers and myself. Dad had managed to get a teaching job which included living in the school house and the school right next door made things easy for me! That school building is now the St Jude’s church. So that move was a really big change for us, though perhaps not quite as much for me because I was less than a year old then! A while later Nan and Pop, who were my paternal grandparents, decided to move from London to Whittlesey and retire there. So we stayed and I grew up in Whittlesey. Older brothers were growing up, one moved away having joined the army, whilst the other settled for a while but job opportunities moved him away too and this meant that I did not see too much of them or their respective offspring as they grew up. Then it became clear that whilst I was learning much where I worked, it seemed that – well, let us just say that my face just didn’t fit! As a result, when the opportunity came for me to move on to a higher grade job with the same company but in Leicester, I went. In truth it was the right thing for me as I met a lovely female and we were married for a while. Further job opportunities gave me greater experience, I moved a few times around the Midlands before finally settling in Leicester. Naturally I talked with parents regarding my first move away from Peterborough but Dad urged me to take the opportunity as he felt it would be good for me. It was then that I learned all about how my grandparents moving to Whittlesey had stopped Dad from doing what he had considered at one time, which was a teaching job right away from Peterborough. I might have found myself being brought up in Swindon or somewhere! But it was not meant to be. There are those, like many of the people I was at school with, who are still happily settled in Whittlesey and looking on Facebook I recognise names, but not faces! As I know I have said before, others moved to places far and wide like the U.S.A, Canada, Australia and to New Zealand. Equally, some of the people I have worked with moved over to England for various reasons, one lad from South Africa had moved here for job reasons but I learned that his claim to fame was as an extra in a crowd scene for a film – ‘Zulu’, I think it was. However, for some the moves were through political turmoil, with folk finding that they and their families were not welcome where they were, due to their race. That to me is absolutely awful, we are all human beings but we do not seem to be able to live peacefully together. Perhaps that day will come, but sadly I do not see it occurring for a while yet, when some people wish to be so selfish. Yet their lives too will end, in time.

So it got me thinking about early man. Many of you will have seen the film “2001 – A Space Odyssey”, where apes fight and they learn rudimentary use of bones as tools. The various ages of man have come and gone, our Earth has also changed but back then early man had no idea of what our world was like. That really is, I think, where the first ‘conspiracy theories’ started. Imagine being told that we were the centre of the universe, that the Earth was flat because they had to make sense of heir world. I am reminded of the poster advertising the “Flat Earth Society – members all around the globe”. Some years ago I bought a computer program called ‘Civilization’, where a player ‘created’ a new civilisation of their own. This ‘Sid Meier’s Civilization’ is a 1991 turn-based strategy video game developed and published by MicroProse and was originally developed for MS-DOS using a standard personal computer but has undergone numerous revisions for various platforms. The player is tasked with leading an entire human civilisation over the course of several millennia by controlling various areas such as urban development, exploration, government, trade, research, and military. The player can control individual units and advance the exploration, conquest and settlement of the game’s world. The player can also make such decisions as setting forms of government, tax rates and research priorities. The player’s civilisation is in competition with other computer-controlled civilisations, with which the player can enter diplomatic relationships that can either end in alliances or lead to war. The game has sold 1.5 million copies since its release, and is considered one of the most influential computer games in history due to its establishment of the 4X genre. In addition to its commercial and critical success, the game has been deemed quite valuable due to its presentation of historical relationships. A multiplayer remake, ‘Sid Meier’s CivNet’, was released for the personal computer in 1995 and ‘Civilization’ was followed by several sequels starting with ‘Civilisation II’, with similar or modified scenarios. I know, I had a copy and played the game for hours!

A world map screenshot from the Amiga version of ‘Civilization’.

In this game, the player takes on the role of the ruler of a civilisation. They start with one, occasionally two, settler units and they attempt to build an empire in competition with two to seven other civilisations. The game requires a fair amount of micromanagement, although less than other simulation games. Along with the larger tasks of exploration, diplomacy and warfare, the player has to make decisions about where to build new cities, which improvements or units to build in each city, which advances in knowledge should be sought (and at what rate), and how to transform the land surrounding the cities for maximum benefit. From time to time the player’s towns may be harassed by barbarians, units with no specific nationality and no named leader. These threats only come from huts, unclaimed land or sea, so that over time and turns of exploration, there are fewer and fewer places from which barbarians will emanate. Before the game begins, the player chooses which historical or current civilisation to play. In contrast to later games in the ‘Civilization’ series, this is largely a cosmetic choice, affecting titles, city names, musical heralds, and colour. The choice does affect their starting position on the “Play on Earth” map, and thus different resources in one’s initial cities, but has no effect on starting position when starting a random world game or a customised world game. The player’s choice of civilisation also prevents the computer from being able to play as that civilisation or the other civilisation of the same colour, and since computer-controlled opponents display certain traits of their civilisations this affects gameplay as well. For example, the Aztecs are fiercely expansionist and generally extremely wealthy. Other civilisations include the Americans, the Mongols and the Romans. Each civilisation is led by a famous historical figure, such as Mahatma Gandhi for India. The scope of this Civilization game is larger than most others. That is because it begins in 4000BC, before the Bronze Age and can last through to AD 2100 on the easiest setting with Space Age and ‘future technologies’. At the start of the game there are no cities anywhere in the world and the player controls one or two settler units, which can be used to found new cities in appropriate sites. Those cities may build other settler units, which can go out and found new cities, thus expanding the empire. Settlers can also alter terrain, build improvements such as mines and irrigation, build roads to connect cities, and later in the game they can construct railroads which offer unlimited movement. As time advances, new technologies are developed. These technologies are the primary way in which the game changes and grows. At the start, players choose from advances such as pottery, the wheel and the alphabet, leading to, near the end of the game, nuclear fission and spaceflight. Players can gain a large advantage if their civilisation is the first to learn a particular technology (the secrets of flight, for example) and put it to use in a military or other context. Most advances give access to new units, city improvements or derivative technologies, for example the chariot unit becomes available after the wheel is developed, and the granary building becomes available to build after pottery is developed. The whole system of advancements from beginning to end is called the technology tree and this concept has been adopted in many other strategy games. Since only one technology may be researched at any given time, the order in which they are chosen makes a considerable difference in the outcome of the game and generally reflects the player’s preferred style of gameplay. Players can also build Wonders of the World in each of the epochs of the game, subject only to obtaining the prerequisite knowledge. These wonders are important achievements of society, science, culture and defence, ranging from the Pyramids and the Great Wall in the Ancient age to the Copernicus Observatory and Magellan’s Expedition in the middle period right up to the Apollo programme, the United Nations and the Manhattan Project in the modern era. Each Wonder can only be built once in the world, and requires a lot of resources to build, far more than most other city buildings or units. Wonders provide unique benefits to the controlling civilisation, for example Magellan’s Expedition increases the movement rate of naval units. Wonders typically affect either the city in which they are built, for example the Colossus, every city on the continent, such as J.S. Bach’s Cathedral, or the civilisation as a whole, like Darwin’s Voyage. However, some wonders are made obsolete by new technologies. The game can be won by conquering all other civilisations or by winning the Space Race, reaching the star system of Alpha Centauri. The game has developed quite a bit over the years though, as I have an excellent version on my MacBook Pro which is much improved from the MS-DOS version that I used to play!

As I have said, it is a cleverly thought-out game, because it mirrors the real world so well. I really do wonder what will happen to us, to this Earth, in the future. There has been much speculation as to whether we will manage to travel to distant stars, to different planets and have interaction with other forms of life. As I said in a blog post earlier this year, life on Earth is based on carbon, perhaps because (so I have learned) that each carbon atom can form bonds with up to four other atoms simultaneously. That is a bit technical for me, but it seems that because of that, carbon is well-suited to form the long chains of molecules which then serve as the basis for life as we know it, such as proteins and DNA. In fact, research by some earth scientists at Rice University suggests that virtually all of Earth’s life-giving carbon could have come from a collision about 4.4 billion years ago between this Earth and an embryonic planet similar to Mercury. Science fiction has long imagined alien worlds inhabited by other life, but based on other elements. One example are the rock-eating Horta, a silicon-based life form as featured in the original Star Trek series. Also in that series, Mr Spock has green blood because the oxygen-carrying agent in Vulcan blood includes copper, rather than iron, as is the case in humans. For us here, carbon is the backbone of each and every known biological molecule. But life here has taken a finite amount of time to evolve, so who is to say that a life-form on another planet light-years from us has developed to the same level. Don’t be downhearted, but it is a fact, so far as our science will tell us, that stars like our Sun burn for about nine or 10 billion years. In fact our Sun is about halfway through its life, so it still has about five billion years to go. After that, the sun will run out of energy and drastically alter the whole of the solar system because oceans will be baked dry, entire planets will be consumed. And worlds that have been icy for so long will finally enjoy their day in the sun. Our star is powered by nuclear fusion, and it turns hydrogen into helium in a process that converts mass into energy. Once the fuel supply is gone, the sun will start growing dramatically. Its outer layers will expand until they engulf much of the solar system, as it becomes what astronomers call a red giant. The life cycle of the sun takes it from the life-giving star that we know today into a swelling red giant and, eventually, a planetary nebula surrounding a tiny white dwarf. Once the sun enters the red giant phase though, the solar system’s denouement is still a subject of debate among scientists. Exactly how far the dying sun will expand, and how conditions will change, aren’t yet clear. But a few things seem likely. The slow death will kill off life on Earth, but it may also create habitable worlds in what are presently the coldest reaches of the solar system. Any humans left around might find refuge on Pluto and other distant dwarf planets out in the Kuiper Belt, a region past Neptune packed with icy space rocks. As our sun expands, these worlds will suddenly find themselves with the conditions necessary for the evolution of life. One scientist believes how these may be the ‘delayed gratification habitable worlds’, as late in the life of the sun, in the red giant phase, the Kuiper Belt may be something of a metaphorical Miami Beach!

We can take an imaginary quick jaunt through our solar system in the potential ‘last days’ of the sun. Throughout solar system history, the innermost planet has been baked by the sun. But even today, Mercury still clings to some icy patches. As our star ages, it will vaporise those remaining volatile areas before eventually eliminating the entire planet in a slow-motion version of Star Wars’ Death Star. Venus though is sometimes called “Earth’s twin” because the neighbouring worlds are so similar in size and composition. But the hellish surface of Venus shares little in common with Earth’s Goldilocks-type perfect conditions. As the sun expands, it will burn up the atmosphere on Venus before it too is consumed by the sun. Whilst the sun may have 5 billion years left before it runs out of fuel, life on Earth will likely be wiped out a long time before that happens. That’s because the sun is actually already growing brighter. In fact by some estimates, it could be as little as a billion years before the sun’s radiation becomes too much for life here on Earth to handle. That might sound like quite a long time, but in comparison life has already existed on this planet for well over 3 billion years and when the sun does turn into a red giant, the Earth will also be vaporised, perhaps just a few million years after Mercury and Venus have been consumed. All the rocks and fossils and remains of the creatures that have lived here will be gobbled up by the sun’s growing orb, wiping out any lingering trace of humanity’s existence on Earth. But not all scientists agree with this interpretation. Some suspect the sun will stop growing just before fully engulfing our planet. Other scientists have suggested schemes for moving Earth deeper into the solar system by slowly increasing its orbit. Thankfully, this debate is still purely academic for all of us alive today. Even our young sun’s radiation was too much for Mars to hold onto an atmosphere capable of protecting complex life. However, recent evidence has shown that Mars may still have water lurking just beneath its surface. Mars may escape the sun’s actual reach as it is at the borderline, but that water will likely all be gone by the time the red giant star takes over the inner solar system. Now we look at the gas giant planets. As our red giant sun engulfs the inner planets, some of their material will likely get thrown deeper into the solar system, to be assimilated into the bodies of the gas giants.

Saturn as viewed from a side never visible from Earth.
(Credit: NASA/JPL-Caltech/Jan Regan)

Here, the ringed planet shows a side never visible from Earth. Cassini took 96 backlit photos for this mosaic on April 13, 2017. Because the sun shines through the rings, the thinnest parts glow brightest, and the thicker rings are dark. However, the approaching boundary of our star will also vaporise Saturn’s beloved rings, which are made of ice. The same fate likely awaits today’s icy ocean worlds, like Jupiter’s moon Europa as well as Saturn’s Enceladus, whose thick blankets of ice would be lost to the void. Once our sun has become a red giant, Pluto and its cousins in the Kuiper Belt along with Neptune’s moon Triton may be the most valuable real estate in the solar system. Today, these worlds hold abundant water ice and complex organic materials. Some of them could even hold oceans beneath their icy surfaces — or at least did in the distant past. But surface temperatures on dwarf planets like Pluto commonly sit at an inhospitable hundreds of degrees below freezing. However, by the time Earth is a cinder the average temperatures on Pluto will be similar to Earth’s average temperatures now.

Pluto as imaged by the New Horizons mission.
(Credit: NASA/JHU-APL/SwRI)

It has been said that when the sun becomes a red giant, the temperatures on Pluto’s surface will be about the same as the average temperatures on Earth’s surface now, because Earth will be toast, but Pluto will be balmy and brimming with the same sorts of complex organic compounds that existed when life first evolved on our own planet. Pluto will then perhaps have a thick atmosphere and a liquid-water surface. Collectively the worlds, from comet-like space rocks to dwarf planets like Eris and Sedna in this new habitable zone will have three times as much surface area as all four of the inner solar system planets combined. This might seem like an academic discussion only relevant to our distant descendants if they’re lucky enough to survive billions of years from now. However, as has been pointed out by a few astronomers, there are around a billion red giant stars in the Milky Way galaxy today. That is a lot of places for living beings to evolve and then perish as their stars consume them. Who knows what will be in the future but it is fun to speculate!

This week, as we come to the end of 2021 I am reminded of something I shared here in November 2020 and it feels appropriate to repeat.

“We are all visitors to this time, this place. We are just passing through. Our
purpose here is to observe, to learn, to grow, to love… and then we return
home.” ~ Australian Aboriginal Proverb

Click: Return to top of page or Index page

Nadolig Llawen a Blwyddyn Newydd Dda!

Or to those who do not speak Welsh, “Merry Christmas and a Happy New Year!” In the last couple of weeks I have written about Christmas. As I said last week, it is generally celebrated on December 25 each year and is a sacred religious holiday as well as being a worldwide cultural and commercial phenomenon. For two millennia, people around the world have been observing it with traditions and practices that are both religious and secular in nature. So we have been remembering and celebrating each year for a very long time. I recently watched an episode of ‘Star Trek: The Next Generation’, with the starship Enterprise and its captain, Jean-Luc Picard in command. It showed how a planet which was not as yet advanced enough to be a member of the United Federation of Planets was monitored and the inhabitants of that planet were secretly watched, to see how they were progressing. This wasn’t in any way to interfere with them, but just to observe. However, the watchers were discovered and the captain of the Enterprise was therefore seen as some omnipotent super-being, a god who could restore life to the dead. Picard had the difficult job of showing how he and everyone else had a finite life, that they could be hurt, injured and would eventually die. But the planet’s inhabitants really did take some convincing. Picard pointed out that these inhabitants had begun life living in caves, then gradually they progressed to building structures, but that their cave-dwelling ancestors would have seen them as people to be worshipped because of their skills. I think the writers of that Star Trek episode did very well, because if we were to go back two thousand years and use the skills we have learned over that time, what would the people of that time think of us. So no matter what our beliefs may be, here we are. Humans in the 21st century. Here on Earth, we are the most abundant and widespread species of primate, characterised by bipedalism along with large, complex brains. This has enabled the development of advanced tools, culture and language. We are highly social and tend to live in complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states. Our social interactions have led to a wide variety of values, social norms and rituals which bolster human society. Curiosity and a human desire to understand and influence the environment and to explain and manipulate phenomena have motivated our development of science, philosophy, mythology as well as religion and other fields of study. Although some scientists equate humans with all members of the genus Homo, in common usage it generally refers to Homo sapiens which emerged around 300,000 years ago in Africa, evolving from Homo heidelbergensis and migrating out of Africa, gradually replacing local populations of archaic humans. For most of history, all humans were nomadic hunter-gatherers but the Neolithic Revolution which began in South-west Asia around 13,000 years ago saw the emergence of agriculture and permanent human settlement. As populations became larger and denser, forms of governance developed within and between communities and a number of civilisations have risen and fallen. Humans have continued to expand, with a global population of over 7.9 billion in December 2021. Genes as well as the environment influence human biological variation in visible characteristics, physiology, disease susceptibility, mental abilities, body size and life span. Though humans vary in many ways, genetically two humans on average are over 99% similar. Generally, men have greater body strength and women have a higher body fat percentage. We are omnivorous, capable of consuming a wide variety of plant and animal material, and have become capable of using fire and other forms of heat to both prepare and cook food. We can survive for up to eight weeks without food, and three or four days without water. We are generally diurnal, sleeping on average seven to nine hours per day. It is quite usual for both the mother and the father to provide care for their children, who are helpless at birth. Over the ages humans have grown and developed, we presently have a large and highly developed prefrontal cortex, the region of the brain associated with higher cognition. We are intelligent, capable of episodic memory, flexible facial expressions, self-awareness and a theory of mind which is fully capable of introspection, private thought, imagination, volition and forming views on our existence. This has enabled many great technological advancements and complex tool development to be possible through reason and the transmission of knowledge to future generations. Language, art and trade are defining characteristics of us humans. Long-distance trade routes have led to cultural explosions and resource distribution that gave an advantage over other species.

Interestingly, the word ‘human’ is a Middle English word from the Old French ‘humain’ and ultimately from the Latin ‘hūmānus’, the adjectival form of ‘homō’, or ‘man’ in the sense of humankind. The native English term can refer to the species generally (as a synonym for humanity) as well as to human males. It may also refer to individuals of either sex, though this latter form is less common in contemporary English. Until about 12,000 years ago, all humans lived as hunter-gatherers as the Neolithic Revolution, the invention of actual agriculture first took place in South-west Asia and spread through large parts of the ‘Old World’, consisting of Africa, Europe and Asia. This was before contact with the Americas, which became known as the New World. Agriculture also occurred independently about 6,000 years ago in such places as Papua New Guinea and some regions of Africa. Access to food surplus led to the formation of permanent human settlements, the domestication of animals and the use of metal tools for the first time in history. Agriculture and sedentary lifestyles led to the emergence of early civilisations. Then an urban revolution took place in the fourth millennium BCE with the development of city states, particularly Sumerian cities which were located in Mesopotamia and it was in these cities that the earliest known form of writing, cuneiform script, appeared around 3000 BCE. Other major civilisations developing around this time were Ancient Egypt and the Indus Valley. They eventually traded with each other and invented technology such as wheels, ploughs and sails.

Agriculture and domestication of animals
led to stable human settlements.

This getting to be more of a history lesson than I’d realised! But bear with me please. Astronomy and mathematics were also developed and the Great Pyramid of Giza was built. But there is evidence of a severe drought lasting about a hundred years that may have caused the decline of these civilisations, with new ones appearing in the aftermath. Babylonians came to dominate Mesopotamia while others, such as the Minoans and the Shang dynasty, rose to prominence in new areas. The Bronze Age suddenly collapsed about 1200 BCE, resulting in the disappearance of a number of civilisations and the beginning of the Greek Dark Ages. During this period iron then started replacing bronze, leading to the Iron Age. In the 5th century BCE, history started being recorded as a discipline, so providing a much clearer picture of life at that time. Between the 8th and 6th century BCE Europe entered the Classical Antiquity age, a period when Ancient Greece and Ancient Rome flourished and around this time other civilisations also came to prominence. The Mayan civilisation started to build cities and also create complex calendars whilst in Africa, the kingdom of Aksum overtook the declining kingdom of Kush which facilitated trade between India and the Mediterranean. In West Asia, the Achaemenid Empire’s system of their centralised governance become the precursor to many later empires, while the Gupta Empire in India and the Han Dynasty in China have been described as ‘golden ages’ in their respective regions. Following the fall of the Western Roman Empire in 476CE, Europe entered the Middle Ages and it was during this period that Christianity and the Church would become the source of centralised authority and education. In the Middle East, Islam became the prominent religion and expanded into North Africa. It led to an Islamic Golden Age, inspiring achievements in architecture, the revival of old advances in science and technology and the formation of a distinct way of life. The Christian and Islamic worlds would eventually clash, with the kingdom of England, the kingdom of France and the Holy Roman Empire declaring a series of ‘holy wars’ to regain control of the Holy Land from Muslims. In the Americas, complex societies would arise starting around 800CE, whilst further south the Aztecs and Incas would become the dominant powers. The Mongol Empire would conquer much of Eurasia in the 13th and 14th centuries and over this same time period the Mali Empire in Africa grew to the largest empire on the continent, stretching from Senegambia to the Ivory Coast. Oceania would see the rise of the Tu’i Tonga empire which expanded across many islands in the South Pacific. It was throughout the early modern period (1500–1800) that the Ottomans controlled the lands around the Mediterranean Basin, whilst Japan entered the Edo period, the Qing dynasty rose in China and the Mughal empire ruled much of India. Europe underwent the Renaissance, starting in the 15th century and the Age of Discovery began with the exploring and the colonising of new regions. This included the British Empire, which expanded to become the world’s largest empire and the colonisation of the Americas. This great expansion led to the Atlantic slave trade and the genocide of Native American peoples. The period also marked the start of the Scientific revolution, with great advances in mathematics, mechanics, astronomy and physiology. The late modern period, 1800 to the present, saw the Industrial and Technological revolutions bring such discoveries as transport, energy development and imaging technology. The United States of America underwent great change, going from a small group of colonies to one of the global super-powers. The Napoleonic Wars had raged right through Europe in the early 1800s, Spain lost most of its New World colonies and Europeans continued expansion into Oceania as well as Africa where European control went from 10% to almost 90% in less than fifty years. A tenuous balance of power among European nations collapsed in 1914 with the outbreak of the First World War, one of the deadliest conflicts in history. In the 1930s a worldwide economic crisis led to the rise of some authoritarian regimes and a Second World War, involving almost all of the world’s countries. Following its conclusion in 1945, the Cold War between the USSR and the USA saw a struggle for global influence, which included a nuclear arms race as well as a space race. What I believe is now the current Information Age sees the world becoming increasingly globalised as well as being interconnected.

Early human settlements were dependent on proximity to water and, depending on the lifestyle, other natural resources used for subsistence such as populations of animal prey for hunting as well as arable land for growing crops and grazing livestock. Modern humans, however, have a great capacity for altering their habitats by means of technology, irrigation, urban planning, construction, deforestation and desertification. It has been said that human settlements continue to be vulnerable to natural disasters, especially those placed in hazardous locations and with low quality of construction! Grouping and deliberate habitat alteration is often done with the goals of providing protection, accumulating comforts or material wealth, expanding the available food, improving aesthetics, increasing knowledge or enhancing the exchange of resources. It is also said that humans are one of the most adaptable species, despite having a relatively narrow tolerance to many of the earth’s extreme environments. Through invention, humans have been able to extend their tolerance to a wide variety of temperatures, humidity and altitudes. As a result, we are a cosmopolitan species found in almost all regions of the world, including tropical rainforests, arid deserts, extremely cold arctic regions and heavily polluted cities. Most other species are confined to a few geographical areas by their limited adaptability. The human population is not, however, uniformly distributed on the Earth’s surface because the population density varies from one region to another and there are large areas almost completely uninhabited, like Antartica and the vast swathes of ocean. Most humans live in Asia (61%), the remainder live in the Americas (14%), Africa (14%), Europe (11%), and Oceania (0.5%). Within the last century, humans have explored challenging environments such as Antarctica, the deep sea and outer space. But human habitation within these hostile environments is restrictive and expensive, typically limited in duration, and restricted to scientific, military or industrial expeditions. We have briefly visited the Moon and have made our presence felt on other celestial bodies through robotic spacecraft. In addition, since 2000 there has been a continuous human presence in space through the habitation of the International Space Station. Estimates of the population at the time agriculture emerged in around 10,000 BC have ranged between 1 million and 15 million. Around 50–60 million people lived in the combined eastern and western Roman Empire in the 4th century AD. Bubonic plagues, first recorded in the 6th century AD, reduced the population by 50%, with the Black Death killing 75–200 million people in Eurasia and North Africa alone. The human population was believed to have reached one billion in 1800 and has then increased exponentially, reaching two billion in 1930 and three billion in 1960, four in 1975, five in 1987 and six billion in 1999. It passed seven billion in 2011 and in 2020 there were 7.8 billion of us. In 2018, 4.2 billion humans (55%) lived in urban areas, up from 751 million in 1950 with the most urbanised regions being Northern America (82%), Latin America (81%), Europe (74%) and Oceania (68%), with Africa and Asia having nearly 90% of the world’s 3.4 billion rural population. Problems for humans living in cities include various forms of pollution and crime, especially in inner city and suburban slums. We have had a dramatic effect on the environment as we are ‘apex’ predators, being rarely preyed upon by other species. Human population growth, industrialisation, land development, overconsumption and combustion of fossil fuels have led to environmental destruction and pollution that significantly contributes to the ongoing mass extinction of other forms of life. We are the main contributor to global climate change, which may accelerate the Holocene extinction, otherwise referred to as the sixth mass extinction or Anthropocene extinction, which is an ongoing extinction event of species during the present Holocene epoch, with the more recent time sometimes called Anthropocene as a result of human activity. The most popular theory is that human overhunting of species added to existing stress conditions as the extinction coincides with human emergence. Although there is debate regarding how much human predation affected their decline, certain population declines have been directly correlated with human activity, such as the extinction events of New Zealand and Hawaii. Aside from humans though, climate change may have been a driving factor in the megafaunal extinctions, especially at the end of the Pleistocene period. Ecologically, humanity has been noted as an unprecedented ‘global super-predator’ that consistently preys on the adults of other ‘apex’ predators and has worldwide effects on food webs. There have been extinctions of species on every land mass and in every ocean. Overall, the Holocene extinction can be linked to the human impact on the environment and this continues into the 21st century, with meat consumption being a primary driver of the mass extinction along with the human population growth and increasing per-capita consumption being considered as primary drivers of this decline.

Earth as seen from Space.

The above image shows the Earth as seen from Space in 2016, showing the extent of human occupation of the planet. The bright lights signify both the most densely inhabited areas and ones financially capable of illuminating them. But there is relatively little variation between human geographical populations, and most of the variation that occurs is at the individual level. Much of human variation is continuous, often with no clear points of demarcation. Genetic data shows that no matter how population groups are defined, two people from the same population group are almost as different from each other as two people from any two different population groups. Dark-skinned populations that are found in Africa, Australia, and South Asia are not closely related to each other. As for our culture, the most widely spoken languages are English, Mandarin Chinese, Hindi, Spanish, Standard Arabic, Bengali, French, Russian, Portuguese and Urdu. Our most practices religions are Christianity, Islam, Hinduism, Buddhism, some Folk religions, Sikhism, Judaism as well as a few unaffiliated ones. Language is the principal form of communication and unique to humans, although many other species have their own forms of communication. Unlike the limited systems of other animals, human language is open, as an infinite number of meanings can be produced by combining a limited number of symbols. Human language also has the capacity of displacement, using words to represent things and happenings that are not presently or locally occurring but reside in the shared imagination of others. Language differs from other forms of communication in that the same meanings can be conveyed through different media, either audibly in speech, visually by sign language or writing and through tactile media such as Braille. Language is central to the communication between humans, and to the sense of identity that unites nations, cultures and ethnic groups. There are approximately six thousand different languages currently in use, including sign languages, and many thousands more that are extinct. But unlike speaking, reading and writing does not come naturally to us and must be taught. Despite this, forms of literature have been present before the invention of words and language, with 30,000-year-old paintings on walls inside some caves portraying a series of dramatic scenes. One of the oldest surviving works of literature is the Epic of Gilgamesh, first engraved on ancient Babylonian tablets about 4,000 years ago. Beyond simply passing down knowledge, the use and sharing of imaginative fiction through stories might have also helped develop the human capabilities for communication and increased the likelihood of securing a mate. Storytelling may also be used as a way to provide the audience with moral lessons and encourage cooperation. We are are often the subject of the arts, as while most art focuses on individual humans or a small group, in literature the genre of science fiction is known for tackling issues related to the humanity as a whole, for example topics such as human evolution or the future of civilisation. This aspect is definitely something I have seen quite clearly in episodes of Star Trek. I feel that we have learned much, yet there is so much more for us to hopefully learn and share with others in a good and positive way that will be of benefit to us all.

This week, I have read…
We all know that Santa has a sleigh, on which he puts all the presents which must be delivered. He has his reindeer, all ready to take him around the world. For Santa, time is special and so that he can get all that must be done in good time he takes with him a Workshop elf. This elf makes sure the sleigh is in good working order and that the presents are packed correctly. The elf is also an engineer and will do repairs if needed, especially with tower blocks going higher and higher as well as aerials, satellite dishes and the like. It’s a hard life, but as you open your presents and thank Santa, please spare a thought for the Workshop elf…

Click: Return to top of page or Index page

Christmas 2021

Christmas is celebrated on December 25 each year and is a sacred religious holiday as well as being a worldwide cultural and commercial phenomenon. For two millennia, people around the world have been observing it with traditions and practices that are both religious and secular in nature. Christians celebrate Christmas Day as the anniversary of the birth of Jesus of Nazareth, a spiritual leader whose teachings form the basis of their religion. Popular customs include exchanging gifts, decorating Christmas trees, attending church, sharing meals with family and friends and, of course, waiting for Santa Claus to arrive. But the middle of winter has long been a time of celebration around the world. Centuries before the arrival of the man called Jesus, early Europeans celebrated light and birth in the darkest days of winter. Many peoples rejoiced during the winter solstice, when the worst of the winter was behind them and they could look forward to longer days and extended hours of sunlight. In Scandinavia, the Norse celebrated Yuletide from December 21, the winter solstice, through to January. In recognition of the return of the sun, fathers and sons would bring home large logs, which they would set on fire. The people would feast until the log burned out, which could take as many as 12 days. The Norse believed that each spark from the fire represented a new pig or calf that would be born during the coming year.

1848 image of Queen Victoria, Prince Albert and their children.

The end of December was a perfect time for celebration in most areas of Europe. At that time of year, most cattle were slaughtered so they would not have to be fed during the winter. In fact for many, it was the only time of year when they had a supply of fresh meat. As well as that, most wine and beer made during the year was fully fermented and ready for drinking. In Germany, people honoured the pagan god Oden during the mid-winter holiday. Germans were terrified of Oden, as they believed he made nocturnal flights through the sky to observe his people, and then decide who would prosper or perish. Because of his presence, many people chose to stay inside. In Rome, where winters were not as harsh as those in the far north, Saturnalia—a holiday in honour of Saturn, the god of agriculture—was celebrated. Beginning in the week leading up to the winter solstice and continuing for a full month, this was a time when food and drink were plentiful and the normal Roman social order was turned upside down. For a month, enslaved people were given temporary freedom and treated as equals. Businesses and schools were closed so that everyone could participate in the holiday’s festivities. Also around the time of the winter solstice, Romans observed Juvenalia, a feast honouring the children of Rome. In addition, members of the upper classes often celebrated the birthday of Mithra, the god of the unconquerable sun, on December 25. It was believed that Mithra, an infant god, was born of a rock. For some Romans, Mithra’s birthday was the most sacred day of the year. However, in the early years of Christianity, Easter was the main holiday and the birth of Jesus was not celebrated. In the fourth century, church officials decided to institute the birth of Jesus as a holiday. Unfortunately, the Bible does not mention a date for his birth, a fact Puritans later pointed out in order to deny the legitimacy of the celebration. Although some evidence suggests that his birth may have occurred in the spring, an argument put forward being why would shepherds be herding in the middle of winter. Pope Julius I chose December 25 and it is commonly believed that the church chose this date in an effort to adopt and absorb the traditions of the pagan Saturnalia festival. First called the Feast of the Nativity, the custom spread to Egypt by 432AD and to England by the end of the sixth century. So by holding Christmas at the same time as traditional winter solstice festivals, church leaders increased the chances that Christmas would be popularly embraced, but gave up the ability to dictate how it was celebrated. By the Middle Ages Christianity had, for the most part, replaced pagan religion. However at Christmas, believers attended church, then celebrated raucously in a drunken, carnival-like atmosphere similar to today’s Mardi Gras. Each year, a beggar or student would be crowned the “lord of misrule” and eager celebrants played the part of his subjects. The poor would go to the houses of the rich and demand their best food and drink and if owners failed to comply, their visitors would most likely terrorise them with mischief. Christmas became the time of year when the upper classes could repay their real or imagined “debt” to society by entertaining less fortunate citizens. It was in the early 17th century a wave of religious reform changed the way Christmas was celebrated in Europe. When Oliver Cromwell and his Puritan forces took over England in 1645, they vowed to rid England of decadence and, as part of their effort, cancelled Christmas. But by popular demand, Charles II was restored to the throne and with him came the return of the popular holiday. The Pilgrims, English separatists who came to America in 1620, were even more orthodox in their Puritan beliefs than Cromwell. As a result, Christmas was not a holiday in early America. From 1659 to 1681, the celebration of Christmas was actually outlawed in Boston, in fact Ebenezer Scrooge had nothing on the 17th-century Puritans, who actually banned the public celebration of Christmas in the Massachusetts Bay Colony for an entire generation and so anyone exhibiting the Christmas spirit was fined five shillings. By contrast, in the Jamestown settlement, Captain John Smith reported that Christmas was enjoyed by all and passed without incident. It seems though that in America, after the American Revolution, English customs fell out of favour, including Christmas. In fact Christmas wasn’t declared a federal holiday until June 26, 1870. So it seems that it wasn’t until the 19th century that Americans began to embrace Christmas. There are even those who have said that Americans re-invented Christmas and changed it from a raucous carnival holiday into a family-centred day of peace and nostalgia. I am not sure I can agree with that! But what was it about the 1800s that piqued American interest in the holiday? There, the early 19th century was a period of class conflict and turmoil. During this time, unemployment was high and gang rioting by the disenchanted classes often occurred during the Christmas season. In 1828, the New York city council instituted the city’s first police force in response to a Christmas riot. This encouraged quite a few members of the upper classes to begin to change the way Christmas was celebrated in America. In 1819, best-selling author Washington Irving wrote The Sketchbook of Geoffrey Crayon, a series of stories about the celebration of Christmas in an English manor house. The sketches featured a squire who invited the peasants into his home for the holiday. In contrast to the problems faced in American society, this showed how the two groups mingled effortlessly. In Irving’s mind, Christmas should be a peaceful, warm-hearted holiday bringing groups together across lines of wealth or social status. Irving’s fictitious celebrants enjoyed ‘ancient customs’ including the crowning of a Lord of Misrule. Irving’s book, however, was not based on any holiday celebration he had attended and in fact, many historians say that Irving’s account actually “invented” tradition by implying that it described the true customs of the season. Also around this time, English author Charles Dickens created the classic holiday tale, A Christmas Carol. The story’s message, the importance of charity and good will towards all humankind, struck a powerful chord in England as well as the United States and showed members of Victorian society the benefits of celebrating the holiday. The family was also becoming less disciplined and more sensitive to the emotional needs of children during the early 1800s. Christmas provided families with a day when they could lavish attention and gifts on their children without appearing to ‘spoil’ them. As people began to embrace Christmas as a perfect family holiday, old customs were unearthed. They looked toward recent immigrants and Catholic and Episcopalian churches to see how the day should be celebrated and in time a Christmas tradition was built which included pieces of many other customs, including decorating trees and exchanging gifts. But although most families quickly bought into the idea that they were celebrating Christmas how it had been done for centuries, some Americans believed they had re-invented a holiday to fill the cultural needs of a growing nation.

In my research I have found a few questions regarding the legend of Santa Claus, which can be traced back to a monk named St. Nicholas. Born in Turkey around 280 A.D., St. Nicholas gave away all of his inherited wealth and travelled the countryside helping the poor and sick, becoming known as the protector of children and sailors. The modern character of Santa is based on traditions surrounding St. Nicholas, with Santa generally depicted as a portly, jolly, white-bearded man, often wearing spectacles, a red coat with white fur collar and cuffs, white-fur-cuffed red trousers, red hat with white fur, and black leather belt and boots, carrying a bag full of gifts for children. He is commonly portrayed as laughing in a way that sounds like “ho ho ho”. This image became popular in the 19th century due to the significant influence of the poem ‘A Visit From St. Nicholas’, also known as The Night Before Christmas and ’Twas The Night Before Christmas from the first line of a poem first published anonymously in 1823 and later attributed to Clement Clarke Moore, who claimed its authorship in 1837. The story is that on the night of Christmas Eve, a family is settling down to sleep when the father is disturbed by noises on the lawn outside. Looking out of the window, he sees Saint Nicholas on a sleigh which is pulled by eight reindeer. After landing his sleigh on the roof, Saint Nicholas enters the house down the chimney, carrying a sack of toys. The father watches his visitor fill the stockings which are hanging by the fireplace and laughs to himself. They share a conspiratorial moment before Saint Nicholas bounds up the chimney again. As he flies away, he wishes a “Happy Christmas to all, and to all a good night.” So St. Nicholas became known by various names such as Santa Claus, Saint Nick, Kris Kringle or simply ‘Santa’. He is said to bring gifts on Christmas Eve of toys and sweets to well-behaved children and either coal or nothing to naughty children. He is said to accomplish this with the aid of Christmas elves who make the toys in his workshop at the North Pole, distributing the gifts around the world on his sleigh which is pulled through the air by flying reindeer. Christmas traditions around the world are quite diverse, but they share key traits that often involve themes of light, evergreens and hope. Probably the most celebrated holiday in the world, our modern Christmas is a product of hundreds of years of both secular and religious traditions from around the globe, many of them centred on the winter solstice. Most people in Scandinavian countries honour St. Lucia (also known as St. Lucy) each year on December 13. The celebration of St. Lucia Day began in Sweden, but had spread to Denmark and Finland by the mid-19th century. In these countries, the holiday is considered the start of the Christmas season and is sometimes referred to as “little Yule.” Traditionally, the oldest daughter in each family rises early, dressed in a long, white gown with a red sash, and wearing a crown made of twigs with nine lighted candles. She wakes each of her family members and for the day, she is called “Lussi” or “Lussibruden” (Lucy bride). The family then eats breakfast in a room lighted with candles. Any shooting or fishing done on St. Lucia Day was done by torchlight, and people brightly illuminated their homes. At night, men, women and children would carry burning torches in a parade. The night would end when everyone threw their torches onto a large pile of straw, creating a huge bonfire. In Finland today, one girl is chosen to serve as the national Lucia and she is honoured in a parade in which she is surrounded by torchbearers. Light is a main theme of St. Lucia Day as her name, which is derived from the Latin word lux, means light. Her feast day is celebrated near the shortest day of the year, when the sun’s light again begins to strengthen. Lucia lived in Syracuse during the fourth century when persecution of Christians was common. Unfortunately, most of her story has been lost over the years but according to one common legend, Lucia lost her eyes while being tortured by a Diocletian for her Christian beliefs. Others say she may have plucked her own eyes out to protest at the poor treatment of Christians and it is for that reason St. Lucia is the patron saint of the blind. In Finland, many Finns visit the sauna on Christmas Eve and families gather and listen to the national “Peace of Christmas” radio broadcast. It is also the custom there to visit the gravesites of departed family members. In Norway, the birthplace of the Yule log, I have learned that the ancient Norse used the Yule log in their celebration of the return of the sun at winter solstice. “Yule” came from the Norse word ‘hweol’, meaning wheel. The Norse believed that the sun was a great wheel of fire that rolled towards and then away from the earth. If you ever wonder why the family fireplace is such a central part of the typical Christmas scene it is because this tradition dates back to the Norse Yule log. It is probably also responsible for the popularity of log-shaped cheese, cakes and desserts during the holidays. But the tradition of decorating Christmas trees comes from Germany and decorating evergreen trees had always been a part of the German winter solstice tradition. The first Christmas trees explicitly decorated and named after the Christian holiday appeared in Strasbourg (part of Alsace) in the beginning of the 17th century. After 1750, Christmas trees began showing up in other parts of Germany, and even more so after 1771, when Johann Wolfgang von Goethe visited Strasbourg and promptly included a Christmas tree is his novel, The Suffering of Young Werther. But over in Mexico, papier-mâché sculptures called piñatas are filled with sweets and coins and hung from the ceiling. Children then take turns hitting the piñata until it breaks, sending a shower of treats to the floor. Children race to gather as much of the items as they can. In 1828, the American minister to Mexico, Joel R. Poinsett, brought a red-and-green plant from Mexico to America. As its colouring seemed perfect for the new holiday, the plants, called poinsettias after Poinsett, began appearing in greenhouses as early as 1830. In 1870, New York stores began to sell them at Christmas and by 1900, they were a universal symbol of the holiday. It may come as no surprise that a manger scene is the primary decoration in Central American, South American and most southern European nations, as St. Francis of Assisi created the first living nativity in 1224 to help explain the birth of Jesus to his followers. Further north, most Canadian Christmas traditions are very similar to those practiced in the United States. In the far north of the country, indigenous Inuits celebrate a winter festival called Sinck Tuck, which features parties with dancing and the exchanging of gifts.

Over in France, Christmas is called Noel. This comes from the French phrase les bonnes nouvelles, which means “the good news” and refers to the gospel. In southern France, some people burn a log in their homes from Christmas Eve until New Year’s Day. This stems from an ancient tradition in which farmers would use part of the log to ensure good luck for the next year’s harvest. Equally, Italians call Christmas ‘il Natale,’ meaning “the birthday” whilst in Greece, many people believe in the ‘kallikantzeri’, goblins that appear and cause mischief during the 12 days of Christmas. Gifts are usually exchanged on January 1, St. Basil’s Day. But down in Australia, the holiday comes in the middle of summer and it’s not unusual for some parts of Australia to hit 100 degrees Fahrenheit on Christmas Day. During the warm and sunny Australian Christmas season, beach time and outdoor barbecues are common. Traditional Christmas Day celebrations include family gatherings, exchanging gifts and either a hot meal with ham, turkey, pork or seafood or barbecues. Here in Britain I have learned that the tradition of exchanging Christmas cards can be traced back to England. An Englishman named John Calcott Horsley helped to popularise the tradition of sending Christmas greeting cards when he began producing small cards featuring festive scenes and a pre-written holiday greeting and this began in the late 1830s. Our Post Office, which dates way back to 1660 when it was established by Charles II and under the guise of the General Post Office (GPO), it soon grew as an important organisation integral within the infrastructure of England during the seventeenth century. Therefore, the exchanging of these cards nearly made them overnight sensations. Celtic and Teutonic peoples had long considered mistletoe to have magic powers and it was said to have the ability to heal wounds and increase fertility. The Celts hung mistletoe in their homes in order to bring themselves good luck and ward off evil spirits and in the Victorian era, during holidays the English would hang sprigs of mistletoe from ceilings and in doorways. If someone was found standing under the mistletoe, they would be kissed by someone else in the room, although this was a behaviour that was not usually demonstrated in Victorian society. A favourite food at this time of year is Christmas pudding, also known as ‘figgy pudding’ or plum pudding, an English dish dating back to the Middle Ages. Suet, flour, sugar, raisins, nuts and spices are tied loosely in cloth and boiled until the ingredients are “plum,” meaning they have enlarged enough to fill the cloth. It is then unwrapped, sliced like cake and topped with cream. Also, ‘Carolling’ began in England, when wandering musicians would travel from town to town visiting castles and homes of the rich. In return for their performance, the musicians hoped to receive a hot meal or money. In most countries nowadays I think children hang stockings on their bedpost or near a fireplace on Christmas Eve, hoping that it will be filled with treats while they sleep. In Scandinavia, similar-minded children leave their shoes on the hearth. But the best one has to be in the Ukraine, where Ukrainians prepare a traditional twelve-course meal and the family’s youngest child keeps watch through the window for the evening star to appear, a signal that the feast can begin. I do wonder if the older children actually sit and wait…

This week, a Fascinating Fact…
We know the word ‘emphatic’, but there is also ‘phatic’. A phatic expression denotes or relates to language used for general purposes of social interaction, rather than to convey information or ask questions. Utterances such as “hello, how are you” and “nice morning, isn’t it?“ are phatic expressions.

Click: Return to top of page or Index page

Christmas Approaches…

In a couple of days time it will be December 12th. Some of you reading this may be reading it on that very date, but for many it will still be a couple of days away. Already the shops will be getting a little bit busier, although in no way do I think they will be as busy as a few years ago. That has been due to the changes in our lifestyles over the last few years. I see mentions on Facebook of folk who put their Christmas decorations up, the same with trees. I mention December 12th as it was my dad’s birthday and although he sadly passed away some years ago now, I still follow our family tradition of putting up decorations, cards and the like starting on that day. I found it interesting though to research the history of Christmas trees which goes back to the symbolic use of evergreens in ancient Egypt and Rome. Long before the advent of Christianity, plants and trees that remained green all year had a special meaning for people in the winter. Just as people today decorate their homes during the festive season with trees such as pine, spruce, and fir, ancient peoples hung evergreen boughs over their doors and windows. In many countries it was believed that evergreens would keep away witches, ghosts, evil spirits, and illness. Here in the Northern hemisphere, the shortest day and longest night of the year falls on December 21 or December 22 and is called the winter solstice. Many ancient people believed that the sun was a god and that winter came every year because the sun god had become sick and weak. They celebrated the solstice because it meant that at last the sun god would begin to get well. Evergreen boughs reminded them of all the green plants that would grow again when the sun god was strong and summer would return. The ancient Egyptians worshipped a god called Ra, who had the head of a hawk and wore the sun as a blazing disk in his crown. At the solstice, when Ra began to recover from his illness, the Egyptians filled their homes with green palm rushes, which symbolised for them the triumph of life over death. Early Romans marked the solstice with a feast called Saturnalia in honour of Saturn, the god of agriculture. The Romans knew that the solstice meant that soon, farms and orchards would be green and fruitful once more so to mark the occasion, they decorated their homes and temples with evergreen boughs. In Northern Europe the Druids, the priests of the ancient Celts, also decorated their temples with evergreen boughs as a symbol of everlasting life whilst the Vikings in Scandinavia thought that evergreens were the special plant of the sun god, Balder. But Germany is credited with starting the Christmas tree tradition as we now know it back in the 16th century when devout Christians brought decorated trees into their homes. Some built Christmas pyramids of wood and decorated them with evergreens and candles if wood was scarce. It is a widely held belief that Martin Luther, the 16th-century Protestant reformer, first added lighted candles to a tree. Walking toward his home one winter evening, composing a sermon, he was awed by the brilliance of stars twinkling through all the evergreens. To recapture the scene for his family, he erected a tree in the main room and wired its branches with lighted candles. Most 19th-century Americans found Christmas trees an oddity though. The first record of one being on display was in the 1830s by the German settlers of Pennsylvania, although trees had been a tradition in many German homes much earlier. The Pennsylvania German settlements had community trees as early as 1747 but as late as the 1840s Christmas trees were seen as pagan symbols and not accepted by most Americans. It is not surprising that, like many other festive Christmas customs, the tree was adopted so late in America. To the New England Puritans, Christmas was sacred. The second governor of the pilgrims, William Bradford, wrote that he tried hard to stamp out “pagan mockery” of the observance, penalising any frivolity. Also, Oliver Cromwell preached against “the heathen traditions” of Christmas carols, decorated trees, and any joyful expression that desecrated “that sacred event.” In 1659, the General Court of Massachusetts enacted a law making any observance of December 25 (other than a church service) a penal offence, in addition people were fined for hanging decorations. That stern solemnity continued until the 19th century, when the influx of German and Irish immigrants undermined the Puritan legacy.

An illustration from a December 1848 edition of the Illustrated London News shows Queen Victoria and her family surrounding a Christmas tree.
Bettmann Archive/Getty Images

In 1846 the popular royals, Queen Victoria and Prince Albert, were sketched in the Illustrated London News standing with their children around a Christmas tree. Unlike the previous royal family, Victoria was very popular with her subjects, and what was done at court immediately became fashionable, not only in Britain but with fashion-conscious East Coast American Society. The Christmas tree had arrived. By the 1890s Christmas ornaments were arriving from Germany and Christmas tree popularity was on the rise around the U.S.A. It was noted that Europeans used small trees about four feet in height, while Americans liked their Christmas trees to reach from floor to ceiling. The early 20th century saw Americans decorating their trees mainly with homemade ornaments, while the German-American sect continued to use apples, nuts, and marzipan biscuits. Popcorn joined in after being dyed bright colours and interlaced with berries and nuts. Electricity brought about Christmas lights, making it possible for Christmas trees to glow for days on end. With this, Christmas trees began to appear in town squares across the country and having a Christmas tree in the home became a tradition around the world, but their history varies from country to country. Here are just a few examples.

Down in Brazil, although Christmas falls during the summer there, they sometimes decorate pine trees with little pieces of cotton that represent falling snow whilst in China, of the small percentage of Chinese who do celebrate Christmas, most erect artificial trees decorated with spangles and paper chains, flowers, and lanterns. Christmas trees are called “trees of light.” In Canada, the German settlers who migrated there from the United States in the 1700s brought with them many of the things associated with Christmas we cherish today, for example Advent calendars, gingerbread houses, biscuits and of course Christmas trees. When Prince Albert put up a Christmas tree at Windsor Castle in 1848, the Christmas tree became a tradition throughout the United Kingdom, the United States, and Canada. Over in Germany, besides the Martin Luther legend, another says that in the early 16th century, people in Germany combined two customs that had been practiced in different countries around the globe. The Paradise tree (a fir tree decorated with apples) represented the Tree of Knowledge in the Garden of Eden. The Christmas Light, a small, pyramid-like frame, usually decorated with glass balls, tinsel and a candle on top, was a symbol of the birth of Christ as the Light of the World. Changing the tree’s apples to tinsel balls and biscuits and combining this new tree with the light placed on top, the Germans created the tree that many of us know today. I understand that a modern Tannenbaum are traditionally decorated in secret with lights, tinsel and ornaments by parents and then lit and revealed on Christmas Eve with sweets, nuts and gifts under its branches. I’ve learned that down in Guatemala the Christmas tree has joined the “Nacimiento” (Nativity scene) as a popular ornament, it is thought because of the large German population there. Gifts are left under the tree on Christmas morning for the children but for some reason parents and adults do not exchange gifts until New Year’s Day. Here in Britain, the Norway spruce is the traditional species used to decorate homes. This was in fact a native species in the British Isles before the last Ice Age and was reintroduced there before the 1500s, but since December 1947 a Christmas tree has been an annual gift to the people of Britain from Norway as a token of gratitude for British support to Norway during the Second World War. The first tree was cut down by Mons Urangsvåg in 1942 during a raid on the Norwegian island called Hisøy, which is located on the west coast between Bergen and Haugesund. After it was cut down, the tree was then transported to England where the Norwegian King was in exile, and given to him as a gift. It is possible to visit the island of Hisøy but only by boat, and from the old tree stump a new tree has since grown. The Christmas tree has been a gift to the people of Britain by Norway every year since then and has provided a central focus for the Trafalgar Square traditional carol-singing programme which is performed by different groups raising money for voluntary or charitable organisations. It is prominently displayed from the beginning of December until 6 January the following year, the Twelfth Night of Christmas, when it is taken down for recycling. The tree is chipped and composted, to make mulch. It is typically a fifty to sixty-year-old Norway spruce, generally over twenty metres tall and is cut in Norway some time in November during a ceremony attended by the British Ambassador to Norway, the Mayor of Oslo and the Lord Mayor of Westminster. After the tree is cut, it is shipped to the UK and at one time it was brought over to Felixstowe free of charge by a cargo ship of the Fred Olsen Line. Then from around 2007 it was brought into Immingham by the DFDS Tor Line, but since 2018 it has been the responsibility of Radius Group to transport, guard and erect the tree in Trafalgar Square. The tree is decorated in a traditional Norwegian style and adorned with 500 white lights and in 2008 the tree began using low-wattage halogen bulbs which used just 3.5kW of power.

Different countries have slightly different traditions when it comes to Christmas trees. In Ireland, they are bought at any time in December and decorated with coloured lights, tinsel, and baubles. Some people favour the angel on top of the tree, others the star. The house is decorated with garlands, candles, holly, and ivy whilst wreaths and mistletoe are hung on the door. In Italy, the presepio (manger or crib) represents in miniature the Holy Family in the stable and is the centre of Christmas for families. Guests kneel before it and musicians sing before it. The presepio figures are usually hand-carved and very detailed in features and dress. The scene is often set out in the shape of a triangle. It provides the base of a pyramid-like structure called the ceppo, this being a wooden frame arranged to make a pyramid several feet high. Several tiers of thin shelves are then supported by this frame. It is entirely decorated with coloured paper, gilt pine cones, and miniature coloured pennants. Small candles are fastened to the tapering sides and a star or small doll is hung at the apex of the triangular sides, whilst the shelves above the manger scene have small gifts of fruit, sweets and presents. It has been said that the ceppo is done in an old Tree of Light tradition which became the Christmas tree in other countries. Some houses even have a ceppo for each child in the family. In Japan, for most of the Japanese who celebrate Christmas it’s purely a secular holiday devoted to the love of their children. Christmas trees are decorated with small toys, dolls, paper ornaments, gold paper fans and lanterns, and wind chimes. Miniature candles are also put among the tree branches and one of the most popular ornaments is the origami swan. Japanese children have exchanged thousands of folded paper “birds of peace” with young people all over the world as a pledge that war must not happen again. Across in Mexico, the principal holiday adornment is el Nacimiento, or Nativity scene. However, a decorated Christmas tree may be incorporated in the Nacimiento or set up elsewhere in the home. As purchase of a natural pine represents a luxury commodity to most Mexican families, the typical arbolito (little tree) is often an artificial one, a bare branch cut from a copal tree (Bursera microphylla) or some type of shrub collected from the countryside. Up in Norway itself, nowadays Norwegians often take a trip to the woods to select a Christmas tree, a trip that their grandfathers probably did not make. The Christmas tree was not introduced into Norway from Germany until the latter half of the 19th century and to the country districts it came even later. Therefore when Christmas Eve arrives, there is the decorating of the tree, usually done by the parents behind the closed doors of the living room, while the children wait with excitement outside. There is a Norwegian ritual known as “circling the Christmas tree” which follows, where everyone joins hands to form a ring around the tree and then walk around it singing carols. After that, gifts are distributed. Across in the Philippines, fresh pine trees are too expensive for many Filipinos so handmade trees in an array of colours and sizes are often used. Star lanterns appear everywhere in December. They are made from bamboo sticks, covered with brightly coloured rice paper or cellophane, and usually feature a tassel on each point. There is usually one in every window, each representing the Star of Bethlehem. But it seems that over in Saudi Arabia the Europeans, Americans, Indians, Filipinos, as well as other Christians living there have to celebrate Christmas privately in their homes. Christmas lights are generally not tolerated and as a result most families place their Christmas trees somewhere rather inconspicuous. However, Christmas is a summer holiday in South Africa as whilst Christmas trees are not common there, windows are often draped with sparkling cotton wool and tinsel. In Spain, a popular Christmas custom is Catalonia, a lucky strike game where a tree trunk is filled with goodies and children hit at the trunk trying to knock out the hazel nuts, almonds, toffee, and other treats. Up in Sweden, most people buy Christmas trees well before Christmas Eve, but it’s not common to take the tree inside and decorate it until just a few days before. Evergreen trees are decorated with stars, sunbursts, and snowflakes made from straw. Other decorations include colourful wooden animals and straw centrepieces. I found it fascinating though to learn that in the Ukraine, Christmas is celebrated on December 25th by Catholics and on January 7th by Orthodox Christians, yet it is the most popular holiday there. So as a result, during the whole of their Christmas season which of course includes New Year’s Day, people decorate fir trees and have parties.

I have found even more fascinating facts about this festive time.

  • In the U.S.A, the Rockefeller Center tree is located at Rockefeller Center, west of Fifth Avenue from 47th through 51st Streets in New York City and dates back to the Depression era.
  • The first tree at Rockefeller Center was placed in 1931 and was a small unadorned tree placed by construction workers at the centre of the construction site. Two years later, another tree was placed there, this time with lights.
  • The tallest tree displayed at Rockefeller Center arrived in 1948 and was a Norway Spruce that measured 100 feet tall and hailed from Killingworth, Connecticut.
  • Between 1887-1933 a fishing schooner called the Christmas Ship would tie up at the Clark Street bridge and sell spruce trees from Michigan to the people of Chicago.
  • In 1912, the first community Christmas tree in the United States was erected in New York City.
  • In 1923, President Calvin Coolidge started the National Christmas Tree Lighting Ceremony now held every year on the White House lawn.
  • In 1963, the National Christmas Tree was not lit until December 22nd because of a national 30-day period of mourning following the assassination of President Kennedy.
  • Since 1966, the National Christmas Tree Association has given a Christmas tree to the President and first family.
  • In 1979, their National Christmas Tree was not lit except for the top ornament in honour of the American hostages in Iran.
  • Christmas trees generally take six to eight years to mature.
  • The tallest living Christmas tree is believed to be the 122-foot, 91-year-old Douglas fir in the town of Woodinville, Washington.
  • Most Christmas trees are cut weeks before they get to a retail outlet.
  • In the past, other types of trees such as cherry and hawthorns were used as Christmas trees.
  • It is said that Thomas Edison’s assistants came up with the idea of electric lights for Christmas trees.
  • Teddy Roosevelt banned the Christmas tree from the White House for environmental reasons.
  • At one time, tinsel was banned because it contained lead. Now it is made of plastic.
  • In the first week, a tree in your home will consume as much as a quart of water per day.
  • You should never burn your Christmas tree in the fireplace, as it can contribute to a build-up of creosote.

This week…
I watched a video recently showing where a cat had somehow managed to get its head stuck inside a tin can and could not get out. A man freed the cat, but found it was not wearing a collar so was saying to people nearby how he thought the cat was probably wild. My immediate thought was “wild – I expect it was absolutely furious!”

Click: Return to top of page or Index page

The Past Is History

As many will know, I am something of a Star Trek fan. I’m not quite so keen on the later series, but the Star Trek TOS and TNG I do enjoy. The early DS9 (Deep Space Nine) episodes are good, but the later ones… well, I’m not as struck on those. However, each to their own. Some of the story lines are in fact quite good as you can just take them as a story or you can see a hidden meaning behind them. For example in a very early DS9 episode, near to the space station a wormhole is discovered in which exist beings with no concept of ‘time’ as we perceive it. Time has to be explained to them, like how events don’t all happen at once but they occur and actions then have consequences. Like in a game of cricket, where a ball is thrown and the person may hit the ball, striking it with a bat. Then the ball may go into a spin in a different direction. It may be caught by another player, or not. In another instance the ball may be thrown and be missed completely. Each time the ball is thrown it is an event in time and that time is always going forward, never backward. Life is a series of consequences, with one event leading to another. I have mentioned in a previous blog post about what occurred with my maternal grandfather, him being on board a particular ship, the H.M.S. Tipperary, which was torpedoed and sunk during World War I. He was in the North Sea for hours, but thankfully another ship came along and rescued him. Except that ship, the HMS Dublin, had at one point seen enemy ships and according to the captain’s report, in a few seconds the enemy was lost in the fog and his ship was turned with the object of chasing and shadowing them, but the existing conditions of weather made this event impossible. Course was therefore shaped for a position where it was hoped to meet with and join up with the 2nd Light Cruiser Squadron. The Commander-in-Chief was informed of sighting the enemy. Commodore, 2nd Light Cruiser Squadron, was asked for course and speed of Squadron. At 6.30 a.m they passed a lot of oil fuel and rescued a man on a piece of wood who turned out to be George T. A. Parkyn, Stoker 1st class of H.M.S. “ Tipperary,” who had been in the water for about 5 hours, and stated his ship had been sunk by shell fire at night. He was my maternal grandfather. Here are extracts from the Battle Of Jutland Official Despatches. So George was in the North Sea all that time and had it not been for the fog and the decision for H.M.S. Dublin to turn away and attempt to join up with the 2nd Light Cruiser Squadron, it would not have seen George or picked him up. A definite tale of consequences as without that occurring and so much more, my father wouldn’t have been born in 1919, my parents would never have met and I, along with quite a few others, would not be here now. It is a fascinating world we live in!

Report by Admiral Jellicoe
George Parkyn’s Rescue from H.M.S. Tipperary

There will be countless stories like this one, of that I am sure. If we also look at the lives of people and the changes brought about by their achievements, it is amazing quite what a difference they have made to our lives today. I believe an example of this is with Louis Pasteur (27 December 1822 – 28 September 1895), who was a French chemist and microbiologist renowned for his discoveries of the principles of vaccination, microbial fermentation and pasteurisation. He was born in Dole, a subprefecture in the Jura department in the Bourgogne-Franche-Comté area of eastern France to a catholic family of a poor tanner and was the third child of Jean-Joseph Pasteur and Jeanne-Etiennette Roqui. The family moved to Marmoz in 1826 and then to Arbois in 1827. Pasteur entered primary school in 1831 and was an average student in his early years, but not particularly academic as his interests were fishing and sketching. He drew many pastels as well as portraits of his parents, friends and neighbours. Louis Pasteur attended secondary school at the Collège d’Arbois and in October 1838 he left for Paris to join the Pension Barbet (which I believe may have been a college), but he became homesick, returning home in November. In 1839 he entered the Collège Royal at Besançon to study philosophy and earned his Bachelor of Letters degree in 1840. He was appointed a tutor at the Besançon college whilst continuing a degree science course with special mathematics. He failed his first examination in 1841, but managed to pass a general science degree from Dijon, where he earned his Bachelor of Science in Mathematics degree in 1842 but with only a mediocre grade in chemistry. Later in 1842, Pasteur took the entrance test for the École Normale Supérieure. He passed the first set of tests, but because his ranking was low, Pasteur decided not to continue and try again next year. He went back to the Pension Barbet to prepare for the test, he also attended classes at the Lycée Saint-Louis and lectures of Jean-Baptiste Dumas at the Sorbonne. In 1843 he passed the test with a high ranking and so entered the École Normale Supérieure. In 1845 he received the licenciè ès sciences degree and in 1846 he was appointed professor of physics at the Collège de Tournon (now called the Lycée Gabriel-Faure) in Ardèche. But the chemist Antoine Jérôme Balard wanted him back at the École Normale Supérieure as a graduate laboratory assistant, so he joined Balard and simultaneously started his research in crystallography. In 1847 he submitted his two thesis, one in chemistry and the other in physics. After serving briefly as professor of physics at the Dijon Lycée, in 1848 he became professor of chemistry at the University of Strasbourg where he met and courted Marie Laurent, daughter of the university’s rector in 1849. They were married on 29 May 1849 and together had five children, only two of whom survived to adulthood. The other three died of typhoid. But his research in chemistry led to remarkable breakthroughs in the overall understanding of the causes and preventions of diseases which laid down the foundations of hygiene, public health and much of modern medicine. His works are credited to saving millions of lives through the developments of vaccines for rabies and anthrax. He is regarded as one of the founders of modern bacteriology and has been honoured as one of the “fathers of bacteriology and microbiology”. Louis Pasteur was responsible for disproving the doctrine of spontaneous generation. Under the auspices of the French Academy of Sciences, his experiment demonstrated that in sterilised and sealed flasks nothing ever developed, whilst in sterilised but open flasks microorganisms could grow. For this experiment, in 1862 the academy awarded him the Alhumbert Prize with a prize of 2,500 francs. He is also regarded as one of the fathers of the germ theory of diseases, a minor medical concept at the time. His many experiments showed that diseases could be prevented by killing or stopping germs, thereby directly supporting the germ theory and its application in clinical medicine. He is best known to the general public for his invention of the technique of treating milk and wine to stop bacterial contamination, the process we call pasteurisation. Louis Pasteur also made significant discoveries in chemistry, most notably on the molecular basis for the asymmetry of certain crystals. Early in his career, his investigation of tartaric acid resulted in the first resolution of what is now called ‘optical isomers’ in chemistry. His work led the way to the current understanding of a fundamental principle in the structure of organic compounds. Pasteur was motivated to investigate fermentation while working at Lille. In 1856 a local wine manufacturer, M. Bigot, whose son was one of Pasteur’s students, sought for his advice on the problems of making beetroot alcohol and souring. According to his son-in-law, René Vallery-Radot, in August 1857 Pasteur sent a paper about lactic acid fermentation to the Société des Sciences de Lille, but the paper was read three months later and a memoir was subsequently published on 30 November 1857. In the memoir, he developed his ideas stating that: “I intend to establish that, just as there is an alcoholic ferment, the yeast of beer, which is found everywhere that sugar is decomposed into alcohol and carbonic acid, so also there is a particular ferment, a lactic yeast, always present when sugar becomes lactic acid. Pasteur also wrote about alcoholic fermentation, which was published in full form in 1858. Jöns Jacob Berzelius and Justus von Liebig had proposed the theory that fermentation was caused by decomposition. Pasteur demonstrated that this theory was incorrect and that yeast was responsible for fermentation to produce alcohol from sugar. He also demonstrated that when a different microorganism contaminated the wine, lactic acid was produced, making the wine sour. In 1861, Pasteur observed that less sugar fermented per part of yeast when the yeast was exposed to air. The lower rate of fermentation aerobically became known as the Pasteur Effect. Pasteur’s research also showed that the growth of micro-organisms was responsible for spoiling beverages, such as beer, wine and milk. With this established, he invented a process in which liquids such as milk were heated to a temperature between 60 and 100°C and this killed most bacteria and moulds already present within them. Pasteur and Claude Bernard completed tests on blood and urine on 20 April 1862. Pasteur patented the process, to fight the “diseases” of wine, in 1865. The method became known as pasteurisation and was soon applied to beer and milk. Beverage contamination led Pasteur to the idea that micro-organisms infecting animals and humans cause disease. He proposed preventing the entry of micro-organisms into the human body, leading Joseph Lister to develop antiseptic methods in surgery. In 1866, Pasteur published ‘Etudes sur le Vin’, about the diseases of wine, and he published ‘Etudes sur la Bière’ in 1876, concerning the diseases of beer. In the early 19th century, Agostino Bassi had shown that muscardine was caused by a fungus that infected silkworms. Since 1853, two diseases called pébrine and flacherie had been infecting great numbers of silkworms in southern France, and by 1865 they were causing huge losses to farmers. In 1865, Pasteur went to Alès and worked for five years until 1870. Silkworms with pébrine were covered in corpuscles. In the first three years, Pasteur thought that the corpuscles were a symptom of the disease. In 1870, he concluded that the corpuscles were the cause of pébrine (it is now known that the cause is microsporidia, a group of spore—forming unicellular parasites. Pasteur also showed that the disease was hereditary and he developed a system to prevent pébrine. Pasteur’s first work on vaccine development was on chicken cholera. Then in the 1870s, he applied his immunisation method to anthrax, which affected cattle and aroused interest in combating other diseases. Pasteur cultivated bacteria from the blood of animals infected with anthrax. When he inoculated animals with the bacteria, anthrax occurred, proving that the bacteria was the cause of the disease. Many cattle were dying of anthrax in “cursed fields” but Pasteur was told that sheep that died from anthrax were buried in the field. Pasteur thought that earthworms might have brought the bacteria to the surface. He found anthrax bacteria in earthworms’ excrement, showing that he was correct, so he told the farmers not to bury dead animals in the fields. Pasteur had been trying to develop the anthrax vaccine since 1877, soon after Robert Koch’s discovery of the bacterium. Pasteur had quite a few disagreements with other scientists on the subject of vaccines. The notion of a weak form of a disease causing immunity to the virulent version was not new, as this had been known for a long time for smallpox. Inoculation with smallpox variolation was known to result in a much less severe disease, and greatly reduced mortality, in comparison with the naturally acquired disease. Edward Jenner had also studied vaccination using cowpox vaccinia to give cross-immunity to smallpox in the late 1790s, and by the early 1800s vaccination had spread to most of Europe. The difference between smallpox vaccination and anthrax or chicken cholera vaccination was that the latter two disease organisms had been artificially weakened, so a naturally weak form of the disease organism did not need to be found. This discovery revolutionised work in infectious diseases, and Pasteur gave these artificially weakened diseases the generic name of “vaccines”, in honour of Jenner’s discovery. Pasteur produced the first vaccine for rabies by growing the virus in rabbits, and then weakening it by drying the affected nerve tissue. The rabies vaccine was initially created by Emile Roux, a French doctor and a colleague of Pasteur, who had produced a killed vaccine using this method. The vaccine had also been tested in 50 dogs before its first human trial. Because of his study in germs, Pasteur encouraged doctors to sanitise their hands and equipment before surgery. Prior to this, few doctors or their assistants practiced these procedures.

Louis Pasteur married Marie Laurent in 1849. She was the daughter of the rector of the University of Strasbourg, and was Pasteur’s scientific assistant. They had five children together, only three of whom survived until adulthood. His grandson, Louis Pasteur Vallery-Radot, wrote that Pasteur had kept from his Catholic background only a spiritualism without religious practice. However, Catholic observers often said that Pasteur remained an ardent Christian throughout his whole life, and his son-in-law wrote, in a biography of him: “Absolute faith in God and in Eternity, and a conviction that the power for good given to us in this world will be continued beyond it, were feelings which pervaded his whole life; the virtues of the gospel had ever been present to him. Full of respect for the form of religion which had been that of his forefathers, he came simply to it and naturally for spiritual help in these last weeks of his life”. The Literary Digest of 18 October 1902 gives this statement from Pasteur that whilst he worked, he prayed “Posterity will one day laugh at the foolishness of modern materialistic philosophers. The more I study nature, the more I stand amazed at the work of the Creator. I pray while I am engaged at my work in the laboratory”. Maurice Vallery-Radot, grandson of the brother of the son-in-law of Pasteur and outspoken Catholic, also holds that Pasteur remained fundamentally Catholic. According to Pasteur Vallery-Radot and Maurice Vallery-Radot, the following well-known quotation attributed to Pasteur is apocryphal: “The more I know, the more nearly is my faith that of the Breton peasant. Could I but know all I would have the faith of a Breton peasant’s wife”. According to Maurice Vallery-Radot, the false quotation appeared for the first time shortly after the death of Pasteur. However, despite his belief in God, it has been said that his views were that of a free-thinker rather than a Catholic, a spiritual more than a religious man. He was also against mixing science with religion. In 1868, Pasteur suffered a severe brain stroke that paralysed the left side of his body, but he recovered, then a further stroke in 1894 severely impaired his health. Failing to fully recover, he died on 28 September 1895, near Paris. He was given a state funeral and was buried in the Cathedral of Notre Dame, but his remains were reinterred in the Pasteur Institute in Paris, in a vault covered in depictions of his accomplishments in Byzantine mosaics.

I try to keep up with the events happening in our world, but there is so much news easily shared nowadays it is so easy to overlook a great deal. People, families, move to other countries and with the Internet we keep in touch with them but we can perhaps overlook things. Happily I know some folk who make a point of reminding us of people and events. On television we can watch the quiz shows that ask when certain things occurred and unless you have a particular interest in the subject it is easy to forget what happened, when notable figures were alive and what they did to and for this world. I have said before about when I was at school I took little interest in history, but now I begin to realise the importance of sharing the honest truth of what happened in the past so that we can build a better future for us, for every one and every thing around us.

This week…
The other morning I looked out of the window at the snow and was reminded of how Good King Wenceslas likes his pizza – deep pan, crisp and even. 😊

Click: Return to top of page or Index page

A Quick Reminder

Times change. They just do. But so should we. Or at the very least, learn and adapt. Consider some of the words spoken during a ceremony by both the bride and groom at a Church of England marriage service and which are a part of the sacred vows they speak, which are: ‘For better, for worse, for richer, for poorer, in sickness and in health’. I am sure that in many other cultures and ceremonies, similar words are uttered. They should all perhaps remind us that change is constant, they should tell us that despite whatever any of us may encounter in our lives we should all try our best to live a good, honest life. Those words were heard by me so many times, along with others as I sang in various choirs for quite a number of years, but I didn’t really realise their full meaning or significance at the time due to my young age. Only later did I begin to understand, especially as I met more and more people of different races and cultures as well as seeing how some people treated this lovely world. To my mind it was yet more to learn from. Over the many years that I was employed, technology changed and I learned to use new things, new skills. As I grew older and became a teacher of certain skills, as part of my training I learned that we do not all learn in the same way, some comprehend or become more skilful far more quickly than others. After a while, things that we do routinely we manage at a much faster rate, we become more adept. Likewise with our senses we see and hear a great many things but more often than not we learn to ignore the sounds that we are used to hearing and the sights that do not seem to change. I know I do. People I see on a regular basis hardly change, but I have been back to see folk who haven’t seen me for a number of years and they do not recognise me, nor I them. Time passes, my life is very different from what it was just two years ago, when I was living on my own in a flat. I had a routine which I had been following for many years, but that was all changing. To be fair though, it needed to. When I was a lad way back in the 1950s and 1960s, children went to either a secondary modern or over to a grammar school, dependent on whether you passed an ‘Eleven Plus’ exam. I didn’t, neither did my two elder brothers, so for us it was the good old secondary modern route. We lived in Whittlesey, a small town some seven miles east of Peterborough. Things were different then, we had a doctor who went out on his rounds, it seemed like we all knew each other and people like me just couldn’t get up to mischief because my dad was a teacher. Or if we did, everyone knew about it! The father of another of my schoolmates was the bank manager, the police lived locally and many of the families were often related. We didn’t have mobile phones, we could be out quite late at night but we did have a help system which came in the form of carrying around four old pennies to use in a telephone box to phone home. That was our “In Case of Emergency’ (ICE) system! Folk moving from other areas weren’t always welcomed, even from other parts of the British Isles. Fun was made of people over some of the accents used by certain folk, or not even understood. At my school there was one person who had poor eyesight so needed glasses which clearly had very thick lenses and that did make the person look odd. In my case, I learned to minimise my own physical disability as far as possible but I learned much later that it was the equivalent of having had a stroke, where my right side was much weaker than my left. Others have had to cope with disability and in years gone by, sadly they were somewhat ‘looked down on’ by some, but happily not all. Moreover, much more is accepted now than in years past. As I say, times change. I have detailed in previous blogs that we were on the edge of the flat Fen country. Many of the folk stayed living in Whittlesey and got jobs in the town, they often met and married the folk they were at school with then settled down in the town. My eldest brother joined the army and when he left them he was sent to a training place in Leicester, I think to acclimatise to civilian life and work. Whilst there he met and married, settled down and moved around as jobs required. My other brother met, married, divorced and when work required it he moved away. Since then he has remarried, they had twins who themselves married and have children. I know of people I used to work with who for many and various reasons have left the Peterborough area, some reside in Canada, some in the U.S.A. whilst others have gone further afield to Australia and New Zealand. As for me, after my initial upbringing in Whittlesey and subsequent move with parents to Peterborough I spent the first nineteen years of my working life there. It may have only been seven miles, moving from Whittlesey to Peterborough, but we thought it was worth it and my parents had heard of then seen a lovely bungalow which had a superb large garden. Though it did mean that I lost contact with just about all of the people I knew back in Whittlesey and only now, through the Internet, am I having some contact with a few of them again. It is easy to look back at that and wonder quite how different life for us all might have been, but changes and chances in this transitory life do occur!

During my early years of work I learned much, not just about the jobs there in British Telecom but about people in general. There are some good, lovely folk around but equally there some very unkind, even spiteful people. Selfish ones, too. I still smile at the old attitudes and behaviours back then because in 1969 when I joined, it was still a part of the Civil Service. Three months later it was changed to a Corporation, but was largely still run in the civil service manner. Managers were called ’sir’, never by their first names. It was all hierarchical. On my grade of job I was only allowed to use a chair without arms, as chairs with arms were for higher grades! In Peterborough we had people from different countries living there as I saw quite a number of Polish people, probably due to World War II. We also had an Italian community, but not a great deal from anywhere else. One lady was clearly African, her skin was indeed black and I found her to be a quiet, friendly, helpful person. Until then I had never seen anyone from another country with such dark skin. In school I had learned to minimise my own disabilities as far as possible as by then I had more than accepted how and who I was. Some tasks proved to be difficult, in fact the change of duty from one department to another meant having to use a date stamp repeatedly using my weak right hand. That aggravated the epilepsy and resulted in my first epileptic fit whilst on holiday a short while later. Thankfully I was given appropriate medication following several checks and tests at different hospitals, whilst at work I was eventually moved to a different job which did not require use of that date stamp! Promotion moved me to another duty, management had already asked me if I planned to make the company my career and after a few more years I was moved into the Sales department. I have said before about my time there and it was all good experience, but when the opportunity came for me to move on further promotion I took it. Almost every cloud has that proverbial silver lining and so I moved to Leicester and beyond. My thirst for knowledge has remained throughout my life, the only thing I have had to do is to learn the skill of not being ’side-tracked’, spending my time more on things which would be of value to me and which I could perhaps then help others to do. There are those whose culinary skills I find utterly amazing, as I know of people who seem to throw items into a pan then mix, stir or whatever, cook for however long it seems necessary without a glance at a timepiece and the food is cooked perfectly! Still, we cannot all be the first violin in the orchestra. It took me a while to get used to ‘work’, so very, very different from school. I learned to plan and to look forward to holidays! Almost all our family holidays were to Devon and Cornwall as we liked the area and had relatives living in Plymouth. But time passed, my parents retired, I was then living in a flat on my own and after a while was able to to spend time in Jersey and Guernsey. Some of the local alcoholic beverages there were quite strong. Mum and dad had a few good holidays together but sadly dad’s health failed and he passed away because of cancer. I am sure it wasn’t just him smoking, that it was the poor air in London. We know how it affected my mother, who also smoked a little but gave that up. Secondary smoke inhalation from sitting on the back seat of the car as we travelled wouldn’t have helped my asthma, although I was doing a great deal of singing and that probably minimised the effects of the smoke to a degree at least. So it has meant that whenever I see a new doctor or need to give details of my medical condition, they are always pleased to learn that I haven’t actually smoked directly. I am residing in a Care Home now, time will tell for how long but I have a good routine here which seems to work. I send greetings every morning to a few friends, something I started a while ago when I realised that this pandemic had isolated so many of us. Quite a few have families it is true, but it isn’t like it was when our parents and perhaps grandparents were alive. Back then they often lived either together or maybe in the same street, certainly in the same town. So they had regular contact. With there being large factories involving manual labour, many worked together and consequently met and married. It is why when we perhaps look at details of those who lost their lives in the two World Wars of the last century, so many had complete families wiped out. I do think that we can all too easily forget the numbers of human lives lost and whilst in this century there have been no worldwide wars, the loss of life attributed to the pandemic has been similar. I am also concerned that some appear to give the reason for someone passing away as being simply due to the pandemic, when other health conditions could have been contributory but no mention is made of those. The ‘bottom line’ though is that it is still a loss of a human life. I know some who seemingly turn a blind eye to the fact that whilst prevention may not be completely achieved it may be at the very least minimised by simple, basic rules. At one time car drivers never wore seat belts, whilst cyclists never wore helmets. Nowadays we take far more care. Times change and we do change with them. At the beginning of the twentieth century there was the influenza pandemic of 1918-1919, also known as the Spanish flu, which lasted between one and two years.

Emergency Hospital, Kansas Camp, Funston, Fort Riley, Kansas 1918.
Image courtesy of the National Museum of Health and Medicine, Armed Forces Institute of Pathology, Washington, D.C.

That pandemic occurred in three waves, though not simultaneously around the globe. In the Northern Hemisphere, the first wave originated in the spring of 1918, during World War I. Although it remains uncertain where exactly the virus first emerged, the earliest cases in the United States were detected in March among military personnel stationed at Camp Funston in Fort Riley, Kansas. Movement of troops probably helped spread the virus throughout the U.S. and Europe during the late spring. By summer the virus had reached parts of Russia, Africa, Asia, and New Zealand. This first wave was comparatively mild and had begun to die down in some areas, but a second, more lethal wave began about August or September 1918. During this wave, pneumonia often developed quickly, with patients usually dying just two days after experiencing the first symptoms of the flu. As social distancing measures were enforced, the second wave began to die down toward the end of November. But once those measures were relaxed, a third wave began in the winter and early spring of 1919. Though not as deadly as the second wave, the third wave still claimed a large number of lives. By the summer the virus had run its course in many parts of the world, but some historians suggest that there was a fourth wave in the winter of 1920, though it was far less virulent. The Spanish flu was the most severe pandemic of the 20th century and, in terms of total numbers of deaths, among the most devastating in human history. Outbreaks occurred in every inhabited part of the world, including islands in the South Pacific. The second and third waves claimed the most lives, with about half the deaths occurring among 20- to 40-year-olds, an unusual mortality age pattern for influenza. India is believed to have suffered at least 12.5 million deaths during the pandemic, and in the United States about 550,000 people died. Some scholars think the total number could have been even higher. Sadly there are always going to be loss of lives, from natural causes. Earthquakes and similar disasters, like Aberfan. Then there are disagreements, wars and folk just not following what some see as simple precautions. Technology has enabled us to do much more than in the past but we are human, we still make mistakes. We try to cope with events, with disasters, adjusting and adapting as necessary. After the major events when routines are disrupted it can be difficult for a time. As a simple example after World War II it took quite a while for food stocks to be back to normal so rationing with some items continued for several years. In our family our meals were organised, with things like fish on Fridays and a roast dinner on Sundays. Certain foods were available only at certain times of the year and were looked forward to. I wonder if we may find ourselves going back to those ways at times in the future. I do believe that one thing is certain though, which is that times will change and will always continue to do so.

This week, Language.
There are times when just a few words put together can express a thought very easily, when a statement is very clear in its meaning, but more often than not the opposite is true – especially without a little bit of thought on phraseology!
As an example, I give you the following:
“Don’t let worries kill you – let the church help”

Click: Return to top of page or Index page

Our World And Beyond

I do believe there are times when we simply forget the scale of this, our amazing planet and how much it has changed in just the last twenty, the last two hundred, the last two thousand years and beyond. At times it may seem to be such a big place and yet at others so small and it does make me wonder when I learn of those who show signs of narcissism, which is a personality disorder characterised by a sense of grandiosity, the need for attention and admiration, superficial interpersonal relationships and a lack of empathy. I try to keep an open mind on how people behave, but even I have seen how some folk always focus on themselves, some who never seem to accept reality and seem to almost live in an ‘artificial’ world of their own. Where I am right now, living and recovering in a Care Home after my heart problems and Covid-19, I see others who have dementia in varying stages but they are not encouraging others to behave as they are doing, believing in things which cannot be. Sadly however there are some who are trying to do just that, trying to persuade folk that what they are saying is the truth. But perhaps they forget how the technology of today enables us to record and recall scenes that at one time would have been simply spoken about. It is a truism that no two people can stand side by side, see exactly the same event and then describe that event to others in precisely the same manner as the other. One may embellish the scene, another may focus more on one aspect than the other. I remember the tale of two men, Fred and George, who were standing near to a church, just after a wedding. The church bells were ringing and the following conversation ensued:

Fred: “The bells sound nice, don’t they.”
George: “What?”
Fred (shouting): “I said, THE BELLS SOUND NICE!”
George (shouting: “I CAN’T HEAR YOU, IT’S THESE BLASTED BELLS!”
I am sure they both enjoyed the wedding…

In earlier posts I have written about communication of information. As we humans explored this lovely Earth and began to share its treasures, many of the races we encountered were fearful of the ‘strangers’ that they met. To illustrate this I found one very good episode of Star Trek TNG which had the captain, Jean-Luc Picard, seen by the inhabitants of a world which they encountered as some sort of god. They expected him to bring people who had died back to life and it wasn’t until Picard himself was injured that these inhabitants realised just how mortal they really all were. Picard also got one inhabitant to consider their life and how they lived. He got them to realise how their lives had changed over a period of time and how they might be treated if they were to meet their ancestors of long ago. Right now in the 21st century on Earth we have changed so much from our ancestors. Were we capable of going back in time a couple of thousand years, we could use just basic skills to heal, to create, to manufacture items, all of which might be seen as ‘magic’, certainly beyond belief to the people of that time. But in time we explored, we learned of new materials, developed and enhanced crops, improved growing techniques, created dams, irrigation along with many things medical. Sadly much came about as a result of wars, improving weapons and many lives were lost. As I did some research into my family’s history I learned that not all that long ago it was quite usual to have many children born to a family, this was because of what one might call ’natural’ wastage, because it was expected that some offspring would die from tuberculosis, cholera, polio etc. But when knowledge was passed on from one generation to another and once reading and writing was taught and shared, more and more was known. At one time it could take some time to share messages and information, but gradually postal services emerged, telegraph then telephone and here we are now with the Internet, which so many of us now access for information. What distresses me though is how so many people will simply accept what they are told, even when with just a little bit of research, information may be either proved or disproved. It is a fact that some countries are ruled by dictators, whilst others are governed in a more democratic fashion but even now there are those who will not accept what simply ‘is’. We know that humans live and die. My ‘family tree’ is quite interesting and with help I have researched much of it. There are now many other people who have researched their families and yes, the Internet is extremely useful in that regard.

R.M.S. Ortona.

As an example, one person whose details I wish to share is of my maternal grandfather, George. I never met him personally, as sadly he passed away a good few years before I was born but I have learned much about him, thanks to the Internet! George was born in Truro in 1884 and he was christened there early the following year. Then in 1889 a brother named Samuel was born. Truro was quite a prosperous mining area, but at the time of the Census in 1891, the family had moved over to Bury St Edmunds in Suffolk, most likely for the chalk mining. At some point things changed though, as in June 1900 George was registered as a ship’s boy aged 15 on board the R.M.S. Ortona (image above) in Tilbury Docks, London. I found his contract ended less than four months later. Over the next twelve months or so he was crewed on other ships, these being the Orizaba and the Wakanui, doing trips between England and Australia/New Zealand. Then in July 1902 he joined the London and South Western Railway, I don’t know in what job though. But a year later he left them without giving any notice and three days later George signed on with the Royal Navy. He was now aged 18 and a Stoker on HMS Nelson and at that time his character was marked as ‘very good’. Around eighteen months later he transferred to HMS Mars at the same rank and character, however some six months later there must have been something occur as he spent ten days in the cells! But his character was still marked as ‘good’. A few days after his release and return to HMS Mars, he was transferred as a Stoker to HMS Victory II. Two months later he was transferred to HMS Fox, but nine months later George was promoted to Stoker 1st class, a grade which he retained over the next four years. But from what I can find out, his weight increased and as a result his term of duty ended but his character was still marked as very good. 1910 then found him in Cardiff, where he married Gertrude and they settled there. A census the following year showed him, his wife and a son, Charles. George was still a stoker, but at an iron works and in the next few years they had two more children, John and Harry, who were both born in Pontypridd. When World War One came, George was back in the royal navy as a stoker 1st class, first on the HMS Victory II and then HMS Tipperary. However, on 31 May 1916 George was one of the very few survivors of the HMS Tipperary which was torpedoed and sunk at the Battle of Jutland. Rescued, he joined the HMS Victory II as a stoker 1st class with his character marked as ‘VG Super’. Three months later he transferred as stoker 1st class to HMS Renown with his character down as ‘VG Sat’, but ten months later showed him as an acting leading stoker on HMS Renown, character ‘VG Super’. In January 1918 he was a leading stoker on HMS Renown, character still ‘VG Super’ and he remained in that position and character until his term of duty was ended in April 1919, again due to obesity. George, Gertrude and family settled in London where, two years later, my mother (also named Gertrude) was born. A further son, Ronald, was born seven years later. If we consider that George would have been away from home for months on end, with little contact from the family, to me there must have been a great deal to catch up on when he returned and he would have had to cope with life after the war once his time in the Royal Navy ended. So far I have not been able to determine what work he did, but my dear mother, who herself sadly passed away a few years ago aged ninety-five, often said how her father would stand at the kitchen sink, staring wistfully into the distance. She knew that he missed the sea. George passed away in 1938, aged 54. I myself was fortunate enough to do a ‘round the world’ cruise a few years ago which I thoroughly enjoyed and that has given me at least some idea why George was so happy at sea, as I certainly was.

Royal Navy Archives – George T.A. Parkyn, Battle of Jutland, 1916.

That world cruise proved to me that this planet, with its diverse people and places, has been through many changes and has a fascinating history in it. I once made the real mistake in school of asking the history teacher why we needed to know about the Tudors. I was simply told to be quiet, which was a shame as I was genuinely wanting to know. But as I mentioned in an earlier blog of this year, I asked a similar question to my maths teacher and was told then ‘one day you will need this’ and he was correct, as I did! I am also finding how history is far more interesting now than it was back then. So it proves to me that the more we learn, the more we find that there is to learn! Obviously I have not been to school for many years and I do know what my dear dad meant shortly before he retired from teaching. His was at an infant/junior school and he was deputy head, but he was beginning to find that the Education Authority were putting constraints on what he had to teach. I think that at my old school in Whittlesey they were teaching to quite a strict curriculum in order for the students to achieve excellent grades. One of the television programmes I presently watch is the Richard Osman’s House of Games on BBC2 each weekday evening. I find it fascinating as they have a mixture of games, one game is where each of the four contestants have to write down an answer on a tablet computer and the person with the numerical value closest to the actual answer then gets a point. If they get the value very close or exactly right, they get two points! They might be having to estimate a particular year, for example when Julius Caesar died, or how far it is from our Earth to the Sun. Quite a few do know, but others do not. Another game is ‘Put Your Finger On It’, where all of the contestants are asked to mark on a map where a place is located. So often the results are wildly inaccurate, but a few are really quite good. So I think we can and do forget how relatively large our Earth is and where we might find places or recall when certain events occurred. Some we know quite easily, like the Battle of Hastings (1066), the Great Plague (1665) and the Great Fire of London (1666) as these are often ‘standard’ questions in school exams. But others, like ‘what year did India gain its independence from Britain’ (1947). It shows that no matter what age we may be, we are never too old to learn. I am also reminded of how I was taught to pass a driving test, which I did at the second attempt and only then did I actually begin to learn to drive motor vehicles. I have said before that as a young child I made the mistake of telling my parents that I was bored and so they soon found work for me to do in the form of washing a drainpipe. That taught me a valuable lesson. But back then I had no concept of time, of my existence and the existence as well as interaction with other things, with other people. I remember reading the tale of the child who went to his first day at school and was exhausted at the end of it. The following morning his father woke him to get him up and ready for school and the child told his father “but I did that yesterday!”. It took some convincing for the child to understand that he would be doing this for many more years yet as he had much to learn! The truth is that we, like so many living things here, have at least the capacity to learn so much in our lives. Some invent completely new things, learn existing things whilst others learn and then develop anew from what others have made, thus developing in ways never previously considered. A prime example of this may be seen in the ‘Back To The Future’ film series, where the two main characters find themselves a hundred years in the past after using a time machine. They attempt to return to their ‘present’ time, but the time machine uses a fuel which has yet to be invented. So they have to adapt to an existing one of the earlier time period. But things change. It really is the one constant in our glorious universe. Let’s face it, once upon a time our ancestors thought the Earth was flat. Imagine going back in time some 2,000 years and strapping a blood pressure cuff to the upper arm of a human of that time. Consider how computer games have developed over the last forty years. My very first was ping-pong, where the machine was first plugged in to a black & white television and tuned to the appropriate channel. We can think back how, over many centuries, explorers went out and discovered other countries, some places were conquered whilst in others the natives, fearful of people with a different colour skin, would kill them. Some even used them as food. In fact when one group learned that what we called ‘civilisation’ included war, where we simply killed thousands of others for no apparent reason, this group considered that we were in fact the barbarians! But we can also take a far, far broader view of our existence. I watched a short clip of film recently where a scientist tries to answer the question ‘how many galaxies of stars are there?’. She does this very well in my view, far better than I could, so I ask you to watch this short YouTube video. It is safe.

https://www.youtube.com/qlEOo9ANNos

It gives us at least an idea just how massive our whole Universe is. I also then consider how relatively short a time it is that we humans have existed and I wonder what will occur for us in the next few thousand years. Time enough I am sure to hopefully learn more as we sit back and drink tea!

A reminder…
This Sunday, 21st November is known informally by many as Stir-Up Sunday and has become associated with the custom of making Christmas puddings on that day. It gets its name from the beginning of the collect for the day in Anglican churches for the last Sunday before the season of Advent in their Book of Common Prayer, which begins with the words, “Stir up, we beseech thee, O Lord, the wills of thy faithful people; that they, plenteously bringing forth the fruit of good works, may of thee be plenteously rewarded…”. Once the cake mix was made, each member of the family would stir the mixture in turn and say a prayer as they did so. The Christmas pudding is one of the essential British Christmas traditions and is said to have been introduced to Britain by Prince Albert, husband of Queen Victoria (though apparently the meat-less version was introduced from Germany by George I in 1714). Most recipes for Christmas pudding require it to be cooked well in advance of Christmas and then reheated on Christmas Day, so the collect of the day served as a useful reminder.

Click: Return to top of page or Index page

The Isle Of Ely

Back in January of this year I wrote a blog post entitled Transport History in which I mentioned that one of the cars my Dad owned during his life was a Ford Anglia. That car had a on it a stick-on panel displaying the ‘Isle of Ely’ badge, of which we were proud. The Isle of Ely is a historic region which is around that cathedral city in Cambridgeshire and between 1889 and 1965 it formed an administrative county of England. It is so called because it was only accessible by boat until the waterlogged Fens were drained in the 17th century, something which I have detailed in my Whittlesey And The Fens blog. Still susceptible to flooding today, it was these watery surrounds that gave Ely its original name the ‘Isle of Eels’, a translation of the Anglo Saxon word ‘Eilig’ and this is a reference to the creatures that were often caught in the local rivers for food. This etymology was first recorded by the Venerable Bede.

The Isle of Ely 1648 by J Blaeu

Until the 17th century, the area was an island surrounded by a large area of fenland, a type of swamp. It was coveted as an area easy to defend, and was controlled in the very early medieval period by the Gyrwas, an Anglo-Saxon tribe. Upon their marriage in 652, Tondbert, a prince of the Gyrwas, presented Æthelthryth (who became St. Æthelthryth), the daughter of King Anna of the East Angles, with the Isle of Ely. She afterwards founded a monastery at Ely, which was destroyed by Viking raiders in 870, but which was rebuilt and became a famous abbey and shrine. Beginning in 1626 and using a network of canals designed by Dutch experts, the Fens were then drained but many Fenlanders were opposed to the draining as it deprived some of them of their traditional livelihood. Acts of vandalism on dykes, ditches, and sluices were common, but the draining was complete by the end of the century. The area’s natural defences led to it playing a role in the military history of England. Following the Norman Conquest, the Isle became a refuge for Anglo-Saxon forces under Earl Morcar, Bishop Aethelwine of Durham and Hereward the Wake in 1071. The area was taken by William the Conqueror, but only after a prolonged struggle. In 1139 civil war broke out between the forces of King Stephen and the Empress Matilda. Bishop Nigel of Ely, a supporter of Matilda, unsuccessfully tried to hold the Isle and then in 1143 Geoffrey de Mandeville rebelled against Stephen and made his base in the Isle. Geoffrey was mortally wounded at Burwell in 1144. Then in 1216, during the First Barons War, the Isle was unsuccessfully defended against the army of King John. Ely took part in the Peasants Revolt of 1381. During the English Civil War the Isle of Ely was held for the parliamentarians and troops from the garrison at Wisbech Castle were used in the siege of Crowland, also parts of the Fens were flooded to prevent Royalist forces entering Norfolk from Lincolnshire. The Horseshoe sluice on the river at Wisbech and the nearby castle and town defences were upgraded and cannon brought from Ely.

Chatteris Plaque on Leonard Childs Bridge

From 1109 until 1837, the Isle was under the jurisdiction of the Bishop of Ely, who appointed a Chief Justice of Ely and exercised temporal powers within the Liberty of Ely. This temporal jurisdiction originated in a charter granted by King Edgar in 970 and confirmed by Edward the Confessor and Henry I to the abbot of Ely. The latter monarch established Ely as the seat of a bishop in 1109, creating the Isle of Ely a county palatine. In England, Wales and Ireland a county palatine or palatinate was an area ruled by a hereditary nobleman enjoying special authority and autonomy from the rest of a kingdom. The name derives from the Latin adjective palātīnus, “relating to the palace”, from the noun palātium, “palace“. It thus implies the exercise of a quasi-royal prerogative within a county, that is to say a jurisdiction ruled by an earl, the English equivalent of a count. A duchy palatine is similar but is ruled over by a duke, a nobleman of higher precedence than an earl or count. The nobleman swore allegiance to the king yet had the power to rule the county largely independently of the king. It should therefore be distinguished from the feudal barony, held from the king, which possessed no such independent authority. Rulers of counties palatine created their own feudal baronies, to be held directly from them ‘in capite’, such as the Barony of Halton. This was in old English law where a capite (from Latin caput, or head) was a tenure, abolished by the Tenures Abolition Act 1660 by which either person or land was held immediately of the king, or of his crown by knight-service. So a holder of a capite is termed a tenant in chief. County palatine jurisdictions were created in England under the rule of the Norman dynasty but in continental Europe they have an earlier date. In general, when a palatine-type autonomy was granted to a lord by the sovereign, it was in a district on the periphery of the kingdom, at a time when the district was at risk from disloyal armed insurgents who could retreat beyond the borders and re-enter. For the English sovereign in Norman times this applied to northern England, Wales and Ireland. As the authority granted was hereditary, some counties palatine legally survived well past the end of the feudal period. It was an act of parliament in 1535/6 which ended the palatine status of the Isle, with all justices of the peace to be appointed by ‘letters patent’, issued under the great seal and warrants to be issued in the king’s name. However, the bishop retained exclusive jurisdiction in civil and criminal matters. A chief bailiff was appointed for life by the bishop and they performed the functions of high sheriff within the liberty, also heading the government of the city of Ely. Interestingly I learned about letters patent, a type of legal instrument in the form of a published written order issued by a monarch, president or other head of state, generally granting an office, right, monopoly, title, or status to a person or corporation. They can be used for the creation of government offices, or for granting city status or a coat of arms. They are also issued for the appointment of representatives of the Crown, such as governors and governors-general of Commonwealth realms, as well as appointing a Royal Commission. In the United Kingdom they are also issued for the creation of peers of the realm. In addition to all this, a particular form of letters patent has evolved into the modern intellectual property patent (referred to as a utility patent or design patent in United States patent law, granting exclusive rights in an invention (or a design in the case of a design patent). In this case it is essential that the written grant should be in the form of a public document so other inventors can consult it both to avoid infringement and understand how to put it into practical use. In the Holy Roman Empire, Austrian Empire and Austria-Hungary, an imperial patent was also the highest form of generally binding legal regulations, for example a Patent of Toleration. I did a bit of research, finding that the more I learn, then the more there is to learn! For example, I read that the Patent of Toleration (in German ’Toleranzpatent’ was an edict of toleration issued on 13 October 1781 by the Habsburg emperor Joseph II. Part of the Josephinist reforms, the Patent extended religious freedoms to non-Catholic Christians living in the crown lands of the Habsburg monarchy, including Lutherans, Calvinists and the Eastern Orthodox. More specifically, these members of minority faiths were now legally permitted to hold “private religious exercises” in clandestine churches. The Patent guaranteed the practice of religion by the Evangelical Lutheran and the Reformed Church in Austria. Nevertheless, worship was heavily regulated, wedding ceremonies remained reserved for the Catholic Church, and the Unity of the Brethren was still suppressed. Similar to the articular churches admitted 100 years before, Protestants were only allowed to erect ‘houses of prayer’ which should not in any way resemble church buildings. In many Habsburg areas, especially in the hereditary lands of Upper Austria, Styria and Carinthia, Protestant parishes quickly developed, strongly relying on traditions. The Patent also regulated mixed faith marriages, foreshadowing the Marriage Patent that was to be released in 1783 seeking to bring marriages under civil rather than canon law. In allowing marriages between religions, if the father was Catholic all children were required to be raised as Catholics whilst if the mother was Catholic only the daughters had to be raised as such. The Patent was followed by the Edict of Tolerance for Jews in 1782. The edict extended to Jews the freedom to pursue all branches of commerce, but also imposed new requirements. Jews were required to create German-language primary schools or send their children to Christian schools (Jewish schools had previously taught children to read and write Hebrew in addition to mathematics.) The Patent also permitted Jews to attend state secondary schools. A series of laws issued soon after the Edict of Toleration abolished the autonomy of the Jewish communities, which had previously run their own court, charity, internal taxation and school systems. It required Jews to acquire family names, made Jews subject to military conscription and required candidates for the rabbinate to have secular education. The 1781 Patent was originally called the “Divine Send of Equal Liberties” but was further put down by the monarch’s advisor. Constraints on the construction of churches were abolished after the revolutions of 1848. The Protestant Church did not receive an equivalent legal status until Emperor Franz Joseph I of Austria issued the Protestant patent in 1861.

But back to Ely. In July 1643, Oliver Cromwell was made governor of the Isle. The Liberty of Ely Act 1837 ended the bishop’s secular powers in the Isle and the area was declared a division of Cambridgeshire, with the right to appoint justices revested in the crown. Following the 1837 Act the Isle maintained separate Quarter Sessions and formed its own constabulary. Under the Local Government Bill of 1888, which proposed the introduction of elected county councils, the Isle was to form part of Cambridgeshire. Following the intervention of the local member of parliament, Charles Selwyn, the Isle of Ely was constituted a separate administrative county in 1889. In 1894 the county was divided into county districts, with the rural districts being Ely, North Witchford, Thorney, Whittlesey and Wisbech. The urban districts were Ely, March and Whittlesey, with Wisbech being the only municipal borough. Whittlesey Rural district consisted of only one parish and this was added to Whittlesey urban district in 1926. However, the county was small in terms of both area and population and its abolition was proposed by the Local Government Boundary Commission in 1947, but its report was not acted upon and the administrative county survived until 1965. Then, following the recommendations of the Local Government Commission for England, on 1 April 1965 the bulk of the area was merged to form Cambridgeshire and Isle of Ely, with the Thorney Rural District going to Huntingdon and Peterborough. From a parliamentary standpoint the Isle of Ely parliamentary constituency was created as a two-member seat in the First and Second Protectorate Parliaments from 1654 to 1659. The constituency was then re-created with a single seat in 1918 but in the boundary changes of 1983 it was replaced by the new constituency of North East Cambridgeshire. Original historical documents relating to the Isle of Ely are held by Cambridgeshire Archives and Local Studies at the County Record Office in Ely. On 1 May 1931, the Isle of Ely County Council was granted a coat of arms. Previous to this, the council had been using the arms of the Diocese of Ely, this being ‘Gules, three ducal coronets, two and one or’. Then in the new grant, silver and blue waves were added to the episcopal arms to suggest that the county was an “isle”. The crest above the shield was a human hand grasping a trident around which an eel was entwined, referring to the popular derivation of “Ely”. On the wrist of the hand was a ‘Wake knot’, representing Hereward the Wake. This Wake knot or ‘Ormond knot’ is an English heraldic knot used historically as an heraldic badge by the Wake family, the lords of the manor of Bourne, Lincolnshire and also by the Butler family, Earls of Ormond of Irish heritage. There is one fascinating item relating to the Wake name that I have learned and which in fact relates to knots. When I was a lad, I was taught how to tie just a few different knots, one of which was the Reef Knot, which is used to join two lines of the same diameter together. Then there is the Sheet Bend, which is used to tie lines of unequal thickness together. But for a stronger join there is the Carrick Bend, also known as the Sailor’s Breastplate and is a knot for joining two lines of very heavy rope or cable that are too large and stiff to be easily formed into other common bends. It will not jam even after carrying a significant load or being soaked with water. As with many other members of the basket weave family, the aesthetically pleasing interwoven and symmetrical shape of the Carrick Bend has also made it popular for decorative purposes. The Wake knot however may be used to join a rope and a strap.

A Wake knot.

In addition to all this, I have learned the following relating to the Isle of Ely in that it became a marquessate, the territorial lordship or possessions of a marquess. The title Marquess of the Isle of Ely was created in the Peerage of Great Britain for Prince Frederick. The title of Duke of Edinburgh was first created on 26 July 1726 by King George I, who bestowed it on his grandson Prince Frederick, who became Prince of Wales the following year. The subsidiary titles of the dukedom were Baron of Snowdon, in the County of Caernarvon, Viscount of Launceston, in the County of Cornwall, Earl of Eltham, in the County of Kent and Marquess of the Isle of Ely. The marquessate was apparently erroneously gazetted as Marquess of the Isle of Wight, although Marquess of the Isle of Ely was the intended title. In later editions of the London Gazette the Duke is referred to as the Marquess of the Isle of Ely. Upon Frederick’s death, the titles were inherited by his son Prince George. When he became George III in 1760, the titles merged into the Crown and ceased to exist. To me, the Isle of Ely is a lovely part of East Anglia, with a fascinating history.

This week…
I am told that in Croatia there is a Museum Of Broken Relationships, at present located in Zagreb. But I think at least part of it should be in Split…

Click: Return to top of page or Index page