Some 20th Century Changes

After my research for last week’s blog post on workhouses, I felt that there had to be a bit more to the story, so I continued looking and this led on to law in general. Now that is an absolutely huge subject that I cannot possibly hope to encompass in my blogs, but I can perhaps highlight a few things we have either forgotten or simply were never told about, as the law can be a fascinating insight into the priorities of a particular period in time and quite rightly it is constantly changing. For example, in 1313 MPs were banned from wearing armour or carrying weapons in Parliament, a law which still stands today. Others, such as the monarch’s guards, are still permitted to carry weapons, just not MPs themselves. It’s easy enough to see the chain of events that prompts laws to be written or changed. For example, since around 2013, when driving along a three-lane motorway, rule 264 of the Highway Code states that “You should always drive in the left-hand lane when the road ahead is clear. If you are overtaking a number of slow-moving vehicles, you should return to the left-hand lane as soon as you are safely past.” Middle-lane hogging is when vehicles remain in the middle lane longer than necessary, even when there aren’t any vehicles in the inside lane to overtake. So in a hundred years’ time, this law might be seen as a historical curiosity, although it was heartily welcomed by many drivers. So for rather obvious reasons, some laws are still standing whilst others have been dropped as they are inappropriate in the modern day. But go back to the late 19th century and there were the Locomotive Acts, or Red Flag Acts which were a series of Acts of Parliament which regulated the use of mechanically propelled vehicles on British public highways. The first three, the Locomotives on Highways Act 1861, The Locomotive Act 1865 and Highways and Locomotives (Amendment) Act 1878, contained restrictive measures on the manning and speed of operation of road vehicles. They also formalised many important road concepts like vehicle registration, registration plates, speed limits, maximum vehicle weight over structures such as bridges, and the organisation of highway authorities. The most draconian restrictions and speed limits were imposed by the 1865 Act, also known as the ‘Red Flag’ Act, which required “all road locomotives, including automobiles, to travel at a maximum of 4mph (6.4km/h) in the country and 2mph (3.2km/h) in the city, as well as requiring a man carrying a red flag to walk in front of road vehicles hauling multiple wagons”. However The 1896 Act removed some restrictions of the 1865 Act and also raised the speed to 14mph (23km/h). But first, let us go back to earlier times. For example, the First Act of Supremacy 1534. Over the course of the 1520s and 1530s, Henry VIII passed a series of laws that changed life in England entirely, and the most significant of these was this First Act of Supremacy which declared that Henry VIII was the Supreme Head of the Church of England instead of the Pope, effectively severing the link between the Church of England and the Roman Catholic Church, and providing the cornerstone for the English Reformation. This change was so far-ranging that it is difficult to cover every effect that it had. It meant that England (and ultimately, Britain) would be a Protestant country rather than a Catholic one, with consequences for her allies and her sense of connection to the other countries of Europe. It gave Henry VIII additional licence to continue plundering and shutting down monasteries, which had been huge centres of power in England, with really significant consequences in that their role in alleviating poverty, and providing healthcare and education was lost. It led to centuries of internal and external conflict between the Church of England and other faiths, some of which are still ongoing today. In fact, until 1707, there was no such thing as the United Kingdom. There was England, and there was Scotland, two countries which had shared a monarch since 1603 but which were otherwise legally separate. But by 1707, the situation was becoming increasingly difficult, and union seemed to solve both sides’ fears that they were dangerously exposed to exploitation by the other. England and Scotland had been at each other’s throats since a time before the nations of ‘England’ and ‘Scotland’ even formally existed. The Acts of Union did not bring that to an end right away, but ultimately these ancient enemies became one of the most enduring political unions that has ever existed. That isn’t to say it has gone entirely uncontested as in 2014, a referendum was held on Scottish independence where 55% of voters opted to remain in the union. We should also recall 1807, when the Slave Trade Act was introduced. In fact Britain had played a pivotal role in the international slave trade, though slavery had been illegal in Britain itself since 1102 but with the establishment of British colonies overseas, slaves were used as agricultural labour across the British empire. It was estimated that British ships carried more than three million slaves from Africa to the Americas, second only to the five million slaves which were transported by the Portuguese. The Quakers, or the Religious Society of Friends to give them their proper name, were a nonviolent, pacifist religious movement founded in the mid-17th century who were opposed to slavery from the start of their movement. They pioneered the Abolitionist movement, despite being a marginalised group in their own right. As non-Anglicans, they were not permitted to stand for Parliament. They founded a group to bring non-Quakers on board so as to have greater political influence, as well as working to raise public awareness of the horrors of the slave trade. This was achieved through the publication of books and pamphlets. The effect of the Slave Trade Act, once passed, was rapid. The Royal Navy, which was the leading power at sea at the time, patrolled the coast of West Africa and between 1808 and 1860 freed 150,000 captured slaves. Finally, in 1833, slavery was finally banned throughout the British Empire. In the first few decades of the Industrial Revolution, conditions in British factories were frequently abysmal. 15-hour working days were usual and this included weekends. Apprentices were not supposed to work for more than 12 hours a day, and factory owners were not supposed to employ children under the age of 9, but a parent’s word was considered sufficient to prove a child’s age and even these paltry rules were seldom enforced. Yet the wages that factories offered were still so much better than those available in agricultural labour that there was no shortage of workers willing to put up with these miserable conditions, at least until they had earned enough money to seek out an alternative. It was a similar social movement to the one that had brought an end to slavery that fought child labour in factories, it was also believed that reducing working hours for children would lead to a knock-on effect where working hours for adults would also be reduced. The Factory Act of 1833, among a host of changes, banned children under 9 from working in textile mills, banned children under 18 from working at night, and children between 9 and 13 were not permitted to work unless they had a schoolmaster’s certificate showing they had received two hours’ education per day. So the Factory Act not only improved factory conditions, but also began to pave the way towards education for all. Then, just two years later, a law against cruelty to animals followed. Until 1835, there had been no laws in Britain to prevent cruelty to animals, except one in 1822, which exclusively concerned cattle. Animals were property, and could be treated in whatever way the property-owner wished. It was actually back in 1824 that a group of reformers founded the Society for the Prevention of Cruelty to Animals, which we know today as the RSPCA. Several of those reformers had also been involved in the abolition of the slave trade, such as the MP William Wilberforce. Their initial focus was on working animals such as pit ponies, which worked in mines, but that soon expanded. The Cruelty to Animals Act of 1835, for which the charity’s members lobbied, outlawed bear-baiting and cockfighting, as well as paving the way for further legislation for things such as creating veterinary hospitals, and improving how animals were transported.

The RSPCA began by championing the rights of the humble pit pony.

Prior to the Married Women’s Property Act 1870, when a woman married a man, she ceased to exist as a separate legal being. All of her property prior to marriage, whether accumulated through wages, inheritance, gifts or anything else became his, and any property she came to possess during marriage was entirely under his control, not hers. There were a handful of exceptions, such as money held in trust, but this option was out of reach of all but the very wealthy. Given the difficulty of seeking a divorce at this time, this effectively meant that a man could do whatever he wished with his wife’s money, including leaving her destitute, and she would have very little legal recourse. But the Act changed this. It gave a woman the right to control money she earned whilst married, as well as keeping inherited property, and made both spouses liable to maintain their children from their separate property, something that was important in relation to custody rights on divorce. The Act was not retrospective, so women who had married and whose property had come into the ownership of their husbands were not given it back, which limited its immediate effect. But ultimately, it was a key stage on the long road to equality between men and women in Britain. It is clear that 1870 was a big year in British politics so fas as education was concerned. This is because before then, the government had provided some funding for schools but this was piecemeal and there were plenty of areas where there were simply no school places to be found. This was complicated by the fact that many schools were run by religious denominations, as there was conflict over whether the government should fund schools run by particular religious groups. As has been seen, under the Factory Act 1833 there were some requirements that children should be educated, but these were frequently ignored. Previously, industrialists had seen education as undesirable, at least when focusing on their ‘bottom line’, as hours when children were in education represented hours when they were not able to work. There were some factory jobs that only children could perform, for instance because of their size. But as automation advanced, it increasingly became the case that a lack of educated workers was holding back industrial production so industrialists became a driving force in pushing through comprehensive education. The Education Act of 1870 didn’t provide free education for all, but it did ensure that schools would be built and funded wherever they were needed, so that no child would miss out on an education simply because they didn’t live near a school. We take it for granted now, but free education for all was not achieved until 1944. At the end of the First World War, the Representation of the People Act 1918 is chiefly remembered as the act that gave women the right to vote, but in fact it went further than that. Only 60% of men in Britain had the right to vote prior to 1918, as voting rights were restricted to men who owned a certain amount of property. Elections had been postponed until the end of the First World War and now, in an atmosphere of revolution, Britain was facing millions of soldiers who had fought for their country returning home and being unable to vote. This was clearly unacceptable. As a result, the law was changed so that men aged over 21, or men who had turned 19 whilst fighting in the First World War, were given the vote. But it was also evident that women had contributed hugely to the war effort, and so they too were given the vote under restricted circumstances. The vote was granted to women over 30 who owned property, were graduates voting in a university constituency or who were either a member or married to a member of the Local Government Register. The belief was that this set of limitations would mean that mostly married women would be voting, and therefore that they would mostly vote the same way as their husbands, so it wouldn’t make too much difference. Women were only granted equal suffrage with men in 1928. Then in 1946 came the National Health Service Act. I personally think that we should be proud of our free health service, especially after I learned what residents of some other countries have to do in order to obtain medical care. In 1942, economist William Beveridge had published a report on how to defeat the five great evils of society, these being squalor, ignorance, want, idleness, and disease. Ignorance, for instance, was to be defeated through the 1944 Education Act, which made education free for all children up to the age of 15. But arguably the most revolutionary outcome of the Beveridge Report was his recommendation to defeat disease through the creation of the National Health Service. This was the principle that healthcare should be free at the point of service, paid for by a system of National Insurance so that everyone paid according to what they could afford. One of the principles behind this was that if healthcare were free, people would take better care of their health, thereby improving the health of the country overall. Or to put it another way, someone with an infectious disease would get it treated for free and then get back to work, rather than hoping it would go away, infecting others and leading to lots of lost working hours. It is an idea that was, and remains, hugely popular with the public.

Education.

As I said last week, there had been several new laws with the gradual closure of workhouses and by the beginning of the 20th century some infirmaries were even able to operate as private hospitals. A Royal Commission of 1905 reported that workhouses were unsuited to deal with the different categories of resident they had traditionally housed, and it was recommended that specialised institutions for each class of pauper should be established in which they could be treated appropriately by properly trained staff. The ‘deterrent’ workhouses were in future to be reserved for those considered as incorrigibles, such as drunkards, idlers and tramps. In Britain during the early 1900’s, average life span as considered as about 47 for a man and 50 for a woman. By the end of the century, it was about 75 and 80. Life was also greatly improved by new inventions. In fact, even during the depression of the 1930s things improved for most of the people who had a job. Of course we then had the First World War, where so many people lost their lives. So far as the United Kingdom and the Colonies are concerned, during that war there were about 888,000 military deaths (from all causes) and just about 17,000 civilian deaths due to military action and crimes against humanity. There were also around 1,675,000 military wounded. Then during the Second World War, again just in the United Kingdom (including Crown Colonies) there were almost 384,000 military deaths (from all causes), some 67,200 civilian deaths due to military action and crimes against humanity as well as almost 376,000 military wounded. On 24 January 1918 it was reported in the Daily Telegraph that the Local Government Committee on the Poor Law had presented to the Ministry of Reconstruction a report recommending abolition of the workhouses and transferring their duties to other organisations. That same year, free primary education for all children was provided in the UK. Then a few years later the Local Government Act of 1929 gave local authorities the power to take over workhouse infirmaries as municipal hospitals, although outside London few did so. The workhouse system was officially abolished in the UK by the same Act on 1 April 1930, but many workhouses, renamed Public Assistance Institutions, continued under the control of local county councils. At the outbreak of the Second World War in 1939 almost 100,000 people were accommodated in the former workhouses, 5,629 of whom were children. Then the 1948 National Assistance Act abolished the last vestiges of the Poor Law, and with it the workhouses. Many of the buildings were converted into retirement homes run by the local authorities, so by 1960 slightly more than half of local authority accommodation for the elderly was provided in former workhouses. Under the Local Government Act 1929, the boards of guardians, who had been the authorities for poor relief since the Poor Law Amendment Act 1834, were abolished and their powers transferred to county and county borough councils. The basic responsibilities of the statutory public assistance committees set up under the Act included the provision of both ‘indoor’ and ‘outdoor’ relief. Those unable to work on account of age or infirmity were housed in Public Assistance (formerly Poor Law) Institutions and provided with the necessary medical attention, the committee being empowered to act, in respect of the sick poor, under the terms of the Mental Deficiency Acts 1913-27, the Maternity and Child Welfare Act 1918 and the Blind Persons Act 1920, in a separate capacity from other county council committees set up under those Acts. Outdoor relief for the able-bodied unemployed took the form of ‘transitional payments’ by the Treasury, which were not conditional on previous national insurance contributions, but subject to assessment of need by the Public Assistance Committee. The Unemployment Act 1934 transferred the responsibility for ‘transitional payments’ to a national Unemployment Assistance Board (re-named ‘Assistance Board’ when its scope was widened under the Old Age Pensions and Widows Pensions Act, 1940). Payment was still dependent on a ‘means test’ conducted by visiting government officials and, at the request of the government, East Sussex County Council, in common with other rural counties, agreed that officers of its Public Assistance Department should act in this capacity for the administrative county, excepting the Borough of Hove, for a period of eighteen months after the Act came into effect. Other duties of the Public Assistance Committee included the apprenticing and boarding-out of children under its care, arranging for the emigration of suitable persons, and maintaining a register of all persons in receipt of relief. Under the National Health Service Act 1946, Public Assistance hospitals were then transferred to the new regional hospital boards, and certain personal health services to the new Health Committee. The National Insurance Act 1946 introduced a new system of contributory unemployment insurance, national health insurance and contributory pension schemes, under the control of the Ministry of Pensions and National Insurance. Payment of ‘supplementary benefits’ to those not adequately covered by the National Insurance Scheme was made the responsibility of the National Assistance Board under the National Assistance Act 1948 and thus the old Poor Law concept of relief was finally superseded. Under the same Act, responsibility for the residential care of the aged and infirm was laid upon a new statutory committee of the county council, the Welfare Services Committee and the Public Assistance Committee was dissolved. Our world is constantly changing!

This week, an amusing image for a change…

Invisible tape.

Click: Return to top of page or Index page

Workhouses

Whilst researching for this blog post, I learned that a man I was once at school with had written about workhouses in the town I grew up in, so I have included his findings, with thanks. From Tudor times until the Poor Law Amendment Act of 1834, care of the poor was in fact the concern of individual parishes. Over in Whittlesey there are records of meetings of the charity governors between 1737 and 1825 and it is known that a workhouse was in existence in the old Tavern Street (later Broad Street) in 1804 and this building was virtually a hospital for the aged of the town. Before the inception of the Whittlesey Union, the parishes of Whittlesey levied a rate and doled it out as outdoor relief to people in their own homes, but by 1832 there was quite high unemployment among farm workers, especially in the winter, so the rate levied in Whittlesey was very high. At that time the workhouse housed thirty people, mainly the old and orphans, but sometimes able bodied men were taken in during the winter. Then the 1834 Poor Law Act was passed in order to build more workhouses and to make it more difficult for the poor to obtain cash handouts, so a new building was started on what was at that time Bassenhally field. The new workhouse had accommodation for sixty inmates and was also a lodging house for vagrants who wandered from one workhouse to another. The inmates received three meat dinners a week and the children received no education. Then in 1851 the workhouse was extended to accommodate one hundred and fifty inmates and then in 1874 a further extension was added, at a cost of £8,000. This workhouse, also now known as ‘the spike’ because of its clock tower, housed over two hundred people. Whilst they were staying there, men were employed on a farm or sack making, outdoor relief was still available to some people but able-bodied men had to enter the workhouse with their families in order to obtain relief. Men, women and children were segregated although parents had access to their children for one hour per day, whilst single unemployed women were forced into the workhouse to obtain relief. Some people stayed in the workhouse for the rest of their lives, and indeed the copy of the workhouse register in the local museum shows that in the majority of cases the reason people left was death. On Sunday mornings, inmates attended the local St.Mary’s church, husbands and wives were allowed to meet on Sundays but were segregated in the church, the women sitting in the front of the pulpit and the men along the wall on the other side of the north aisle. In the 1920s the main function of the workhouse seems to have been the care of the sick, infirm and elderly women with young children and orphans. Local people were cared for in the main building, but also overnight accommodation was provided in a separate building for tramps and vagrants who were expected to work, chopping wood or picking oaken, the chopped wood being sold to the townsfolk. Then in 1930 the board of guardians was disbanded. At the end of the 1930s the building was used by Coates school whilst its own building was undergoing repairs and shortly afterwards the building was demolished. The need for poor law institutions disappeared with the introduction of the National Assistance Act in 1948 and this founded the National Assistance Board, which was responsible for public assistance. Derived from national insurance contributions, the Board established means-tested supplements for the uninsured. Then in the early 1950s, the Sir Harry Smith school was built on the site. As a result, my old secondary school is on the site of what was at one time a workhouse where children were not taught!

Whittlesey Workhouse cellar, unearthed beneath the car park of Sir Harry Smith School during renovation work in 2011.

Following the Black Death, a devastating pandemic that killed about one-third of England’s population between 1346 and 1352, the Statute of Cambridge in 1388 was an attempt to address the labour shortage. This new law fixed wages and restricted the movement of labourers, as it was anticipated that if they were allowed to leave their parishes for higher-paid work elsewhere then wages would inevitably rise. According to a historian, the fear of social disorder following the plague ultimately resulted in the state, and not a ‘personal Christian charity’, becoming responsible for the support of the poor. The resulting laws against vagrancy were the origins of state-funded relief for the poor. Then from the 16th century onwards a distinction was legally enshrined between those who were willing to work but could not, and those who were able to work but would not, between the genuinely unemployed and the idler. Supporting the destitute was a problem exacerbated by King Henry VIII’s Dissolution of the Monasteries which began in 1536. They had been a significant source of charitable relief and provided a good deal of direct and indirect employment. The Poor Relief Act of 1576 went on to establish the principle that if the able-bodied poor needed support, they had to work for it. Then the Act for the Relief of the Poor Act in 1601 made parishes legally responsible for the care of those within their boundaries who, through either age or infirmity, were unable to work. The Act essentially classified the poor into one of three groups. It proposed that the able-bodied be offered work in a ‘house of correction’, the precursor of the workhouse, where the ‘persistent idler’ was therefore to be punished. It also proposed the construction of housing for the impotent poor, the old and the infirm although most assistance was granted through a form of poor relief known as ‘outdoor relief’. This was in the form of money, food, or other necessities given to those living in their own homes, funded by a local tax on the property of the wealthiest in the parish. In Britain, a workhouse was a total institution where those unable to support themselves financially were offered accommodation and employment. In Scotland, they were usually known as poorhouses. The earliest known use of the term ‘workhouse’ is from 1631, in an account by the mayor of Abingdon, reporting that “we have erected with’n our borough a workhouse to set poorer people to work”. However, as a result of mass unemployment following the end of the Napoleonic Wars in 1815, the introduction of new technology to replace agricultural workers in particular, and a series of bad harvests, meant that by the early 1830s the established system of poor relief was proving to be unsustainable. The New Poor Law of 1834 attempted to reverse the economic trend by discouraging the provision of relief to anyone who refused to enter a workhouse and some Poor Law authorities hoped to run workhouses at a profit by utilising the free labour of their inmates. Most were employed on tasks such as breaking stones and crushing bones to produce fertiliser. As the 19th century wore on, workhouses increasingly became refuges for the elderly, infirm, and sick rather than the able-bodied poor, and in 1929 legislation was passed to allow local authorities to take over workhouse infirmaries as municipal hospitals. Although workhouses were formally abolished by the same legislation in 1930, many continued under their new appellation of Public Assistance Institutions under the control of local authorities. It was not until the introduction of the National Assistance Act of 1948 that the last vestiges of the Poor Law finally disappeared and with them the workhouses.

Poor House, Framlingham Castle.

This ‘Red House’ at Framlingham Castle in Suffolk was founded as a workhouse in 1664. The workhouse system evolved in the 17th century, allowing parishes to reduce the cost to ratepayers of providing poor relief. The first authoritative figure for numbers of workhouses comes in the next century from ‘The Abstract of Returns made by the Overseers of the Poor’, which was drawn up following a government survey in 1776. It put the number of parish workhouses in England and Wales at more than 1,800, or about one parish in seven, with a total capacity of more than 90,000 places. This growth in the number of workhouses was prompted by the Workhouse Test Act of 1723, which obliged anyone seeking poor relief to enter a workhouse and undertake a set amount of work, usually for no pay. This system was called indoor relief and the Act helped prevent irresponsible claims on a parish’s poor rate. The growth was also bolstered by the Relief of the Poor Act in 1782 which was intended to allow parishes to share the cost of poor relief by joining together to form unions, known as Gilbert Unions, to build and maintain even larger workhouses to accommodate the elderly and infirm. The able-bodied poor were instead either given outdoor relief or found employment locally. Workhouses were established and mainly conducted with a view to deriving profit from the labour of the inmates, and not as being the safest means of affording relief by at the same time testing the reality of their destitution. The workhouse was in truth at that time a kind of manufactory, carried on at the risk and cost of the poor-rate, employing the worst description of the people, and helping to pauperise the best. By 1832 the amount spent on poor relief nationally had risen to £7 million a year, more than ten shillings per head of population, up from £2 million in 1784 and the large number of those seeking assistance was pushing the system to the verge of collapse. The economic downturn following the end of the Napoleonic Wars in the early 19th century resulted in increasing numbers of unemployed and coupled with developments in agriculture that meant less labour was needed on the land, along with three successive bad harvests beginning in 1828 and the ‘Swing Riots’ of 1830, reform was inevitable. In 1832 the government established a Royal Commission to investigate and recommend how relief could best be given to the poor. The result was the establishment of a centralised Poor Law Commission in England and Wales under the Poor Law Amendment Act in 1834, also known as the New Poor Law, which discouraged the allocation of outdoor relief to the able-bodied, with all cases offered ‘the house and nothing else’. Individual parishes were grouped into Poor Law Unions, each of which was to have a union workhouse. More than 500 of these were built during the following fifty years, two-thirds of them by 1840. In certain parts of the country there was a good deal of resistance to these new buildings, some of it violent, particularly in the industrial north. Many workers lost their jobs during the major economic depression of 1837, and there was a strong feeling that what the unemployed needed was not the workhouse but short-term relief to tide them over. By 1838, five hundred and seventy-three Poor Law Unions had been formed in England and Wales and these incorporated 13,427 parishes, but it was not until 1868 that unions were established across the entire country. Despite the intentions behind the 1834 Act, relief of the poor remained the responsibility of local taxpayers, and there was thus a powerful economic incentive to use loopholes such as sickness in the family to continue with outdoor relief as the weekly cost per person was about half that of providing workhouse accommodation. Also, outdoor relief was further restricted by the terms of the 1844 Outdoor Relief Prohibitory Order which aimed to end it altogether for the able-bodied poor. As a result, in 1846 of 1.33 million paupers only 199,000 were maintained in workhouses, of whom 82,000 were considered to be able-bodied, leaving an estimated 375,000 of the able-bodied on outdoor relief. Excluding periods of extreme economic distress, it has been estimated that about 6.5% of the British population may have been accommodated in workhouses at any given time. After 1835, many workhouses were constructed with the central buildings surrounded by work and exercise yards enclosed behind brick walls, so-called “pauper bastilles”. The commission proposed that all new workhouses should allow for the segregation of paupers into at least four distinct groups, each to be housed separately between the aged and impotent, children, able-bodied males, and able-bodied females.

The Carlisle Union Workhouse, opened in 1864. It later part of the University of Cumbria.

In 1836 the Poor Law Commission distributed six diets for workhouse inmates, one of which was to be chosen by each Poor Law Union depending on its local circumstances. Although dreary, the food was generally nutritionally adequate and according to contemporary records was prepared with great care. Issues such as training staff to serve and weigh portions were well understood. The diets included general guidance, as well as schedules for each class of inmate. They were laid out on a weekly rotation, the various meals selected on a daily basis, from a list of foodstuffs. For instance, a breakfast of bread and gruel was followed by dinner, which might consist of cooked meats, pickled pork or bacon with vegetables, potatoes, dumpling, soup and suet then rice pudding. Supper was normally bread, cheese and broth, sometimes butter or potatoes. The larger workhouses had separate dining rooms for males and females, but workhouses without separate dining rooms would stagger the meal times to avoid any contact between the sexes. Religion played an important part in workhouse life: prayers were read to the paupers before breakfast and after supper each day. Each Poor Law Union was required to appoint a chaplain to look after the spiritual needs of the workhouse inmates, and he was invariably expected to be from the established Church of England. Religious services were generally held in the dining hall, as few early workhouses had a separate chapel. But in some parts of the country there were more dissenters than members of the established church, as section 19 of the 1834 Poor Law specifically forbade any regulation forcing an inmate to attend church services ‘in a Mode contrary to their Religious Principles’ and the commissioners were reluctantly forced to allow non-Anglicans to leave the workhouse on Sundays to attend services elsewhere, so long as they were able to provide a certificate of attendance signed by the officiating minister on their return. As the 19th century wore on, non-conformist ministers increasingly began to conduct services within the workhouse, but Catholic priests were rarely welcomed. A variety of legislation had been introduced during the 17th century to limit the civil rights of Catholics, beginning with the Popish Recusants Act of 1605 in the wake of the failed Gunpowder Plot that year. Though almost all restrictions on Catholics in England and Ireland were removed by the Roman Catholic Relief Act of 1829, a great deal of anti-Catholic feeling remained. Even in areas with large Catholic populations the appointment of a Catholic chaplain was unthinkable. Some guardians went so far as to refuse Catholic priests entry to the workhouse. The education of children presented a dilemma. It was provided free in the workhouse, but had to be paid for by the ‘merely poor’. Instead of being ‘less eligible’, conditions for those living in the workhouse were in certain respects ‘more eligible’ than for those living in poverty outside. By the late 1840s, most workhouses outside London and the larger provincial towns housed only those considered to be the incapable, elderly and sick. By the end of the century only about twenty per cent of those admitted to workhouses were unemployed or destitute, but about thirty per cent of the population over 70 were in workhouses. Responsibility for administration of the poor passed to the Local Government Board in 1871 and the emphasis soon shifted from the workhouse as a receptacle for the helpless poor to its role in the care of the sick and helpless. The Diseases Prevention Act of 1883 allowed workhouse infirmaries to offer treatment to non-paupers as well as inmates. The introduction of pensions in 1908 for those aged over 70 did not reduce the number of elderly housed in workhouses, but it did reduce the number of those on outdoor relief by twenty-five per cent. By the beginning of the 20th century some infirmaries were even able to operate as private hospitals. A Royal Commission of 1905 reported that workhouses were unsuited to deal with the different categories of resident they had traditionally housed, and recommended that specialised institutions for each class of pauper should be established, in which they could be treated appropriately by properly trained staff. The ‘deterrent’ workhouses were in future to be reserved for those considered as incorrigibles, such as drunkards, idlers and tramps. On 24 January 1918 the Daily Telegraph reported that the Local Government Committee on the Poor Law had presented to the Ministry of Reconstruction a report recommending abolition of the workhouses and transferring their duties to other organisations. That same year, free primary education for all children was provided in the UK. Then the Local Government Act of 1929 gave local authorities the power to take over workhouse infirmaries as municipal hospitals, although outside London few did so. The workhouse system was officially abolished in the UK by the same Act on 1 April 1930, but many workhouses, renamed Public Assistance Institutions, continued under the control of local county councils. At the outbreak of the Second World War in 1939 almost 100,000 people were accommodated in the former workhouses, 5,629 of whom were children. Then the 1948 National Assistance Act abolished the last vestiges of the Poor Law, and with it the workhouses. Many of the buildings were converted into retirement homes run by the local authorities, so by 1960 slightly more than half of local authority accommodation for the elderly was provided in former workhouses. Camberwell workhouse in Peckham, South London continued until 1985 as a homeless shelter for more than 1,000 men, operated by the Department of Health and Social Security and renamed a resettlement centre. Southwell workhouse, also known as Greet House, in Southwell, Nottinghamshire is now a museum but was used to provide temporary accommodation for mothers and children until the early 1990s. How often we can pass by these buildings and not give a thought to their historical significance.

This week, a thought.
Life is a presentation of choices. Wherever you are now exactly represents the sum of your previous decisions, actions and inactions.

Click: Return to top of page or Index page

The Bright Side

Right now it is approaching the end of January and this week’s blog is slightly longer than usual. Last month we had the shortest day of the year in terms of daylight, which means quite a few people should begin to feel happier now that our sun rises at an earlier time each day! Of course, we become used to that and then the clocks in the UK go forward an hour. Moon phases reveal the passage of time in the night sky and on some nights when we look up at the moon it is full and bright, whilst sometimes it is just a sliver of silvery light. These changes in appearance are the phases of the moon and as the moon orbits Earth, it cycles through eight distinct phases. The four primary phases of the moon occur about one week apart, with the full moon its most dazzling stage. For example we had a New Moon on January 2, a First Quarter on January 9, a Full Moon on January 17 and a Last Quarter on January 25. Then we are back to a New Moon on February 1, which will signal the beginning of the Lunar New Year. This is also called Chinese New Year and will signal the ‘Year Of The Water Tiger’. A New Moon is when our satellite is between the Earth and the Sun, so it’s not visible to us. Technology has progressed so much now and more folk take such excellent photos of the moon and other stellar objects. In addition, we can share these with family, friends, almost anyone we wish to via the Internet technology so many of us can access. But not everyone has either the access to or even the wish to use things like Facebook, Messenger, WhatsApp, Zoom and so many others too numerous to mention. In my early days of taking photographs I used a very simple and straightforward camera, a Kodak Instamatic. Once used, I would take the film in to a local camera shop where the film would be developed and a few days later I would return to that shop to collect my pictures. Sometimes I would be pleased with the results and other times not, but I could at times get some advice a friendly shop assistant who was a photographer. I really did learn much in those early days and I am grateful even now for the help I received. I have written in a previous blogs about the different cameras I have had, from the basic ‘point and click’ up to the modern Single Lens Reflex (SLR) ones where a prism and mirror system is used to view an image exactly before it is stored electronically on a memory card. Images can now be modified, linked, such things as their brightness and contrast adjusted, all at the click of a button. Videos are made quickly and easily using simple smart phones, even if they only last a few seconds. I have no doubt that in time, more ideas will provide what to many will be seen as bigger as well as better. But we should surely not lose sight of the past, the basics, the simple ideas. There are many who will come up with new ideas, but they cannot be expected to create them at will. Likewise, those with new ideas give rise to further developments. As an example, I have previously mentioned in an earlier blog post about Guy Fawkes and the Gunpowder Plot.

Gunpowder is the first explosive to have been developed. Popularly listed as one of the ‘Four Great Inventions’ of China, the others being the compass, paper making and printing. Gunpowder was invented during the late Tang Dynasty in the 9th century, whilst the earliest recorded chemical formula for gunpowder dates to the Song Dynasty of the 11th century. The knowledge of gunpowder spread rapidly throughout Asia, the Middle East and Europe, possibly as a result of Mongol conquests during the 13th century, with written formulas for it appearing in the Middle East between 1240 and 1280 in a treatise by Hasan-al-Rammah and in Europe by 1267 in the ‘Opus Majus’ by Roger Bacon. It was employed in warfare to some effect from at least the 10th century in weapons such as fire arrows, bombs and the fire lance before the appearance of the gun in the 13th century. In fact whilst the fire lance was eventually supplanted by the gun, some other gunpowder weapons such as rockets and fire arrows continued to see use in China, Korea, India, and eventually Europe. Gunpowder has also been used for non-military purposes such as fireworks for entertainment, as well as in explosives for mining and tunnelling. The evolution of guns then led to the development of large artillery pieces, popularly known as bombards, during the 15th century and pioneered by states such as the Duchy of Burgundy. Firearms came to dominate early modern warfare in Europe by the 17th century and the gradual improvement of cannons firing heavier rounds for a greater impact against fortifications led to the invention of the star fort and the bastion in the Western world, where traditional city walls and castles were no longer suitable for defence. The use of gunpowder technology also spread throughout the Islamic world as well as to India, Korea and Japan. The use of gunpowder in warfare during the course of the 19th century diminished due to the invention of smokeless powder and as a result, gunpowder is often referred to nowadays as ‘black powder’ to distinguish it from the propellant used in contemporary firearms.

A Chinese fire arrow utilising a bag of gunpowder as incendiary,
c. 1390.

The earliest reference to gunpowder seems to have appeared in 142AD during the Eastern Han dynasty. It is said that an alchemist by the name of Wei Boyang was known as the ‘father of alchemy’ and wrote about a substance with gunpowder-like properties which described a mixture of three powders that would “fly and dance” violently in the ‘Book of the Kinship of Three’, a Taoist text on the subject of alchemy. However, Wei Boyang is considered to be a semi-legendary figure meant to represent a ‘collective unity’, and was probably written about in stages from the Han dynasty to 450AD. Although not specifically named, the powders were almost certainly the ingredients of gunpowder and no other explosive known to scientists is composed of such powders. Whilst it was almost certainly not their intention to create a weapon of war, Taoist alchemists continued to play a major role in the development of gunpowder due to their experiments with sulphur and saltpetre, although one historian has considered that despite the early association of gunpowder with Taoism, this may be a quirk of historiography and a result of the better preservation of texts associated with Taoism, rather than being a subject limited to only Taoists. Their quest for the elixir of life certainly attracted many powerful patrons, one of whom was Emperor Wu of Han. The next reference to gunpowder occurred in the year 300AD during the Jin dynasty and a Taoist philosopher wrote down all of the ingredients of gunpowder in his surviving works, collectively known as the ‘Baopuzi’. In 492AD, some Taoist alchemists noted that saltpetre, one of the most important ingredients in gunpowder, burns with a purple flame allowing for practical efforts at purifying the substance and during the Tang dynasty, alchemists used saltpetre in processing the four yellow drugs, namely sulphur, realgar, orpiment and arsenic trisulphide. Taoist text warned against an assortment of dangerous formulas, one of which corresponds with gunpowder, in fact alchemists called this discovery ‘fire medicine’ and the term has continued to refer to gunpowder in China into the present day, a reminder of its heritage as a side result in the search for longevity increasing drugs. A book published in 1185AD called ‘Gui Dong’, The Control of Spirits, also contains a story about a Tang dynasty alchemist whose furnace exploded, but it is not known if this was caused by gunpowder. The earliest surviving chemical formula of gunpowder dates to 1044AD in the form of the military manual, known in English as the ‘Complete Essentials for the Military Classics’, which contains a collection of facts on Chinese weaponry. However this edition has since been lost and the only currently extant copy is dated to 1510AD during the Ming dynasty. Gunpowder technology also spread to naval warfare and in 1129AD it was decreed that all warships were to be fitted with trebuchets for hurling gunpowder bombs.

By definition, a gun uses the explosive force of gunpowder to propel a projectile from a tube so cannons, muskets, and pistols are therefore typical examples. In 1259AD a type of fire-emitting lance was made from a large bamboo tube, with a pellet wad stuffed inside it. Once the fire goes off, it completely spews the rear pellet wad forward, and it has been said that “the sound is like a bomb that can be heard for five hundred or more paces”. The pellet wad mentioned is possibly the first true bullet in recorded history. Fire lances transformed from the bamboo, wood or paper-barrelled firearm to the metal-barrelled firearm in order to better withstand the explosive pressure of gunpowder. From there it branched off into several different gunpowder weapons known as ‘eruptors’ in the late 12th and early 13th centuries. The oldest extant gun whose dating is unequivocal is the Xanadu Gun because it contains an inscription describing its date of manufacture corresponding to 1298AD. It is so called because it was discovered in the ruins of Xanadu, the Mongol summer palace in Inner Mongolia. The design of the gun includes axial holes in its rear which some speculate could have been used in a mounting mechanism. Another specimen, the Wuwei Bronze Cannon, was discovered in 1980 and may possibly be the oldest as well as largest cannon of the 13th century though a similar weapon was discovered in 1997, but much smaller in size. So it seems likely that the gun was born sometime during the 13th century. Gunpowder may have been used during the Mongol invasions of Europe, as shortly after the Mongol invasions of Japan which was from 1274AD to 1281AD, the Japanese produced a scroll painting depicting a bomb and is speculated to have been the Chinese thunder crash bomb. Japanese descriptions of the invasions also talk of iron and bamboo ‘pao’, causing light and fire and emitting 2 to 3,000 iron bullets.

A Swiss soldier firing a hand cannon late 14th, 15th centuries.
Illustration produced in 1874.

A common theory of how gunpowder came to Europe is that it made its way along the Silk Road, through the Middle East. Another is that it was brought to Europe during the Mongol invasion in the first half of the 13th century. Some sources claim that Chinese firearms and gunpowder weapons may have been deployed by Mongols against European forces at the Battle of Mohi in 1241AD, it may also have been due to subsequent diplomatic and military contacts. Some authors have speculated that William of Rubruck, who served as an ambassador to the Mongols from 1253AD to 1255AD, was a possible intermediary in the transmission of gunpowder. The 1320s seem to have been the takeoff point for guns in Europe according to most modern military historians. Scholars suggest that the lack of gunpowder weapons in a well-traveled Venetian’s catalogue for a new crusade in 1321AD implies that guns were unknown in Europe up until this point but guns spread rapidly across Europe There was a French raiding party that sacked and burned Southampton in 1338AD who brought with them a ribaudequin, a late medieval volley gun with many small-calibre iron barrels set up in parallel on a platform. It was in use during the 14th and 15th centuries and when the gun was fired in a volley, it created a shower of iron shot. But the French brought only 3 pounds of gunpowder. Around the late 14th century European and Ottoman guns began to deviate in purpose and design from guns in China, changing from small anti-personnel and incendiary devices to the larger artillery pieces most people imagine today when using the word “cannon”, If the 1320s can be considered the arrival of the gun on the European scene, then the end of the 14th century may very well be the departure point from the trajectory of gun development in China. In the last quarter of the 14th century, European guns grew larger and began to blast down fortifications.

In India, gunpowder technology is believed to have arrived by the mid-14th century, but could have been introduced much earlier by the Mongols, who had conquered both China and some borderlands of India, perhaps as early as the mid-13th century. The unification of a large single Mongol Empire resulted in the free transmission of Chinese technology into Mongol conquered parts of India. Regardless, it is believed that the Mongols used Chinese gunpowder weapons during their invasions of India. The first gunpowder device, as opposed to naphtha-based pyrotechnics, introduced to India from China in the second half of the 13th century, was a rocket called the ‘hawai’. The rocket was used as an instrument of war from the second half of the 14th century onward, and the Delhi sultanate as well as Bahmani kingdom made good use of them.

‘Mons Meg’, a medieval Bombard weapon built in 1449.
Located in Edinburgh Castle.

As a response to gunpowder artillery, European fortifications began displaying architectural principles such as lower and thicker walls in the mid-1400s. Cannon towers were built with artillery rooms where cannons could discharge fire from slits in the walls. However this proved problematic as the slow rate of fire, reverberating concussions, and noxious fumes produced greatly hindered defenders. Gun towers also limited the size and number of cannon placements because the rooms could only be built so big. The star fort, also known as the bastion fort, was a style of fortification that became popular in Europe during the 16th century. These were developed in Italy and became widespread in Europe. The main distinguishing features of the star fort were its angle bastions, each placed to support their neighbour with lethal crossfire, covering all angles, making them extremely difficult to engage with and attack. By the 1530s the bastion fort had become the dominant defensive structure in Italy. Outside Europe, the star fort became an ‘engine of European expansion’ and acted as a force multiplier so that small European garrisons could hold out against numerically superior forces. Wherever star forts were erected, the natives experienced great difficulty in uprooting European invaders. In China, bastion forts were advocated for the construction so that their cannons could better support each other. Gun development and design in Europe reached its most classic form in the 1480s, as they were longer, lighter, more efficient, and more accurate compared to predecessors only three decades prior and the design persisted. The two primary theories for the appearance of the classic gun involve the development of gunpowder corning and a new method for casting guns. The ‘corning’ hypothesis stipulates that the longer barrels came about as a reaction to the development of corned gunpowder. Not only did corned powder keep better, because of its reduced surface area, but gunners also found that it was more powerful and easier to load into guns. Prior to corning, gunpowder would also frequently de-mix into its constitutive components and was therefore unreliable. The faster gunpowder reaction was suitable for smaller guns, since large ones had a tendency to crack, and the more controlled reaction allowed large guns to have longer, thinner walls. In India, guns made of bronze have been recovered from Calicut (1504AD) and Diu (1533AD). By the 17th century a diverse variety of firearms were being manufactured in India, large guns in particular. Gujarat supplied saltpetre in Europe for use in gunpowder warfare during the 17th century and the Dutch, French, Portuguese, and English used Chāpra as a centre of saltpetre refining. Aside from warfare, gunpowder was used for hydraulic engineering in China by 1541. Gunpowder blasting followed by dredging of the detritus was a technique which Chen Mu employed to improve the Grand Canal at the waterway where it crossed the Yellow River. In Europe, it was utilised in the construction of the ‘Canal du Midi’ in Southern France and which was completed in 1681 and linked the Mediterranean sea with the Atlantic with 240km of canal and 100 locks. But before gunpowder was applied to civil engineering, there were two ways to break up large rocks, by hard labour or by heating with large fires followed by rapid quenching. The earliest record for the use of gunpowder in mines comes from Hungary in 1627AD. It was introduced to Britain in 1638AD by German miners, after which time records are numerous but until the invention of the safety fuse in 1831, the practice was extremely dangerous. Another reason for danger were the dense fumes given off and the risk of igniting flammable gas when used in coal mines. Gunpowder was also extensively used in railway construction. At first railways followed the contours of the land, or crossed low ground by means of bridges and viaducts, but later railways made extensive use of cuttings and tunnels. One 2400-ft stretch of the 5.4 mile Box Tunnel on the Great Western Railway line between London and Bristol consumed a ton of gunpowder per week for over two years. Then there is the Fréjus Rail Tunnel, also called Mont Cenis Tunnel, which is a rail tunnel some 8.5 miles (13.7 kilometres) length in the European Alps, carrying the Turin-Modane railway through Mont Cenis to an end-on connection with the Cult-Modane railway and linking Bardonecchia in Italy to Modane in France. The tunnel was completed in 13 years starting in 1857AD but, even with black powder, progress was only 25 centimetres a day until the invention of pneumatic drills, which speeded up the work. However, the latter half of the 19th century saw the invention of nitroglycerin along with nitrocellulose and smokeless powders which soon replaced traditional gunpowder in most civil and military applications. Believe it or not, there is so much more to tell on this subject and as you can see, we have learned and developed so much over the centuries. I am sure we will continue to do so. As always, I hope that it will be for the good, for the benefit of all.

This week…
A great deal has been written about marriage. I once saw the following quote: “Marriage is an institution – but not everyone wants to live in an institution”. Another is “Marriage can be like a deck of cards. At the beginning, all you need are two hearts and a diamond, but in the end you wish you had a club and a spade”…

Click: Return to top of page or Index page

Accepting Change

As we go through life, change is all around us, day by day. I have written about our universe, our sun and the planets, including this Earth which has changed over millions of years. Though it is only a relatively short space of time that as humans we have been recording these developments, with the technology we have developed and the skills we have learned we can look back and see what has occurred. But these changes are continuing and we are having a marked effect on them. This Earth still spins, seasons change, life ends and new life begins. In many species it develops, as it adapts to the changes that are made. But many species are no longer with us. I was watching an item on YouTube about changes being made to the Catthorpe Interchange on the M1 motorway and how newts had been discovered there. As a result, changes were made to the area where they were in order to preserve it whilst they grew and eventually moved. A newt is a form of salamander, which are a group of amphibians typically characterised by their lizard-like appearance, with slender bodies, blunt snouts, short limbs projecting at right angles to the body, and the presence of a tail in both larvae and adults. Their diversity is highest in the Northern Hemisphere. They rarely have more than four toes on their front legs and five on their rear legs, but some species have fewer digits and others lack hind limbs. Their permeable skin usually makes them reliant on habitats in or near water or other cool, damp places. Some species are aquatic throughout their lives, some take to the water intermittently, and others are entirely terrestrial as adults. They are capable of regenerating lost limbs as well as other damaged parts of their bodies and researchers hope to reverse engineer this remarkable regenerative processes for potential human medical applications, such as brain and spinal cord injury treatment or preventing harmful scarring during heart surgery recovery. The skin of some species contains the powerful poison tetrodotoxin and as a result these salamanders tend to be slow-moving and have a bright warning colouration in order to advertise their toxicity. Salamanders typically lay eggs in water and have aquatic larvae, but great variation occurs in their lifecycles. Newts metamorphose through three distinct developmental life stages: aquatic larva, terrestrial juvenile (eft), and adult. Adult newts have lizard-like bodies and return to the water every year to breed, otherwise living in humid, cover-rich land habitats. They are therefore semiaquatic, alternating between aquatic and terrestrial habitats. Not all aquatic salamanders are considered newts, however. More than 100 known species of newts are found in North America, Europe, North Africa and Asia. Newts are threatened by habitat loss, fragmentation and pollution. Several species are endangered, and at least one species, the Yunnan lake newt, has recently become extinct as it was only found in the shallow lake waters and adjacent freshwater habitats near the Kunming Lake in Yunnan, China. The Old English name of the animal was ‘efte’ or ‘efeta’ (of unknown origin), resulting in the Middle English ‘eft. This word was transformed irregularly into ‘euft’, ‘evete’ or ‘ewt(e)’. The initial “n” was added from the indefinite article “an” by the early 15th century. The form “newt” appears to have arisen as a dialectal variant of ‘eft’ in Staffordshire, but entered Standard English by the Early Modern period where it was used by Shakespeare in ‘Macbeth’. The regular form ‘eft’, now only used for newly metamorphosed specimens, survived alongside ‘newt’, especially in composition, the larva being called “water-eft” and the mature form “land-eft” well into the 18th century, but the simplex ‘eft’ as equivalent to “water-eft” has been in use since at least the 17th century. Dialectal English and Scots also has the word ‘ask’’, also ‘awsk’ and ‘esk’ in Scots used for both newts and wall lizards from Old English, from photo-Germanic , literally ‘lizard-badger’ or ‘distaff-like lizard’. Latin had the name ‘stellio’ for a type of spotted newt, Ancient Greek had the name κορδύλος, presumably for the water newt (immature newt, or eft). German has ‘Molch’, from Middle High German. Newts are also known as ‘Tritones’, named for the mythological Triton in historical literature, and ‘triton’ remains in use as common name in some Romance languages, in Greek, in Romanian, Russian, and Bulgarian.

The Pyrenean brook newt lives in small streams in the mountains.

Newts are found in North America, Europe, North Africa and Asia. The Pacific newts and the Eastern newts are amongst the seven representative species in North America, whilst most diversity is found in the Old World. In Europe and the Middle East, eight genera with roughly 30 species are found, with the ribbed newts extending to northernmost Africa. Eastern Asia, from Eastern India over Indochina to Japan, is home to five genera with more than 40 species. As I have mentioned, newts are semiaquatic, spending part of the year in the water for reproduction and the rest of the year on land. Whilst most species prefer areas of stagnant water such as ponds, ditches or flooded meadows for reproduction, some species such as the Danube Crested newt can also occur in slow-flowing rivers. In fact the European brook newts and European mountain newts have even adapted to life in cold, oxygen-rich mountain streams. During their terrestrial phase, newts live in humid habitats with abundant cover such as logs, rocks, or earth holes. Newts share many of the characteristics of their salamander kin, including semipermeable glandular skin, four equal-sized limbs, and a distinct tail. However, the skin of the newt is not as smooth as that of other salamanders. The cells at the site of an injury have the ability to un-differentiate, reproduce rapidly and differentiate again to create a new limb or organ. One hypothesis is that the un-differentiated cells are related to tumour cells, since chemicals that produce tumours in other animals will produce additional limbs in newts. In terms of development, the main breeding season for newts in the Northern Hemisphere is in June and July. After courtship rituals of varying complexity, which take place in ponds or slow-moving streams, the male newt transfers a spermatophore, which is taken up by the female. Fertilised eggs are laid singly and are usually attached to aquatic plants. This distinguishes them from the free-floating eggs of frogs or toads, which are laid in clumps or in strings. Plant leaves are usually folded over and attached to the eggs to protect them. The larvae, which resemble fish fry but are distinguished by their feathery external gills, hatch out in about three weeks. After hatching, they eat algae, small invertebrates, or other amphibian larvae. During the subsequent few months, the larvae undergo metamorphosis, during which they develop legs, whilst the gills are absorbed and replaced by air-breathing lungs. At this time some species, such as the North American newts, also become more brightly coloured. Once fully metamorphosed, they leave the water and live a terrestrial life, when they are known as ‘efts’. Only when the eft reaches adulthood will the North American species return to live in water, rarely venturing back onto the land. Conversely, most European species live their adult lives on land and only visit water to breed.

The Pacific newt is known for its toxicity.

Many newts produce toxins in their skin secretions as a defence mechanism against predators. ‘Taricha’ newts of western North America are particularly toxic and the rough-skinned newt of the Pacific Northwest actually produces more than enough tetrodotoxin to kill an adult human. In fact some native Americans of the Pacific Northwest used the toxin to poison their enemies! However, the toxins are only dangerous if ingested or otherwise enter the body, for example through a wound. Newts can safely live in the same ponds or streams as frogs and other amphibians and most newts can be safely handled, provided the toxins they produce are not ingested or allowed to come in contact with mucous membranes or breaks in the skin. I have also learned that newts, as with salamanders in general and other amphibians, serve as bioindicators and this is because their thin, sensitive skin and evidence of their presence (or absence) can serve as an indicator of the health of the environment. Most species are highly sensitive to subtle changes in the pH level of the streams and lakes where they live. Because their skin is permeable to water, they absorb oxygen and other substances they need through their skin. This is why scientists carefully study the stability of the amphibian population when studying the water quality of a particular body of water.

But of course that is just one example of changes on this Earth and this to me is why it is so very important to be aware of change. I know that change occurs all the time, change is healthy in so many ways. But so often people make changes in a very selfish way, with no thought as to what impact it may have, whether it be on the people around us, on the plants and animals, even to Earth itself. It has been said that many years ago an animal was left on an island, albeit by accident perhaps, but but that single animal then preyed on a species local to the island and wiped the species out completely. But the animal could not have been brought to that island without human intervention. We have very strict controls on our borders, as most if not all countries do, and yet there are those who flout the rules, not thinking that the rules should apply to them. A while ago I learned of Birds Nest soup, called the ‘Caviar of the East’ but rather than being made from twigs and bits of moss, it is made from the hardened saliva from Swiftlets and dissolved in a broth. It is a Chinese delicacy, is high in minerals like calcium, magnesium and potassium and is extremely rare and valuable. However, because it is an animal product, it is subject to strict import restrictions, particularly with regard to H5N1 avian flu, which could cause an epidemic if brought in to another country. But some people attempt to bring this item over from such places as China, hiding it in their luggage, even though they are warned not to. It is potentially dangerous to bring such items into another country because of the harm it can do. Nowadays we travel around the world far more easily, we can get on an aircraft and be on the other side of the world in a matter of hours, a trip that would at one time have taken us weeks. I was fortunate enough to have a superb holiday a few years ago which took me around the world to Australia and New Zealand, then up to the United States of America, with a number of superb stopping-off places in between. A few years before I had flown to the U.S.A. and wherever I went, the same strict border controls were in place. In fact, prior to my long cruise I had to have a few vaccinations, with proof that I had done so and whilst boarding at Southampton a few passengers were not permitted to travel because they had not been vaccinated. As a result, they had to make their own way to our next port of call, which was Tenerife, after they had been. Right now we are still in the midst of a pandemic, though there are those who have differing views on it, both in terms of its effect and its treatment. Only time will tell. As expected, it is having a marked effect on us, on our daily lives, the health and welfare of everyone. It is changing how we live, how we interact with family and friends and how we cope. Some I know are coping better than others. There is no doubt that we all have a collective responsibility to manage in these troubled times, to believe the people who are skilled in medicine and not be swayed by the people who only think selfishly of themselves. As I wrote in a blog post last year, some folk want the newest, the latest things, they treasure possessions whilst others consider money itself to be important. It does not matter what country they are from. There are those who say that money is the root of all evil, but they are in fact misquoting from the Bible, as the correct version is “For the love of money is the root of all of evil: which while some coveted after, they have erred from the faith, and pierced themselves through with many sorrows”. ~ 1 Timothy 6:10. So it is not the love of money, it is what is done with it that matters. We read so often how people seek both peace and contentment and it is often those who lead a simpler life without many possessions who are, as they have enough food and clothing for themselves and they do what they can to help others. They give thanks every day for all things in their lives, the good and the not so good. Such folk are content. But if those who have money would share it with those who have less, even if it was to simply increase a worker’s basic wage, it would make such a tremendous difference. That is a change which many would gladly accept.

This week I was reminded…
Of a large shop in Peterborough which, many years ago, clearly had a central cash office. As I recall, to pay for goods your money was handed over to an assistant who put it, along with an invoice, into a plastic container. This was placed into a pneumatic pipe system which went between departments, the container whizzed along, your cash was taken and a receipt returned in the same way. I believe that in some areas, even telephone exchanges used them. It is nothing like the electronic systems we use today!

Click: Return to top of page or Index page

Human Evolution

Last week I wrote about how the Sun, along with the planets, were thought to have been formed. This time I will say a bit more about the colonisation of the land, a bit about extinctions and then talk about our human evolution and its history.

An artist’s conception of Devonian flora.

I said last week that the Huronian ice age might have been caused by the increased oxygen concentration in the atmosphere, which caused the decrease of methane (CH4) in the atmosphere. Methane is a strong greenhouse gas, but with oxygen it reacts to form CO2, a less effective greenhouse gas. Oxygen accumulation from photosynthesis resulted in the formation of an ozone layer that absorbed much of the Sun’s ultraviolet radiation, meaning unicellular organisms that reached land were less likely to die, and as a result Prokaryotes began to multiply and became better adapted to survival out of the water. These microscopic single-celled organisms have no distinct nucleus with a membrane and include bacteria. These organisms colonised the land, then along came Eukaryotes, an organism consisting of a cell or cells in which the genetic material is DNA in the form of chromosomes contained within a distinct nucleus. For a long time, the land remained barren of multicellular organisms. The supercontinent Pannotia formed around 600Ma (that is 600 million years ago) and then broke apart a short 50 million years later. Fish, the earlier vertebrates, evolved in the oceans around 530Ma. A major extinction event occurred near the end of the Cambrian period, which ended 488 Ma. Several hundred million years ago plants, probably resembling algae and fungi, started growing at the edges of the water, and then out of it. The oldest fossils of land fungi and plants date to around 480 to 460Ma, though molecular evidence suggests the fungi may have colonised the land as early as 1,000Ma and the plants 700Ma. Initially remaining close to the water’s edge, mutations and variations resulted in further colonisation of this new environment. The timing of the first animals to leave the oceans is not precisely known, but the oldest clear evidence is of arthropods on land around 450Ma, perhaps thriving and becoming better adapted due to the vast food source provided by the terrestrial plants. There is also unconfirmed evidence that arthropods may have appeared on land as early as 530Ma. The first of five great mass extinctions was the Ordovician-Silurian extinction and its possible cause was the intense glaciation of Gondwana, which eventually led to a snowball Earth where some 60% of marine invertebrates became extinct. The second mass extinction was the Late Devonian extinction, probably caused by the evolution of trees, which could have led to the depletion of greenhouse gases like CO2 or the eurotrophication, the process by which an entire body of water or parts of it, became progressively enriched with minerals and nutrients. It has also been defined as “nutrient-induced increase in phytoplankton productivity”. This meant that 70% of all species became extinct. The third mass extinction was the Permian-Triassic, or the Great Dying event, possibly caused by some combination of the Siberian Traps volcanic event, an asteroid impact, methane hydrate gasification, sea level fluctuations and a major anoxic event. In fact, either the Wilkes Land Crater in Antarctica or the Bedout structure off the northwest coast of Australia may indicate an impact connection with the Permian-Triassic extinction. But it remains uncertain whether either these or other proposed Permian-Triassic boundary craters are either real impact craters or even contemporaneous with the Permian-Triassic extinction event. This was by far the deadliest extinction ever, with about 57% of all families and 83% of all living organisms were killed. The fourth mass extinction was the Triassic-Jurassic extinction event in which almost all small creatures became extinct, probably due to new competition from dinosaurs, who were the dominant terrestrial vertebrates throughout most of the Mesozoic period. After yet another, the most severe extinction of the period around 230Ma, dinosaurs split off from their reptilian ancestors. The Triassic-Jurassic extinction event at 200Ma spared many of the dinosaurs and they soon became dominant among the vertebrates. Though some mammalian lines began to separate during this period, existing mammals were probably small animals resembling shrews. The boundary between avian and non-avian dinosaurs is not clear, but Archaeopteryx, traditionally considered one of the first birds, lived around 150Ma. The earliest evidence for evolving flowers is during the Cretaceous period, some 20 million years later around 132Ma. Then the fifth and most recent mass extinction was the K-T extinction. In 66Ma, a 10-kilometre (6.2 mile) asteroid struck Earth just off the Yucatan Peninsula, somewhere in the southwestern tip of then Laurasia and where the Chicxlub crater in Mexico is today. This ejected vast quantities of particulate matter and vapour into the air that occluded sunlight, inhibiting photosynthesis. 75% of all life, including the non-avian dinosaurs, became extinct, marking the end of the Cretaceous period and Mesozoic era.

Yucatan Chicxlub Crater in Mexico.

A small African ape living around 6Ma (6 million years ago) was the last animal whose descendants would include both modern humans and their closest relatives, the chimpanzees, and only two branches of its family tree have surviving descendants. Very soon after the split, for reasons that are still unclear, apes in one branch developed the ability to walk upright. Brain size increased rapidly, and by 2Ma the first animals classified in the genus Homo had appeared. Of course, the line between different species or even genera is somewhat arbitrary as organisms continuously change over generations. Around the same time, the other branch split into the ancestors of the common chimpanzee and the ancestors of the bonobo as evolution continued simultaneously in all life forms. The ability to control fire probably began in Homo Erectus, probably at least 790,000 years ago but perhaps as early as 1.5Ma, but it is possible that the use and discovery of controlled fire may even predate Homo Erectus and fire was possibly used by the early Lower Palaeolithic. It is more difficult to establish the origin of language and it is unclear as to whether Homo Erectus could speak or if that capability had not begun until Homo sapiens. As brain size increased, babies were born earlier, before their heads grew too large to pass through the pelvis. As a result, they exhibited more plasticity and thus possessed an increased capacity to learn and required a longer period of dependence. Social skills became more complex, language became more sophisticated and tools became more elaborate. This contributed to further cooperation and intellectual development. Modern humans are believed to have originated around 200,000 years ago or earlier in Africa as the oldest fossils date back to around 160,000 years ago. The first humans to show signs of spirituality are the Neanderthals, usually classified as a separate species with no surviving descendants. They buried their dead, often with no sign of food or tools. But evidence of more sophisticated beliefs, such as the early Cro-Magnon cave paintings, probably with magical or religious significance, did not appear until 32,000 years ago. Cro-Magnons also left behind stone figurines such as Venus of Willendorf, probably also signifying religious belief. By 11,000 years ago, Homo sapiens had reached the southern tip of South America, the last of the uninhabited continents, except for Antarctica which remained undiscovered until 1820 AD). Tool use and communication continued to improve, and interpersonal relationships became more intricate. Throughout more than 90% of its history, Homo sapiens lived in small bands as nomadic hunter-gatherers. It has been thought that as language became more complex, the ability to remember as well as to communicate information resulted so ideas could be exchanged quickly and passed down the generations. Cultural evolution quickly outpaced biological evolution and history proper began. It seems that between 8,500BC and 7,000BC, humans in the Fertile Crescent area of the Middle East began the systematic husbandry of plants and animals, so true agriculture began This spread to neighbouring regions, and developed independently elsewhere until most Homo sapiens lived sedentary lives in permanent settlements as farmers. It was also found that those civilisations which did adopt agriculture, the relative stability and increased productivity provided by farming allowed the population to expand. Not all societies abandoned nomadism, especially those in the isolated areas of the globe that were poor in domesticable plant species, such as Australia. Agriculture had a major impact; humans began to affect the environment as never before. Surplus food allowed a priestly or governing class to arise, followed by an increasing division of labour which led to Earth’s first civilisation at Sumer in the Middle East, between 4,000BC and 3,000BC. Additional civilisations quickly arose in ancient Egypt, at the Indus River valley and in China. The invention of writing enabled complex societies to arise, record-keeping and libraries served as a storehouse of knowledge and increased the cultural transmission of information. Humans no longer had to spend all their time working for survival, enabling the first specialised occupations, like craftsmen, merchants and priests. Curiosity and education drove the pursuit of knowledge and wisdom, and various disciplines, including science, albeit in a primitive form, arose. This in turn led to the emergence of increasingly larger and more complex civilisations, such as the first empires, which at times traded with one another, or fought for territory and resources. By around 500BC there were more advanced civilisations in the Middle East, Iran, India, China, and Greece, at times expanding, other times entering into decline. In 221BC, China became a single polity, this being an identifiable political entity, a group of people who have a collective identity and who are organised by some form of institutionalised social relations, having the capacity to mobilise resources. They would grow to spread its culture throughout East Asia and it has remained the most populous nation in the world. During this period, famous Hindu texts known as Vedas came in existence in Indus Valley civilisation. They developed in warfare, arts, science, mathematics as well as in architecture. The fundamentals of Western civilisation were largely shaped in Ancient Greece, with the world’s first democratic government and major advances in philosophy as well as science. Ancient Rome grew with law, government, and engineering and then the Roman Empire was Christianised by Emperor Constantine in the early 4th century but then the Roman Empire declined by the end of the 5th century. Beginning with the 7th century, the Christianisation of Europe began. In 610AD Islam was founded and quickly became the dominant religion in Western Asia. The ‘House of Wisdom’ was established in the Abbasid era of Baghdad and Iraq. It is considered to have been a major intellectual centre during the Islamic Golden Age, where Muslim scholars in Baghdad as well as Cairo flourished from the ninth to the thirteenth centuries until the Mongol sack of Baghdad in 1258AD. Meanwhile in 1054AD the Great Schism between the Roman Catholic Church and the Eastern Orthodox Church led to the prominent cultural differences between Western and Eastern Europe. In the 14th century, the Renaissance began in Italy with advances in religion, art, and science. At that time the Christian Church as a political entity lost much of its power. In 1492AD, Christopher Columbus reached the Americas, thus initiating great changes to the New World. European civilisation began to change beginning in 1500AD, leading to both the Scientific and Industrial revolutions. The European continent began to exert political and cultural dominance over human societies around the world, a time known as the Colonial era. Then in the 18th century a cultural movement known as the Age of Enlightenment further shaped the mentality of Europe and contributed to its secularisation. From 1914 to 1918 and 1939 to 1945, nations around the world were embroiled in World Wars. Following World War I, the League of Nations was a first step in establishing international institutions to settle disputes peacefully. After failing to prevent World War II, mankind’s bloodiest conflict, it was replaced by the United Nations and after that war, many new states were formed, declaring or being granted independence in a period of decolonisation.The democratic capitalist United States and the socialist Soviet Union became the world’s dominant super-powers for a time and they held an ideological, often violent rivalry known as the Cold War until the dissolution of the latter. In 1992, several European nations joined in the European Union and as transportation and communication has improved, both the economies and political affairs of nations around the world have become increasingly intertwined. However, this globalisation has often produced both conflict and cooperation. As we continue in this beautiful world though, we are at present having to cope with a world-wide pandemic for which no cure has yet been found. We are researching and looking for vaccines that it is said will at least reduce the adverse effects of Covid-19, however many do not believe that these same vaccines are what we need. As a result, a great many deaths are still being reported in countries right across our world. Some say it is a man-made virus, others are suggesting conspiracy theories, but I feel sure that just as in the past other viruses have been beaten, this one will also be. However, in the meantime we should surely behave responsibly and work together to help reduce the spread of this virus, no matter what our thoughts, ideas or beliefs may be. So that in years to come, others may then look back and learn, in order for all life on Earth to continue.

This week, a simple quote…

“The purpose of life is a life of purpose.”
~ Robert Byrne (22 May 1930 – 06 December 2016)

Click: Return to top of page or Index page

Looking Back

I try not to spend too much time looking back on my life, but there are times when it is good to do so and I have been reminded of a blog post I sent out early last year. A few years ago now a good friend sent me an article about a daughter learning about Darwin’s Theory of Evolution and then her mother telling her about the Sanskrit Avatars, which tell their version on the beginning of life here on Earth. I appreciated that, but to me there are other people, for example the Aborigines, also the American Indians who all have their traditions. Whatever way is right, however things occurred, I really do believe that this world, along with the rest of the Universe, didn’t just happen by accident. With looming discrepancies about the true age of the universe, scientists have taken a fresh look at the observable, expanding universe and have now estimated that it is almost 14 billion years old, plus or minus 40 million years. Considering that as well as our sun, our star, there are around 100,000 million stars in just the Milky Way alone, our Earth is a bit small! As well as that, there are an estimated 500 billion galaxies. With all the fighting and killing that we humans have done in the (extremely relatively) short time that we have been around, it is perhaps a good thing that the nearest star system to our sun is Alpha Centauri, which is 4.3 light-years away. That’s about 25 trillion miles (40 trillion kilometres) away from Earth – nearly 300,000 times the distance from the Earth to the Sun. In time, about 5 billion years from now, our sun will run out of hydrogen. Our star is currently in the most stable phase of its life cycle and has been since the birth of our solar system, about 4.5 billion years ago and once all the hydrogen gets used up, the sun will grow out of this stable phase. But what about our Earth. In my blog post last week I said about what might happen if we were to take an imaginary quick jaunt through our solar system in the potential ‘last days’ of the sun. There has been speculation, there have been films, tv series, all giving a view on how things were or might be. The film ‘2001 A Space Odyssey’ is just one example. Others films like Star Trek have imagined where beings from other worlds colonised Earth, some have considered life if another race were to change life completely here. A favourite of mine, Stargate, started out as a film and then became a series where the Egyptian ruler, Ra, travelled via a star-gate to a far-distant world where earth-like creatures lived. We can speculate and wonder, it is a thing that we humans can do. Though if you know of the late, great Douglas Adams and his tales, we should not panic. Just remember that in his writings, at one point the Earth was destroyed to make way for a hyperspace bypass and that just before that happened, the dolphins left Earth and as they did so, they sent a message saying “So long, and thanks for all the fish”. But I digress. I cannot possibly detail the full history of our Earth here, but I can perhaps highlight a few areas and I shall do my best.

Many attempts have been made over the years to comprehend the main events of Earth’s past, characterised by constant change and biological evolution. There is now a geological time scale, as defined by international convention, which depicts the large spans of time from the beginning of the Earth to the present and its divisions chronicle some definitive events of Earth history. Earth formed around 4.54 billion years ago, approximately one-third the age of the Universe, by accretion – this being the growth or increase by the gradual accumulation of additional layers or matter. It came from the solar nebula. Volcanic outgassing probably created the primordial atmosphere and then the ocean, but the early atmosphere contained almost no oxygen. Much of the Earth was molten because of frequent collisions with other bodies which led to extreme volcanism. Whilst the Earth was in its earliest stage, a giant impact collision with a planet-sized body named Theiais is thought to have formed the Moon. Over time, the Earth cooled, causing the formation of a solid crust, and allowing liquid water on the surface. The Hadean eon represents the time before a reliable fossil record of life, it began with the formation of the planet and ended 4 billion years ago. The following Archean and Proterozoic eons produced the beginnings of life on Earth and its earliest evolution. The succeeding eon was divided into three eras, which brought forth arthropods, fishes, and the first life on land, the next which spanned the rise, reign, and climactic extinction of the non-avian dinosaurs and the following one which saw the rise of mammals. Recognisable humans emerged at most 2 million years ago, a vanishingly small period on the geological scale. The earliest undisputed evidence of life on Earth dates from at least 3.5 billion years ago after which a geological crust started to solidify. There are microbial mat fossils found in 3.48 billion-year-old sandstone discovered in Western Australia and other early physical evidence of a biogenic substance is graphite found in 3.7 billion-year-old rocks discovered in southwestern Greenland. Photosynthetic organisms appeared between 3.2 and 2.4 billion years ago and began enriching the atmosphere with oxygen. Life remained mostly small and microscopic until about 580 million years ago, when complex multicellular life arose. This developed over time and culminated in the Cambrian Explosion about 541 million years ago. This sudden diversification of life forms produced most of the major algae, fungi, and plants known today, and divided the Proterozoic eon from the Cambrian Period of the Paleozoic Era. It is estimated that 99 percent of all species that ever lived on Earth, over five billion have become extinct and estimates on the number of Earth’s current species range from 10 million to 14 million, of which about 1.2 million are documented, but over 86 percent have not been described. However, it was recently claimed that 1 trillion species currently live on Earth, with only one-thousandth of one percent described. The Earth’s crust has constantly changed since its formation, as has life since its first appearance. Species continue to evolve, taking on new forms, splitting into daughter species, or going extinct in the face of ever-changing physical environments. The process of plate tectonics continues to shape the Earth’s continents and oceans and the life which they harbour.

So the history of Earth is divided into four great eons, starting with the formation of the planet. Each eon saw the most significant changes in Earth’s composition, climate and life. Each eon is subsequently divided into eras, which in turn are divided into periods and which are further divided into epochs. In the Hadean eon, the Earth was formed out of debris around the solar protoplanetary disk. There was no life, temperatures were extremely hot, with frequent volcanic activity and hellish-looking environments, hence the eon’s name which comes from Hades. Possible early oceans or bodies of liquid water appeared and the Moon was formed around this time, probably due to a collision into Earth by a protoplanet. In the next eon came the first form of life, with some continents existing and an atmosphere is composing of volcanic and greenhouse gases. Following this came early life of a more complex form, including some forms of multicellular organisms. Bacteria began producing oxygen, shaping the third and current of Earth’s atmospheres. Plants, later animals and possibly earlier forms of fungi formed around that time. The early and late phases of this eon may have undergone a few ’Snowball Earth’, periods, in which all of the planet suffered below-zero temperatures. A few early continents may have existed in this eon. Finally complex life, including vertebrates, begin to dominate the Earth’s ocean in a process known as the Cambrian Explosion. Supercontinents formed but later dissolved into the current continents. Gradually life expanded to land and more familiar forms of plants, animals and fungi began to appear, including insects and reptiles. Several mass extinctions occurred though, amongst which birds, the descendants of non-avian dinosaurs, and more recently mammals emerged. Modern animals, including humans, evolved at the most recent phases of this eon.

An artist’s rendering of a protoplanetary disk.

The standard model for the formation of our Solar System, including the Earth, is the Solar Nebula hypothesis. In this model, the Solar System formed from a large, rotating cloud of interstellar dust and gas, composed of hydrogen and helium created shortly after the Big Bang some 13.8 billion years ago and heavier elements ejected by supernovae. At about 4.5 billion years the nebula began a contraction that may have been triggered by a shock wave from a nearby supernova, which would have also made the nebula rotate. As the cloud began to accelerate, its angular momentum, gravity and inertia flattened it into a protoplanetary disk that was perpendicular to its axis of rotation. Small perturbations due to the collisions and the angular momentum of other large debris created the means by which kilometre-sized protoplanets began to form, orbiting the nebular centre. Not having much angular momentum it collapsed rapidly, the compression heating it until the nuclear fusion of hydrogen into helium began. After more contraction, a ’T Tauri’ star ignited and evolved into the Sun. Meanwhile, in the outer part of the nebula gravity caused matter to condense around density perturbations and dust particles, and the rest of the protoplanetary disk began separating into rings. In a process known as runaway accretion, successively larger fragments of dust and debris clumped together to form planets. Earth formed in this manner about 4.54 billion years ago and was largely completed within 10 to 20 million years. The solar wind of the newly formed T Tauri star cleared out most of the material in the disk that had not already condensed into larger bodies. The same process is expected to produce other accretion disks around virtually all newly forming stars in the universe, some of which yield planets. Then the proto-Earth grew until its interior was hot enough to melt the heavy metals and having higher densities than silicates, these metals sank. This so-called ‘iron catastrophe’ resulted in the separation of a primitive mantle and a metallic core only 10 million years after the Earth began to form, producing the layered structure of Earth and setting up the formation of its magnetic field.

This Earth is often described as having had three atmospheres. The first atmosphere, captured from the solar nebula, was composed of lighter elements from the solar nebula, mostly hydrogen and helium. A combination of the solar wind and Earth’s heat would have driven off this atmosphere, as a result of which the atmosphere was depleted of these elements compared to cosmic abundances. After the impact which created the Moon, the molten Earth released volatile gases; and later more gases were released by volcanoes, completing a second atmosphere rich in greenhouse gases but poor in oxygen. Finally, the third atmosphere, rich in oxygen, emerged when bacteria began to produce oxygen. The new atmosphere probably contained water vapour, carbon dioxide, nitrogen, and smaller amounts of other gases. Water must have been supplied by meteorites from the outer asteroid belt also some large planetary embryos and comets may have contributed. Though most comets are today in orbits farther away from the Sun than Neptune, some computer simulations show that they were originally far more common in the inner parts of the Solar System. As the Earth cooled, clouds formed, rain created the oceans and recent evidence suggests the oceans may have begun forming quite early. At the start of the Archean eon, they already covered much of the Earth. This early formation has been difficult to explain because of a problem known as the ‘faint young sun’ paradox. Stars are known to get brighter as they age, and at the time of its formation the Sun would have been emitting only 70% of its current power. Thus, the Sun has become 30% brighter in the last 4.5 billion years. Many models indicate that the Earth would have been covered in ice and a likely solution is that there was enough carbon dioxide and methane to produce a greenhouse effect. The carbon dioxide would have been produced by volcanoes and the methane by early microbes whilst another greenhouse gas, ammonia, would have been ejected by volcanos but quickly destroyed by ultraviolet radiation. One of the reasons for interest in the early atmosphere and ocean is that they form the conditions under which life first arose. There are many models, but little consensus, on how life emerged from non-living chemicals; chemical systems created in the laboratory fall well short of the minimum complexity for a living organism. The first step in the emergence of life may have been chemical reactions that produced many of the simpler organic compounds, including nuclei and amino acids that are the building blocks of life. An experiment in 1953 by Stanley Miller and Harold Urey showed that such molecules could form in an atmosphere of water, methane, ammonia and hydrogen with the aid of sparks to mimic the effect of lightning. Although atmospheric composition was probably different from that used by Miller and Urey, later experiments with more realistic compositions also managed to synthesise organic molecules. Additional complexity could have been reached from at least three possible starting points, these being self-replication, an organism’s ability to produce offspring that are similar to itself, metabolism, its ability to feed and repair itself and external cell membranes, which allow food to enter and waste products to leave, but exclude unwanted substances. The earliest cells absorbed energy and food from the surrounding environment. They used fermentation, the breakdown of more complex compounds into less complex compounds with less energy, and used the energy so liberated to grow and reproduce. Fermentation can only occur in an oxygen-free) environment. The evolution of photosynthesis made it possible for cells to derive energy from the Sun. Most of the life that covers the surface of the Earth depends directly or indirectly on photosynthesis. The most common form, oxygenic photosynthesis, turns carbon dioxide, water, and sunlight into food. It captures the energy of sunlight in energy-rich molecules, which then provide the energy to make sugars. To supply the electrons in the circuit, hydrogen is stripped from water, leaving oxygen as a waste product. Some organisms, including purple bacteria and green sulphur bacteria, use an an oxygenic form of photosynthesis that uses alternatives to hydrogen stripped from water as electron donors, such as hydrogen sulphide, sulphur and iron. Such organisms are restricted to otherwise inhospitable environments like hot springs and hydrothermal vents. The simpler form arose not long after the appearance of life. At first, the released oxygen was bound up with limestone, iron and other minerals. The oxidised iron appears as red layers in geological strata which are called banded iron formations. When most of the exposed readily reacting minerals were oxidised, oxygen finally began to accumulate in the atmosphere. Though each cell only produced a minute amount of oxygen, the combined metabolism of many cells over a vast time transformed Earth’s atmosphere to its current state. This was Earth’s third atmosphere. Some oxygen was stimulated by solar ultraviolet radiation to form ozone, which collected in a layer near the upper part of the atmosphere. The ozone layer absorbed, and still absorbs, a significant amount of the ultraviolet radiation that once had passed through the atmosphere. It allowed cells to colonise the surface of the ocean and eventually the land. Without the ozone layer, ultraviolet radiation bombarding land and sea would have caused unsustainable levels of mutation in exposed cells. Photosynthesis had another major impact. Oxygen was toxic; much life on Earth probably died out as its levels rose in what is known as the oxygen catastrophe. Resistant forms survived and thrived, and some developed the ability to use oxygen to increase their metabolism and obtain more energy from the same food. The Sun’s natural evolution has made it progressively more luminous during the Archean and Proterozoic eons and the Sun’s luminosity increases 6% every billion years. As a result, the Earth began to receive more heat from the Sun in the Proterozoic eon. However, the Earth did not get warmer. Instead, geological records suggest that it cooled dramatically during the early Proterozoic. Glacial deposits found in South Africa based on paleo-magnetic evidence suggest they must have been located near the equator. Thus, this glaciation, known as the Huronian glaciation, may have been global. Some scientists suggest this was so severe that the Earth was frozen over from the poles to the equator, a hypothesis called Snowball Earth. The Huronian ice age might have been caused by the increased oxygen concentration in the atmosphere, which caused the decrease of methane (CH4) in the atmosphere. Methane is a strong greenhouse gas, but with oxygen it reacts to form CO2, a less effective greenhouse gas. When free oxygen became available in the atmosphere, the concentration of methane could have then decreased dramatically, enough to counter the effect of the increasing heat flow from the Sun. However, the term Snowball Earth is more commonly used to describe later extreme ice ages during the Cryogenian period. There were four periods, each lasting about 10 million years, between 750 and 580 million years ago, when the earth is thought to have been covered with ice apart from the highest mountains, and average temperatures were about −50°C (−58°F). The snowball may have been partly due to the location of the supercontinent straddling the Equator. Carbon dioxide combines with rain to weather rocks to form carbonic acid, which is then washed out to sea, thus extracting the greenhouse gas from the atmosphere. When the continents are near the poles, the advance of ice covers the rocks, slowing the reduction in carbon dioxide, but in the Cryogenian the weathering of that supercontinent was able to continue unchecked until the ice advanced to the tropics. The process may have finally been reversed by the emission of carbon dioxide from volcanoes or the destabilisation of methane gas.

Astronaut Bruce McCandless II outside of the Space Shuttle Challenger in 1984.

I think that sets the basic scene for the Earth itself, but there is still much to write about in terms of colonisation of land, extinctions and human evolution. But I think this is more than enough for now. Change has continued at a rapid pace and along with technological developments such as nuclear weapons, computers, genetic engineering and nanotechnology there has been economic globalisation, spurred by advances in communication and transportation technology which has influenced everyday life in many parts of the world. Cultural and institutional forms such as democracy, capitalism and environmentalism have increased influence. Major concerns and problems such as disease, war, poverty and violent radicalism along with more recent, human-caused climate-change have risen as the world population increases. In 1957, the Soviet Union launched the first artificial satellite into orbit and, soon afterwards Yuri Gagarin became the first human in space. The American, Neil Armstrong, was the first to set foot on another astronomical object, the Moon. Unmanned probes have been sent to all the known planets in the Solar System, with some, such as the two Voyager spacecraft having left the Solar System. Five space agencies, representing over fifteen countries, have worked together to build the International Space Station. Aboard it, there has been a continuous human presence in space since 2000. The World Wide Web became a part of everyday life in the 1990s, and since then has become an indispensable source of information in the developed world. I have no doubt that there will be much more to find, learn, discover and develop.

This week, as we begin a new year…
When attempting to remember the order of planets in our Solar System, I have found they can be remembered by:
‘My Very Educated Mother Just Served Us Nachos’

Mercury
Venus
Earth
Mars
Jupiter
Saturn
Uranus
Neptune

(Pluto was first discovered in 1930 and described as a planet located beyond Neptune, but following improvements in technology, in 2006 it was then reclassified as a ‘dwarf planet’ in 2006.)

Click: Return to top of page or Index page

A Year Ends, A New Year Begins

We are approaching the end of what for many of us has been the year 2021 but for some, the number will be different because according to tradition, the Hebrew calendar started at the time of Creation, placed at 3761 BCE. So for our current 2021/2022, the Hebrew year is 5782. Right over on the other side of this amazing world it will soon be New Year’s Day, whilst others have a few hours to wait. I will admit to finding it strange a few years ago on my lovely holiday as I crossed the International Dateline a couple of times when I ‘skipped over’ some days, whilst others were counted twice. Whatever our circumstances it has been a very trying and troubling year for so many on this world, this Earth. In years to come I wonder what we will reflect on, those who are doing so. We have both seen and experienced change, of that we may be sure. I have no doubt that there will be further change too, in the years ahead. On earlier blog posts I have said a bit about my young days, growing up in Whittlesey, where almost everyone seemed to know almost everyone else! We moved up from London, at first it was Mum, Dad, my two elder brothers and myself. Dad had managed to get a teaching job which included living in the school house and the school right next door made things easy for me! That school building is now the St Jude’s church. So that move was a really big change for us, though perhaps not quite as much for me because I was less than a year old then! A while later Nan and Pop, who were my paternal grandparents, decided to move from London to Whittlesey and retire there. So we stayed and I grew up in Whittlesey. Older brothers were growing up, one moved away having joined the army, whilst the other settled for a while but job opportunities moved him away too and this meant that I did not see too much of them or their respective offspring as they grew up. Then it became clear that whilst I was learning much where I worked, it seemed that – well, let us just say that my face just didn’t fit! As a result, when the opportunity came for me to move on to a higher grade job with the same company but in Leicester, I went. In truth it was the right thing for me as I met a lovely female and we were married for a while. Further job opportunities gave me greater experience, I moved a few times around the Midlands before finally settling in Leicester. Naturally I talked with parents regarding my first move away from Peterborough but Dad urged me to take the opportunity as he felt it would be good for me. It was then that I learned all about how my grandparents moving to Whittlesey had stopped Dad from doing what he had considered at one time, which was a teaching job right away from Peterborough. I might have found myself being brought up in Swindon or somewhere! But it was not meant to be. There are those, like many of the people I was at school with, who are still happily settled in Whittlesey and looking on Facebook I recognise names, but not faces! As I know I have said before, others moved to places far and wide like the U.S.A, Canada, Australia and to New Zealand. Equally, some of the people I have worked with moved over to England for various reasons, one lad from South Africa had moved here for job reasons but I learned that his claim to fame was as an extra in a crowd scene for a film – ‘Zulu’, I think it was. However, for some the moves were through political turmoil, with folk finding that they and their families were not welcome where they were, due to their race. That to me is absolutely awful, we are all human beings but we do not seem to be able to live peacefully together. Perhaps that day will come, but sadly I do not see it occurring for a while yet, when some people wish to be so selfish. Yet their lives too will end, in time.

So it got me thinking about early man. Many of you will have seen the film “2001 – A Space Odyssey”, where apes fight and they learn rudimentary use of bones as tools. The various ages of man have come and gone, our Earth has also changed but back then early man had no idea of what our world was like. That really is, I think, where the first ‘conspiracy theories’ started. Imagine being told that we were the centre of the universe, that the Earth was flat because they had to make sense of heir world. I am reminded of the poster advertising the “Flat Earth Society – members all around the globe”. Some years ago I bought a computer program called ‘Civilization’, where a player ‘created’ a new civilisation of their own. This ‘Sid Meier’s Civilization’ is a 1991 turn-based strategy video game developed and published by MicroProse and was originally developed for MS-DOS using a standard personal computer but has undergone numerous revisions for various platforms. The player is tasked with leading an entire human civilisation over the course of several millennia by controlling various areas such as urban development, exploration, government, trade, research, and military. The player can control individual units and advance the exploration, conquest and settlement of the game’s world. The player can also make such decisions as setting forms of government, tax rates and research priorities. The player’s civilisation is in competition with other computer-controlled civilisations, with which the player can enter diplomatic relationships that can either end in alliances or lead to war. The game has sold 1.5 million copies since its release, and is considered one of the most influential computer games in history due to its establishment of the 4X genre. In addition to its commercial and critical success, the game has been deemed quite valuable due to its presentation of historical relationships. A multiplayer remake, ‘Sid Meier’s CivNet’, was released for the personal computer in 1995 and ‘Civilization’ was followed by several sequels starting with ‘Civilisation II’, with similar or modified scenarios. I know, I had a copy and played the game for hours!

A world map screenshot from the Amiga version of ‘Civilization’.

In this game, the player takes on the role of the ruler of a civilisation. They start with one, occasionally two, settler units and they attempt to build an empire in competition with two to seven other civilisations. The game requires a fair amount of micromanagement, although less than other simulation games. Along with the larger tasks of exploration, diplomacy and warfare, the player has to make decisions about where to build new cities, which improvements or units to build in each city, which advances in knowledge should be sought (and at what rate), and how to transform the land surrounding the cities for maximum benefit. From time to time the player’s towns may be harassed by barbarians, units with no specific nationality and no named leader. These threats only come from huts, unclaimed land or sea, so that over time and turns of exploration, there are fewer and fewer places from which barbarians will emanate. Before the game begins, the player chooses which historical or current civilisation to play. In contrast to later games in the ‘Civilization’ series, this is largely a cosmetic choice, affecting titles, city names, musical heralds, and colour. The choice does affect their starting position on the “Play on Earth” map, and thus different resources in one’s initial cities, but has no effect on starting position when starting a random world game or a customised world game. The player’s choice of civilisation also prevents the computer from being able to play as that civilisation or the other civilisation of the same colour, and since computer-controlled opponents display certain traits of their civilisations this affects gameplay as well. For example, the Aztecs are fiercely expansionist and generally extremely wealthy. Other civilisations include the Americans, the Mongols and the Romans. Each civilisation is led by a famous historical figure, such as Mahatma Gandhi for India. The scope of this Civilization game is larger than most others. That is because it begins in 4000BC, before the Bronze Age and can last through to AD 2100 on the easiest setting with Space Age and ‘future technologies’. At the start of the game there are no cities anywhere in the world and the player controls one or two settler units, which can be used to found new cities in appropriate sites. Those cities may build other settler units, which can go out and found new cities, thus expanding the empire. Settlers can also alter terrain, build improvements such as mines and irrigation, build roads to connect cities, and later in the game they can construct railroads which offer unlimited movement. As time advances, new technologies are developed. These technologies are the primary way in which the game changes and grows. At the start, players choose from advances such as pottery, the wheel and the alphabet, leading to, near the end of the game, nuclear fission and spaceflight. Players can gain a large advantage if their civilisation is the first to learn a particular technology (the secrets of flight, for example) and put it to use in a military or other context. Most advances give access to new units, city improvements or derivative technologies, for example the chariot unit becomes available after the wheel is developed, and the granary building becomes available to build after pottery is developed. The whole system of advancements from beginning to end is called the technology tree and this concept has been adopted in many other strategy games. Since only one technology may be researched at any given time, the order in which they are chosen makes a considerable difference in the outcome of the game and generally reflects the player’s preferred style of gameplay. Players can also build Wonders of the World in each of the epochs of the game, subject only to obtaining the prerequisite knowledge. These wonders are important achievements of society, science, culture and defence, ranging from the Pyramids and the Great Wall in the Ancient age to the Copernicus Observatory and Magellan’s Expedition in the middle period right up to the Apollo programme, the United Nations and the Manhattan Project in the modern era. Each Wonder can only be built once in the world, and requires a lot of resources to build, far more than most other city buildings or units. Wonders provide unique benefits to the controlling civilisation, for example Magellan’s Expedition increases the movement rate of naval units. Wonders typically affect either the city in which they are built, for example the Colossus, every city on the continent, such as J.S. Bach’s Cathedral, or the civilisation as a whole, like Darwin’s Voyage. However, some wonders are made obsolete by new technologies. The game can be won by conquering all other civilisations or by winning the Space Race, reaching the star system of Alpha Centauri. The game has developed quite a bit over the years though, as I have an excellent version on my MacBook Pro which is much improved from the MS-DOS version that I used to play!

As I have said, it is a cleverly thought-out game, because it mirrors the real world so well. I really do wonder what will happen to us, to this Earth, in the future. There has been much speculation as to whether we will manage to travel to distant stars, to different planets and have interaction with other forms of life. As I said in a blog post earlier this year, life on Earth is based on carbon, perhaps because (so I have learned) that each carbon atom can form bonds with up to four other atoms simultaneously. That is a bit technical for me, but it seems that because of that, carbon is well-suited to form the long chains of molecules which then serve as the basis for life as we know it, such as proteins and DNA. In fact, research by some earth scientists at Rice University suggests that virtually all of Earth’s life-giving carbon could have come from a collision about 4.4 billion years ago between this Earth and an embryonic planet similar to Mercury. Science fiction has long imagined alien worlds inhabited by other life, but based on other elements. One example are the rock-eating Horta, a silicon-based life form as featured in the original Star Trek series. Also in that series, Mr Spock has green blood because the oxygen-carrying agent in Vulcan blood includes copper, rather than iron, as is the case in humans. For us here, carbon is the backbone of each and every known biological molecule. But life here has taken a finite amount of time to evolve, so who is to say that a life-form on another planet light-years from us has developed to the same level. Don’t be downhearted, but it is a fact, so far as our science will tell us, that stars like our Sun burn for about nine or 10 billion years. In fact our Sun is about halfway through its life, so it still has about five billion years to go. After that, the sun will run out of energy and drastically alter the whole of the solar system because oceans will be baked dry, entire planets will be consumed. And worlds that have been icy for so long will finally enjoy their day in the sun. Our star is powered by nuclear fusion, and it turns hydrogen into helium in a process that converts mass into energy. Once the fuel supply is gone, the sun will start growing dramatically. Its outer layers will expand until they engulf much of the solar system, as it becomes what astronomers call a red giant. The life cycle of the sun takes it from the life-giving star that we know today into a swelling red giant and, eventually, a planetary nebula surrounding a tiny white dwarf. Once the sun enters the red giant phase though, the solar system’s denouement is still a subject of debate among scientists. Exactly how far the dying sun will expand, and how conditions will change, aren’t yet clear. But a few things seem likely. The slow death will kill off life on Earth, but it may also create habitable worlds in what are presently the coldest reaches of the solar system. Any humans left around might find refuge on Pluto and other distant dwarf planets out in the Kuiper Belt, a region past Neptune packed with icy space rocks. As our sun expands, these worlds will suddenly find themselves with the conditions necessary for the evolution of life. One scientist believes how these may be the ‘delayed gratification habitable worlds’, as late in the life of the sun, in the red giant phase, the Kuiper Belt may be something of a metaphorical Miami Beach!

We can take an imaginary quick jaunt through our solar system in the potential ‘last days’ of the sun. Throughout solar system history, the innermost planet has been baked by the sun. But even today, Mercury still clings to some icy patches. As our star ages, it will vaporise those remaining volatile areas before eventually eliminating the entire planet in a slow-motion version of Star Wars’ Death Star. Venus though is sometimes called “Earth’s twin” because the neighbouring worlds are so similar in size and composition. But the hellish surface of Venus shares little in common with Earth’s Goldilocks-type perfect conditions. As the sun expands, it will burn up the atmosphere on Venus before it too is consumed by the sun. Whilst the sun may have 5 billion years left before it runs out of fuel, life on Earth will likely be wiped out a long time before that happens. That’s because the sun is actually already growing brighter. In fact by some estimates, it could be as little as a billion years before the sun’s radiation becomes too much for life here on Earth to handle. That might sound like quite a long time, but in comparison life has already existed on this planet for well over 3 billion years and when the sun does turn into a red giant, the Earth will also be vaporised, perhaps just a few million years after Mercury and Venus have been consumed. All the rocks and fossils and remains of the creatures that have lived here will be gobbled up by the sun’s growing orb, wiping out any lingering trace of humanity’s existence on Earth. But not all scientists agree with this interpretation. Some suspect the sun will stop growing just before fully engulfing our planet. Other scientists have suggested schemes for moving Earth deeper into the solar system by slowly increasing its orbit. Thankfully, this debate is still purely academic for all of us alive today. Even our young sun’s radiation was too much for Mars to hold onto an atmosphere capable of protecting complex life. However, recent evidence has shown that Mars may still have water lurking just beneath its surface. Mars may escape the sun’s actual reach as it is at the borderline, but that water will likely all be gone by the time the red giant star takes over the inner solar system. Now we look at the gas giant planets. As our red giant sun engulfs the inner planets, some of their material will likely get thrown deeper into the solar system, to be assimilated into the bodies of the gas giants.

Saturn as viewed from a side never visible from Earth.
(Credit: NASA/JPL-Caltech/Jan Regan)

Here, the ringed planet shows a side never visible from Earth. Cassini took 96 backlit photos for this mosaic on April 13, 2017. Because the sun shines through the rings, the thinnest parts glow brightest, and the thicker rings are dark. However, the approaching boundary of our star will also vaporise Saturn’s beloved rings, which are made of ice. The same fate likely awaits today’s icy ocean worlds, like Jupiter’s moon Europa as well as Saturn’s Enceladus, whose thick blankets of ice would be lost to the void. Once our sun has become a red giant, Pluto and its cousins in the Kuiper Belt along with Neptune’s moon Triton may be the most valuable real estate in the solar system. Today, these worlds hold abundant water ice and complex organic materials. Some of them could even hold oceans beneath their icy surfaces — or at least did in the distant past. But surface temperatures on dwarf planets like Pluto commonly sit at an inhospitable hundreds of degrees below freezing. However, by the time Earth is a cinder the average temperatures on Pluto will be similar to Earth’s average temperatures now.

Pluto as imaged by the New Horizons mission.
(Credit: NASA/JHU-APL/SwRI)

It has been said that when the sun becomes a red giant, the temperatures on Pluto’s surface will be about the same as the average temperatures on Earth’s surface now, because Earth will be toast, but Pluto will be balmy and brimming with the same sorts of complex organic compounds that existed when life first evolved on our own planet. Pluto will then perhaps have a thick atmosphere and a liquid-water surface. Collectively the worlds, from comet-like space rocks to dwarf planets like Eris and Sedna in this new habitable zone will have three times as much surface area as all four of the inner solar system planets combined. This might seem like an academic discussion only relevant to our distant descendants if they’re lucky enough to survive billions of years from now. However, as has been pointed out by a few astronomers, there are around a billion red giant stars in the Milky Way galaxy today. That is a lot of places for living beings to evolve and then perish as their stars consume them. Who knows what will be in the future but it is fun to speculate!

This week, as we come to the end of 2021 I am reminded of something I shared here in November 2020 and it feels appropriate to repeat.

“We are all visitors to this time, this place. We are just passing through. Our
purpose here is to observe, to learn, to grow, to love… and then we return
home.” ~ Australian Aboriginal Proverb

Click: Return to top of page or Index page

Nadolig Llawen a Blwyddyn Newydd Dda!

Or to those who do not speak Welsh, “Merry Christmas and a Happy New Year!” In the last couple of weeks I have written about Christmas. As I said last week, it is generally celebrated on December 25 each year and is a sacred religious holiday as well as being a worldwide cultural and commercial phenomenon. For two millennia, people around the world have been observing it with traditions and practices that are both religious and secular in nature. So we have been remembering and celebrating each year for a very long time. I recently watched an episode of ‘Star Trek: The Next Generation’, with the starship Enterprise and its captain, Jean-Luc Picard in command. It showed how a planet which was not as yet advanced enough to be a member of the United Federation of Planets was monitored and the inhabitants of that planet were secretly watched, to see how they were progressing. This wasn’t in any way to interfere with them, but just to observe. However, the watchers were discovered and the captain of the Enterprise was therefore seen as some omnipotent super-being, a god who could restore life to the dead. Picard had the difficult job of showing how he and everyone else had a finite life, that they could be hurt, injured and would eventually die. But the planet’s inhabitants really did take some convincing. Picard pointed out that these inhabitants had begun life living in caves, then gradually they progressed to building structures, but that their cave-dwelling ancestors would have seen them as people to be worshipped because of their skills. I think the writers of that Star Trek episode did very well, because if we were to go back two thousand years and use the skills we have learned over that time, what would the people of that time think of us. So no matter what our beliefs may be, here we are. Humans in the 21st century. Here on Earth, we are the most abundant and widespread species of primate, characterised by bipedalism along with large, complex brains. This has enabled the development of advanced tools, culture and language. We are highly social and tend to live in complex social structures composed of many cooperating and competing groups, from families and kinship networks to political states. Our social interactions have led to a wide variety of values, social norms and rituals which bolster human society. Curiosity and a human desire to understand and influence the environment and to explain and manipulate phenomena have motivated our development of science, philosophy, mythology as well as religion and other fields of study. Although some scientists equate humans with all members of the genus Homo, in common usage it generally refers to Homo sapiens which emerged around 300,000 years ago in Africa, evolving from Homo heidelbergensis and migrating out of Africa, gradually replacing local populations of archaic humans. For most of history, all humans were nomadic hunter-gatherers but the Neolithic Revolution which began in South-west Asia around 13,000 years ago saw the emergence of agriculture and permanent human settlement. As populations became larger and denser, forms of governance developed within and between communities and a number of civilisations have risen and fallen. Humans have continued to expand, with a global population of over 7.9 billion in December 2021. Genes as well as the environment influence human biological variation in visible characteristics, physiology, disease susceptibility, mental abilities, body size and life span. Though humans vary in many ways, genetically two humans on average are over 99% similar. Generally, men have greater body strength and women have a higher body fat percentage. We are omnivorous, capable of consuming a wide variety of plant and animal material, and have become capable of using fire and other forms of heat to both prepare and cook food. We can survive for up to eight weeks without food, and three or four days without water. We are generally diurnal, sleeping on average seven to nine hours per day. It is quite usual for both the mother and the father to provide care for their children, who are helpless at birth. Over the ages humans have grown and developed, we presently have a large and highly developed prefrontal cortex, the region of the brain associated with higher cognition. We are intelligent, capable of episodic memory, flexible facial expressions, self-awareness and a theory of mind which is fully capable of introspection, private thought, imagination, volition and forming views on our existence. This has enabled many great technological advancements and complex tool development to be possible through reason and the transmission of knowledge to future generations. Language, art and trade are defining characteristics of us humans. Long-distance trade routes have led to cultural explosions and resource distribution that gave an advantage over other species.

Interestingly, the word ‘human’ is a Middle English word from the Old French ‘humain’ and ultimately from the Latin ‘hūmānus’, the adjectival form of ‘homō’, or ‘man’ in the sense of humankind. The native English term can refer to the species generally (as a synonym for humanity) as well as to human males. It may also refer to individuals of either sex, though this latter form is less common in contemporary English. Until about 12,000 years ago, all humans lived as hunter-gatherers as the Neolithic Revolution, the invention of actual agriculture first took place in South-west Asia and spread through large parts of the ‘Old World’, consisting of Africa, Europe and Asia. This was before contact with the Americas, which became known as the New World. Agriculture also occurred independently about 6,000 years ago in such places as Papua New Guinea and some regions of Africa. Access to food surplus led to the formation of permanent human settlements, the domestication of animals and the use of metal tools for the first time in history. Agriculture and sedentary lifestyles led to the emergence of early civilisations. Then an urban revolution took place in the fourth millennium BCE with the development of city states, particularly Sumerian cities which were located in Mesopotamia and it was in these cities that the earliest known form of writing, cuneiform script, appeared around 3000 BCE. Other major civilisations developing around this time were Ancient Egypt and the Indus Valley. They eventually traded with each other and invented technology such as wheels, ploughs and sails.

Agriculture and domestication of animals
led to stable human settlements.

This getting to be more of a history lesson than I’d realised! But bear with me please. Astronomy and mathematics were also developed and the Great Pyramid of Giza was built. But there is evidence of a severe drought lasting about a hundred years that may have caused the decline of these civilisations, with new ones appearing in the aftermath. Babylonians came to dominate Mesopotamia while others, such as the Minoans and the Shang dynasty, rose to prominence in new areas. The Bronze Age suddenly collapsed about 1200 BCE, resulting in the disappearance of a number of civilisations and the beginning of the Greek Dark Ages. During this period iron then started replacing bronze, leading to the Iron Age. In the 5th century BCE, history started being recorded as a discipline, so providing a much clearer picture of life at that time. Between the 8th and 6th century BCE Europe entered the Classical Antiquity age, a period when Ancient Greece and Ancient Rome flourished and around this time other civilisations also came to prominence. The Mayan civilisation started to build cities and also create complex calendars whilst in Africa, the kingdom of Aksum overtook the declining kingdom of Kush which facilitated trade between India and the Mediterranean. In West Asia, the Achaemenid Empire’s system of their centralised governance become the precursor to many later empires, while the Gupta Empire in India and the Han Dynasty in China have been described as ‘golden ages’ in their respective regions. Following the fall of the Western Roman Empire in 476CE, Europe entered the Middle Ages and it was during this period that Christianity and the Church would become the source of centralised authority and education. In the Middle East, Islam became the prominent religion and expanded into North Africa. It led to an Islamic Golden Age, inspiring achievements in architecture, the revival of old advances in science and technology and the formation of a distinct way of life. The Christian and Islamic worlds would eventually clash, with the kingdom of England, the kingdom of France and the Holy Roman Empire declaring a series of ‘holy wars’ to regain control of the Holy Land from Muslims. In the Americas, complex societies would arise starting around 800CE, whilst further south the Aztecs and Incas would become the dominant powers. The Mongol Empire would conquer much of Eurasia in the 13th and 14th centuries and over this same time period the Mali Empire in Africa grew to the largest empire on the continent, stretching from Senegambia to the Ivory Coast. Oceania would see the rise of the Tu’i Tonga empire which expanded across many islands in the South Pacific. It was throughout the early modern period (1500–1800) that the Ottomans controlled the lands around the Mediterranean Basin, whilst Japan entered the Edo period, the Qing dynasty rose in China and the Mughal empire ruled much of India. Europe underwent the Renaissance, starting in the 15th century and the Age of Discovery began with the exploring and the colonising of new regions. This included the British Empire, which expanded to become the world’s largest empire and the colonisation of the Americas. This great expansion led to the Atlantic slave trade and the genocide of Native American peoples. The period also marked the start of the Scientific revolution, with great advances in mathematics, mechanics, astronomy and physiology. The late modern period, 1800 to the present, saw the Industrial and Technological revolutions bring such discoveries as transport, energy development and imaging technology. The United States of America underwent great change, going from a small group of colonies to one of the global super-powers. The Napoleonic Wars had raged right through Europe in the early 1800s, Spain lost most of its New World colonies and Europeans continued expansion into Oceania as well as Africa where European control went from 10% to almost 90% in less than fifty years. A tenuous balance of power among European nations collapsed in 1914 with the outbreak of the First World War, one of the deadliest conflicts in history. In the 1930s a worldwide economic crisis led to the rise of some authoritarian regimes and a Second World War, involving almost all of the world’s countries. Following its conclusion in 1945, the Cold War between the USSR and the USA saw a struggle for global influence, which included a nuclear arms race as well as a space race. What I believe is now the current Information Age sees the world becoming increasingly globalised as well as being interconnected.

Early human settlements were dependent on proximity to water and, depending on the lifestyle, other natural resources used for subsistence such as populations of animal prey for hunting as well as arable land for growing crops and grazing livestock. Modern humans, however, have a great capacity for altering their habitats by means of technology, irrigation, urban planning, construction, deforestation and desertification. It has been said that human settlements continue to be vulnerable to natural disasters, especially those placed in hazardous locations and with low quality of construction! Grouping and deliberate habitat alteration is often done with the goals of providing protection, accumulating comforts or material wealth, expanding the available food, improving aesthetics, increasing knowledge or enhancing the exchange of resources. It is also said that humans are one of the most adaptable species, despite having a relatively narrow tolerance to many of the earth’s extreme environments. Through invention, humans have been able to extend their tolerance to a wide variety of temperatures, humidity and altitudes. As a result, we are a cosmopolitan species found in almost all regions of the world, including tropical rainforests, arid deserts, extremely cold arctic regions and heavily polluted cities. Most other species are confined to a few geographical areas by their limited adaptability. The human population is not, however, uniformly distributed on the Earth’s surface because the population density varies from one region to another and there are large areas almost completely uninhabited, like Antartica and the vast swathes of ocean. Most humans live in Asia (61%), the remainder live in the Americas (14%), Africa (14%), Europe (11%), and Oceania (0.5%). Within the last century, humans have explored challenging environments such as Antarctica, the deep sea and outer space. But human habitation within these hostile environments is restrictive and expensive, typically limited in duration, and restricted to scientific, military or industrial expeditions. We have briefly visited the Moon and have made our presence felt on other celestial bodies through robotic spacecraft. In addition, since 2000 there has been a continuous human presence in space through the habitation of the International Space Station. Estimates of the population at the time agriculture emerged in around 10,000 BC have ranged between 1 million and 15 million. Around 50–60 million people lived in the combined eastern and western Roman Empire in the 4th century AD. Bubonic plagues, first recorded in the 6th century AD, reduced the population by 50%, with the Black Death killing 75–200 million people in Eurasia and North Africa alone. The human population was believed to have reached one billion in 1800 and has then increased exponentially, reaching two billion in 1930 and three billion in 1960, four in 1975, five in 1987 and six billion in 1999. It passed seven billion in 2011 and in 2020 there were 7.8 billion of us. In 2018, 4.2 billion humans (55%) lived in urban areas, up from 751 million in 1950 with the most urbanised regions being Northern America (82%), Latin America (81%), Europe (74%) and Oceania (68%), with Africa and Asia having nearly 90% of the world’s 3.4 billion rural population. Problems for humans living in cities include various forms of pollution and crime, especially in inner city and suburban slums. We have had a dramatic effect on the environment as we are ‘apex’ predators, being rarely preyed upon by other species. Human population growth, industrialisation, land development, overconsumption and combustion of fossil fuels have led to environmental destruction and pollution that significantly contributes to the ongoing mass extinction of other forms of life. We are the main contributor to global climate change, which may accelerate the Holocene extinction, otherwise referred to as the sixth mass extinction or Anthropocene extinction, which is an ongoing extinction event of species during the present Holocene epoch, with the more recent time sometimes called Anthropocene as a result of human activity. The most popular theory is that human overhunting of species added to existing stress conditions as the extinction coincides with human emergence. Although there is debate regarding how much human predation affected their decline, certain population declines have been directly correlated with human activity, such as the extinction events of New Zealand and Hawaii. Aside from humans though, climate change may have been a driving factor in the megafaunal extinctions, especially at the end of the Pleistocene period. Ecologically, humanity has been noted as an unprecedented ‘global super-predator’ that consistently preys on the adults of other ‘apex’ predators and has worldwide effects on food webs. There have been extinctions of species on every land mass and in every ocean. Overall, the Holocene extinction can be linked to the human impact on the environment and this continues into the 21st century, with meat consumption being a primary driver of the mass extinction along with the human population growth and increasing per-capita consumption being considered as primary drivers of this decline.

Earth as seen from Space.

The above image shows the Earth as seen from Space in 2016, showing the extent of human occupation of the planet. The bright lights signify both the most densely inhabited areas and ones financially capable of illuminating them. But there is relatively little variation between human geographical populations, and most of the variation that occurs is at the individual level. Much of human variation is continuous, often with no clear points of demarcation. Genetic data shows that no matter how population groups are defined, two people from the same population group are almost as different from each other as two people from any two different population groups. Dark-skinned populations that are found in Africa, Australia, and South Asia are not closely related to each other. As for our culture, the most widely spoken languages are English, Mandarin Chinese, Hindi, Spanish, Standard Arabic, Bengali, French, Russian, Portuguese and Urdu. Our most practices religions are Christianity, Islam, Hinduism, Buddhism, some Folk religions, Sikhism, Judaism as well as a few unaffiliated ones. Language is the principal form of communication and unique to humans, although many other species have their own forms of communication. Unlike the limited systems of other animals, human language is open, as an infinite number of meanings can be produced by combining a limited number of symbols. Human language also has the capacity of displacement, using words to represent things and happenings that are not presently or locally occurring but reside in the shared imagination of others. Language differs from other forms of communication in that the same meanings can be conveyed through different media, either audibly in speech, visually by sign language or writing and through tactile media such as Braille. Language is central to the communication between humans, and to the sense of identity that unites nations, cultures and ethnic groups. There are approximately six thousand different languages currently in use, including sign languages, and many thousands more that are extinct. But unlike speaking, reading and writing does not come naturally to us and must be taught. Despite this, forms of literature have been present before the invention of words and language, with 30,000-year-old paintings on walls inside some caves portraying a series of dramatic scenes. One of the oldest surviving works of literature is the Epic of Gilgamesh, first engraved on ancient Babylonian tablets about 4,000 years ago. Beyond simply passing down knowledge, the use and sharing of imaginative fiction through stories might have also helped develop the human capabilities for communication and increased the likelihood of securing a mate. Storytelling may also be used as a way to provide the audience with moral lessons and encourage cooperation. We are are often the subject of the arts, as while most art focuses on individual humans or a small group, in literature the genre of science fiction is known for tackling issues related to the humanity as a whole, for example topics such as human evolution or the future of civilisation. This aspect is definitely something I have seen quite clearly in episodes of Star Trek. I feel that we have learned much, yet there is so much more for us to hopefully learn and share with others in a good and positive way that will be of benefit to us all.

This week, I have read…
We all know that Santa has a sleigh, on which he puts all the presents which must be delivered. He has his reindeer, all ready to take him around the world. For Santa, time is special and so that he can get all that must be done in good time he takes with him a Workshop elf. This elf makes sure the sleigh is in good working order and that the presents are packed correctly. The elf is also an engineer and will do repairs if needed, especially with tower blocks going higher and higher as well as aerials, satellite dishes and the like. It’s a hard life, but as you open your presents and thank Santa, please spare a thought for the Workshop elf…

Click: Return to top of page or Index page

Christmas 2021

Christmas is celebrated on December 25 each year and is a sacred religious holiday as well as being a worldwide cultural and commercial phenomenon. For two millennia, people around the world have been observing it with traditions and practices that are both religious and secular in nature. Christians celebrate Christmas Day as the anniversary of the birth of Jesus of Nazareth, a spiritual leader whose teachings form the basis of their religion. Popular customs include exchanging gifts, decorating Christmas trees, attending church, sharing meals with family and friends and, of course, waiting for Santa Claus to arrive. But the middle of winter has long been a time of celebration around the world. Centuries before the arrival of the man called Jesus, early Europeans celebrated light and birth in the darkest days of winter. Many peoples rejoiced during the winter solstice, when the worst of the winter was behind them and they could look forward to longer days and extended hours of sunlight. In Scandinavia, the Norse celebrated Yuletide from December 21, the winter solstice, through to January. In recognition of the return of the sun, fathers and sons would bring home large logs, which they would set on fire. The people would feast until the log burned out, which could take as many as 12 days. The Norse believed that each spark from the fire represented a new pig or calf that would be born during the coming year.

1848 image of Queen Victoria, Prince Albert and their children.

The end of December was a perfect time for celebration in most areas of Europe. At that time of year, most cattle were slaughtered so they would not have to be fed during the winter. In fact for many, it was the only time of year when they had a supply of fresh meat. As well as that, most wine and beer made during the year was fully fermented and ready for drinking. In Germany, people honoured the pagan god Oden during the mid-winter holiday. Germans were terrified of Oden, as they believed he made nocturnal flights through the sky to observe his people, and then decide who would prosper or perish. Because of his presence, many people chose to stay inside. In Rome, where winters were not as harsh as those in the far north, Saturnalia—a holiday in honour of Saturn, the god of agriculture—was celebrated. Beginning in the week leading up to the winter solstice and continuing for a full month, this was a time when food and drink were plentiful and the normal Roman social order was turned upside down. For a month, enslaved people were given temporary freedom and treated as equals. Businesses and schools were closed so that everyone could participate in the holiday’s festivities. Also around the time of the winter solstice, Romans observed Juvenalia, a feast honouring the children of Rome. In addition, members of the upper classes often celebrated the birthday of Mithra, the god of the unconquerable sun, on December 25. It was believed that Mithra, an infant god, was born of a rock. For some Romans, Mithra’s birthday was the most sacred day of the year. However, in the early years of Christianity, Easter was the main holiday and the birth of Jesus was not celebrated. In the fourth century, church officials decided to institute the birth of Jesus as a holiday. Unfortunately, the Bible does not mention a date for his birth, a fact Puritans later pointed out in order to deny the legitimacy of the celebration. Although some evidence suggests that his birth may have occurred in the spring, an argument put forward being why would shepherds be herding in the middle of winter. Pope Julius I chose December 25 and it is commonly believed that the church chose this date in an effort to adopt and absorb the traditions of the pagan Saturnalia festival. First called the Feast of the Nativity, the custom spread to Egypt by 432AD and to England by the end of the sixth century. So by holding Christmas at the same time as traditional winter solstice festivals, church leaders increased the chances that Christmas would be popularly embraced, but gave up the ability to dictate how it was celebrated. By the Middle Ages Christianity had, for the most part, replaced pagan religion. However at Christmas, believers attended church, then celebrated raucously in a drunken, carnival-like atmosphere similar to today’s Mardi Gras. Each year, a beggar or student would be crowned the “lord of misrule” and eager celebrants played the part of his subjects. The poor would go to the houses of the rich and demand their best food and drink and if owners failed to comply, their visitors would most likely terrorise them with mischief. Christmas became the time of year when the upper classes could repay their real or imagined “debt” to society by entertaining less fortunate citizens. It was in the early 17th century a wave of religious reform changed the way Christmas was celebrated in Europe. When Oliver Cromwell and his Puritan forces took over England in 1645, they vowed to rid England of decadence and, as part of their effort, cancelled Christmas. But by popular demand, Charles II was restored to the throne and with him came the return of the popular holiday. The Pilgrims, English separatists who came to America in 1620, were even more orthodox in their Puritan beliefs than Cromwell. As a result, Christmas was not a holiday in early America. From 1659 to 1681, the celebration of Christmas was actually outlawed in Boston, in fact Ebenezer Scrooge had nothing on the 17th-century Puritans, who actually banned the public celebration of Christmas in the Massachusetts Bay Colony for an entire generation and so anyone exhibiting the Christmas spirit was fined five shillings. By contrast, in the Jamestown settlement, Captain John Smith reported that Christmas was enjoyed by all and passed without incident. It seems though that in America, after the American Revolution, English customs fell out of favour, including Christmas. In fact Christmas wasn’t declared a federal holiday until June 26, 1870. So it seems that it wasn’t until the 19th century that Americans began to embrace Christmas. There are even those who have said that Americans re-invented Christmas and changed it from a raucous carnival holiday into a family-centred day of peace and nostalgia. I am not sure I can agree with that! But what was it about the 1800s that piqued American interest in the holiday? There, the early 19th century was a period of class conflict and turmoil. During this time, unemployment was high and gang rioting by the disenchanted classes often occurred during the Christmas season. In 1828, the New York city council instituted the city’s first police force in response to a Christmas riot. This encouraged quite a few members of the upper classes to begin to change the way Christmas was celebrated in America. In 1819, best-selling author Washington Irving wrote The Sketchbook of Geoffrey Crayon, a series of stories about the celebration of Christmas in an English manor house. The sketches featured a squire who invited the peasants into his home for the holiday. In contrast to the problems faced in American society, this showed how the two groups mingled effortlessly. In Irving’s mind, Christmas should be a peaceful, warm-hearted holiday bringing groups together across lines of wealth or social status. Irving’s fictitious celebrants enjoyed ‘ancient customs’ including the crowning of a Lord of Misrule. Irving’s book, however, was not based on any holiday celebration he had attended and in fact, many historians say that Irving’s account actually “invented” tradition by implying that it described the true customs of the season. Also around this time, English author Charles Dickens created the classic holiday tale, A Christmas Carol. The story’s message, the importance of charity and good will towards all humankind, struck a powerful chord in England as well as the United States and showed members of Victorian society the benefits of celebrating the holiday. The family was also becoming less disciplined and more sensitive to the emotional needs of children during the early 1800s. Christmas provided families with a day when they could lavish attention and gifts on their children without appearing to ‘spoil’ them. As people began to embrace Christmas as a perfect family holiday, old customs were unearthed. They looked toward recent immigrants and Catholic and Episcopalian churches to see how the day should be celebrated and in time a Christmas tradition was built which included pieces of many other customs, including decorating trees and exchanging gifts. But although most families quickly bought into the idea that they were celebrating Christmas how it had been done for centuries, some Americans believed they had re-invented a holiday to fill the cultural needs of a growing nation.

In my research I have found a few questions regarding the legend of Santa Claus, which can be traced back to a monk named St. Nicholas. Born in Turkey around 280 A.D., St. Nicholas gave away all of his inherited wealth and travelled the countryside helping the poor and sick, becoming known as the protector of children and sailors. The modern character of Santa is based on traditions surrounding St. Nicholas, with Santa generally depicted as a portly, jolly, white-bearded man, often wearing spectacles, a red coat with white fur collar and cuffs, white-fur-cuffed red trousers, red hat with white fur, and black leather belt and boots, carrying a bag full of gifts for children. He is commonly portrayed as laughing in a way that sounds like “ho ho ho”. This image became popular in the 19th century due to the significant influence of the poem ‘A Visit From St. Nicholas’, also known as The Night Before Christmas and ’Twas The Night Before Christmas from the first line of a poem first published anonymously in 1823 and later attributed to Clement Clarke Moore, who claimed its authorship in 1837. The story is that on the night of Christmas Eve, a family is settling down to sleep when the father is disturbed by noises on the lawn outside. Looking out of the window, he sees Saint Nicholas on a sleigh which is pulled by eight reindeer. After landing his sleigh on the roof, Saint Nicholas enters the house down the chimney, carrying a sack of toys. The father watches his visitor fill the stockings which are hanging by the fireplace and laughs to himself. They share a conspiratorial moment before Saint Nicholas bounds up the chimney again. As he flies away, he wishes a “Happy Christmas to all, and to all a good night.” So St. Nicholas became known by various names such as Santa Claus, Saint Nick, Kris Kringle or simply ‘Santa’. He is said to bring gifts on Christmas Eve of toys and sweets to well-behaved children and either coal or nothing to naughty children. He is said to accomplish this with the aid of Christmas elves who make the toys in his workshop at the North Pole, distributing the gifts around the world on his sleigh which is pulled through the air by flying reindeer. Christmas traditions around the world are quite diverse, but they share key traits that often involve themes of light, evergreens and hope. Probably the most celebrated holiday in the world, our modern Christmas is a product of hundreds of years of both secular and religious traditions from around the globe, many of them centred on the winter solstice. Most people in Scandinavian countries honour St. Lucia (also known as St. Lucy) each year on December 13. The celebration of St. Lucia Day began in Sweden, but had spread to Denmark and Finland by the mid-19th century. In these countries, the holiday is considered the start of the Christmas season and is sometimes referred to as “little Yule.” Traditionally, the oldest daughter in each family rises early, dressed in a long, white gown with a red sash, and wearing a crown made of twigs with nine lighted candles. She wakes each of her family members and for the day, she is called “Lussi” or “Lussibruden” (Lucy bride). The family then eats breakfast in a room lighted with candles. Any shooting or fishing done on St. Lucia Day was done by torchlight, and people brightly illuminated their homes. At night, men, women and children would carry burning torches in a parade. The night would end when everyone threw their torches onto a large pile of straw, creating a huge bonfire. In Finland today, one girl is chosen to serve as the national Lucia and she is honoured in a parade in which she is surrounded by torchbearers. Light is a main theme of St. Lucia Day as her name, which is derived from the Latin word lux, means light. Her feast day is celebrated near the shortest day of the year, when the sun’s light again begins to strengthen. Lucia lived in Syracuse during the fourth century when persecution of Christians was common. Unfortunately, most of her story has been lost over the years but according to one common legend, Lucia lost her eyes while being tortured by a Diocletian for her Christian beliefs. Others say she may have plucked her own eyes out to protest at the poor treatment of Christians and it is for that reason St. Lucia is the patron saint of the blind. In Finland, many Finns visit the sauna on Christmas Eve and families gather and listen to the national “Peace of Christmas” radio broadcast. It is also the custom there to visit the gravesites of departed family members. In Norway, the birthplace of the Yule log, I have learned that the ancient Norse used the Yule log in their celebration of the return of the sun at winter solstice. “Yule” came from the Norse word ‘hweol’, meaning wheel. The Norse believed that the sun was a great wheel of fire that rolled towards and then away from the earth. If you ever wonder why the family fireplace is such a central part of the typical Christmas scene it is because this tradition dates back to the Norse Yule log. It is probably also responsible for the popularity of log-shaped cheese, cakes and desserts during the holidays. But the tradition of decorating Christmas trees comes from Germany and decorating evergreen trees had always been a part of the German winter solstice tradition. The first Christmas trees explicitly decorated and named after the Christian holiday appeared in Strasbourg (part of Alsace) in the beginning of the 17th century. After 1750, Christmas trees began showing up in other parts of Germany, and even more so after 1771, when Johann Wolfgang von Goethe visited Strasbourg and promptly included a Christmas tree is his novel, The Suffering of Young Werther. But over in Mexico, papier-mâché sculptures called piñatas are filled with sweets and coins and hung from the ceiling. Children then take turns hitting the piñata until it breaks, sending a shower of treats to the floor. Children race to gather as much of the items as they can. In 1828, the American minister to Mexico, Joel R. Poinsett, brought a red-and-green plant from Mexico to America. As its colouring seemed perfect for the new holiday, the plants, called poinsettias after Poinsett, began appearing in greenhouses as early as 1830. In 1870, New York stores began to sell them at Christmas and by 1900, they were a universal symbol of the holiday. It may come as no surprise that a manger scene is the primary decoration in Central American, South American and most southern European nations, as St. Francis of Assisi created the first living nativity in 1224 to help explain the birth of Jesus to his followers. Further north, most Canadian Christmas traditions are very similar to those practiced in the United States. In the far north of the country, indigenous Inuits celebrate a winter festival called Sinck Tuck, which features parties with dancing and the exchanging of gifts.

Over in France, Christmas is called Noel. This comes from the French phrase les bonnes nouvelles, which means “the good news” and refers to the gospel. In southern France, some people burn a log in their homes from Christmas Eve until New Year’s Day. This stems from an ancient tradition in which farmers would use part of the log to ensure good luck for the next year’s harvest. Equally, Italians call Christmas ‘il Natale,’ meaning “the birthday” whilst in Greece, many people believe in the ‘kallikantzeri’, goblins that appear and cause mischief during the 12 days of Christmas. Gifts are usually exchanged on January 1, St. Basil’s Day. But down in Australia, the holiday comes in the middle of summer and it’s not unusual for some parts of Australia to hit 100 degrees Fahrenheit on Christmas Day. During the warm and sunny Australian Christmas season, beach time and outdoor barbecues are common. Traditional Christmas Day celebrations include family gatherings, exchanging gifts and either a hot meal with ham, turkey, pork or seafood or barbecues. Here in Britain I have learned that the tradition of exchanging Christmas cards can be traced back to England. An Englishman named John Calcott Horsley helped to popularise the tradition of sending Christmas greeting cards when he began producing small cards featuring festive scenes and a pre-written holiday greeting and this began in the late 1830s. Our Post Office, which dates way back to 1660 when it was established by Charles II and under the guise of the General Post Office (GPO), it soon grew as an important organisation integral within the infrastructure of England during the seventeenth century. Therefore, the exchanging of these cards nearly made them overnight sensations. Celtic and Teutonic peoples had long considered mistletoe to have magic powers and it was said to have the ability to heal wounds and increase fertility. The Celts hung mistletoe in their homes in order to bring themselves good luck and ward off evil spirits and in the Victorian era, during holidays the English would hang sprigs of mistletoe from ceilings and in doorways. If someone was found standing under the mistletoe, they would be kissed by someone else in the room, although this was a behaviour that was not usually demonstrated in Victorian society. A favourite food at this time of year is Christmas pudding, also known as ‘figgy pudding’ or plum pudding, an English dish dating back to the Middle Ages. Suet, flour, sugar, raisins, nuts and spices are tied loosely in cloth and boiled until the ingredients are “plum,” meaning they have enlarged enough to fill the cloth. It is then unwrapped, sliced like cake and topped with cream. Also, ‘Carolling’ began in England, when wandering musicians would travel from town to town visiting castles and homes of the rich. In return for their performance, the musicians hoped to receive a hot meal or money. In most countries nowadays I think children hang stockings on their bedpost or near a fireplace on Christmas Eve, hoping that it will be filled with treats while they sleep. In Scandinavia, similar-minded children leave their shoes on the hearth. But the best one has to be in the Ukraine, where Ukrainians prepare a traditional twelve-course meal and the family’s youngest child keeps watch through the window for the evening star to appear, a signal that the feast can begin. I do wonder if the older children actually sit and wait…

This week, a Fascinating Fact…
We know the word ‘emphatic’, but there is also ‘phatic’. A phatic expression denotes or relates to language used for general purposes of social interaction, rather than to convey information or ask questions. Utterances such as “hello, how are you” and “nice morning, isn’t it?“ are phatic expressions.

Click: Return to top of page or Index page

Christmas Approaches…

In a couple of days time it will be December 12th. Some of you reading this may be reading it on that very date, but for many it will still be a couple of days away. Already the shops will be getting a little bit busier, although in no way do I think they will be as busy as a few years ago. That has been due to the changes in our lifestyles over the last few years. I see mentions on Facebook of folk who put their Christmas decorations up, the same with trees. I mention December 12th as it was my dad’s birthday and although he sadly passed away some years ago now, I still follow our family tradition of putting up decorations, cards and the like starting on that day. I found it interesting though to research the history of Christmas trees which goes back to the symbolic use of evergreens in ancient Egypt and Rome. Long before the advent of Christianity, plants and trees that remained green all year had a special meaning for people in the winter. Just as people today decorate their homes during the festive season with trees such as pine, spruce, and fir, ancient peoples hung evergreen boughs over their doors and windows. In many countries it was believed that evergreens would keep away witches, ghosts, evil spirits, and illness. Here in the Northern hemisphere, the shortest day and longest night of the year falls on December 21 or December 22 and is called the winter solstice. Many ancient people believed that the sun was a god and that winter came every year because the sun god had become sick and weak. They celebrated the solstice because it meant that at last the sun god would begin to get well. Evergreen boughs reminded them of all the green plants that would grow again when the sun god was strong and summer would return. The ancient Egyptians worshipped a god called Ra, who had the head of a hawk and wore the sun as a blazing disk in his crown. At the solstice, when Ra began to recover from his illness, the Egyptians filled their homes with green palm rushes, which symbolised for them the triumph of life over death. Early Romans marked the solstice with a feast called Saturnalia in honour of Saturn, the god of agriculture. The Romans knew that the solstice meant that soon, farms and orchards would be green and fruitful once more so to mark the occasion, they decorated their homes and temples with evergreen boughs. In Northern Europe the Druids, the priests of the ancient Celts, also decorated their temples with evergreen boughs as a symbol of everlasting life whilst the Vikings in Scandinavia thought that evergreens were the special plant of the sun god, Balder. But Germany is credited with starting the Christmas tree tradition as we now know it back in the 16th century when devout Christians brought decorated trees into their homes. Some built Christmas pyramids of wood and decorated them with evergreens and candles if wood was scarce. It is a widely held belief that Martin Luther, the 16th-century Protestant reformer, first added lighted candles to a tree. Walking toward his home one winter evening, composing a sermon, he was awed by the brilliance of stars twinkling through all the evergreens. To recapture the scene for his family, he erected a tree in the main room and wired its branches with lighted candles. Most 19th-century Americans found Christmas trees an oddity though. The first record of one being on display was in the 1830s by the German settlers of Pennsylvania, although trees had been a tradition in many German homes much earlier. The Pennsylvania German settlements had community trees as early as 1747 but as late as the 1840s Christmas trees were seen as pagan symbols and not accepted by most Americans. It is not surprising that, like many other festive Christmas customs, the tree was adopted so late in America. To the New England Puritans, Christmas was sacred. The second governor of the pilgrims, William Bradford, wrote that he tried hard to stamp out “pagan mockery” of the observance, penalising any frivolity. Also, Oliver Cromwell preached against “the heathen traditions” of Christmas carols, decorated trees, and any joyful expression that desecrated “that sacred event.” In 1659, the General Court of Massachusetts enacted a law making any observance of December 25 (other than a church service) a penal offence, in addition people were fined for hanging decorations. That stern solemnity continued until the 19th century, when the influx of German and Irish immigrants undermined the Puritan legacy.

An illustration from a December 1848 edition of the Illustrated London News shows Queen Victoria and her family surrounding a Christmas tree.
Bettmann Archive/Getty Images

In 1846 the popular royals, Queen Victoria and Prince Albert, were sketched in the Illustrated London News standing with their children around a Christmas tree. Unlike the previous royal family, Victoria was very popular with her subjects, and what was done at court immediately became fashionable, not only in Britain but with fashion-conscious East Coast American Society. The Christmas tree had arrived. By the 1890s Christmas ornaments were arriving from Germany and Christmas tree popularity was on the rise around the U.S.A. It was noted that Europeans used small trees about four feet in height, while Americans liked their Christmas trees to reach from floor to ceiling. The early 20th century saw Americans decorating their trees mainly with homemade ornaments, while the German-American sect continued to use apples, nuts, and marzipan biscuits. Popcorn joined in after being dyed bright colours and interlaced with berries and nuts. Electricity brought about Christmas lights, making it possible for Christmas trees to glow for days on end. With this, Christmas trees began to appear in town squares across the country and having a Christmas tree in the home became a tradition around the world, but their history varies from country to country. Here are just a few examples.

Down in Brazil, although Christmas falls during the summer there, they sometimes decorate pine trees with little pieces of cotton that represent falling snow whilst in China, of the small percentage of Chinese who do celebrate Christmas, most erect artificial trees decorated with spangles and paper chains, flowers, and lanterns. Christmas trees are called “trees of light.” In Canada, the German settlers who migrated there from the United States in the 1700s brought with them many of the things associated with Christmas we cherish today, for example Advent calendars, gingerbread houses, biscuits and of course Christmas trees. When Prince Albert put up a Christmas tree at Windsor Castle in 1848, the Christmas tree became a tradition throughout the United Kingdom, the United States, and Canada. Over in Germany, besides the Martin Luther legend, another says that in the early 16th century, people in Germany combined two customs that had been practiced in different countries around the globe. The Paradise tree (a fir tree decorated with apples) represented the Tree of Knowledge in the Garden of Eden. The Christmas Light, a small, pyramid-like frame, usually decorated with glass balls, tinsel and a candle on top, was a symbol of the birth of Christ as the Light of the World. Changing the tree’s apples to tinsel balls and biscuits and combining this new tree with the light placed on top, the Germans created the tree that many of us know today. I understand that a modern Tannenbaum are traditionally decorated in secret with lights, tinsel and ornaments by parents and then lit and revealed on Christmas Eve with sweets, nuts and gifts under its branches. I’ve learned that down in Guatemala the Christmas tree has joined the “Nacimiento” (Nativity scene) as a popular ornament, it is thought because of the large German population there. Gifts are left under the tree on Christmas morning for the children but for some reason parents and adults do not exchange gifts until New Year’s Day. Here in Britain, the Norway spruce is the traditional species used to decorate homes. This was in fact a native species in the British Isles before the last Ice Age and was reintroduced there before the 1500s, but since December 1947 a Christmas tree has been an annual gift to the people of Britain from Norway as a token of gratitude for British support to Norway during the Second World War. The first tree was cut down by Mons Urangsvåg in 1942 during a raid on the Norwegian island called Hisøy, which is located on the west coast between Bergen and Haugesund. After it was cut down, the tree was then transported to England where the Norwegian King was in exile, and given to him as a gift. It is possible to visit the island of Hisøy but only by boat, and from the old tree stump a new tree has since grown. The Christmas tree has been a gift to the people of Britain by Norway every year since then and has provided a central focus for the Trafalgar Square traditional carol-singing programme which is performed by different groups raising money for voluntary or charitable organisations. It is prominently displayed from the beginning of December until 6 January the following year, the Twelfth Night of Christmas, when it is taken down for recycling. The tree is chipped and composted, to make mulch. It is typically a fifty to sixty-year-old Norway spruce, generally over twenty metres tall and is cut in Norway some time in November during a ceremony attended by the British Ambassador to Norway, the Mayor of Oslo and the Lord Mayor of Westminster. After the tree is cut, it is shipped to the UK and at one time it was brought over to Felixstowe free of charge by a cargo ship of the Fred Olsen Line. Then from around 2007 it was brought into Immingham by the DFDS Tor Line, but since 2018 it has been the responsibility of Radius Group to transport, guard and erect the tree in Trafalgar Square. The tree is decorated in a traditional Norwegian style and adorned with 500 white lights and in 2008 the tree began using low-wattage halogen bulbs which used just 3.5kW of power.

Different countries have slightly different traditions when it comes to Christmas trees. In Ireland, they are bought at any time in December and decorated with coloured lights, tinsel, and baubles. Some people favour the angel on top of the tree, others the star. The house is decorated with garlands, candles, holly, and ivy whilst wreaths and mistletoe are hung on the door. In Italy, the presepio (manger or crib) represents in miniature the Holy Family in the stable and is the centre of Christmas for families. Guests kneel before it and musicians sing before it. The presepio figures are usually hand-carved and very detailed in features and dress. The scene is often set out in the shape of a triangle. It provides the base of a pyramid-like structure called the ceppo, this being a wooden frame arranged to make a pyramid several feet high. Several tiers of thin shelves are then supported by this frame. It is entirely decorated with coloured paper, gilt pine cones, and miniature coloured pennants. Small candles are fastened to the tapering sides and a star or small doll is hung at the apex of the triangular sides, whilst the shelves above the manger scene have small gifts of fruit, sweets and presents. It has been said that the ceppo is done in an old Tree of Light tradition which became the Christmas tree in other countries. Some houses even have a ceppo for each child in the family. In Japan, for most of the Japanese who celebrate Christmas it’s purely a secular holiday devoted to the love of their children. Christmas trees are decorated with small toys, dolls, paper ornaments, gold paper fans and lanterns, and wind chimes. Miniature candles are also put among the tree branches and one of the most popular ornaments is the origami swan. Japanese children have exchanged thousands of folded paper “birds of peace” with young people all over the world as a pledge that war must not happen again. Across in Mexico, the principal holiday adornment is el Nacimiento, or Nativity scene. However, a decorated Christmas tree may be incorporated in the Nacimiento or set up elsewhere in the home. As purchase of a natural pine represents a luxury commodity to most Mexican families, the typical arbolito (little tree) is often an artificial one, a bare branch cut from a copal tree (Bursera microphylla) or some type of shrub collected from the countryside. Up in Norway itself, nowadays Norwegians often take a trip to the woods to select a Christmas tree, a trip that their grandfathers probably did not make. The Christmas tree was not introduced into Norway from Germany until the latter half of the 19th century and to the country districts it came even later. Therefore when Christmas Eve arrives, there is the decorating of the tree, usually done by the parents behind the closed doors of the living room, while the children wait with excitement outside. There is a Norwegian ritual known as “circling the Christmas tree” which follows, where everyone joins hands to form a ring around the tree and then walk around it singing carols. After that, gifts are distributed. Across in the Philippines, fresh pine trees are too expensive for many Filipinos so handmade trees in an array of colours and sizes are often used. Star lanterns appear everywhere in December. They are made from bamboo sticks, covered with brightly coloured rice paper or cellophane, and usually feature a tassel on each point. There is usually one in every window, each representing the Star of Bethlehem. But it seems that over in Saudi Arabia the Europeans, Americans, Indians, Filipinos, as well as other Christians living there have to celebrate Christmas privately in their homes. Christmas lights are generally not tolerated and as a result most families place their Christmas trees somewhere rather inconspicuous. However, Christmas is a summer holiday in South Africa as whilst Christmas trees are not common there, windows are often draped with sparkling cotton wool and tinsel. In Spain, a popular Christmas custom is Catalonia, a lucky strike game where a tree trunk is filled with goodies and children hit at the trunk trying to knock out the hazel nuts, almonds, toffee, and other treats. Up in Sweden, most people buy Christmas trees well before Christmas Eve, but it’s not common to take the tree inside and decorate it until just a few days before. Evergreen trees are decorated with stars, sunbursts, and snowflakes made from straw. Other decorations include colourful wooden animals and straw centrepieces. I found it fascinating though to learn that in the Ukraine, Christmas is celebrated on December 25th by Catholics and on January 7th by Orthodox Christians, yet it is the most popular holiday there. So as a result, during the whole of their Christmas season which of course includes New Year’s Day, people decorate fir trees and have parties.

I have found even more fascinating facts about this festive time.

  • In the U.S.A, the Rockefeller Center tree is located at Rockefeller Center, west of Fifth Avenue from 47th through 51st Streets in New York City and dates back to the Depression era.
  • The first tree at Rockefeller Center was placed in 1931 and was a small unadorned tree placed by construction workers at the centre of the construction site. Two years later, another tree was placed there, this time with lights.
  • The tallest tree displayed at Rockefeller Center arrived in 1948 and was a Norway Spruce that measured 100 feet tall and hailed from Killingworth, Connecticut.
  • Between 1887-1933 a fishing schooner called the Christmas Ship would tie up at the Clark Street bridge and sell spruce trees from Michigan to the people of Chicago.
  • In 1912, the first community Christmas tree in the United States was erected in New York City.
  • In 1923, President Calvin Coolidge started the National Christmas Tree Lighting Ceremony now held every year on the White House lawn.
  • In 1963, the National Christmas Tree was not lit until December 22nd because of a national 30-day period of mourning following the assassination of President Kennedy.
  • Since 1966, the National Christmas Tree Association has given a Christmas tree to the President and first family.
  • In 1979, their National Christmas Tree was not lit except for the top ornament in honour of the American hostages in Iran.
  • Christmas trees generally take six to eight years to mature.
  • The tallest living Christmas tree is believed to be the 122-foot, 91-year-old Douglas fir in the town of Woodinville, Washington.
  • Most Christmas trees are cut weeks before they get to a retail outlet.
  • In the past, other types of trees such as cherry and hawthorns were used as Christmas trees.
  • It is said that Thomas Edison’s assistants came up with the idea of electric lights for Christmas trees.
  • Teddy Roosevelt banned the Christmas tree from the White House for environmental reasons.
  • At one time, tinsel was banned because it contained lead. Now it is made of plastic.
  • In the first week, a tree in your home will consume as much as a quart of water per day.
  • You should never burn your Christmas tree in the fireplace, as it can contribute to a build-up of creosote.

This week…
I watched a video recently showing where a cat had somehow managed to get its head stuck inside a tin can and could not get out. A man freed the cat, but found it was not wearing a collar so was saying to people nearby how he thought the cat was probably wild. My immediate thought was “wild – I expect it was absolutely furious!”

Click: Return to top of page or Index page