A Question of Autumn

Why do leaves change colour in the autumn? It’s a simple question with a simple answer, you might say – they start to die. But there is a bit more to it than that, if you’re interested in the science, and how it can produce some of nature’s most picturesque scenery.

Every autumn the leaves from deciduous trees change colour before falling to the ground. This is due to the fact the leaves contain many chemical pigments, the most important being chlorophyll. Chlorophyll makes leaves green and helps in the process of photosynthesis, which attracts sunlight to the tree, helping them grow. Leaves also contain the chemical carotene, which has a yellow colouring. Carotene lives in the leaves all year, but is masked by the green of the chlorophyll.

The process of leaves turning from green to yellow, red or brown, is dependent on the climate. When autumn approaches and the warmer temperatures of summer begin to dip, the chlorophyll within the leaves begins to break down. Other pigments that live beneath the chlorophyll, such as the carotene, come forward.

Chlorophyll is dependent on water as well as sunshine. As the climate cools and the tree draws colder water up through its roots, the tree prepares for winter. It does this by growing a thin layer of cells over the water tubes in its leaves, closing them up in preparation. Without a regular supply of water, the green chlorophyll starts to disappear and the other colours in the leaf, such as the yellow carotene, can be seen. In some trees, when the leaf cells build, the water blocking wall which seals the tubes in the leaf’s stem traps sugar inside the leaf. This turns the sap and therefore the leaf red, or even purple.

The final part of the process before a leaf falls is when the water within the tree dries up completely. This dehydration kills any remaining green chlorophyll, as well as the yellow and red pigments. Consequently, the leaves turn brown and start to die, becoming dry and crunchy before they fall from the tree.

All in all, it’s quite a complicated and intricate process that provides us with this often beautiful time of year. When it isn’t raining at least!

Today marks the birthday of a man who History has shown to be one of the great pioneers of modern medicine, whose work led to the advent of vaccination. Bearing in mind how much we rely on these to protect us against numerous conditions and diseases now, it is one worth remembering.

Born in Berkeley, Gloucestershire on 17 May 1749, Edward Jenner was the son of the local vicar. He was only 14 years old when he became an apprentice to a surgeon, and began training to be a doctor. Working in the countryside, Jenner noticed that, despite the rife nature of the smallpox disease across England, the milkmaids never suffered from it. They didn’t even show signs of the scarring that commonly affected smallpox sufferers. He did know, however, that the milkmaids often suffered from the far less serious condition of cowpox. Jenner therefore began to work on the theory that perhaps milkmaids did catch the smallpox disease, but had somehow become immune to it.

Taking his thought processes further, Jenner speculated that if you had the relatively harmless cowpox, then perhaps you wouldn’t get the far more lethal disease of smallpox at all. Wanting to prove his theory, in 1796 Jenner carried out his now famous experiment, which involved using a needle to insert pus from Sarah Neales, a milkmaid with cowpox, into the arm of an eight-year-old, James Phipps. A few days later, Jenner then exposed James to the smallpox. The boy failed to contract the disease, and Jenner concluded he was now immune to it. Calling this new method vaccination (after the Latin word vacca, meaning cow), Jenner submitted a paper to the Royal Society the following year about his discovery. It was met with some interest, but further proof was requested. Jenner proceeded to vaccinate and monitor several more children, including his own son.

Although the results of Jenner’s study were published in 1798, his work met with opposition, and even ridicule. It wasn’t until 1853, 30 years after Edward Jenner had died, that his smallpox vaccination was to be made a compulsory injection across both England and Wales. However, Jenner’s work would ultimately lead to a wave of medical innovation, and further, to the large number of life saving vaccinations available today. For this, he is surely worthy of remembrance.

Almost every health scare these days seems to concern viruses. From bird flu and Ebola and now to Zika, these pathogens appear to have medicine on the hop. But what exactly is a virus, and why are viruses such a problem?

A typical virus is a remarkably simple machine. It is just a short stretch of nucleic acid (DNA or RNA) surrounded by a protein coat. The nucleic acid contains coded information for making new virus particles, while the protein coat may help the virus gain access to its host. And that is that. Viruses have no membranes, no complicated machinery for carrying out reactions, and no metabolism like one of your own cells. Viruses do not feed, move, or respond to their surroundings like proper organisms. And they are so small that they were not even visible until 1939, following the development of the electron microscope.

However, introduce a virus into a host cell and the results are dramatic. It hijacks the cell’s processes and redirects them exclusively to the manufacture of more of itself. New virus particles are then budded-off from the surface, each surrounded by a piece of host cell membrane. Or, the cell splits open, releasing hundreds of new particles to infect other cells. Either way, viruses damage and kill our cells, which is what makes us ill when we have a viral infection.

Another trick of viruses provides a hint as to their origins. The genes in our own cells are remarkably like virus particles, also consisting of nucleic acid (DNA in this case) surrounded by protein. Sometimes a virus splices itself into this host DNA like an extra gene. The virus then lies low, being copied with the rest of the DNA when the cell divides, and passing to each of the daughter cells produced. Later, it may emerge without warning to form more viruses in the normal way, causing illness years after it first invaded. This is what happens when the chicken pox virus causes shingles in later life, or when people recovered from Ebola relapse, as recently happened to the Scottish nurse Pauline Cafferkey.

So, if viruses are constructed and can behave just like normal genes, perhaps that’s what they really are? Perhaps they are “escapee genes” that left their cells long ago to take up an independent existence? That would explain why they find it so easy to invade and take over our cells, and why we find it so difficult to defend against them.

Whatever the truth of their ancient origin, viruses present modern medicine with a formidable challenge. Antibiotics do not work against them, and the particles mutate rapidly to keep ahead of vaccines prepared to defeat them. One thing is certain: Ebola and Zika are not the last health scares that they will bring us.

Viruses and the diseases that they cause, including ‘flu and Ebola, are covered in depth by the new A Level Biology course recently launched by Oxford Open Learning. You can find out more about the course here: http://www.ool.co.uk/subject/a-level-biology/


For most of us, sound is a good thing. Our personal entertainment systems, myriad music channels, as well as downloads, mean we pretty much listen to what we want to most of the time. But how did this come about? How did we get to have such a wide choice? And just what is the history of radio and recorded sound anyway?

The British Library has decided to preserve as many sound recordings as possible. This will be a national project covering public and private collections. Recordings could be 100 years old, as old as the work of Thomas Edison himself.

The technology used back then is getting increasingly hard to use now. It must all be digitised, and the thought is that the experts have got 15 years to do it before some of these oldest recordings become impossible to work with.

To give you some idea of the size of the project, the library surveyed nearly 4000 collections containing 2 million items. And they come in all different forms, from material in tubs as big as cake tins to six inch long ‘concert cylinders’. They might even be made either of wax or lacquer! Some of these already need rare equipment to play them, so modernisation is essential.

So what’s going to be in this sound archive that’s so important, so desirable? Well, the answer is quite a lot: Drama and literature recordings including poetry going back to 1955, and drama to the mid sixties; oral history, which can be ordinary people telling their stories; a survey of English dialects going back to the 1950’s ( Did you know, for instance, that the Isle of Wight has its own ‘old’ language, which you and I would never fully understand?).

There are all sorts of music on record too, of course. They call this range ‘jazz to grime, music hall to metal.’. There’s classical music going back to 1937, and something the library calls ‘forced entertainment,’ which they explain is experimental drama and ‘happenings’.

It’s quite a collection and far too much for anyone to listen to it all. But at least it’s going to be there, and we will know that this really old material is going to be preserved indefinitely.

Virtually anyone who can get on to the British Library website can enjoy these sounds. Imagine studying music or drama – or just being interested in local history or early pop music – and being able to listen to original recordings. Several partner radio organizations are involved in the project, and it’s going to cost £9.5 million of Heritage Lottery funding plus contributions.

So is it worth it – saving old treasures like this that we can all enjoy? What do you think? Sounds good to me.

Scientist and mathematician Galileo Galilei was born on February 15th, 1564, in Pisa, Italy. A pioneer of maths, physics and astronomy, Galileo’s career had long-lasting implications for the study of science.

In 1583, Galileo was first introduced to the Aristotelian view of the universe, which was a religion-based view of how the world worked. A strong Catholic, Galileo supported this view until 1604, when he developed theories on motion, falling objects, and the universal law of acceleration. He began to openly express his support of the controversial Copernican theory, which stated that the Earth and planets revolved around the sun, in direct contrast to the doctrine of Aristotle and the Church.

In July 1609, Galileo learned about a telescope which had been built by Dutch eyeglass makers. Soon he developed a telescope of his own, which he sold to Venetian merchants for spotting ships when at sea. Later that year, Galileo turned his telescope toward the heavens. In 1610 he wrote The Starry Messenger, where he revealed that the moon was not flat and smooth, but a sphere with mountains and craters. He discovered that Venus had phases like the moon, and that Jupiter had revolving moons, which didn’t go around the Earth at all.

With a mounting body of evidence that supported the Copernican theory, Galileo pushed his arguments against church beliefs further in 1613, when he published his observations of sunspots, which refuted the Aristotelian doctrine that the sun was perfect. That same year, Galileo wrote a letter to a student to explain how Copernican theory did not contradict Biblical passages, but that scripture was written from an earthly perspective, and that this implied that science provided a different, more accurate perspective.

In February 1616, a Church inquisition pronounced Galileo as heretical. He was ordered not to “hold, teach, or defend in any manner” the Copernican theory regarding the motion of the Earth. Galileo obeyed the order until 1623, when a friend, Cardinal Maffeo Barberini, was selected as Pope Urban VIII. He allowed Galileo to pursue his work on astronomy on condition it did not advocate Copernican theory.

In 1632, Galileo published the Dialogue Concerning the Two Chief World Systems, a discussion among three people: one supporting Copernicus’ heliocentric theory of the universe, one arguing against it, and one who was impartial. Though Galileo claimed Dialogues was neutral, the Church disagreed. Galileo was summoned to Rome to face another inquisition, which lasted from September 1632 to July 1633. During most of this time, Galileo wasn’t imprisoned, but, in a final attempt to break him, he was threatened with torture, and he finally admitted he had supported Copernican theory. Privately, though, he continued to say he was correct. This ultimately led to his conviction for heresy and as a result he spent his remaining years under house arrest.

Despite the fact he was forbidden to do so, Galileo still went on to write Two New Sciences, a summary of his life’s work on the science of motion and strength of materials. It was another work that has helped cement his place in history as the world’s most pioneering scientist, even if he was not fully appreciated in his own time. Galileo Galilei died on January 8th, 1642.

By far the biggest killers in today’s Britain are cancer and circulatory disease. Of the 501, 424 people who died in 2014, 29% died of cancer and 27% from heart attacks plus strokes. There is no doubt as to why charities seeking a cure for these scourges attract so much public support.

By contrast, leaving aside ‘flu and pneumonia, which mainly kill the already weakened elderly and infirm, infectious diseases account for a mere 0.6% of deaths. Your chances of being cut down by one of these in your prime of life is comparable with that of the threat from road traffic accidents or suicide. The reason we are dieing largely from heart disease and cancer is not because they are becoming more virulent, then. It is simply because we are living longer. Whereas in 1900 the average life expectancy in this country was just 48, now it is 81.

Antibiotics, the drugs used to treat bacterial infections, are a recent invention. Alexander Fleming stumbled upon them by accident in a London laboratory in 1928, though the first, penicillin, only went into mass-production in 1944. When it did so, it reduced at a stroke the number of deaths from infections, making hospital operations safe, battlefield wounds less fatal, and many serious diseases treatable.

Bacterial resistance to antibiotics emerged as a problem in the 1950s, but it has now become critical. Resistant bacteria have the ability to transfer their resistance to other species as well as passing it on to their offspring. So, once established, resistance to a particular antibiotic spreads rapidly, and bacteria with multiple resistances emerge. By 2004, bacteria resistant to almost all known antibiotics had appeared, while in 2015, bacteria resistant even to the”antibiotic of last resort” appeared in southern China. It is expected to spread to the west shortly.

In April 2014, the World Health Organization (WHO) sounded the alarm on this topic in no uncertain fashion. It spoke of a “major global threat” from such antibiotic-resistant bacteria, and an imminent return to a pre-antibiotic era, where people regularly die from the simplest of infections. If and when this happens, you would be far more likely to die from sepsis following a cut, or from airborne or waterborne bacterium, and less likely to live to an age when cancer and heart disease are a concern.

There is, though, a glimmer of light on this dark horizon. Traditional antibiotics are developed from defensive chemicals produced by fungi and bacteria. However, our own cells also produce chemicals that attack bacteria. They are short proteins (peptides) produced on our own cellular protein-assembly machines, called ribosomes. From this comes their acronym, RAMP antimicrobials (ribosomally synthesized antimicrobial peptides). These antimicrobials carry a positive electrical charge on their molecules and are attracted to the negatively charged outsides of bacterial cells. Once attached to the bacteria, they punch holes in the bacterial wall or membrane, killing the cell.

These natural defence molecules have been around for millions of years, during which time bacteria have failed to develop effective resistance to them. So, if this is the case, and if effective artificial mimics of natural RAMPs can be made, we may yet avoid a potential return to the dark ages of pre-antibiotics.

Bacteria, the discovery and action of antibiotics, and the emergence and spread of resistance, are all covered in depth in the new A level Biology course recently launched by Oxford Open Learning. You can find out more about the course here: http://www.oxfordhomeschooling.co.uk/subject/biology-a-level/


2016 is a leap year, which means that the year will have 366 days rather than 365. This extra day is placed at the end of February, meaning that February 2016 will have 29 days, rather than its usual 28.

Leap Years occur every 4 years. They are needed to keep the calendar we use (the Gregorian Calendar) in alignment with the Earth’s revolutions around the sun. It takes the Earth approximately 365.242199 days to circle around the sun once. That is the same as 365 days, 5 hours, 48 minutes, and 46 seconds. As the Gregorian calendar only has 365 days in a year, if there wasn’t an additional day on February 29 every 4 years, we would lose almost six hours off our calendar each year. That means that, after only 100 years, our calendar would be wrong by approximately 24 days; almost an entire month. On each occasion, the leap year date has to be exactly divisible by 4. However, if the year is also divisible by 100, it is not a leap year- unless it can be divided by 400 as well! For example, this means that the years 2000 and 2400 are leap years, while 1800, 1900, 2100, 2200, 2300 and 2500 are not leap years (Still keeping up?).

The idea of a leap year was first introduced by Julius Caesar in the Roman empire over 2000 years ago. The Julian calendar was different to our more modern Gregorian calendar in that leap years were worked out more simply, with the only rule being that the year had to be divisible by 4. However, this ended up producing too many leap years! The situation still wasn’t corrected until the introduction of the Gregorian Calendar by Aloysius Lilius (an Italian physician who named it after Pope Gregory) in 1582, though, and it wasn’t until 1752 that the Gregorian calendar was adopted in Britain and America, and our own leap year calendar was corrected.

As for the future, well, if you’ve been paying attention you’ll know that the next leap years will be 2020, 2024… and so on.

At 11.03am UK time, on 15th December 2016, Major Tim Peake became the first British government funded astronaut to journey to the International Space Station.

Major Peake and his fellow astronauts, Tim Kopra and Yuri Malenchenko, will spend the next six months on the ISS after their craft, the Soyuz TM capsule, achieved a faultless launch from a standing start from the Baikanour Cosmodrome in Kazakhstan. Reaching the Earth’s orbit in only 8 minutes and 48 seconds, it then took just six hours for the Soyuz to catch up with the space station, which travels at 17,500mph as it orbits the Earth.

During the final approach, the probe of the Soyuz was captured by a series of hooks on the docking cone of the International Space Station, allowing all three astronauts to pass inside at a height of 400km from Earth.

A Major in the British Army, 41 year old Peake is the first Briton ever to be accepted into the European Astronaut Corps. He has been training for this mission since 2009, and beat 8000 hopeful astronauts to take part.

Peake follows in the footsteps of Helen Sharman, who was the first Briton to go into space as part of the Russian space corps on Project Juno, a co-operative project between a number of British companies and the Soviet government. In 1991, Sharman spent a week aboad the Russian Mir space station. Other British astronauts, Michael Foale and Piers Sellers, who flew missions on the US space shuttle, had to take dual citizenship in order to attain their dream of travelling in space.

Peake’s mission, named Principa, after Sir Issac Newton’s thesis on gravity, is to undertake a vast array of scientific experiments in the ISS’s Columbus laboratory with his colleagues, including the effect of headaches after space travel, complex research, and space walks. The last of these, Peake has said, would be his “ultimate” space experience.

Peake’s mission is also the first to have received funding from the government, and perhaps this, as well as the great public interest generated, will mean that there will be more British astronauts in the years to come.

512px-Children_computing_by_David_ShankboneThe national curriculum states that children in primary school should learn how to write basic programs, to ‘debug’ (fix mistakes in programs to allow functionality); to use technology to store and organise content; and to understand how technology is used outside of the school setting. It is impossible to ignore the fact that computer literacy is now vital for many aspects of daily living, and children should be equipped with the skills they need as early in their lives as possible.

Information technology is based on logic: the idea is that you follow a set of steps in a particular order so that you can reach a desired outcome. Developing and enhancing the logic skills of children will not only benefit their capabilities with regards to computers and digital devices, it also helps to improve numeracy. Mathematics is also centred around logic, meaning children who have good IT skills will be better able to understand the subject’s problems and concepts.

The study of IT at primary school is also an important part of preparing for secondary school. When children enter secondary school, it is assumed that they will be proficient with and confident in using technology. Although most young people have frequent and regular access to IT at home, and use devices such as PCs, tablets and mobile phones, they might not necessarily understand how they work. Formal education in information technology allows children to start secondary school with the ability to use computers and other devices to organise their work, participate in activities, and engage fully with all aspects of the curriculum.

There is increasing hysteria over children accessing social media sites and apps. The temptation is for parents and teachers to ban all such activities in a bid to protect them. However, banning these sites and apps with no discussion or explanation only makes them more alluring for children. It also means that when they enter secondary school, they are ill prepared for the murky world that can accompany online interaction. Children who study IT at primary school can be better protected from online bullying or abuse by being informed and educated about what is acceptable behaviour, and how and when to get help if it is needed.

Studying IT at primary school can help to develop research skills from a young age. Children who learn how to access the material they need and what kinds of sources and content are most useful and relevant, will be better prepared for secondary and university study. Although books still have an important part to play in the study of many subjects, online research skills are vital to the education of young people in the 21st century.

512px-Autumn_leaves_sceenarioThe Autumn Equinox marks the beginning of autumn on September 23rd. This year the precise time autumn begins is in the morning at 4:21 A.M, when the sun appears to cross the celestial equator from north to south.

The word equinox means equal night, when the night and day are almost exactly the same length of time. This occurs twice a year, once at the start of spring, and then again to signal the beginning of autumn in late September.

Throughout history the autumn equinox has given rise to a number of traditions. In Greek mythology the arrival of autumn is associated with the goddess Persephone returning to the underworld to be with her husband, Hades. This event was seen as a time to reflect on recent successes and failures and to carry out rituals to protect you against the coming months.

The Pagan calendar shows that the autumn equinox is marked with Mabon. This is one of the eight Sabbats (a celebration based on the cycles of the sun), which celebrates the second harvest, gives thanks for the days of sunlight, and helps to prepare for the coming of winter. Mabon, as with many Pagan festivals, was replaced by the Christian church with Christianised observances. It was replaced by Michaelmas (also known as the Feast of Michael and All Angels), which falls a few days after the equinox, on September 29.

It isn’t just the Northern Hemisphere’s countries that celebrate the coming of autumn. In China it is marked with the Moon Festival. At this time the abundance of the summer’s harvest is celebrated with the making of mooncake, which is baked full of lotus, sesame seeds, and a duck egg or dried fruit.

In Japan, the Buddhist celebration of Higan, or Higan-e, is a week long observance which occurs during both the September and March equinoxes. Higan means “the other shore”, and refers to the spirits of the dead reaching Nirvana. The period of the equinoxes is so important in Buddhism that both weeks have been national holidays since the Meiji period (1868-1912).

All in all, then, there is a lot more to this change of season than a new colouring of the leaves and a chill in the air!


Connect with Oxford Home Schooling