Homework is a necessary part of education, as it helps children to continue learning outside of the classroom.
However, without the presence of a teacher, it’s up to parents to provide support if a child is struggling with an answer, and our research shows that many aren’t comfortable with this responsibility.
Only a third of UK parents say they feel confident helping with homework, while just 6% correctly answered all three of the primary school questions in our test.
Think you can do better? Take our homework quiz below to find out!
It was written by a Year 3 teacher using English, maths and science questions she would typically ask her class. See if your knowledge is up to scratch, or if you need to go back to school.
How did you do? Share your score and thoughts on children’s homework by tweeting us @OxfordHomeSch
Whether you are an adult learner or a teenager who is juggling multiple subjects, working efficiently and effectively can be challenging.
But it doesn’t have to be. The solution lies in being organised – specifically with your time.
Whether you prefer a handwritten calendar or an electronic one, think about colour coding it.
Perhaps you could assign a different colour for each subject. Or maybe a different colour for different aspects of your life.
This is a great visual method to ascertain whether you are spending enough time on your learning, and helps you dedicate a solid part of your day to it rather than thinking ‘I’ll do that later’ and never quite getting around to it.
Effective learning doesn’t depend on how many hours you put in. It depends on what you do in that time.
So when organising your learning time, don’t simply slot study periods into your diary. List what you will specifically work on during that time. This will not only help you stay on-track but also ensure that you are making steady progress in all areas that need attention.
Don’t forget to schedule in some relaxation too!
Look ahead at your learning schedule and think about what you need to do now, and what can wait until tomorrow (so to speak).
It can be overwhelming when you have a long list of tasks – especially if you feel like all of them had to be completed yesterday. But when you zoom in, you will see that you can divide your list into manageable chunks.
This will help you actually complete your list and is a great strategy if you have a tendency to procrastinate.
We all like the feeling of being successful. So when we find something difficult, we can often be tempted to avoid it. This is the opposite of what you need to do. Think about it: if you spend more time on things you find hard, they will soon become easier.
Tips 1 to 3 feed into this – if you dedicate specific time to the harder topics, and prioritise them over ones you have already mastered, your learning will be more effective.
Some of us work better in the mornings, others at night. Still others find it is easier to work in the afternoon. Find out what your own peak learning time is. It will be when you make the most progress, feel freshest and absorb learning best.
Cramming before an exam is tempting and in principle, it can be effective. But only as long as you choose your study time wisely.
On 25th January 2019, the Doomsday Clock was moved closer to midnight, from three to two and a half minutes to twelve.
Created by the board of the Bulletin of the Atomic Scientists in 1947, the Doomsday Clock began as a visual representation of the world’s response to nuclear threats. In contrast to the perils it represents, the idea of the clock is very simple. The nearer to midnight the minute hand is placed, the closer the board of Atomic Scientists believes the world is to disaster. Midnight being a representation of the moment of a worldwide apocalypse.
The aim of this shock tactic is to raise awareness of how close human beings are getting to destroying the planet they inhabit. Speaking to USA Today, a representative from the Atomic Scientists explained that the clock “conveys how close we are to destroying our civilisation with dangerous technologies of our own making.”
When the Doomsday Clock was first invented, the scientists involved were also working on the Manhattan Project; a programme responsible for the construction of the first nuclear weapons. Very aware of the consequences of what they were doing, they introduced the clock to warn of the weapons’ power. In this first instance, the hands were set at 7 minutes to midnight.
Since its birth, the clock hands have been moved backwards and forwards. At its ‘safest,’ it was set at seventeen minutes to midnight in 1991. In 1953, at the height of the Cold War, the clock hands were moved to two minutes to midnight, when the USA invented the hydrogen bomb.
The Bulletin of the Atomic Scientists gives the reason behind the current placement of the hands at two and a half minutes to minute in 2019, as “the failure of President Trump and other world leaders to deal with looming threats of nuclear war and climate change”.
There is no doubt that the reasoning behind the Doomsday Clock is both serious and worrying, but what factors are used to conclude its position?
Eugene Rabinowitch, from the Bulletin of the Atomic Scientists, explains that several factors are taken into consideration when deciding the placement of the hands. These include nuclear threats, climate change, bioterrorism, biosecurity and side threats, such as cyber warfare. “The Bulletin’s clock is not a gauge to register the ups and downs of the international power struggle; it is intended to reflect basic changes in the level of continuous danger in which mankind lives in the nuclear age.”
Just how accurate is the Doomsday Clock, then? Well, the sad truth is that we won’t know until it’s too late. It can’t be denied however, that it does make you stop and think.
2019 marks 150 years since the Periodic Table was created in 1869. This easily recognisable chart, which displays and orders every known chemical element, has become a stable reference point in the world of Science, particularly Chemistry.
It is the Russian scientist, Dmitri Mendeleev who is credited with the creation of the table. However, when he first put together his chart showing the elements, it looked rather different to the one we have today. As Science News reminds us, ‘When Dmitrii Mendeleev proposed his periodic table 150 years ago, no one knew what was inside an atom. Today, we know that an element’s place on the table, along with its chemical properties, has a lot to do with the element’s proton number as well as how its electrons are configured.’
Born in 1834, Mendeleev was part of a large Siberian family. After the death of his father, Dmitri’s mother transported her family over 1500 miles to St. Petersburg. Once there she saved enough to allow her son to go to school, where his advanced intellect quickly became clear. By the time he was an adult, he was already a brilliant scientist. Mendeleev famously wrote a textbook, Chemical Principles, because he couldn’t find a decent book on Chemistry that was written in Russian.
There had been other scientists who had come close to creating a workable table of the chemical elements before Mendeleev. The earliest attempt to classify them was in 1789, when French scientist, Antoine Lavoisier, grouped them based on their properties; into gases, non-metals, metals and earths. However, it was Mendeleev who finally managed to arrange them into an order that worked.
His discovery came when, in February 1869, he was writing the properties of the elements on pieces of card and arranging and rearranging until, as a spokesman for the Royal Society of Chemistry explains, “he realised that, by putting them in order of increasing atomic weight, certain types of element regularly occurred. For example, a reactive non-metal was directly followed by a very reactive light metal and then a less reactive light metal. Initially, the table had similar elements in horizontal rows, but he soon changed them to fit in vertical columns, as we see today.”
One of the reasons Mendeleev’s work was so groundbreaking was that he was forward-thinking enough to leave spaces within the table, with a mind to the chemical element discoveries of the future.
Scientific advancements and discoveries since have indeed meant that the Periodic Table has gradually accumulated and added many new elements. Four new elements were added in 2016 alone.
Although Mendeleev never received a Nobel Prize for his work, the 101st element to be discovered was named Mendelevium after him. 2019 has been declared the “International Year of the Periodic Table of Chemical Elements (IYPT2019)” by the United Nations General Assembly and UNESCO. For information about the activities taking place across the UK and the world as a whole, you can find out more, visit- https://www.iypt2019.org/
War is the only proper school for a surgeon.
The First World War was a watershed moment in history. Never before had such a relatively short period in time seen such seismic shifts in technology, society and culture. The newly industrialised nature of the conflict and parallel stalemate of the trenches, all under near-constant artillery bombardment, was fertile ground for rapid innovation. In just four years the battlefields of France and beyond saw the introduction of tanks, militarised aircraft, machine guns and chemical warfare. But what war harms, society inevitably must find ways to heal. These novel technologies of death and destruction brought with them wounds and bodily disorders completely new to medicine, and as a result the medical field would embark on a journey of similarly hasty scientific advancement.
The nature of trench warfare meant that, with soldiers’ bodies protected most of the time, there were a disproportionate amount of head and facial injuries. Surgeons were at a loss as to how best to treat these horrific wounds and burns, often stitching together open wounds with no time to consider the consequences of the healing process.
At Sidcup in London, a New Zealand-born, British-trained surgeon, Harold Gillies, was a crucial figure in the development of reconstructive surgery. Gillies advocated a highly experimental, never-before-seen method of treating facial gunshot and burn victims with skin grafts – taking tissue from, for example, the chest or leg and using it to repair the face – a technique still in mainstream use today. In a pre-antibiotic age, his pioneering “Pedicle Tube,” a tube of skin leading from the donor site to the graft site, allowed blood flow from a healthy area of the body to the injured area, nourishing the graft tissue, and preventing infection.
Other advancements in the treatment of physical injury included the Thomas splint, developed by Welsh surgeons Hugh Owen Thomas and Robert Jones, which drastically reduced the number of deaths from broken bones, and the mobile X-ray unit, invented by Marie Curie in France and launched onto the battlefields with the help of 150 female operators.
It wasn’t just physical injury that soldiers risked on the frontlines. Disease, including the 1918 flu pandemic, accounted for around one third of military casualties, while around six million civilians perished due to disease and war-related famine. After seeing the widespread death caused by Typhoid fever during the Second Boer War, a British bacteriologist called Almroth Wright lobbied the British Army to provide 10 million vaccines against the disease to its troops on the Western Front, preventing, by some estimates, around half a million deaths.
Infection originating from wounds was also rife, thanks in part to the foul conditions in the trenches, where lice and mud were ubiquitous. An antiseptic solution developed by the French-British partnership of Alexis Carrel and Henry Dakin drastically reduced the need for amputation due to sepsis.
From the ashes of war progress so often springs, the decay and destruction of conflict powering innovation and change. The First World War was billed as the war to end all wars, a title that as we well know could not have proved further from the truth, but soldiers and doctors of subsequent conflicts benefited immeasurably from the new medical knowledge, technologies and techniques that emerged from it.
On December 22nd 2018, following President Trump’s request for a federally-funded wall along the Mexican-American border being denied by the opposing Democratic Party, the United States Government found itself , once again, partially shut down. Three weeks later and this remains the situation, the country being held in an ever more expensive, damaging stalemate by a stubborn president and gridlocked Congress.
The shutdown has had a great impact on federally-funded services and has affected nearly 800,000 federal workers, causing many to have to work without pay. It is something that has occurred with alarming regularity under this President, but the latest instance is the most significant to date, becoming now the longest on record and showing little sign of an ending.
Every year, the President must sign budget legislation comprised of 12 appropriation bills, which outline the allocation of the federal budget to different government services and agencies, including the Department of Justice, the Transport Security Administration, US Department of Education, the Environmental Protection Agency, and the Internal Revenue Services. Currently, Trump is refusing to sign a bill that does not include the requested funding for a wall at the Mexican-American border.
Trump’s unrelenting demand for $5 billion to pay for a Mexican-American Iron Curtain has halted the approval of federal spending for the 2019 fiscal year. Democrats in Congress refuse to support any further spending on the wall, while Trump continues to threaten extending the shutdown for months or even years until Congress approves funding for the wall.
Historically, the US Constitution states that Congress has the ‘power of the purse’ and therefore, the power to appropriate government funds. The 1974 Budget and Impoundment Control Act further instilled this, granting Congress greater budgetary power and curtailing presidential involvement in appropriating funds. Government shutdowns usually arise when the President and the House and Senate are unable to resolve budgetary disagreements before interim deadlines in a budgetary cycle. The American government’s unique division of power sets the framework for a shutdown that would not otherwise be possible in countries with parliamentary systems.
However, America is no stranger to full or partial government shutdowns. Classified as a ‘partial shutdown’, this is the 21st shutdown in American history, and as already stated, the longest. The first government shutdown occurred on May 1st, 1980, but lasted only one day. The most notable shutdown in American history previously happened in 1995-1996 when the government shutdown for 21 days due to a dispute between Democratic President Bill Clinton and the Republican-majority in Congress.
Regardless of how long it takes for the government to resolve the shutdown, the days in which the government isn’t functioning have considerable consequences for national service and federal employees. Chief among these is the fact that the federal employees working in the departments affected by the partial shutdown will not receive pay until it is over.
For many departments, such as the Transport Security Administration, workers are still required to go to work every day despite this lack of pay. With some workers simply unable to even get to work as another consequence, though, greater pressure is put on many of the employees working in roles such as those vital to airline safety. Additionally, with the National Park Services affected by the shutdown, a third of national parks have been closed. Those still open are left open to vandalism and without basic maintenance.
Now, Trump is looking to declare a national emergency to circumvent Congress to build his wall. However, while the future of the wall is still uncertain, the repercussions of Trump’s ceaseless fixation on illegal immigrants crossing the Mexican-American border and his inability to compromise with Congress on funding has certainly been, and continues to be felt across America.
Spelling is an important part of our everyday lives, from developing our language at school all the way through to adulthood.
With British culture becoming more Americanised each year, we thought it would be interesting to find out how many Brits are influenced by American phrases and spelling perhaps without them even realising.
Our recent research found that almost half of Brits (48%) strongly agree that it’s important that children learn the importance of British spellings, with only 6% strongly agreeing that Brits should accept the move towards American ways of spelling.
Do you think your grammar has what it takes to spot the difference? Take our quiz to find out if you can find the Americanisms amongst the British spellings!
How did you do? Share your thoughts on Americanisms and your results with us on Twitter @OOLTrust
Education is constantly evolving. Billed as one of the fastest growing tech markets in the UK, our schools collectively spend more than £900 million a year on education technology, or edtech as it’s commonly dubbed. Neither does the sector show any sign of slowing down.
Curriculums change through the years, and with them the means of presenting their educational content to school pupils. We’ve already seen chalkboards exchanged for interactive whiteboards and projectors, and textbooks largely swapped out for laptops and computers. We’re all familiar with these developments, but ground-breaking progress continues to be made.
On October 16th, 2018, the BBC published a report on a parliamentary meeting that served as a landmark event for technology in education. Based at Middlesex University, a robot named Pepper sat down with MP’s to discuss the impact that robotics and artificial intelligence have had on education, and how things could move forward in the future. While all the questions and answers were predetermined, the main thrust of the conversation with Pepper was to encourage a blend of technology and human oversight, rather than replacing one with the other.
This merger focuses on viewing technology as something that is of service to teachers and pupils rather than something to be subservient to. An LSE study has already proven that banning smartphones in schools significantly improved results, but the question worth asking is; can technology be repurposed for education’s sake?
Though robotics and AI are being introduced to the learning environment, for now they largely handle the more administrative tasks in the schooling arena. For example, they’ll record test results or manage student data. That said, the robot Pepper facilitated duties in front line learning too, such as aiding special needs children with their numeracy development. Clearly, this edtech is all of enormous help to teachers, who have been notoriously overworked for years, resigning and even falling ill from the stress of their exploited roles.
An App called Kahoot! has also made waves amongst school pupils both in the UK and the US. It allows teachers to create their own digital games that their pupils can access through the app, enhancing their learning through a fun use of technology. The app had a recorded 50 million monthly users in June 2017, which shows just how quickly edtech can gain traction in popularity. Under teacher supervision, apps enable the learning experience to become exciting and interactive in a way that textbooks, unfortunately, can’t be.
To some degree, edtech allows children to have a more prominent hand in their education. It gives them greater agency in terms of not only what they learn, but how they learn it. Technology is something that young people are very familiar with, and that same familiarity can spur their engagement in the classroom. Edtech use means that learning becomes a less passive experience; pupils can now get involved using their screens, instead of listening to teachers monologue in ways they can’t fully comprehend.
Research conducted in 2014 questioned if school pupils absorb information better when they’re taught under specific learning styles and techniques. In 2019, perhaps surprisingly, this topic of which method is best remains a hotbed for contention and controversy.
It’s well known that pupils can excel in certain subjects and may struggle to master others, and of course there’s no shame in finding anything difficult. It has rightly remained the principle of education in good schools to nurture a child’s desire to learn, rather than to relentlessly push them into acquiring top-end grades to the detriment of their wellbeing. Learning is an organic and diverse process and it suffers when enforced under superficial measures.
This said, an array of questions come into play here; can pupils decipher the information they need from blocks of text, or are more practical study methods their forte? Will they improve from class group work, or can they thrive using an online course at home? Do they need images to tackle a subject, or a teacher issuing instructions at every step?
Each learning method in the VAK model aims to ensure that every child has an access point into learning, breaking down the barriers that prevent them from fully understanding any given topic. A child who prefers visual means can, theoretically, stick to the books and videos while avoiding any physical or listening-based activity. But does it make sense to make the act of learning so linear?
Complications arise when it comes to taking each method and making them applicable to every subject. Can a visual learner use images to really understand playing sport in physical education? Can an auditory learner excel in a silent reading period of an English class? Will their future workplace cater to that singular method alone? When a pupil is confined to a singular way of learning, it may have the potential to create a paradoxical classroom culture and restrict the kinds of information they can absorb in the future too.
Moreover, a research paper in 2004 recorded as many as 71 different learning styles, but the scholars themselves cited that their endeavours were “extensive, opaque, contradictory and controversial” after accumulating their data. Again, this state of argument appears to have changed little to date. While some children did indeed find their studies to be worthwhile under a personally tailored regimen, others criticised the lack of diversity. Do we ignore the things we’re not good at, or do we work to hone our skills?
Children need to know that learning is undoubtedly for them. When it comes to getting started or exam revision, something like VAK is undoubtedly a plus. It’s okay to have favoured ways of doing things, but then again, school is about being flexible and engaging with a never-ending canvas of ideas. There should be a constant circulation of learning styles for children to acquaint themselves with – not only so they can play to their strengths, but also to improve on methods of learning that they’re not so well versed in as well.
In 2018 the BBC reported that over the last three years the number of children who are being homeschooled in the UK has risen by around 40%. It’s not hard to see why; for parents, ensuring their child’s schooling is top quality is vital, and home schooling is definitely worth consideration as the new school year starts. Whether you’re considering homeschooling for your little ones or terrible teens, choosing to self-teach offers the perfect method for many parents who seek a more hands-on approach in their children’s education. In the UK, as a parent you must ensure your child receives a full-time education from the age of 5, moving through Key Stages 1-3 and on to GCSE and potentially A-Level education.
So is homeschooling right for you? Whatever the age or abilities of your child(ren), learning from home presents many benefits. Let’s look at a few of these advantages, which may help you decide.
Two of the main reasons influencing UK parents’ decision to choose homeschooling include protecting their children’s mental health and the ability to avoid exclusion. Being in a large classroom environment can present a number of challenges for children, including exposure to bullies, feelings of inadequacy from being around superior-performing peers and being singled out for being ‘different’ from other children. Many children may feel as if they simply don’t ‘fit in’. Home schooling offers a solution to avoid these situations and protect your children’s mental health and wellbeing.
The chance to learn one-to-one rather than one-to-many offers many children the chance to feel fully involved and immersed in their own learning. This increases their chances of remaining engaged and interested in their studies. This also allows you, as a parent, to build a stronger bond with your child; to be able to identify their strengths and weaknesses and work with them on these. It is attention that they may not get in a large classroom environment.
Homeschooling allows your child to proceed through their education at their own pace rather than that of scheduled class. Every child is unique, with their own abilities, and these abilities may vary from subject to subject. If your child needs more help with Mathematics and less so with English, you can adjust their learning schedule accordingly.
This means more healthy sleeping patterns and time to study – you have the time to flex your child’s learning timetable around your lifestyle and circumstances. You can take holidays when you want, too. A definite win-win.
Homeschooling offers many benefits over more traditional school classroom study. It’s worth weighing up the pros and cons of both options before making a decision to homeschool of course, and there are plenty of resources to do this, including the UK Government’s website, which can provide further advice.