Tag Archives: academic buoyancy

Researching the ‘emotional learner’

To what extent do emotions impact on academic achievement? This is a question I’ve been grappling with for nearly two years. More specifically, can positive emotions help students to cope more appropriately with day-to-day setbacks (daily resilience/academic buoyancy) and, if so, how can we nurture such emotions?

American psychologist Barbara Fredrickson has proposed that positive emotions help us in a number of ways. Specifically, while negative emotions such as fear narrow our cognitive processes by triggering our survival instincts, positive emotions work in the opposite direction. Interest, for example, triggers our desire to explore and encourages us to re-frame failure and setbacks in a more positive way. Furthermore, Reinhard Pekrun and his colleagues at the University of Munich have found that positive emotions are positively associated with engagement while negative emotions such as boredom, anxiety and hopelessness predict negative academic outcomes.

What I’ve quite rapidly begun to realise is that emotions are slippery things – they just won’t keep still – especially in teenagers! Another problem is that there are just too many emotions to measure, so you have to narrow it down to specifics. I initially decided to look at the role of boredom in academic buoyancy but then decided it might be more positive to look at interest. I finally settled on the exploration of interest and how it relates to the way pupils cope with daily setbacks (e.g. does intrinsic interest in a particular subject lead to a more positive response to, say, failing a test in that subject?).

Measuring emotions.

I’m now attempting to work out how I can measure all of this. On a very simple level I’m trying to identify a correlation between interest and academic buoyancy – both of which can be measured using previous validated and widely used scales. I’ve decided to recruit a sample of year 12 students embarking on a course in psychology for the first time. They’ll be asked to complete an on-line questionnaire each week for around eight weeks (see below if you’d like to be involved).

Yes, I can already hear the objections. Not only am I looking for a correlation (which doesn’t necessarily imply causation) but also I’m using self-completion questionnaires that are prone to social desirability factors and demand characteristics. The longitudinal nature of the study should help here, so long as the sample is significantly large (although this will result in huge data sets – this is both a positive thing in terms of the data but negative in terms of the time needed to collate and analyse).

Of course, I could add weight to any results (and, let’s be honest, there is no guarantee that I will support my hypothesis) by conducting a second study within a laboratory environment – I’ll lose some ecological validity but I’ll gain some control. If the results of my (as yet undefined) study 2 correlate with the results of the first study then I might be on to something.

Why bother?

Each year we are told that more and more young people are seeking help for stress and anxiety caused by the proliferation of high stakes testing. Teachers are in a position to identify possible psychological problems but should not be expected to become amateur counsellors. If help is needed professionals should provide it and it’s becoming clear that external agencies will become more involved in pupil wellbeing over the next few years. As the stakes get higher so will the psychological problems experienced by young people and I suspect there will be a huge number of ‘consultants’ offering interventions that have been neither tested nor validated in any meaningful way. The more data we have on aspects beyond the classroom the more we are able to target useful interventions. Viewing pupils as ‘emotional learners’ could perhaps be just one way of providing evidence based programs that nurture both wellbeing and academic achievement.

[Could your school help with my research? I’m looking for Year 12 Psychology students new to the subject in September 2015 (NB have not studied GCSE Psychology) who would be prepared to complete a weekly online ‘diary’ for around 8 weeks. Contact via Twitter in the first instance @psychologymarc – more details to follow].

Perceptions of Failure: Is there a role for Positive Psychological Capital?

Consider the following two scenarios:

Matilda has just been given an essay back from her teacher and it’s not the result she hoped for. The teacher has given her lots of feedback and advice on how to improve on her essay and she reads it thoroughly and pledges to correct her errors and re-submit it in a few days time. She is disappointed but understands that if she acts on the feedback her grade should increase.

Matty has completed the same essay and, just like Matilda, didn’t get the result he wanted. With Matty this always seems to be the case and constant poor grades have left him demoralised. Again, there is lots of feedback and advice on how to improve but Matty doesn’t read it – he’s a failure, he always fails and there seems to be very little he can do to fix the problem.

There are several psychological factors at play here. We could say that Matilda is displaying a Growth Mindset while Matty is surely a Fixed Mindset. We could also suggest that Matty is displaying a certain degree of learned helplessness (he has become so fixated on failure that he can’t see a way out) as well as showing self-handicapping tendencies. These can be viewed as both cognitive and emotional responses to failure – I see it all the time in my Sixth Form students.

As well as the established reasons for Matty’s behaviour explained above, we could also view Matilda’s and Matty’s responses in terms of Positive Psychological Capital (or PsyCap). Although PsyCap is a concept rarely applied to education, its related components of high self-efficacy, optimism, hope and resiliency have been found to be important motivational components in academic success and, although these components might need revising in terms of education, the general framework seems suitably relevant.

The Role of Academic Buoyancy.

It’s highly likely that Matilda would test higher for levels of academic buoyancy than Matty as, on the surface, it would appear that she is more able to ‘bounce back’ from minor (yet personally significant) setbacks such as a disappointing grade on an essay. From his own research, Dave Putwain at Edge Hill University has speculated that buoyant individuals may not view academic failure as threatening to either personal aspirations or self-worth due to their belief in the ability to bounce back from failure. (Putwain et al., 2012) Putwain further suggests that buoyant individuals do not hold an expectation of failure because of a belief in their ability to respond positively to the challenge of evaluative-performance events, suggesting further that academic buoyancy is based on positive ways of approaching academic setbacks rather than attempting to cope with them. Another way to put this would be to say that Matilda has accumulated more positive psychological capital while Matty views failure as an end result due to his lack of positive psychological capital.

For teachers, this creates interesting opportunities. In a society so obsessed with success and failure how do we promote a more positive view to failure within our students? Boys appear particularly prone to this (although the evidence is mostly anecdotal) which would explain why my male students are less likely to hand in homework than my female students – they fear failure, partly due to their difficulties in dealing with it.

Putwain, D.W., Connors, L., Symes, W. & Douglas-Osborn, E. (2012). Is academic buoyancy anything more than adaptive coping? Anxiety, stress, and coping. [Online]. 25 (3). p.pp. 349–58. Available from: http://www.ncbi.nlm.nih.gov/pubmed/21644112. [Accessed: 10 December 2013].

Should we try to ‘teach’ resilience?

Resilience is the latest ‘buzz’ word in education. I’ve lost count of the times politicians have spoken about it (without actually understanding the concept) and the number of schools that have implemented programs to ‘teach’ it.

As I’ve mentioned before, I prefer to use the term ‘academic buoyancy’ to describe the ability to ‘bounce back’ from those small but personally significant setbacks students encounter every day (everything from a disappointing grade on a test to those inevitable patches of poor performance). Such incidents represent low-level stressful situations rather than major attacks on self-confidence or perceived abilities.

Even though many schools have implemented programmes, these interventions remain difficult to assess due to few of them being similar to each other or measuring that which they are supposed to be promoting. A recent systematic consultative review found that many resilience programs within schools used the term ‘resilience’ is such a vague and conceptually weak manner that the authors found it difficult to identify those which could be realistically described as resilience-based (Hart & Heaver, 2013). Furthermore, results from the largest UK trial of resilience training in schools (the UK Resilience Project) continue to be largely ignored, perhaps due in part to disappointing outcomes and criticism concerning the intervention package (Coyne, 2013).

While the issue of definitions is certainly a problem, I think an equally destructive issue surrounds the view that we must teach resilience, rather than concentrating on factors that nurture academic buoyancy.

Four factors that (appear) to strengthen academic buoyancy.

1. Academic Self Concept.
Those learners who view their academic selves in a positive way appear more able to deal with daily setbacks. ASC represents a specific sub-category of self-esteem but recognises that our view of ourselves as learners is state rather than trait buoyancyspecific (so you might have high ASC for English but not for Maths). It would therefore follow that you could cope better with setbacks in those subjects where your ASC is high but not in subjects where it is low. ASC is, in part, related to our past experiences of ourselves as learners – poor experiences in Maths will lead to a negative view of ourselves within the area of Maths (e.g. Marsh & Martin, 2011)

2. Positive Emotions & Emotional Regulation.
Perhaps a little more controversial because of the relationship to the Positive Psychology movement. Nevertheless, evidence has found that negative emotions do negatively impact on buoyancy and that those who experience more positive affect manage to better safeguard themselves from setbacks through their ability to re-frame failure in more positive ways. (Putwain & Daly, 2013; Tugade & Fredrickson, 2011)

3. Implicit Theories of Intelligence.
Carol Dweck’s ‘mindset’ theory has been marketed to death but the general theory remains sound. Those who view their own intelligence as malleable (so-called ‘Growth Mindest’) are better equipped to deal with setbacks in a more constructive way (e.g. Dweck, 2000)

4. Growth Goals.
Setting goals can be a very powerful tool, particularly if those goals are incremental and represent a ‘better than last time’ or ‘personal best’ approach. And never neglect good feedback. (e.g. Liem, Ginns, Martin, Stone, & Herrett, 2012)

The assumption being made is that these factors influence and nurture buoyancy. Of course, there is more likely to be a reciprocal relationship and much more work needs to be done in order to better understand these complex relationships.

Ultimately, it’s less about teaching resilience and more about encouraging those factors that allow resilience to flourish.

Coyne, J. (2013). Positive psychology in the schools: the UK Resilience Project. PloS blogs. Retrieved October 18, 2014, from http://tinyurl.com/nt6ehs5
Dweck, C. S. (2000). Self-theories: Their role in motivation, personality, and development. Essays in Social Psychology (p. 214). Psychology Press. Retrieved from http://www.amazon.de/dp/1841690244
Hart, A., & Heaver, B. (2013). Evaluating resilience-based programs for schools using a systematic consultative review . Journal of Child and Youth Development, 1(1), 27–53.
Liem, G. A. D., Ginns, P., Martin, A. J., Stone, B., & Herrett, M. (2012). Personal best goals and academic and social functioning: A longitudinal perspective. Learning and Instruction, 22(3), 222–230. doi:10.1016/j.learninstruc.2011.11.003
Marsh, H. W., & Martin, A. J. (2011). Academic self-concept and academic achievement: relations and causal ordering. The British Journal of Educational Psychology, 81(Pt 1), 59–77. doi:10.1348/000709910X503501
Putwain, D. W., & Daly, A. L. (2013). Do clusters of test anxiety and academic buoyancy differentially predict academic performance? Learning and Individual Differences, 27, 157–162. doi:10.1016/j.lindif.2013.07.010
Tugade, M. M., & Fredrickson, B. L. (2011). Resilient Individuals Use Positive Emotions to Bounce Back From Negative Emotional Experiences. Journal of Personality and Social Psychology, 86(2), 320–333. doi:10.1037/0022-3514.86.2.320.Resilient

Metacognition and Academic Growth

What do we mean by ‘meta-cognition’?

Meta-cognition relates to the process of actively thinking about our own learning. It’s often referred to as ‘learning skills’ or ‘learning to learning’ and is centered on one’s ability to evaluate and monitor one’s own learning and to readjust as necessary through continual self-monitoring. It also includes the ability to self-regulate one’s own learning in terms of managing motivation.

Meta-cognitive Regulation

This refers to the adjustments people make in order to help them control their own learning and includes:

  • Planning
  • Information Management Strategies
  • Comprehension Monitoring
  • ‘de-bugging’ strategies
  • Evaluative and Progress Goals
  • Knowing when and where to use particular strategies for learning and problem solving
  • How and why to use such strategies
  • The use of prior knowledge to plan a strategy for approaching a learning task
  • Taking the necessary steps to:
    • Problem Solve
    • Reflect on and/or evaluate the results
    • Modify the approach as needed

Meta-cognitive Knowledge

This relates to what individuals know about themselves as ‘cognitive processers’ as well as what they understand about the different approaches that can be used for learning and problem solving as well as a knowledge about the demands of a particular task.

In my experience, many students are generally unable or unwilling to evaluate their own learning. However, the students who do best are often the ones who can self-evaluate and self-regulate when given the opportunity to do so (for example, through careful consideration of teacher feedback). For this reason I’m going to look at my own practice, specifically the way in which I present feedback and how I expect my students to approach it.

Does it really work?

Over the past few years teachers have become more concerned with ‘evidenced based’ approaches to teaching rather than relying on untested and often highly erroneous ones (e.g. Learning Styles and Brain Gym). A great deal of the pressure for evidence based learning has grown from a grass-roots level though social media (predominately Twitter), culminating in the ResearchED movement.

The teaching of metacognitive strategies, as well as an awareness of meta-cognition in general, has strong empirical support.

Hattie (2009), in his synthesis of more then 800 meta-analyses of learning interventions found meta-cognitive strategies to have an effect size* of 0.71, suggesting a high impact on educational achievement.

The Education Endowment Foundation has found similar results, finding that meta-cognitive and self-regulatory strategies can add between 7 and 9 months additional progress on average.


How should we ‘teach’ meta-cognitive strategies?

If the impact of meta-cognitive strategies is so large, why are students still so poor at self-evaluation and self-regulation? It could be that many schools view it as a faddy bolt-on rather than a highly effective tool to improve students outcomes – the strategies never become imbedded into the system. Meta-cognitive stills need to be part of the culture of the school and be employed in every lesson (rather than being taught in isolation). I would also argue that feedback is a major part of the process and that feedback needs to be detailed, useful and attached to growth goals. The process then becomes a cyclic one that spirals outwards as learning and growth becomes visible.

The recognition of meta-cognition is particularly interesting as it so easily feeds into a more joined up set of initiatives that incorporate other evidence-based interventions such as resilience/buoyancy and Mindset.

*Effect Size is a measurement of the effectiveness of the intervention or strategy based on the results of meta-analysis (the analysis of several studies in the same area). An effect size of 0.4 or above is considered to be within the ‘zone of desired effects’. The greater the effect size, the more the strategy or intervention is seen to be effective. But note that some meta-analyses will be based on far fewer studies than others, leading to lower reliability.

Fear, Failure and Memory.

Sometimes I think we neglect the impact anxiety and fear has on our students. With the sudden interest everyone seems to have in cognitive psychology (usually referred to as ‘cognitive science’ by the political elite*) there is, quite rightly, a growing fascination with how we can help learners to recall all the information we’ve been filling their heads with.

…but what use are these strategies if our students are so terrified of failing that they can barely recall their names (let alone the components of the working memory model – I know, ironic isn’t it)?

Alright, I’m being melodramatic.

…or am I?

Consider the following quote from the book ‘everyone is talking about’):

A fear of failure can poison learning by creating aversions to the kinds of experimentation and risk taking that characterize striving, or by diminishing performance under pressure, as in a test setting. In the latter instance, students who have a high fear of making errors when taking tests may actually do worse on the test because of their anxiety.

Make it Stick (Brown, Rodieger & McDaniel, 2013)

It seems that fear of failure is having a detrimental impact on working memory capacity because the student is directing resources away from memory and into the monitoring process – so the student is so busy thinking about performance and the monitoring of possible mistakes that there is little working memory capacity left to take care of the job in hand.

It also seems to be worse for girls…

One study investigating anxiety and performance in mathematics found that anxiety and worry in females was much more likely to negatively impact on working memory. More specifically, the researchers identified a causal chain from the worry component of anxiety to visuospatial working memory to maths performance, with worry placing more strain on visuospatial working memory in females (Ganley & Vasilyeva, 2014)

Those students who are less test anxious also appear to be more resilient and perform better on tests than those with increased levels of test anxiety (Putwain, Nicholson, Connors, & Woods, 2013).

Fear of failure is also more likely to lead to cognitive strategies such as self handicapping which in turn further perpetuate failure (Bartels & Herman, 2011)

Implications of this kind of research into emotion and learning are quite clear – rather than ignoring or eliminating the fear experienced by students, educationalists should encourage more positive ways of dealing with the fear of failure. Students need to fail (I’ve been banging on about that for a while now) but the ‘idea’ of failure needs a serious re-framing. While we might not yet be ready for a French-style ‘Festival of Errors’ or a Californian ‘FailCon’ (after all, let’s face it, no Brit wants to admit they’ve cocked up!), there is certainly a need for some cognitive readjustment.

* Michael Gove likes to use the term ‘cognitive science’ – I suspect it’s his way of hiding the fact that he thinks psychology is neither a science nor a proper subject.


Bartels, J., & Herman, W. (2011). Fear of Failure, Self-Handicapping, and Negative Emotions in Response to Failure. Online Submission. Retrieved from http://eric.ed.gov/?id=ED524320
Brown, P, Rodieger, H., & McDaniel, M. (2013). Make it stick. Belknap.
Ganley, C. M., & Vasilyeva, M. (2014). The role of anxiety and working memory in gender differences in mathematics. Journal of Educational Psychology, 106(1), 105–120. doi:10.1037/a0034099
Putwain, D. W., Nicholson, L. J., Connors, L., & Woods, K. (2013). Resilient children are less test anxious and perform better in tests at the end of primary schooling. Learning and Individual Differences, 28, 41–46. doi:10.1016/j.lindif.2013.09.010


Questioning the stability of academic buoyancy.

Back in October I conducted a small-scale exploratory study into three constructs (academic self-concept, academic buoyancy and implicit theories of intelligence). You can read the details here. A few weeks ago I asked the same students toStressed-Student complete the questionnaires again to confirm that these constructs remain stable over time. I was particularly interested in academic buoyancy (day-to-day resilience) due to the forthcoming AS exams. What I wanted to confirm was that those students who considered themselves resilient at time 1 (October 2013) still considered themselves resilient at time 2 (May 2014). This would be measured using the Academic Buoyancy Scale (Martin and Marsh, 2007), a four-item measure of academic buoyancy (AB) that has proved reliable over time and within different settings.

Let’s get some of the problems with the ‘study’ out of the way now.

At time 1, the sample consisted of 41 year 12 students. At time 2, and due to a number of factors (including subject/school drop-out and a lower volunteer rate) this had dropped to 27. The final sample is therefore very low and is far from representative.

The sample is small and unrepresentative – predominantly white, middle-class and with a higher percentage of female participants.

However, as this was an exploratory study, I was looking for general patterns needed to establish possible further avenues of investigation.

Ethical Issues.

The study was conducted in line with ethical procedures of the University of York. Participants were volunteers and gave informed written consent (all participants were over the age of 16). They had the right to withdraw from the study as any time (including the withdrawal of their data).

What did the data show?

Data analysis was conducted using the R statistical package. The results of the t-test found a significant difference between AB at time 1 and AB at time 2 (p<0.01). Further analysis found an effect size of 0.675. If we apply Cohen’s (1988) conventions for effect size, we also find a highly significant difference between time 1 and time 2 (so we can be pretty confident that timing was a major factor).

What does all this mean?

Results would suggest that AB isn’t stable and is mitigated by other factors. The timing of the second data collection activity (a week before the start of AS exams) could play a role in the difference between the two sets of scores, begging the question “Do students feel less confident about their abilities at different times?” Outcome measures (in the form of AS results) can be examined in August and could (but only ‘could’) yield more information.

Where now?

The plan now is to use experience-sampling methods (ESM) to collect data on a number of factors ‘as they happen’. The problem with much of the research into academic buoyancy is that participants are asked to complete measures in isolation (i.e. “I am good at dealing with setbacks”). ESM allows for participants to think about these measures in a more realistic and moment-by-moment way via electronic ‘prompts’ sent to mobile devices. ESM tends to result in large data sets, dependent upon the number of prompts and length of the study, so sample sizes can be smaller (and, for practical reasons, need to be). An additional possibility would be to supplement the ESM data with a end of day/end of week questionnaire to investigate the difference between immediate and retrospective self-assessments.

What’s the point?

Emotion appears to impact on learning. Research has suggested that factors such as self-concept, boredom, anxiety and resilience can have both positive and negative effects on academic outcomes, as well as cognitive functions like attention. Understanding the nature of these factors could help to develop interventions to stabilise some of them. Emotion impacts on cognition, for example, stress can heighten recall to a point but too much anxiety leads to inaccurate recall. The so-called Yerkes-Dodson suggests that performance increases as physiological and mental arousal increases to an optimum level, at which point cognitive functions begin to decline. Although the Yerkes-Dodson law in somewhat dated, more recent research appears to support its validity.

In a system where more and more of our young people are suffering from heightened levels of anxiety (the reason for which is highly debatable), examining their daily classroom lives can be provide rich data into how, when and why they do and do not learn.

What do we really mean by resilience?

Character, resilience, buoyancy, grit – these concepts have been floating about a lot lately.

In February the All-party Parliamentary Group on Social Mobility published the document ‘Character and Resilience Manifesto’ while in the same month Tristram Hunt told conference delegates at the AQA Creative Education conference that character and resilience can and should be taught in schools (a point he returned to recently in an address to the Institute for Effective Education in York). In March Liz Truss suggested that mindfulness lessons should be introduced to improve resilience in schools, adding weight to a growing agenda on the importance of non-cognitive skills.

If I’ve learnt anything over the past few months it’s been that educationalists are concerned that our obsession with exam success is neglecting the importance of these non-cognitive skills. The politicians, I cynically assume, are just along for the ride. I’ve blogged about resilience before, only I don’t tend to call it resilience because resilience as it stands in the research literature isn’t always the same resilience as the one I’m interested in. More recently, the term ‘grit’ has also entered the non-cognitive skill lexicon, unfortunately it’s often seen as synonymous with resilience – which it isn’t. Ultimately I often end up discussing the wrong thing with people because what I mean by resilience and (occasionally) ‘grit’ just isn’t the same thing. And herein lies the problem.

Resilience, according to the literature, is the ability to overcome major adversity. Much of the research has focussed on the way in which certain at-risk groups cope with major negative outcomes. This isn’t necessarily what others think it is – what they mean is day-to-day resilience, what psychologists Herb Marsh and Andrew Martin call ‘buoyancy’. Academic buoyancy refers to individuals’ ability to cope with all those everyday challenges (like a bad mark on a test or an impending exam) which a more characteristic of the lives of students than those covered by resilience. Buoyancy assumes minor negative outcomes, the accumulation of which can lead to major academic underachievement. Like resilience, buoyancy is a dynamic process – it isn’t stable and is state dependent (in the same way academic self-concept is state dependent). What we mean by this is that a student might be confident in, say, maths where they display a positive self-concept and high buoyancy but less confident in English where their self-concept is negative and their buoyancy low. Self-esteem represents a global measure and doesn’t necessarily relate to either academic or non-academic self-concept – increasing a child’s self-esteem won’t make them more confident in English if their English academic self-concept remains negative.

So what about ‘grit’? Duckworth defines ‘grit’ as “persuasiveness and passion for long-term goals” which is distinctly different from buoyancy and resilience (although all three are related). Grit does appear to predict resilience, which in turn predicts buoyancy but grit remains a rather vague term with a research base much lower than either resilience or buoyancy. Hunt used the terms ‘grit’ and ‘resilience’ interchangeably at the IEE, assuming that the former was simply an American manifestation of the latter. Angela Duckworth (the American psychologist who coined the term), however, suggests that ‘grit’ is a trait (it remains relatively stable over time) – resilience and buoyancy (as already mentioned) are dynamic processes – they aren’t stable.

So, with what appears to be a number of misunderstandings and misconceptions, how does all this play out on the ground? Several schools have certainly embraced the ‘idea’ of resilience but have interpreted it in different ways. The Knowledge is Power Programme (KIPP), in the United States, is certainly one of the more successful ones. KIPP has identified 24 characteristics that schools should try and develop, while Bedford Academy in the UK has narrowed these down to ‘the magnificent seven’ (grit, zest, optimism, social intelligence, gratitude, curiosity and self-control) while Rossett School in North Yorkshire, UK concentrates on the ‘3R’s’ (Resilience, Reflectiveness and Respect). Marsh and Martin identify the 5C’s (or motivational predictors) of academic buoyancy (Confidence, Coordination, Commitment, Composure and Control), suggesting another set of characteristics that could form part of a resilience or ‘character’ intervention.

Measuring day-to-day resilience presents yet another problem. KIPP, Bedford Academy and Rossett all issue pupils with a score (dependent on the system used) as part of the reporting system. In the case of Rossett, this appears to be subject specific – correctly identifying resilience as state (not trait) based. However, research into resiliency and buoyancy tends to use self-completion psychometric tools and it would be interesting to discover if these pupils would rate their resiliency as similar to the teacher rating (and which one we should choose as ‘accurate’). Unfortunately, any systematic investigation into the effectiveness of these systems is proving hard to track down and outcomes measures don’t seem to exist.

I’m certainly in favour of programmes that help to improve resilience and improve character. The problems lie both in a common terminology and accurate measurement, in terms of progress and outcomes. If we can’t trust the tools and we can’t measure the outcome it all turns into a bit of a farce.