Category

Business

What is morality if the future is known?

By | Business, Theology | No Comments

In the movie Arrival a linguist learning an alien language gains access to a consciousness that knows the future. Unlike our consciousness, which runs from cause to effect and is sequential, theirs can see the whole arc of time simultaneously.Their life is about discerning purpose and enacting events, while ours is about discerning good outcomes and deploying our free will and volition to those ends.

In Ted Chiang’s Story of Your Life, on which the screenplay is based, this is explained theoretically with reference to Fermat’s principle of least time. This states that the path taken by a ray between two given points is the path that can be traversed in the least time. Lurking behind this idea is the realisation that nature has an ability to test alternative paths: a ray of sunlight must know its destination in order to choose the optimal route. Chiang has his protagonist muse about the nature of free will in such a scheme: the virtue would not be in selecting the right actions, but in duly performing already known actions, in order that the future occurs as it should. It’s a bit like an actor respecting Shakespeare enough not to improvise one of his soliloquies.

While this thought experiment made me wonder about the ethics curriculum in alien schools – all virtue ethics and Stoicism? – it also made me pause to notice that in many ways we do know the future. While we may not have access to the narrative of our own individual futures, when it comes to AI we do have some inkling about the most likely futures we will face as a species. It has been the signal mission of the Science Fiction genre to set out these futures for our inspection. If we know our futures, what is the moral task?

In his book Life 3.0, Max Tegmark identifies 12 ‘AI Aftermath Scenarios’, ranging from Libertarian Utopia to Self-destruction. These variously describe the relationship between AI and humans, including ‘Zookeeper’ wherein AI retains humans essentially as pets. These 12 scenarios alone would give us fertile ground for devising ethical curricula, but to give you a flavour of the task here are some starter questions. If Zookeeper were a future, we would not be in charge. How could we act now to ensure that our keepers were benign? Cue John Rawls, and the end of our complacency about leaving the coding up to other people and to the free markets.

The problem about life’s current business logics is that they are irredeemably Enlightened and therefore utilitarian. ‘If x then y’ is about – obviously – optimising outcomes. Which would be bad news for us as a species if we were no longer in charge. While we are, we are kept safe, because behind this po-faced scientific rationalism there is a deep-seated commitment to the dignity of the human person: I will not harvest one of your kidneys. euthanize your aling granny, or sterilise your disabled daughter. But this is not a logic that has been transparently translated into computer programs: Asimov’s fictional Laws of Robotics famously forbid harm to humans, but this has been quickly ignored in the case of military drones and autonomous weapons. The philosopher John Rawls’ Veil of Ignorance is a way of thinking about the construction of society such that the designers should assume they might be the losers as well as the winners, on the view that designing in fairness for paupers as well as princes would result in a just society. Should we therefore now use author privilege to bake our supremacy into the basic global codes for AI and more precisely into our systems of law?

Tegmark’s scenarios suggest myriad other approaches and dilemmas. For instance, they include various Utopias, where humans and AI peacefully co-exist, because of the abolition of property rights and the introduction of guaranteed income. Does this possible future shed light on our current debates about Universal Basic Income?

Yet another scenario sees humans gracefully exiting in favour of their superior AI offspring. If we decided that we actively wanted to be bettered, how would we accelerate discoveries about consciousness to devise ever-better androids? In any scenario, our ethics and values must needs be reviewed, which has implications not only for public policy but also for law and for education.

But regardless of which thought experiment about the future you choose to run, the bottom line is this: if we know the range of likely outcomes, why are we not planning more thoroughly for them?

Robot Dread

By | Business, Theology | No Comments

I sense a morbid fear behind our catastrophizing about androids, which I reckon is to do with a loss of autonomy. It’s true that for periods in history tribes and people have assumed they have no autonomy, life being driven by the fates or by a predetermined design or creator, so this could be a particularly modern malady in an era that luxuriates in free will. But concern about the creep of cyborgism through the increasing use of technology in and around our bodies seems to produce a frisson of existential dread that I have been struggling to diagnose. Technology has always attracted its naysayers, from the early saboteurs to the Luddites and the Swing Rioters, and all the movements that opposed the Industrial Revolution, but this feels less about livelihoods and more about personhood.

Raymond Cattell famously identified two types or modes of human intelligence: crystallized intelligence and fluid intelligence. His model is not of course correct, but it’s a useful lens through which to view this particular problem. His categorization distinguishes between those facts and experiences that are crystallized as the sum total of our learned knowledge – our databases, if you like – and the fluid intelligence that abstracts from those databases to solve novel problems or make intuitive leaps. In trying to zero in on our scruple about AI, I think we are groping towards an understanding of the latter as something that is particularly human and which we would hold to be special. As more and more species are found to be able to use tools to solve problems, we tend to obsess about this vanishing ground of particularity, that with google and the great apes leaves us telling the story of Einstein’s beam of light to reassure ourselves that we’re still somehow neurobiologically distinctive.

I think our dread is an existential fear of being programmed, and of waking up to find we have been someone’s puppet all along. All those times when we felt we were wrestling with our consciences or weighing up weighty arguments, and the rules were already driving us towards a decision we had felt was our very own and not preordained in someone else’s playbook. If we have no agency, we have no freedom, except in an illusory and manipulative way.

But while I appreciate this fear, I suspect it is misbegotten. It is of course theoretically possible that AI will be able to programme us in the future. It’s doing that already in many ways, but at the moment with our willing consent, when it comes to our health and life management. And it’s this word ‘consent’ that it is crucial. Do we know enough about AI to be actively giving our consent, globally, in an informed way? And are we given the opportunity to consent? I think not, which is why the dread is useful, if it galvanizes us into interest and action. We should be asking sharp questions about programming and controls, and the ownership of code, and which red lines as a species we think it is not yet safe to cross. Does our driverless car choose to sacrifice the granny or the toddler, and do we consent to that coding? Are we happy that our every online reaction and transaction has become the database for AI, embedding as objective fact the sum total of all our flawed human and subjective interactions? Are we happy with a global Intellectual Property regime that both conveys and protects the ownership of AI technology on corporations without sufficient regulation or accountability to nation states?

It may be that one day we will find out we were programmed after all. But while we still rejoice in our free will, we need to exercise it, and not let this unspecified dread confine us to stupefaction watching SciFi on the sofa. Meanwhile, we still don’t really know enough about Cattell’s fluid intelligence, which does feel distinctive. Could we do more in our schools to develop this muscle, rather than maintaining our narrow focus on A*s in the STEM subjects that the robots have already nailed/? It might make us more human, and able to programme ever more human robots too… 

Robots don’t have childhoods

By | Business, Theology | 2 Comments

I’m sitting on the beach at North Berwick, with clear views out to the Bass Rock and May Isle, watching the children play. My daughter digs a deep hole, then runs off to find hermit crabs in the rock pools. Nearby, a young boy is buried up to the neck while his sister decorates his sarcophagus with shells. On the shore, a toddler stands transfixed by a washed-up jellyfish, while two older girls struggle to manipulate a boat in the shallows, trying to avoid the splashing boys playing swim-tig.

We’re under the benign shadow of the North Berwick Law, where there’s a bronze-age hill fort, so it’s likely this holiday postcard scene has not changed much since this part of Scotland was first settled, thousands of years ago, when those children dug holes, found crabs, and frolicked in the sea. I felt a wave of such sadness, thinking forward in time. Will this beach still play host to the children of the far distant future, or will we have designed out childhood by then? Robots don’t have childhoods because they don’t need them. Humans still do, but I wonder how much time you’ve spent trying to figure out why?

At the moment we need a childhood to grow physically, and to develop mentally towards adulthood and independence from our parents. All robots are adult already, so don’t need of this rather awkward and inefficient phase: just a quick test, then the on button. As a species, humans are ridiculously slow to mature. This is so obviously problematic when compared to other species that there must be an evolutionary reason for keeping this comparative design flaw. It seems that to develop a brain of human complexity takes time, hence this slow process.

But if we could decode the brain, could we not short-circuit the process by cloning adults and programming them direct? This of course is the ultimate design goal of AI, and we’re familiar with it from a whole host of SciFi movies: whether or not we keep humans as well, or simply use the secrets of their brains to evolve beyond them remains to be seen.

It might seem obscene, in these halcyon days of the UN Convention on the Rights of the Child, to empty my beach by indulging in a thought experiment about the future of childhood, but given that our technology has already overtaken our capacity to agree global ethical red lines in so many areas, we need to confront this spectre in order to work out not only why it feels anathema to us, but what that might do in response.  

What are these children doing on the beach? All parents who have spent interminable hours in dilapidated playgrounds will have had the same thought, in case there might be an easier way. They are playing, of course, but with Darwin’s eyes we can also see that they are learning. They are learning  about their physicality and their preferences; they are learning about other people and about relationships; they are learning about the natural world and about the world’s rules. So beware the child with no scabs on their knees: they have not yet learned about taking risks. We’re quite quixotic about childhood. Most of us loathed our own, but it seems we will fight to our dying breath to protect the childhoods of those we love, so our children still tuck a baby tooth under their pillow, and write to Santa Claus.

We’re in a transition. AI can already do many of the cognitive things that humans can do, more quickly, accurately and cheaply, and it’s improving all the time. This creates a dilemma. Because the future is not yet here, we’re still competing hard in the previous race, and on its terms. While most parents know that google has already overtaken their children, and there is more to education than information, the current social frenzy is still about doing your utmost to get your kids into the Gradgrind School of Facts. But the shadow of the future is already here, so we know that today’s highly-prized selective crammers, with a zeal for STEM and an ability to churn out volumes of A stars and Oxbridge places, have maybe 10-15 years to rake it in before their product becomes obsolete. We do need human computers in the interim, to programme AI for us, but once we’ve aced machine learning they will also become defunct. Meanwhile the crammers could save costs and boost performance by removing their STEM teachers in favour of so-called intelligent tutors, because AI-led learning already outperforms traditional learning in most settings.

Yet there is in traditional education a core curriculum which has not yet been improved by AI: moreover it may not be susceptible to AI in the way that STEM undoubtedly is. As I’ve argued elsewhere, I think the key to our humanity isn’t to be found in the clean lines of rationality that would delight any programmer, but in our junk code: the mistakes, the regrets, the dreams, the grief, the envy, the fear, and the joy. We learn these kinds of things very messily on the beaches and in the playgrounds of our childhoods; but at school we learn it particularly in the arts and the humanities: through the myths and stories about human waywardness; the mind-stretching disciplines of philosophy; and the creativity and exuberance of music, drama and art. In all of these, we learn the fundamentally frustrating qualitative nature of argument and criticism, where there are no clear-cut yes/no answers and it is nigh on impossible to score 100%. (By the way, it’s not that we can’t learn these things in STEM subjects, it’s just that they are not taught that way at elementary level). 

I’d wager we learn these junk code things with the particular help of the emotions, because of the role the amygdala plays in memory and in survival. And if that’s the case, it’s the reason we need to stop designing out bad stuff like not coming first or fluffing your lines in front of your peers. We might learn joy from a well-done sum, but we don’t learn shame or embarrassment or chagrin or schadenfreude. Kids need to wallow in the absolute limits of being human in order to feel these limits for themselves: this is a vital prerequisite for the rule of law as well as for human ingenuity and invention. Robots have to act within the bounds of the theoretically known because they are victims of their programming. If only to design better robots, we need to keep pushing at these bounds, in order to extend them: no paradigm shift was ever created without this very human recalcitrance.

So while we decide whether or not childhood is a state that we want to protect in the future, if we develop the know-how to avoid it, we should relish the very essence of childhood, by fighting back against the prevailing policy that prioritises STEM. Indeed we should reverse this trend while there is still time, and make the arts and the humanities both compulsory and subsidised in all formal education. And maybe we should risk teaching philosophy to those kids on the beach: versions of the trolley problem are a daily reality for them anyway, so they might be best placed to help us solve it.

A Year of Universal Basic Income?

By | Business, Theology | 4 Comments

Following the first cases of Covid-19 in the UK in January and February, lockdown was announced on the evening on Monday 23 March. Since then, citizens have been working from home, except for keyworkers, or have been laid off or furloughed under the government’s Job Retention Scheme. As at 12 May, 7.5 million jobs have been furloughed, and the British Chamber of Commerce reported that 71% of businesses surveyed by them had furloughed some staff. This means that the UK government are currently paying the wage bill for about a quarter of all UK employees. The scheme will be gradually phased out, with some part-time working and employer contributions, finally ending in October.

The economy has taken a huge hit. The decline in GDP in 2020 is likely to be the largest since WW2. The Office for Budget Responsibility and The Bank of England have published scenarios of a GDP fall of 14%, which compares rather badly to the 4.2% fall during the financial crisis in 2009. As an example, in April and into May, retail footfall was down by 75-80% compared with a year ago. Claims for the Universal Credit benefit increased by 2.5 million between 16 March and 5 May, and The Bank of England projects the unemployment rate rising to 9% in the second quarter of 2020, compared to 4% before the crisis. Many think this is optimistic, given that the US is already on 15%, having not deployed the policy option of a job retention scheme to slow down the rate of layoffs.

None of this is good news for the economy, but it is very bad news for household debt. Before the crisis, government data reported that household debt had peaked in Q2 2008 at 147% of household disposable income. It then declined to 124% by late 2015. But growth in household debt levels accelerated from early 2016, and the debt-to-income ratio had risen to 128% by mid-2017. In Q3 2019 it stood at 126.8%. Already, the Financial Conduct Authority had established that 12% of UK adults (5.9 million people) have no savings and investments at all and that a further 37% (19.1 million people) have savings or investments of less than £10,000; meaning that almost half of UK adults either have no savings or ones less than £10,000 in value. In particular, of those in the most vulnerable financial category, 3.7 million said that their household could only continue to cover living expenses for under a week if they lost their main source of income, without having to borrow money or to ask for help from friends or family. A survey in March showed that 49% of those polled expected to have difficulty in paying bills, with 57% of those working saying their earnings were lower than in the previous week. In May, the Citizens Advice Bureau reports that an estimated 6 million people have already fallen behind on a household bill due to Coronavirus. 4 million people have fallen behind on rent, council tax or on utility bills where they will have little protection from debt collection when temporary protections on enforcement expire. It is true that many utility companies, banks and landlords have offered payment holidays for lockdown, which is very welcome. But these arrangements merely defer and delay payments rather than cancel them, which means they will be mounting up, presenting many households with a need either to reschedule their payments to render them affordable, or the prospect of being unable to meet the sudden increase in payments once the holiday ends.

These are not the feckless debts of a bling generation, these are cost-of-living debts. Those in the clutches of the high-interest lenders were already using these clear and simple, readily available and impersonal on-line loans to finance rainy-day purchases. In 2014 research by the Competition and Markets Authority suggested that the average payday loan is for around £260, lent over 30 days. Reasons for these loans fell into three categories. While some were used to finance living costs, they were most often used for emergency expenses including the repair or replacement of cars, boilers, and white goods. The other major category was seasonal, particularly the need to buy Christmas presents, or new school uniform and shoes for a new term. Very few were for more frivolous expenditure, for example a last-minute holiday or luxury item. This suggests that a cushion of just £300 savings might enable most households to avoid these kinds of emergency loans.

Looking forward to a post-Covid, post-Brexit Britain, the future looks very bleak, particularly for the financially vulnerable. Even the less vulnerable may now be cautious about post-lockdown spending, just when the economy needs a boost. So my proposal is for the UK government to introduce a year of Universal Basic Income (UBI), which would provide both cushion and spending. Those who do not need it should be invited to donate it to a charity of their choosing, because the UK’s charities too are suffering unprecedented falls in income, with many set to close.

First mooted in Thomas More’s Utopia in 1516, UBI is defined by lifelong champion Guy Standing as ‘a modest amount of money paid unconditionally to individuals on a regular basis; intended to be paid to all, regardless of age, gender, marital status, work status and work history’. Many countries around the world are experimenting with it. Since 1982, residents in Alaska are paid an annual Permanent Fund Dividend, funded by a share of the profits from the state’s oil industry. Last year, the payment amounted to £1,300 per citizen. This is in comparison to the recent pilot in Finland which awarded 2,000 randomly selected unemployed people with an income of around £500 month; and the 2019 proposal by Guy Standing that the UK rate should be £48 a week. A report by Reform Scotland suggested giving adults £100 per week and children £50 per week. Modelling conducted by the RSA for their report on a basic income for Scotland found that in Fife, a basic income of £2,400 a year would reduce relative household poverty by 8.5% and halve destitution. A basic income of £4,800 a year would reduce relative household poverty by 33% and end destitution completely. Foodbanks could become a thing of the past.

Malcolm Torry argues that UBI could be funded through a 5% rise in income tax, while the new economics foundation proposes scrapping the tax-free personal allowance to finance it. Others have argued that it could largely be financed by the reinvention of the current benefits system. My more modest proposal is to introduce this for a year only, as part of the Covid-19 recovery plan, both for its short-term effects and in order to test it properly as an option for the future. It would therefore need to be carefully designed to maximise its potential to generate useful research data. It might also be used to model a separate ‘gap year’ product in the tradition of National Service, whereby every citizen in the future would have the right to a year off, to use for volunteering and citizenship, or perhaps to re-train or re-skill.

Opposition to UBI tends to centre around two key issues. One is moral hazard, and the other its political irreversibility. The latter argues for a very clear Covid-related one-year programme to permit a political u-turn should the experiment not work. The former argument is less easily dismissed. Would citizens abuse the system and would it drive dependency and idleness? Is there any evidence that these schemes in fact work? As a Christian, I hear these concerns, but I hear the moral arguments more loudly. While the economic results of the recent UBI pilot in Finland were unremarkable, one finding stood out for me: ‘the basic income recipients were more satisfied with their lives and experienced less mental strain than the control group. They also had a more positive perception of their economic welfare.’ Given the mental ill-health effects of Covid-19 and lock-down, a year’s UBI would be a gracious way to honour the dignity of the citizen and say thank you for their efforts to protect each other during the pandemic.

with thanks to Andrew Phillips from the Jubilee Centre for help with research

How to Budge Up

By | Business | No Comments

The head-hunter Joanna Moriarty has a wonderful turn of phrase. She says it’s great that men are appointing women to senior roles, but now that women have finally made it into the boardroom, she reckons the men still need to ‘budge up’ to accommodate them. I know quite a few men who struggle with this, because it seems that whatever they do it’s never quite right. It either feels patronising, or it’s not supportive enough. And it’s in everyone’s interests that women perform well, particularly in the boardroom, not just in service of better results but also to be role models for those coming up behind them. So how might men budge up, exactly? Here are 7 golden rules:

1 Avoid Dominating

First, some obvious physical things. Do watch your body language and tone, and of course the appropriateness of what you say and where you let your gaze fall. Her neckline may be a response to the weather or the fashion, not an invitation for scrutiny. In meetings if your power casts a long shadow think carefully about seating plans and how to ensure that the women in the room are best placed to participate. Keep a score on meeting contributions, and if the women aren’t contributing, tactfully invite them to. Take care in Q&A not just to select the men, who will tend to ask the first questions, and to explicitly invite the women present to ask questions too [I know, we should just get on with it: we’re trying!]. When women do contribute, use your body language and attention to encourage it: turning to your notes or your phone sends out a too-obvious signal to the other men present that they too should feel free to switch off when a woman talks.

2 Avoid Mansplaining

Social media is rife with examples of embarrassing situations where eminent females have suffered the indignity of a confident man authoritatively correcting them about a matter on which they are frankly expert. But this kind of ‘mansplaining’ occurs routinely in situations where women are still in the minority and men are used to being in charge. There is prevention, and there is cure. The first is about research. Most women I know go into meetings armed with a google search so they know who they’ll be meeting so as to avoid faux pas. Then they listen carefully to pick up cues, and adjust their pitch accordingly. Call it years of social conditioning if you like, but it’s this finely honed intuition that has kept the social wheels turning for generations. But if your male antennae fail you, you’ll find the basilisk stare your ally in remedying the problem. If you do find yourself in full flow, keep your eyes peeled for this vital clue. If you spot it, stop talking, and apologise.

3 Avoid Manterrupting

Numerous studies show that women are more prone to be interrupted than men. It’s more likely they’ll be interrupted by a man, but women are also more prone to interrupting other women than they are to interrupting a man. So apart from letting her finish, you might also use your power in the room to ensure everyone gets to finish. Granted, there are the office bores who will always require interrupting, but if that does not apply, practise active listening to keep you from breaking in, and keep an eye out for anyone else who may need your support to ensure their contribution is heard.

4 Avoid Hepeating

Hepeating is when a male colleague takes the credit for making a brilliant intervention that had in fact already been made earlier in the meeting by a woman. It seems that those men who do hepeat are often unaware that a female colleague has already made that point. Perhaps they don’t quite hear female voices as authoritative, or are poor listeners. If it is conscious bias, stop it. The theft of ideas is not acceptable in any walk of life. If it is unconscious bias, watch out for it, and again check for basilisk stares if the room suddenly goes chilly after you’ve made a particularly brilliant remark.

5 Avoid Rescuing

In the National Gallery there’s a wonderful picture from c1470 of St George and the Dragon by Ucello. Enter St George, on his charger, with his enormous lance! But the damsel who was supposed to be in distress is perfectly calm: she already has the dragon on a leash. Chivalry is not dead, but it’s often misplaced at work. Please check before you rescue. She may not have said or done what you would have, but she may indeed have thought long and hard about it, and may even be right. It may feel risky to you to let her lead, but it’s terminally undermining to be corrected in public, particularly when it’s rare that a ‘mistake’ has actually been made.

6 Avoid Auto-competing

Many men have been raised by their families and their schooling to strive to win, both on the field and off. Competition is viewed as such a wholesome discipline that it is enshrined unassailably within our economic system. But Shelley Taylor’s research shows that men and women have different biological responses to stress. It turns out that a competitive ‘fight or flight’ response is typically male, whereas women under pressure are more likely to reach out and communicate, a response she has dubbed ‘tend and befriend’. While this finding has much broader ramifications, in this context it means that while it feels entirely normal for men automatically to compete in stressful contexts, for most women this does not feel like a normal response, and can in fact feel hostile when she is on its receiving end. Male working environments tend to reek of the accumulated effect of generations of men competing with each other, and in extreme environments like trading floors this can magnify operational risk. But even in ‘normal’ working environments with male-dominated cultures, the endless cut-and-thrust that feels like sport to men feels wearing and unnecessary to women. Always reacting to a perceived challenge with an attack is unlikely to be a productive strategy, and will tend to shut down your female colleagues rather than get the best out of them. And beware: any ‘tend and befriend’ responses they suggest when the pressure is on might feel weak and foolish to you, primed to triumph in a zero-sum game. Keep your cards close to your chest and don’t give the game away! But they have the maths on their side: the game theorists have proved that co-operation is a superior strategy in most interactions.

7 Ask?

Finally, thank you for your help in getting us this far, and for your help in securing more representative boardrooms in the future. We know it’s incredibly fraught, trying to avoid the extremes of avuncular patronage on the one hand and political-correctness-gone-mad on the other. So it may be that it’s simply easiest for you to ask. Could you risk asking your female colleagues for feedback on how supported by you they currently feel, and whether there’s anything else you might do to help them be at their best in the workplace?

Character and Confidence

By | Business | No Comments

The FT’s Sarah O’Connor unleashed a bit of a storm when she wrote recently that teaching state-school kids firm handshakes was patronising, and that ‘character education’ had no place in the national curriculum. I should like to contest this both as the Chairman of Gordonstoun school, which invented character education, and as a state-school educated Scot who was lucky enough to attend Lucie Clayton Finishing School in Kensington one summer. Read More

Be more unicorn

By | Business | No Comments

You can’t avoid the unicorns, if you go into any gift shop these days. They’re everywhere – key-rings, handbags, fairy-lights; you can even buy sequinned t-shirts that say ‘be more unicorn’ on the front. It’s funny how much appeal this mythical beast has.

In Scotland, where it’s the national animal, Stirling Castle has gone unicorn-mad. In the Queen’s Inner Hall, you’ll find seven hand-woven unicorn tapestries hanging on the walls. They’re based on the famous Hunt of the Unicorn series from the 1500s, which are now in the Metropolitan Museum in New York. Re-creating them in Stirling was quite a project, costing £2million and taking 13 years. The panels show a unicorn being hunted, tamed by a virgin, killed, then appearing alive again in captivity. This narrative is assumed to be an allegorical interpretation of the life of Christ, although no-one really knows what’s going on in this story. Read More

How to design in character education

By | Business | No Comments

Extreme Performance

Among adults, one of the most common phobias is public speaking. Well, it seems that Gordonstoun could teach us a thing or two about how to fix that. When our Head of Music first heard the results of research into the value of our out-of-classroom offer, he responded by mounting weekly recitals. Struck by the compelling pedagogy of challenge, he realised that the students needed more than just the once-termly terror of a public concert if they were really to embed public performance skills. And what could be more terrifying that standing in the loneliness of a concert hall surrounded by your peers, waiting to play?

Transferable learnings

When we launched the Edinburgh University research into the non-classroom curriculum at Gordonstoun, it was greeted with predictable wails about public school privilege. “It’s easy for them – they’ve got a YACHT!” But this is to misunderstand why we did it. We did it because being a charity is about more than sharing facilities or providing bursaries. It is about being generous about anything we’ve been lucky enough to learn. Gordonstoun has already shared Outward Bound and the Duke of Edinburgh Award with the world. As one of the founders of character education – now very much in vogue – we wanted to assist the public debate about it by providing real data. We’re one of the only schools that can do that, because we have 80 years’ worth of alumni to ask, all of whom were immersed in a carefully designed character-based curriculum, long before it became trendy. And character education cannot be tested at the time of acquisition, because the very point of it if its durability, and the fact that a good one will keep delivering for you throughout your lifetime. So this multi-method longitudinal study is very important. We even asked parents, because they can often see the changes in their children more readily than they can, particularly when they are pre-career.

So let me summarise the essence of our findings, and why they are so transferable, not only through school populations, but for everyone involved in youth work, training and development.

(1) Learning to try

Our research showed that a varied and repeated out-of-classroom curriculum that is compulsory for all students compels them to try things they would otherwise avoid. Our findings showed us that this ‘have a go’ mentality lasts well beyond the Gordonstoun years, and has inspired many to keep trying new things for the rest of their lives. Schools who wish to emulate this need only make more of their non-academic curriculum compulsory, so that young people gain a broader exposure to experience, and learn not be be afraid of trying something new.

(2) Learning to fail

Having to try everything means that failure is inevitable, given that it is unlikely that everyone will be good at everything. Students learn to fail, and they learn how others fail too. They learn that they may need other people to succeed, but also that they may be better than others at unexpected things. Again, schools wishing to help students learn to fail well need only identify no-examined elements of the curriculum where there is an opportunity for experimentation, and create a safe environment where failure is not considered socially terminal. This could be normalised by delaying specialisation and by making elements of sporting or drama activities compulsory for all.

(3) Learning to try again

Because the curriculum is regular and repeated, students have to have to have another go, even if they failed last time. So they learn resilience, and about conquering their fears, both about their own abilities, and about how their peers will react to them. Again, this teaches students how to pick themselves up, and many alumni told us that this ability to bounce back had been crucial in helping them to navigate subsequent career setbacks. Anyone working with young people who wants to help them learn this important life skill would need to design in opportunities for students to identify their fears and set about conquering them, whether it is public speaking, a fear of heights or water, or just plain social shyness.

(4) Social levelling

We found that in the melting pot that is Gordonstoun, the out-of-classroom curriculum is a fantastic leveller. No-one cares who your parents are on exped if you forgot to pack the hot chocolate. And because students often found themselves being led or rescued by peers they would never have expected to thrive in these contexts, it engenders a humility and respect for other people based on ability and character, and not on culture or background. Any school delivers these lessons by exposing the same groups of students to a range of contexts, where different people will have a change to shine each time.

(5) Gender

For the women in our sample, being pitted against men in so many different scenarios instils a particularly steady career confidence. Working together both in and out of the classroom, they were bound to have seen men be worse as well as better than them in such a wide variety of contexts, that their expectations in the workplace are very different, which has helped our female alumni to thrive. Again, any opportunity for mixed-gender groups to face challenges together can help with this, if the range of opportunities offered is sufficent to generate multiple data points.

Character breeds confidence

I taught leadership for over a decade at Ashridge Business School, where I had the opportunity to meet thousands of senior leaders, and to learn about their challenges. What they told me was that they wanted to be more confident. What the Gordonstoun research shows is that confidence is a natural by-product of the experience of facing your fears, time after time, and surviving them. This robs them of their power to stop you in your tracks, because you know you have developed the power to prevail. If all schools and those who provide youth activities took these findings to heart, and adapted them to use in their own contexts, we wouldn’t have business leaders who are too scared to do the right thing. We’d have brave leaders of character, which is what the world so desperately needs today.

A version of this post appeared in Insider on 20 July 2018.

For the sake of honour

By | Business, Theology | One Comment

Honour is one of those words that gets bandied about rather a lot. Sometimes it’s used just as a label, as in the Honours of Scotland; ‘it wasn’t me, Your Honour’; and ‘she gave him a gong in the Honours’. We also talk about ‘honour’ killings, as well as Honorary degrees. But what does it mean when we say things like: ‘I’m honoured to meet you;’ ‘I promise on my honour;’ or even ‘wilt thou love her, comfort her, honour, and keep her?’ These usages seems to invoke a sense of respect and virtue, something that is more about an orientation or a behaviour.

Honour is one of those old-fashioned words, like manners. But when we use it of someone, we refer to that rather rare and durable characteristic of their being reliably moral. We think people are honourable if they do the right thing. We tend to notice it all the more if it proves costly: our mental picture is probably of a tweedy and stoic English gent standing on a lonely pier, waving goodbye to his true love because she deserves better. So is honour as outdated as curtsying to cakes, and should we have none of it? On the contrary, we need honour more than ever, and we need to start teaching it to our children again. Read More