Future Learning

What Is the Current State of Learning?

In the last century, humanity has made great strides in improving access to education for all children. More than 91 percent of the world’s children now attend primary school, but that still leaves 57 million children without access to learning. We have a long road ahead of us if we are to re -imagine the Future of Education. Today’s most pressing challenges in learning include improving the quality of education and ensuring every child gets an education.

Education quality varies widely in different locations around the world. This quality difference stems from any number of factors, including poorly trained teachers and administrators, under-resourced infrastructure, and outdated rote-based curricula that do not teach skills relevant to today’s economies. And depending on the location, socio-economic problems can often interfere with learning—problems ranging from child malnourishment and illness to mental health issues, violence, and the caregivers’ inability to pay school fees.

Most children who don’t attend school live in Sub-Saharan Africa, conflicted or war zones, and rural areas without transportation. Children not attending school might also face additional challenges, such as having disabilities or being female in locations where girls have more restrictions than boys.

In a world of dramatic, accelerating technological and economic change, we all may face a future of rapidly changing jobs. This relentless pace of change is increasingly demanding lifelong learning to supplement our formal education. We will therefore need to be flexible in terms of how we meet our basic needs while finding stimulation and purpose in our learning and work.

The guide on the side or the sage on the stage?

Markham-21st century teacher

The great fiction that a teacher today has become a “guide on the side” is now hardwired into nearly every conversation about the future of teaching and learning. Teachers don’t deliver information any longer; they act as co-constructivists and facilitators, sitting shoulder to shoulder with students.

Why raise objections to this new narrative? First, it’s disingenuous. Teachers still stand at the front of the room. They teach, using traditional tools and tapping their repository of information to share with students. They lecture. Yes, sometimes too long, but a competent teacher knows when to sit down or ask questions.

The second objection is aimed at education’s habit of settling for shiny new terms when the facts demand a deeper commitment to truth telling. The truth is that in the emerging  era of project based learning, personalisation, 21stcentury skills training, commitment to social emotional growth, and attention to equity and social challenges, the complexities of teaching can’t be captured by a simple ‘You’re now a guide on the side’ mandate. Teaching in this ecosystem calls upon a rich, demanding skill set that has transformed the profession into one of the most complex, creative, and (potentially) rewarding jobs on the planet.

Given the numbers of teachers expressing dissatisfaction with their jobs, leaving the profession, or reporting burnout, one might conclude the opposite. But the turmoil can be traced to the system of pacing guides and testing that forces compliance. Educators are tired of teaching inside the lines. In schools focused on innovation, the job may be challenging, but it’s also energizing precisely because it invokes deep purpose and reward.

Acknowledging the new state of the profession is critical. With standards obsessed systems backed by high stakes testing wilting under the increasing pressure of on demand, self-directed learning, schools will yield to more flexible curriculum, online options, and strengths/skills outcomes supporting the journey of learning, not the final degree.

As systems change, inevitably a teacher’s role will have to be reconceptualized as a new mental model evolves around what it means to ‘teach.’ Evidence shows how difficult this mind shift will be. Despite the decades-old ‘guide on the side’ conversation, no corresponding attention is paid yet to developing the facilitation skills and coaching protocols that teachers need for effective people management. The focus instead remains on classroom management and traditional behavioural tools.

Preparing teachers for this new role amplifies the challenge. Under industrial rules, a teacher is trained (‘prepared’) to implement a skill—to follow a pacing guide, roll out a reading program, deliver content, and ‘manage’ a classroom. But already complex professions operating in dynamic environments foresee ‘training’ obsolete  . There is an increasing demand for the ‘T-shaped Person’ , who has both the breadth and depth to respond to variety and novelty.

The observational and relational skills necessary for deep facilitation and mentoring in inquiry environments meet this standard of complexity. Rather than being preparedteachers will need to prepare themselves. Techniques will matter, but true competency will derive from experience, practice, and agile learning within an ecosystem of constant growth.

This sounds a bit theoretical compared to the seat time and one size fits all approach to teacher preparation. But transforming our mental model of a teacher is not really that difficult. First, stop relying on the one stop category of ‘guide on the side’ and start identifying the skill sets necessary to be a ‘future ready’ teacher. Undoubtedly the nomenclature will change over the next decade, but projections on digital learning, personalisation, creativity and contribution indicate at least five categories of teacher skilfulness:

  • Practitioner. No matter how much Google or AI invades the classroom, teachers will still deliver knowledge. But in inquiry classrooms, teachers mainly deliver on the fly with ‘just in time’ information in response to student questions and wonders. Since knowledge can’t be easily scripted, pre-packaged, or confined to shop-worn lessons, teachers will need to do a deep dive into their subjects and know not just the subject, but the field. More important, they will need to master a new skill set focused on project-based learning and inquiry practices.
  • Facilitator. Threshing out the true roles of the guide on the side is the next step. A facilitator’s prime job is to set up the conditions for optimal learning by building safety, community, and relationships into the environment. Setting challenges, building successful teams, monitoring deeper learning, and combining design thinking with high quality PBL practices come next. In many ways, the required skill set is to know how to put all the pieces in place for deeper learning—and then getting out of the way.
  • Coach. In a world with infinite paths to success, personalization is inevitable. Each student will start at a different place and end in a different place; each will bring unique talents and perspectives to the journey. A coach teaches and models skills, listens deeply enough to know individual needs, and realizes that coaching is not just conversation but an exchange that succeeds through respectful protocols. The skill set? Teachers will need explicit skills in offering feedback and techniques across thinking, creating, designing, collaborating, and communicating domains.
  • Mentor. The Mentor shares the skill set of the therapist. However daunting, teachers will need to expand their comfort zone and be willing to teach, assess, highlight, value, and offer support for empathy, curiosity, perseverance, and the range of positive strengths identified as successful behaviours in today’s world. This extends the Coach’s role into a much more personal and engaged relationship with students, requiring deep observational skills backed by empathy, deep listening, attentive presence, and an attitude of openness and nonjudgment.
  • Changemaker. Students will not remain silent or standards compliant as the globe contends with climate change, inequality, or migration. As the innovation meme intensifies, they will want to find purpose, put the sustainable goals into action, and in general move way beyond the four walls of school. For teachers, resistance will be futile. Rather, the new skill set of the future-ready teacher is to become a co-learner and co-creator, working with students on service learning projects or finding ways to apply classroom knowledge to authentic issues. This trend is already visible; expect it to accelerate.

The next step? Let go of seat time metrics to certify teachers. Instead, focus on the professional journey and full immersion in a digital and face to face ecosystem that invites deep collaboration, on demand knowledge, shared practices, high quality feedback, and teacher-led systems leadership. In other words, start a rich conversation and keep it going through every means possible. That’s the way forward for teaching in the 21st century.

Stephen Lee

Hypo – cognition is a censorship tool that mutes what we can feel.

Can you find the symbol that is different from the rest?


How long did that take you? Let’s try another one. Find the symbol that is different from the rest:


It is the same image that you saw before, just rotated 90 degrees to the right. Only this time, it is much easier to spot the differing symbol. The reason we are expert at discerning the number 2 from the number 5s is precisely that: they are 2 and 5 – numerical conceptions that we have developed from an early age, mental representations imbued with meaning. Disable the conceptual access, and we’d see nothing but a jumble of angled lines, the same way that we grimaced at the squiggly symbol in the earlier image: alien and unrecognisable, barely distinguishable from its likewise oddly shaped neighbours.

It is a strange feeling, stumbling upon an experience that we wish we had the apt words to describe, a precise language to capture. When we don’t, we are in a state of hypocognition, which means we lack the linguistic or cognitive representation of a concept to describe ideas or interpret experiences. The term was introduced to behavioural science by the American anthropologist Robert Levy, who in 1973 documented a peculiar observation: Tahitians expressed no grief when they suffered the loss of a loved one. They fell sick. They sensed strangeness. Yet, they could not articulate grief, because they had no concept of grief in the first place. Tahitians, in their reckoning of love and loss, and their wrestling with death and darkness, suffered not from grief but a hypocognition of grief.

No one, in fact, is immune to hypocognition. In our  research with the psychologist David Dunning at the University of Michigan, we asked participants: have you ever heard of the concept benevolent sexism?

If you haven’t, this is a term describing a chivalrous attitude that appears favourable towards women, but actually reinforces traditional gender roles and perpetuates gender stereotypes. When a professor says ‘Women are fragile and delicate creatures,’ or when a neighbour jests ‘I let my wife deal with paint colours – women are good at that kind of stuff,’ you can sense the discomfort lingering in air. Such comments reflect benevolent sexism because they sound like compliments, but carry presumptions of women as either the fragile damsel in need of protection or the default caretaker laden with household labour.

We then asked: how often have you noticed benevolent sexist comments or behaviours over the past two weeks? The results were striking. People who were hypocognitive of a concept noticed instances of it less often around them, compared with the people who knew the concept. Lacking the concept of benevolent sexism blinds you to its occurrence. Knowing the concept of benevolent sexism renders visible its manifestation.

On the flip side, if you have never heard of shoeburyness, consider yourself blessed. People who know the concept (shoeburyness: the vague uncomfortable feeling of sitting on a seat that is still radiating warmth from someone else’s bottom) are plagued by the sensation more often than those who are hypocognitive.

Hypocognition is not readily cured by acquiring a new word. Nor do ‘Words of the Year’ frequently succeed in becoming permanent fixtures of the lexicon. Nevertheless, the proliferation of neologisms can lend affirmation to unspoken moments of disquietude, to an amorphous cloud of restlessness in the modern world.

Before I knew what phubbing was, I didn’t have the guts – or the word – to call out my friend for phubbing me (snubbing me for her phone) in the middle of a conversation. And now… I still don’t – not when I myself can barely resist the urge of being figital (excessively checking one’s digital device) and curb my own performative busyness. But alas, though I am far from escaping the sprawling influences of digital addiction, I am no longer hypocognitive of them. As cognitive psychology affirms, having a verbal label – even a nonsensical terminology, an apparent portmanteau – can distil a nebulous phenomenon into an experience that’s more immediate and concrete.

If the prerequisite to addressing a problem is to identify it, what happens when the identifier remains hypocognised? In describing his nontraditional family arrangement, the American writer Andrew Solomon noted the poverty of language to mirror the modern complexities of relatedness. In the absence of an expanding lexicon, we default to denotations bounded by the traditional descriptors of a nuclear family. ‘My husband and I are often asked whether our son George’s surrogate mother is “like an aunt”,’ Solomon wrote in The Guardian in 2017. ‘We are asked which of us is “really the mom”. Single parents are routinely asked what it is like to be “both mother and father”.’

But the darkest form of hypocognition is one born out of motivated, purposeful intentions. A frequently overlooked part of Levy’s treatise on Tahitians is why they suffered from a hypocognition of grief. As it turns out, Tahitians did have a private inkling of grief. However, the community deliberately kept the public knowledge of the emotion hypocognitive to suppress its expression. Hypocognition was used as a form of social control, a wily tactic to expressly dispel unwanted concepts by never elaborating on them. After all, how can you feel something that doesn’t exist in the first place?

Intentional hypocognition can serve as a powerful means of information control. In 2010, the Chinese rebel writer Han Han told CNN that any of his writings containing the words ‘government’ or ‘communist’ would be censored by the Chinese internet police. Ironically, these censorship efforts also muffled an abundance of praise from pro-leadership blogs. An effusive commendation such as ‘Long live the government!’ would be censored too, for the mere mention of ‘government’.

A closer look reveals the furtive workings of hypocognition. Rather than rebuking negative remarks and rewarding praises, the government blocks access to any related discussion altogether, rendering any conceptual understanding of politically sensitive information impoverished in the public consciousness. ‘They don’t want people discussing events. They simply pretend nothing happened… That’s their goal,’ Han Han said. Regulating what is said is more difficult than ensuring nothing is said. The peril of silence is not a suffocation of ideas. It is to engender a state of blithe apathy in which no idea is formed.

Nevertheless, I’d like to think that the attempt at hypocognising a concept can often propel a more urgent need for its expression. The emergence of a unifying language of #MeToo gives voice to those who were compelled into silence. The materialisation in 2017 of a new gender glossary lends credence to the existence of those whose identity departs from the rigid binaries of man and woman. Ideas and categories that are yet to be conceptualised leave open aspirational possibilities for future progress. Every now and then, a new term will bubble up; a new concept will burst forth – to give meaning to walks of life previously starved of recognition, to instil life into our inchoate impulses, to tell the stories that need to be told.


1_wImPKFuDkt8GTCKqwX4zzQAccording to a study featured in the New Scientist, men are seen as more desirable when they are in relationships.

When single women were given the choice to pursue a relationship with either a single man or a taken man, 59 percent said they were interested in the single man, while 90 percent said they give the man who’s attached a try.

The researchers believed that the reason for this is that men in relationships are “preselected”.

Preselection is the idea that women feel attracted to men they think other women want.

Hector Castillo explained this concept in a YouTube video with the following credit card analogy.

Take a credit card. It’s a card you use to put in money that you currently don’t have in the hopes that you will pay it later through credit. The more you make (or not) these payments, the better (or worse) your credit score becomes. Over time, credit companies essentially pre-approve your credit card with a good (or bad) credit score depending on your credit history.

How this works with women is that, when she sees a man with a woman who is more good-looking/more attractive than her, it’s like the man has a credit score above 670, which is considered a good credit score. Whereas if the man was with a woman who’s less good-looking/attractive than her, that’s an equivalent of a credit score of 629, which is considered a bad credit score.

Researchers in a study published in Scientific Reports confirmed this observation as they found that men with girlfriends gain an “attractiveness boost” and are suddenly seen as more attractive than they normally would. The girlfriend is essentially approving the man for the other woman around him, making him look more attractive.

It is as if the girlfriend is holding a sign point towards the man saying: “This man is desirable, sexy and totally dateable.” Women notice this and begin to find the man desirable, sexy and totally dateable.

This is why guys often notice that when they are single, almost nobody pays him attention. But the moment he begins dating someone, he suddenly gets hit on by a few women.

Why Do Women Use Preselection?

If you consider female evolutionary strategy, it’s mostly about minimizing risk.

Women can only produce an offspring at a rate of 9 months at a time. So they will seek out the most high-quality man they can get in order to make sure their offspring have the highest chance of survival (both in terms of good genes and the resources the man provides).

Every guy knows this, so they will try to portray themselves as one of these high-quality men.

And women know men do this. So they look for ways to make it easier to choose men.

If you consider close relationships, they take a ridiculous amount of time to form. Researchers have actually studied how long it takes to build a relationship. They found that it takes on average of 50 hours to turn an acquaintance into a friend, 90 hours to turn a friend to a good friend, and 200 hours to turn a good friend to a best friend.

Women simply don’t have the time to go on a date with every single guy they find attractive. So what they do? They outsource that task to other women.

We depend on others for our opinions. This is even more true with decisions that involve a lot of uncertainty. The right choice isn’t always easy to determine, and so make our decisions based on what others are doing.

Mate selection is no different. And since it’s a naturally uncertain task, women will most often take the most secure path by looking for already existing proof that the man is of value. What better evidence there is than to see a man surrounded by attractive women.

Women are the choosy sex and have a greater abundance of sexual options than men. So if another girl is with a man, most of the work is already done for her.

Most guys are average, and if a woman, who naturally has a lot of options, is picking him, then that means that he has attractive traits that made her choose him.

She feels more secure. She isn’t risking it with a single man (whose mate value is unclear), making it a far more reliable way to tell attraction than anything else a man could say (he could lie) or do (he could easily pretend).

How to Use Preselection To Get Women

Now that you’re sold on preselection, or at least consider that the idea has some merit, how can you use it attract more women into your life?

Here are three ways:

  1. Let Women See You With Other Women
  2. Develop Confidence and Outcome Independence
  3. Learn Women Speak

Let’s look at each:

#1 — Let Women See You With Other Women

The obvious way to use preselection is to let other women see you with another woman/women. The more attractive the woman, the better.

According to the research paper on Mate Choice Copying in Humans, the attractiveness of the woman beside the man also matters. Even more so than how attractive the man is. The research also suggests that it works the other way. A man with an unattractive female sees a decrease in perceived attractiveness.

Writer Chase Amante believed there is a ‘rank’ in terms of people women can see you with. He gave the list below, in his blog, in order of most alluring to least:

  • A group of attractive/young women following or watching you
  • You with multiple attractive/young women doting on you
  • A group of attractive/young mixed or men following or watching you
  • You with a single attractive/young woman doting on you
  • You with a very cool-looking buddy hanging out and having a good time
  • You with friends, having a good time

So if you’re going out in a public setting, bring a wing-woman to help introduce you to other women. Or you can invite your friends, male and/or female, to hang out with you.

The social proof will work in your favor and you’ll find women being much more receptive to you when you approach them.

#2 — Develop Confidence and Outcome Independence

Confidence and Outcome Independence are two of the most attractive traits a man can have. Women are most attracted to a man who knows exactly what he wants from life and spends so much time pursuing his missions, that he can’t afford to be distracted from his dreams by a woman.

Confidence is about how capable you think you are. You know you’re not perfect, but you know that you have the skills needed to accomplish the things you want in life. Confidence naturally develops as you become more experienced with a good number of attractive women.

Outcome Independence is when you are not moved about a specific scenario that occurs to you. If it works out, good. If it doesn’t, again, also good. Either way, you win because you are detached from the outcome.

So if you act nervous (lacking confidence) or needy (lacking outcome independence), her attraction for you will plummet. So you have to develop these traits to the point where you feel completely relaxed and non-needy. I talk more about this here.

Basically, you want:

  • Approach more women. The more women you interact with, the more comfortable you will get with them. You become less attached to one girl because you know that there plenty of women out there. Success breeds confidence… so by having positive experiences you allow yourself to feel more confident in that area of your life.
  • Seek Self-approval. All of your negative emotions come from your reaction to the external world. When others disapprove with you, insult you, when you regret decisions, all of these things empty your emotional tank. So from now on, only you can give yourself approval.

#3 — Learn Women Speak

Men’s preferred style of communication is one that’s more direct, straight-to-the-point, and leads to an objective truth. Women, on the other hand, prefer to sub-communicate.

Sub-communication is a feminine style of conversation that is based on relating through emotions, ambiguity, double meaning, and indirection. Essentially, it’s a style of communication that favors harmony.

Women don’t like talking like men. They find logical, short conversations boring. For women, conversations are about harmony and relating. They’re about feeling understood and connecting with the other person. So, if you can speak Womanese, you show an indirect type of preselection. It communicates that you have interacted with many women before.

One of the main traits of female sub-communication is ambivalence. That is, when a statement is made that could mean two different opposite things. Here are a few examples of what women say and what they actually mean to help you better understand this concept:

  • “Maybe.” Usually, it means no.
  • “Yes”. It could mean yes, no or maybe. You have to look at the context and her body language.
  • “It’s fine,” “It’s Ok.” “Don’t Worry”. It’s not fine. You did something wrong and now she’s mad at you for it.
  • “We’ll see.” Usually means no.
  • “I think we should be friends.” AKA: I don’t find you attractive.

When a woman says something, it helps to read her body language, decipher her tone of voice, and look at her facial expressions, or else you will miss out on what she is truly saying.

Women often keep their conversations ambiguous for the following reasons:

  • To keep social harmony
  • To avoid responsibility and deny consequences from certain actions
  • To show interest without saying it, which would put her on the spot
  • To keep options open and avoid committing to soon

Once you begin to understand this preferred mode of female communication, you’ll show women that you are experienced and preselected by other women.

Photo by Matan Segev from Pexels

Preselection is the most powerful form of attraction for women. It helps women decide you’re a good fit for them by outsourcing their uncertain decision-making onto other women.

There is nothing a woman finds more attractive than a man other women find attractive. And with preselection, that man becomes you.

Why you might be single still.

Negative-School-Environment-490x275After hitting rock bottom at 40 years old, finding myself in an emotionally abusive relationship I didn’t know how to get out of or recover from, I let myself get pretty comfortable in my singledom for the majority of my 40s.I equated falling in love with pain and betrayal, so being alone felt much safer.

That particular relationship (now nine years ago) catapulted me onto a journey of desperately seeking how to heal the excruciating pain, and to my surprise, inspired me to become a heartbreak coach.

My coaching approach stems from the idea that our thoughts create our results.

Over the last five years, I’ve worked tirelessly to become an expert on helping women heal their hearts with their minds. But in the back of my mind, it gnawed at me that if I really was the creator of my reality and could heal the worst pain through conscious thinking, I knew that it was my brain’s unconscious thinking that was keeping me single and “safe.”

I finally had to recognize that my most-of-the-time single status wasn’t because dating sucks or my city sucks, or all the good ones were gone, or the trauma from my past relationship was the reason I “couldn’t” let myself love again.

I was single because I kept CHOOSING to be.

I was single because I believed it was “hard” to find my ideal girl.

I was single because my fear of getting hurt was greater than my desire to create an epic love story, which, I’m so happy to share, I have now finally created.

Editor’s note: Although this article uses male pronouns, the advice applies to all sexual orientations and gender identities.

So, how did I do it?

1) I stopped blaming past failed relationships and started using them as LEARNING LESSONS. After another Ms. Unavailable would go running for the hills, I’d ask myself, WHAT DID I LEARN? I couldn’t undo the past, so I decided to look for evidence to support how it SERVED me, vs. hurt me.2) I fell in love with myself. Many people are super uncomfortable with this aspect of our work together – I used to be too. But if you aren’t authentically in love with you, how can you expect someone else to be? I started to evaluate who I was both as an individual and a partner (without an actual partner in the picture) and began to really soak up my value as both. Loving myself meant taking regular inventory of how proud I was of the man I’d become, honoring my commitment to grow in multiple areas of my life, and owning my worth, with or without a woman in the picture.

3) I got clear on the ideal woman I wanted to attract – I focused on the QUALITIES I was looking for in a partner. I wanted an emotionally available woman. Someone who was looking for a lifelong partnership. I wanted us to laugh and have crazy chemistry, but I wasn’t fixated on what she was supposed to look like – I just wanted to FEEL a certain way around her. I wanted to feel an innate trust and comfort in my body. Honesty and loyalty were also important to me.

4) I became the man my ideal woman would want to attract. This doesn’t mean I changed who I was for Future Ms. Right, but I thought about how I could up – level myself as a partner for my ideal girl, who would be way more up – leveled than the crummy ones I had been attracting. If I wanted to attract better, I needed to become better. For me, this was taking my business to the next level. There was something about becoming a total boss dude that not only turned me on, but I believed it would turn my ideal partner on. In order to grow my business and myself as a coach, I had to grow my self – worth, and I saw so clearly that this would translate into my worth as a partner. This might be different for you. In what area of your life do you know you’re not hitting your potential? Your finances? Health? An unhealed relationship? Focusing on improving other areas of your life will for sure create space for Ms. Right to enter.

5) I prepared for her arrival. I started to focus on my living space and created actual physical room for her. I had a whole drawer in my bathroom that was hers with a pink (womanly!) toothbrush waiting for her to use. I left empty shelves and extra hangers for her in my closet. I PRACTICED THOUGHTS I’D THINK AS IF SHE WAS ALREADY HERE, like, “I’m so lucky to have her, and she’s so lucky to have me.” Or, “It’s SO FUN seeing her after a long day of work.” I used my imagination and had FUN being the man in my ideal relationship BEFORE she arrived.

6) I didn’t let a bad date derail my commitment to finding her. No date was a waste of my time. It was a step closer to me finding her. I refused to let myself take a break from dating just because I went out with five girls in a row. I accepted that awkward, brutal dates were a part of the process, and I wasn’t going to let them deter my belief that my ideal girl was definitely out there.

7) I self – coached before each date. Right before the first date with my now girlfriend, I felt super nervous because my brain had decided I already liked her. I had to check myself because of the many times I thought someone seemed great on screen and then not so great in real life. So I paused and found the thought creating the anxiety: I REALLY WANT US TO LIKE EACH OTHER! It sounds like an innocent thought, but it was coming from a place of pressure versus ease. I got clear and recognized that it would be a bummer if we didn’t hit it off, but I WOULD SURVIVE like I had the prior 100 shitty first dates. The nerves didn’t fully go away with this new thought (which I don’t expect to happen on a first date), but I was showing up more present and clear than I had been over my many years of unconsciously dating.

Keep in mind that being responsible for your singledom isn’t an opportunity to shame and blame yourself. It’s an opportunity to heal, grow, and manifest the partnership of your dreams.

I had a blast implementing this new approach, and tease my girl all the time that I created her with my brain!

So don’t forget to HAVE FUN as a single man, putting yourself out there in a completely new way. If you commit to these steps, you’ll for sure feel a powerful shift that will PUSH you so much closer to your dream partner. Why not give it a shot?

3 Words with more emphasize than “I love you.”

How many times have you said ‘I love you’ to your partner or spouse?

Depending on the length of your relationship (and your level of expressiveness), it might be a hundred, a thousand, or a million times.

Expressing words of love are essential in every relationship, but what if you could learn 3 words that would light up your lover AND take your relationship to an even deeper level?

There are three almost magical words that when said with real sincerity will deepen your connection and allow your lover to feel really seen.



‘I appreciate you’ are the three magic words that all of us need to say more often in a relationship, all relationships, but certainly in our intimate relationship.

It’s perhaps because we often don’t hear them that they might be so powerful.

As one male attendee told us after we taught this at a business event:

“I tell my wife ‘I Love You’ all the time, but I used your 3-words and she actually started crying.”

“Then I started crying.

“In two decades, I’m not sure we’ve felt this connected! I just always thought she knew I appreciated her, but saying it was powerful.”

You may elicit a response like this the first time you say it, but hearing
‘I appreciate you’ never gets old!

Even after 17-years together, we say it to each other every day and it still feels like an emotional booster shot.

Appreciation is the cornerstone of a long-lasting relationship.

In a relationship, it’s easy to take each other for granted or fixate on the stuff that gets under your skin, but by looking for things to appreciate and then expressing that appreciation, you’ll both feel happier and more connected.

Research from the journal of Cognition and Emotion shows that gratitude is THE quality that makes people want to spend time with you.

‘I appreciate you’ can actually be more powerful than ‘I love you’.


1. ‘I Love You’ Might Be Triggering — Almost every human has experienced heartbreak, trauma and betrayal at the hands of someone who “loved” them so, we often have mixed feelings about love. Appreciation, on the other hand, is usually a “safe” emotion for nearly everyone and is not so charged.

2. ‘I Love You’ Can Get Watered Down — we may hear ‘I love you’ a lot in a relationship and it can almost be a given so it can get “watered down”. “I know you love me,” is a frequent response. Because we hear it often we almost take it for granted, and it loses its power.

3. ‘I Love You’ Can Be Manipulative — ‘I love you’ can be used to appease, mollify or even shut up our spouse. “You know I love you” can be manipulative, but ‘I appreciate you’ is often surprising and heartfelt.

4. Appreciation Heightens EVERYTHING — recent studies in gratitude, AKA appreciation, prove that when we are in a state of appreciation our physical, mental and emotional health drastically improves. Read How the Power of Gratitude Can Change Your Life. So imagine what being in a state of gratitude towards your partner could do to deepen connection, fan the flames of passion and enrich your friendship, not to mention boost both of your health and vitality.

Feel awkward or don’t know how to start?

Three Powerful Ways to Say I Appreciate You

• Simply, look them in the eye and say, “I appreciate you”.

• Thank your lover for their unique qualities: “I appreciate that you’re so funny/kind/ambitious/loving.”

• Appreciate the little things: “I appreciate that you always make the coffee. “I appreciate that you pick up the kids. “I appreciate that you got the dry-cleaning today.

BTW: we aren’t suggesting you don’t say I love you, but rather that you truly say it with heart AND you add in ‘I appreciate you’.

The key, of course, is to really mean it!

What can you appreciate about your partner today?0_CwHTb50AwbUN7NuG

The White Man’s Jesus.

7b765df2-e0c0-4d3c-85f2-f83c670d1ac6In the Bible itself, bodies matter, but not the way they do now. The ancient texts have sick bodies and healed bodies, pierced bodies and resurrected bodies. But for the most part, the Bible is pretty quiet about the colour of those bodies’ skin or the tone of their hair. To understand our contemporary obsession with the actors’ bodies in The Bible mini-series, we need to consider why something that is so silent in the Bible has become so salient in our approaches to it.

Throughout the 19th century, as new technologies allowed for the mass production and distribution of Bible images, some religious teachers worried that they could hinder the mission of the Church. One Presbyterian minister in New York City cautioned his congregants in the 1880s not to trust the imagery of Jesus they saw in picture-book Bibles and on stained-glass windows. ‘It is a remarkable thing in the history of Christ that nowhere have we any clue to His physical identity. The world owns no material portraiture of His physical person. All the pictures of Christ by the great artists are mere fictions.’

Just as it was time for slavery to end, it was also time for women and men of colour to refuse the language and images that associated darkness with evil, and whiteness with good

There was a serious theological reason for that minister’s concern: the lack of biblical detail about Christ’s physical features was crucial to the universal appeal of Christianity: ‘If He were particularised and localised — if, for example, He were made a man with a pale face — then the man of the ebony face would feel that there was a greater distance between Christ and him than between Christ and his white brother.’ Instead, because the Bible refused to describe Jesus in terms of racial features, his gospel could appeal to all. Only in this way could the Church be a place where the ‘Caucasian and Mongolian and African sit together at the Lord’s table, and we all think alike of Jesus, and we all feel that He is alike our brother’.

The theme of a universal Jesus has been a common response from American Christians to the question of what Jesus looked like. In 1957, Martin Luther King Jr’s advice column in Ebony magazine received a letter that asked: ‘Why did God make Jesus white, when the majority of peoples in the world are non-white?’ King answered with the essence of his political and religious philosophy. He denied that the colour of one’s skin determined the content of one’s character, and for King there was no better example than Christ. ‘The colour of Jesus’ skin is of little or no consequence,’ King reassured his readers, because skin colour ‘is a biological quality which has nothing to do with the intrinsic value of the personality’. Jesus transcended race, and he mattered ‘not in His colour, but in His unique God-consciousness and His willingness to surrender His will to God’s will. He was the son of God, not because of His external biological makeup, but because of His internal spiritual commitment.’

But in a society that separated people based on colour, God’s son wasn’t the only challenge for image-makers: the devil was, too. During the Civil War, one northern African-American, T Morris Chester, had announced that just as it was time for slavery to end, it was also time for women and men of colour to refuse the language and images that associated darkness with evil, and whiteness with good. Nearly a century before Malcolm X gained notoriety for such claims, Chester asked his fellows to wield consumer power to effect change. If, he said, you ‘want a scene from the Bible, and this cloven-footed personage is painted black, say to the vendor, that your conscientious scruples will not permit you to support so gross a misrepresentation, and when the Creator and his angels are presented as white, tell him that you would be guilty of sacrilege, in encouraging the circulation of a libel upon the legions of Heaven’.

By refusing the idea of the dark devil, Chester was going up against centuries of Christian iconography. Throughout medieval Europe, it was entirely regular to describe Satan as dark or black. Witches were known for practising ‘dark arts’, and in early colonial America when British immigrants to the New World accused others of being witches, they too conflated darkness with the demonic. The devil was everywhere in Salem in 1692, and he could take any number of physical forms. He did not always come in blackness or redness: Sarah Bibber saw ‘a little man like a minister with a black coat on and he pinched me by the arm and bid me to go along with him’. But most often he did: one witnessed Satan as a ‘little black bearded man’. Another saw him as ‘a black thing of a considerable bigness’, and yet another beheld the devil in the form of a black dog. The devil came as a Jew and as a Native American as well. In The Wonders of the Invisible World (1693), the Puritan theologian Cotton Mather associated Indians and black people with the devil: he wrote that ‘Swarthy Indians’ were often in the company of ‘Sooty Devils’, and Satan presented himself as ‘a small Black man’.

Because of America’s history and its contemporary demographics, there is almost no way to depict Bible characters without causing alarm

In the 20th and 21st centuries, debates over how to depict biblical figures have grown louder and more contentious. In large part, this is because of the increased importance of visual imagery in US culture. Whether at the movies or on TV, in magazines or on the internet, Americans produce and consume images at a staggering rate. Even in the 1930s, some African-American teenagers who took part in sociological surveys answered the question ‘What colour was Jesus?’ with ‘All the pictures of Him I’ve seen are white.’ That seemed definitive enough. Decades later, when Phillip Wiebe, professor of philosophy at Trinity Western University in Canada, interviewed people for his book Visions of Jesus(1997), a man named Jim Link reported having a visionary experience in which Jesus ‘had a beard and brown shoulder-length hair, and looked like the popular images of Jesus in pictures’.

At times, films have tried to avoid controversy by obscuring biblical characters, as in Ben-Hur (1959) or The Robe (1953). In those cases, we see the back or the arm of Jesus, but never his face. At other times, filmmakers have seemed to beg for controversy, such as the casting of the black actor Carl Anderson in the role of Judas Iscariot in the film Jesus Christ Superstar (1973), released just five years after Martin Luther King Jr’s assassination.

Questions of race and identity have now become inescapable elements of any public presentation of the Bible. Mel Gibson digitally altered The Passion of the Christ (2004) to transform the actor Jim Caviezel’s eyes from blue to brown — in an attempt to make his Jesus character look more Jewish. But even with this change, and a prosthetic nose attached to Caviezel’s face, some critics nonetheless denounced the film for presenting Jesus as a typical white American man, excluding, as those earlier ministers had worried, the ‘man of the ebony face’.

The Bible mini-series is yet another example of how Americans have portrayed Bible characters visually, debated what those characters did or should look like, and discussed whether those figures should be put into flesh at all. The debates haven’t simply been about religion. They have also shown how entangled politics and religion are in America, with questions such as whether President Obama is working on the side of God or the side of the devil. And big money is involved — whether in the form of high ratings and advertising revenue from TV and film aimed at the huge evangelical Christian market, or in the lucrative industries that publish Bibles and tracts depicting, perhaps unwittingly, Jesus and the devil on opposite sides of a racial divide.

Because of America’s history and its contemporary demographics, there is almost no way to depict Bible characters without causing alarm. To call Jesus ‘black’ signals political values that are associated with the radical left. In 2008, President Obama’s pastor Jeremiah Wright almost cost him the Democratic nomination because of his claims that ‘Jesus was a poor black man’. However, to present Jesus as white in a society where African-Americans, Asian-Americans, and Latino Americans make up increasing numbers of the population is quickly understood as a code for a conservative worldview. Little wonder, then, that some Americans are choosing to describe Jesus as ‘brown’ as a way to avoid the white-black binary. If one attends an anti-conservative rally in the US, for instance, one is likely to find a poster that reads: ‘Obama is not a brown-skinned, anti-war socialist who gives away free health care. You’re thinking of Jesus.’

Civil Disobedience

fugitive-justics-fuchsiaCivil discourse is in an accelerating downward spiral of coarse insult, free-flying contempt and general meanness. We will surely soon reach bottom, an inevitably inarticulate resting place where we quit wasting words and just mutely flip each other off. Since bemoaning our uncivil culture is almost as prevalent as incivility itself, let me forgo any ritual hand wringing. I register the culture here because it so influences me: as public discourse grows crueller, nastier and more aggressive, my temptations to be uncivil increase apace, and I don’t like that.

My growing temptations to incivility are diverse and predictable. When one encounters disrespect, the desire to answer in kind is strong. Likewise, with so many pitched to provoke anger, one wants to give them just the outrage they invite. More basically, I find it ever harder to like people and so to act as if I like them – misanthropy does not seem so unreasonable as it once did. But incivility’s most powerful appeal is that it can seem downright righteous.

The desire to be civil, in its cleanest and most robust form, is a desire to be moral, to treat others humanely, with respect, toleration and consideration. But if one wants to be moral, one must also know that, in order to be good, sometimes one cannot be nice. The imperative to treat others civilly is never responsibly total because sometimes a moral good is won in rudeness. To display disrespect or enmity, to mock or shun, to insult or shame – these can be moral gestures. For even as we need to respect humanity, valuing human beings can sometimes require disrespecting some of them, precisely the ones who deny or damage our shared humanity. To show such people respect and consideration might let them have their way a bit, let them continue in their destructive ways.

My sneering contempt for your terrible moral outlook might not stop you, but maybe my disdain can slow you down or discourage others from doing like you do. This, then, is where temptation is at its greatest. There are many who do not so much succumb, but actively embrace it. The world at present is not just full of rude people, it is full of people being rude because they judge it to be righteous. I feel the pull. But I have doubts.

Can our self-conscious minds save us from our selfish selves?

the-suicide-of-dorothy-haleLike all living things, humans are organisms, biological entities that func­tion as physiological aggregates whose constituent parts operate with a high degree of cooperation and a low degree of conflict. But unlike other organ­isms, humans possess a rogue component – a brain network that can, at will, choose to defect and undermine the survival mission and purpose of the rest of the body. This is the network that underlies human conscious­ness, and especially our capacity for auto – noetic, or reflective, self-awareness, the basis of the conceptions that underlie our greatest achievements as a species – art, music, architecture, literature, science – and our ability to appreciate them.

The autonoetically conscious human brain is the only entity in the his­tory of life that has ever been able to choose, at will, to terminate his or her own existence, or even put the organism’s physical existence at risk for the thrill of simply doing so – the other cells and systems be damned. Some argue, on the basis of anecdotal evidence, that some other animals also commit suicide. But whether such behaviours are truly intentional, in the sense of being based on a thought about causing one’s self to cease to exist, is con­troversial. In the late-19th century, the sociologist Émile Durkheim proposed that the term ‘suicide’ should be used only in cases of death resulting directly or indirectly from a positive or negative act that the individual knows or believes will produce the intended result – death. Durkheim argued that conceiving a goal of this kind depends on possession of a reflective form of consciousness that other animals lack – that the physiological capacities they possess are insufficient to this purpose. He concluded that true suicide, in its various forms, is a social condi­tion of humans.

Early humans are believed to have been unremarkable compared with coexisting fauna. Then, at some point (estimates range between 50,000 and 200,000 years ago), something happened to distin­guish our ancestors from the rest of the animal kingdom. They developed novel capacities and ways of existing and interacting with one another – language; complex hierarchical relational reasoning; representation of self versus other; mental time-travel. Autonoetic consciousness, the human ability to know about our own existence, was the result.

That autonoesis might be unique to humans does not mean that it appeared out of the blue. For one thing, our primate ances­tors had sophisticated cognitive capacities, including working memory and executive functions. These made possible the integration of perceptual and mnemonic information in real time, and the ability to deliberate about alternative courses of action. Such capacities are known to depend on networks involving lateral areas of prefrontal cortex. This is important because both human and nonhuman primates possess these areas, but other mammals do not. Perhaps these networks allowed ancestral primates to have a noetic (factual or semantic) consciousness of objects and events, including the ability to distinguish between what is useful and what is harmful, and maybe even to have a simple semantic version of self-awareness. But they would not have been able to experience their self as an entity with a personal past, and imagine possible futures, including the existential realisation of future nonexistence. This capacity for autonoesis, I propose, depended on the emergence of unique, enriched features of prefrontal networks that humans are known to possess, but that even other primates lack.

Given that autonoetic consciousness can undermine the survival goals of the organism, it must have had useful consequences. Perhaps it enabled the ability to have a self-focused perspective on the value of objects and events to the individual – to the self. Without the involvement of one’s subjective self, what we humans call emotions cannot be experienced. Other animals might have some kinds of emotional experiences in significant situations in their lives, but without autonoesis they cannot have the kinds of experiences we do.

The personal, self-centred nature of the autonoetic mind leads it to assume that it is always in charge of its body’s actions. Indeed, so-called free will is one of our most cherished narratives. For example, Judeo-Christian religions teach that humans attain heaven in the afterlife through their choices in life. René Descartes’s dualistic philoso­phy was an attempt to reconcile such religious conceptions in light of the scientific revolution begun by Copernicus and Galileo. The philosopher Søren Kierkegaard later proposed that anxiety is the price we pay for this freedom to choose. While some move­ments in modern science – behaviourism being a prime example – have at­tempted to suppress consciousness as a scientific construct, consciousness itself did not let that rejection stand. Today, the science of consciousness is a vibrant field.

The kind of consciousness our mind supported by our unique kind of brain has enabled us to conquer frontiers. We have the power to change the environment to meet our needs; satisfy our whims, desires and fantasies; and protect ourselves from our fears and anxieties. Imagining the unknown inspires us to find new ways of existing. Pursuing these comes with risks, but we can also anticipate them and conceive of possi­ble solutions in advance.

Our thirst for knowledge has led to scientific and technological discov­eries that have made life, at least for the lucky among us, easier in many ways. We don’t have to forage for food or drink in dangerous settings – attacks by other species, which are so common in the animal kingdom, are simply not part of daily life for most humans. Food is kept fresh by refrigeration. We easily combat seasonal changes in temperature with other convenient appliances. We have access to medications to treat, and even prevent, common illnesses, and surgical procedures can fix and, in some cases, replace damaged body parts. We can electronically communicate with people anywhere in the world instantaneously.

The internet has indeed transformed life in ways worth celebrating but, like most good things, it comes at a cost. It has made it easier to be self-centred, facilitating realignments of interests that oppose the common good and challenge commonly accepted beliefs through hearsay and rumour, and even outright lies. False assertions gain credence simply through rapid rep­etition. Some use such tactics to undermine the value of science and its contributions to life and wellbeing, and to attack the foundations of our social structures, including our government, and its safety nets for those in need, and its checks and balances against tyranny.

The pace of change to our ecosystem has become fast and furious. Global temperatures and sea levels are rising. Weather patterns are in flux. Forests are burning. Deserts are expanding. Species are becoming extinct at unprecedented rates. Many alarmed observers have called for efforts to reverse, or at least slow, changes brought on by our choices. According to the astrophysicist Adam Frank, the Earth will surely persist in some form, but it is likely that some of the life forms present today will not make it. History tells us that large organisms with energy-demanding lifestyles are especially vulnerable to environmental reconfigurations. Never, in the history of life, has any species asked more of the environment than we have.

Pondering such issues, the philosopher Todd May recently asked: ‘Would human extinction be a tragedy?’ He concluded that the planet might well be better off without us, but that such an outcome would indeed be a trag­edy, as we have achieved remarkable things as a species. Autonoesis, I contend, has made these possible. But it also has a dark side. With self-consciousness comes selfishness, and narcissism, enabling our most troubling and base dispositions towards others – distrust, fear, hate, greed and avarice. According to the philosopher Christophe Menant, it is the root of evil.

Yet only self-conscious minds can come to the realisation, as May’s mind did, that we have an obligation to confront our selfish nature for the good of humankind, as a whole. To act on this will require a global effort. If we succeed in joining together to rise above short-sighted policies and self-indulgent desires we might avert some of the more drastic changes in the configuration of life, and preserve some kind of future for our descendants.

We persist as individuals only if we persist as a species. We don’t have time for biological evolution to come to the rescue – it’s too slow a process. We have to depend on more rapid avenues of change – cognitive and cultural evolution – which, in turn, depend on our autonoetic minds. In the end, whether humans will be part of the Earth’s future is up to us – to the choices our self-conscious minds make.

What makes a super Dad. (2)

download (7)                          To understand the role of the father, we must first understand why it evolved in our species of ape and no other. The answer inevitably lies in our unique anatomy and life history. As any parent knows, human babies are startlingly dependent when they are born. This is due to the combination of a narrowed birth canal – the consequence of our bipedality – and our unusually large brains, which are six times larger than they should be for a mammal of our body size.

So mum births her babies early and gets to invest less time in breastfeeding them. Surely this means an energetic win for her? But since lactation is the defence against further conception, once over, mum would rapidly become pregnant again, investing more precious energy in the next hungry foetus. She would not have the time or energy to commit to finding, processing and feeding her rapidly developing toddler.

At this point, she would need help. When these survival-critical issues first appeared around 800,000 years ago, her female kin would have stepped in. She would have turned to her mother, sister, aunt, grandma and even older daughters to help her. But why not ask dad? Cooperation between individuals of the same sex generally evolves before that between individuals of different sex, even if that opposite-sex individual is dad. This is because keeping track of reciprocity with the other sex is more cognitively taxing than keeping track of it with someone of the same sex. Further, it has to be of sufficient benefit to dad’s genes for him to renounce a life of mating with multiple females, and instead focus exclusively on the offspring of one female. While this critical tipping point had not yet been reached, women fulfilled this crucial role for each other.

But 500,000 years ago, our ancestors’ brains made another massive leap in size, and suddenly relying on female help alone was not enough. This new brain was energetically hungrier than ever before. Babies were born more helpless still, and the food – meat – now required to fuel our brains was even more complicated to catch and process than before. Mum needed to look beyond her female kin for someone else. Someone who was as genetically invested in her child as she was. This was, of course, dad.

Without dad’s input, the threat to the survival of his child, and hence his genetic heritage, was such that, on balance, it made sense to stick around. Dad was incentivised to commit to one female and one family while rejecting those potential matings with other females, where his paternity was less well-assured.

As time ticked on and the complexity of human life increased, another stage of human life-history evolved: the adolescent. This was a period of learning and exploration before the distractions that accompany sexual maturity start to emerge. With this individual, fathers truly came into their own. For there was much to teach an adolescent about the rules of cooperation, the skills of the hunt, the production of tools, and the knowledge of the landscape and its inhabitants. Mothers, still focused on the production of the next child, would be restricted in the amount of hands-on life experience they could give their teenagers, so it was dad who became the teacher.

This still rings true for the fathers whom my colleagues and I research, across the globe, today. In all cultures, regardless of their economic model, fathers teach their children the vital skills to survive in their particular environment. Among the Kipsigis tribe in Kenya, fathers teach their sons about the practical and economic aspects of tea farming. From the age of nine or 10, boys are taken into the fields to learn the necessary practical skills of producing a viable crop, but in addition – and perhaps more vitally – they are allowed to join their fathers at the male-only social events where the deals are made, ensuring that they also have the negotiation skills and the necessary relationships that are vital to success in this tough, marginal habitat.

In contrast, children of the Aka tribe of both sexes join their fathers in the net hunts that take place daily in the forests of the Democratic Republic of Congo. The Aka men are arguably the most hands-on fathers in the world, spending nearly half their waking time in actual physical contact with their children. This enables them to pass on the complex stalking and catching skills of the net hunt, but also teaches sons about their role as co-parent to any future children.

And even in the West, dads are vital sources of education. I argue that fathers approach their role in myriad different ways dependent upon their environment but, when we look closely, all are fulfilling this teaching role. So, while Western dads might not appear to be passing on overtly practical life – skills, they do convey many of the social skills that are necessary to succeed in our competitive, capitalist world. It is still very much the case that the wheels of success in this environment are oiled by the niceties of social interaction – and knowing the rules of these interactions and the best sort of person to have them with gives you a massive head start, even if it is just dad’s knowledge of a good work placement.

Fathers are so critical to the survival of our children and our species that evolution has not left their suitability for the role to chance. Like mothers, fathers have been shaped by evolution to be biologically, psychologically and behaviourally primed to parent. We can no longer say that mothering is instinctive yet fathering is learned.

The hormonal and brain changes seen in new mothers are mirrored in fathers. Irreversible reductions in testosterone and changes in oxytocin levels prepare a man to be a sensitive and responsive father, attuned to his child’s needs and primed to bond – and critically, less motivated by the search for a new mate. As a man’s testosterone drops, the reward of chemical dopamine increases; this means that he receives the most wonderful neurochemical reward of all whenever he interacts with his child. His brain structure alters in those regions critical to parenting. Within the ancient, limbic core of the brain, regions linked to affection, nurturing and threat-detection see increases in grey and white matter. Likewise enhanced by connectivity and the sheer number of neurons are the higher cognitive zones of the neocortex that promote empathy, problem solving and planning.

But crucially, dad has not evolved to be the mirror to mum, a male mother, so to speak. Evolution hates redundancy and will not select for roles that duplicate each other if one type of individual can fulfil the role alone. Rather, dad’s role has evolved to complement mum’s.

This is no more clear than in the neural structure of the brain itself. In her 2012 fMRI study, the Israeli psychologist Shir Atzil explored the similarities and differences in brain activity between mothers and fathers when they viewed videos of their children. She found that both parents appeared similarly wired to understand their child’s emotional and practical needs. For both parents, peaks of activity were seen in the areas of the brain linked to empathy. But beyond this, the differences between the parents were stark.

The mother’s peaks in activity were seen in the limbic area of her brain – the ancient core linked to affection and risk-detection. The father’s peaks were in the neocortex and particularly in areas linked to planning, problem solving and social cognition. This is not to say that there was no activity in the limbic area for dad and the neocortex for mum, but the brain areas where the most activity was recorded were distinctly different, mirroring the different developmental roles that each parent has evolved to adopt. Where a child was brought up by two fathers, rather than a father and a mother, the plasticity of the human brain had ensured that, in the primary caretaking dad, both areas – mum’s and dad’s – showed high levels of activity so that his child still benefited from a fully rounded developmental environment.

Fathers and their children have evolved to carry out a developmentally crucial behaviour with each other: rough-and-tumbleplay. This is a form of play that we all recognise. It is highly physical with lots of throwing up in the air, jumping about and tickling, accompanied by loud shouts and laughter. It is crucial to the father-child bond and the child’s development for two reasons: first, the exuberant and extreme nature of this behaviour allows dads to build a bond with their children quickly; it is a time-efficient way to get the hits of neurochemicals required for a robust bond, crucial in our time-deprived Western lives where it is still the case that fathers are generally not the primary carer for their children. Second, due to the reciprocal nature of the play and its inherent riskiness, it begins to teach the child about the give and take of relationships, and how to judge and handle risk appropriately; even from a very young age, fathers are teaching their children these crucial life lessons.

And how do we know that dads and kids prefer rough-and-tumble play with each other rather than, say, having a good cuddle? Because hormonal analysis has shown that, when it comes to interacting with each other, fathers and children get their peaks in oxytocin, indicating increased reward, from playing together. The corresponding peak for mothers and babies is when they are being affectionate. So, again, evolution has primed both fathers and children to carry out this developmentally important behaviour together.

This has meant that, to ensure the survival of mother and baby and the continued existence of our species, we have evolved to exhibit a shortened gestation period, enabling the head to pass safely through the birth canal. The consequence of this is that our babies are born long before their brains are fully developed. But this reduced investment in the womb has not led to an increased, compensatory period of maternal investment after birth. Rather, the minimum period of lactation necessary for a child to survive is likewise drastically reduced; the age at weaning of an infant child can be as young as three or four months. A stark contrast to the five years evident in the chimp. Why is this the case?

If we, as a species, were to follow the trajectory of the chimpanzee, then our interbirth interval (the time between the birth of one baby and the next) would have been so long; so complex and so energy-hungry is the human brain that it would have led to an inability to replace – let alone increase – our population. So, evolution selected for those members of our species who could wean their babies earlier and return to reproduction, ensuring the survival of their genes and our species. But because the brain had so much development ahead of it, these changes in gestation and lactation lengths led to a whole new life-history stage – childhood – and the evolution of a uniquely human character: the toddler.