Critique of The World Economic Forum’s Vision for Education

Raphael. School of Athens, 1509-1511

From a survey of 3500 businesses in 2015, the British Chamber of Commerce (BCC) found that companies are dissatisfied with the quality of young candidates that are entering the job market.[1]  Other articles frame millennial job seekers as suffering from entitlement, sloppiness, a lack of persistence and a flimsy grip on their emotions, a lack of soft skills as it were, which makes hiring them challenging[2]. The internet also has no shortage of vile descriptions for millennials; they have short attention spans, are self-centred and motivated by passion rather than reason and common sense. They lack the Protestant Ethic,[3] one blogger wrote, the old tradition of rolling up your sleeves, working the skin off your hands and sweating blood just to create a meaningful life.

This impotence of the youth appears to be corroborated by mountains of unemployment data. In Britain, a country with a moderate 5.4% national unemployment rate in 2015, found that the youth unemployment rate was almost three times higher, at 14.9%. This phenomenon is not unique to Britain. The American Bureau of Labor and Statistics (BLS) reported similar numbers for youth unemployment, that is people between the ages 18 and 34 years olds.  Their employment rate was 14.4%[4], again, more than three times the national average of 3.9% in 2020. The South African numbers are jaw-dropping, but present a similar statistical reality, with a youth unemployment rate of 63% against a backdrop of a 30.1% national average.[5] From all this, we can conclude that search parties are out. Companies and policy makers around the world are desperately looking for solutions to curb youth unemployment.

So far, the media, large economic institutions, businesses and policy makers are wagging their fingers at schools and blaming them for failing to produce economically viable individuals.  For instance, the World Economic Forum (WEF) published a report titled, “New Vision for Education:  Unlocking the Potential of technology”[6] – a decree, as it were, that outlines what schools ought to do to get their act together.  

The report identifies 16 skills, which they deem “critical” for the 21st century.  The skills are grouped into three categories, namely: a) foundational, b) competencies and c) character qualities. The first category, Foundational skills, includes literacy, numeracy, ICT literacy and financial literacy – essentially the skills that are traditionally associated with schools.  Competencies, the second category, includes critical thinking, creativity, communication and collaboration, while the latter, Character Qualities, has curiosity, persistence, adaptability and leadership, amongst others. The report goes on to declare that the WEF surveyed 100 countries and found gaps in the extent that schools readily produce individuals with these skills, and as expected, they claim that developed countries do a better job.

And the solution? They developed a model called the Closed Loop, where a team of educators comes together, firstly to determine learning objectives. After that, they compose a curriculum, which becomes a scripted manual for teachers to deliver instructions. Then, with the help of technology, they continuously assess the learners and provide tailored interventions to help them succeed. The report closed with a catalogue of software and case studies that are implementing the Closed Loop model, needless to say, only citing their success stories.


To truly appreciate the scope of trying to create a vision for the future of education, it is necessary to begin by briefly scanning the annals of history to get a sense of our present relationship with learning. Throughout the ages, the nature of education changed to track the intellectual fashions of the times. For instance, Plato and Socrates who both lived between 428 and 322 BC, established idealism. That is the notion that there is a perfect or ideal version of all things, and that our pursuits for knowledge will invariably lead us to a preexisting ideal. In other words, there is nothing new under the sun. 

Interestingly, the side effects of idealism manifested most prominently in medicine. Hippocrates the father of modern medicine and a contemporary of Plato and Socrates, held a similar view. To him and his followers, illnesses were a kind of intrusion that caused an imbalance that disturbed the ideal, healthy state of the body.  He reasoned, therefore, that there must be physical causes of illnesses, and that the correct diagnosis could lead to the right remedy.  That was revolutionary thought in a world steeped in alchemy, magic and the supernatural.  In the spirit of the ideal, however, the Hippocratic school of thought held that diagnosis aught to be strictly observational and non-intrusive, lest one disturbs the body even further by prodding, introducing all sorts of foreign objects and further besmirching it from the ideal. And that turned out to be a problem of historic proportions.  

In John Barry’s masterpiece, The Great Influenza, he cites that it took 2000 years (with blips of advancement here and there) for the medical world to truly break this Hippocratic hangover. [7]Only in the 18th and 19th century, when the enlightenment era – a different philosophical construct – was fully manifest, did physicians find the courage to dissect, prod and probe the body to understand how it works and discover what could otherwise never be known.  In this sense, we can see how the metaphysics of idealism created an epistemological blind spot for the medical world.  We can see how philosophies can manifest into advances and limitations for education.

This understanding, therefore, beckons the following questions.  What are the philosophical underpinnings of our times? And how are these ideas manifesting themselves into the metaphysics of education? Only by first answering these questions can a vision for the future of education, in earnest, be crafted.  


Legend has it that during the early 1300s, a Mongolian merchant arrived home in a small village, from months of travel. [8] He embraced his wife and children, exchanged pleasantries and wild stories of his expedition with neighbours, and eventually retired to get some rest. A few days later, however, he developed a fever which his wife treated dutifully with herbs, prayer and other traditional magic. A week later, still bedridden, he developed growths of puss on his neck, armpits and groins. They kept growing and became as large as meatballs, oozing with warm puss. Eventually, the meatballs burst open, giving off a miasma of rotting flesh.  After that, his hands and feet started losing all sensation and turning black.  Days later, he started coughing non-stop until his lungs eventually ruptures and he started drowning in his own blood. And then finally, he expired.  After his burial, his wife developed a similar fever.  After that, his children, the village and eventually the strange disease spread along trade routes until it reached the borders of Kaffa.  

Kaffa, now Feodosiya in Ukraine, was a cosmopolitan Italian trading outpost with a mixture of merchants and citizens from different parts of the world. During 1336 they were under siege from the mighty Mongolian army, which was under the leadership of Jani Beg.  While the Italians were bracing themselves behind the reinforced walls of Kaffa, the Mongolian soldiers were fighting a different war outside.  They were harassed by some mysterious disease, beginning with a fever, followed by growths of giant meatballs in their necks, blackening of limbs and ruptured lungs. The disease spread swiftly from one solider to the next and threatened to sabotage the siege. But Jani Beg, unrelenting, hatched a plan to force the Italians out of the formidable walls of Kaffa.  He ordered his generals to round up all the dead bodies, load them on catapults and hurl them over Kaffa’s city walls, in what became the first known act of biological warfare.  As expected, the Italians, merchants and citizens from other parts of the world fell prey to the disease a clamoured out of the city, mostly going back home to inadvertently infect others. A global pandemic began.[9]

As quickly as a year later, the disease became known as the Bubonic Plague and wiped out more than 60% of the European population[10].  Being staunch believers that the church mediates between them and God, people turned to the Pope, Clement VI, for answers.  He declared that the disease was an act of wrath from God for the sins of the world, and ordered for mass graves. Slowly but surely, however, people grew apathetic about the church’s inability to summon God’s will to their salvation.  In response this, and to avoid the logics and humiliation of having to bury thousands of people, the Pope consecrated the River Rhone and hurled the disease ridden corpses into it, inadvertently infecting more people downstream.[11]  Amazingly, rumours spread that the Jews were poising wells and the water supply. Then on the 14th of February in 1349, one thousand Jews were burnt alive in Germany, in what is known as the Strasbourg Massacre, and many more were expelled from the city.

In the meantime, physicians were completely stumped by the disease.  No amount of non-intrusive observation and prognostication led them to the right causes of the disease.  Absurdly, some physicians advised the Pope to keep an array of torches around him at night to chase the disease away.  Some 300 years later in 1665, in the dark alleys of London, the plague had mutated and was traveling through rats and flees to infect more people.[12] (Note that some scholars believe the bubonic plague of the 14th century and the black death of the 17th century are different diseases, nevertheless…)[13] Unbeknownst to the physicians, they suspected that cats and dogs were the carriers. In 1665 more than 40,000 cats and dogs were killed in London, and of course, the rats multiplied, and the disease spread exponentially.  Soon after that, the government hired watchmen to enforce 40 days of lockdown for anybody that showed symptoms, together with their entire family.  Incidentally, the trend of the summer holiday house also began amongst the rich as they retreated from London to the countryside.

There was no solution for the plague, and small pockets of intellectuals began expressing their frustration.  To articulate their disillusionment, consider these words from Sir Roger Scruton.  “When we pray, we do not command the world to obey us. On the contrary, we humbly acknowledge our lack of power and ask God to intervene on our behalf. Prayer is a recognition of our weakness and a resolve, at the same time, to deserve God’s help.”[14] This was, in essence, the divine relationship that the thinkers of the time grew dissatisfied of and sought new way of thinking.  Amongst these intellectuals was Rene Descartes, who anarchistically declared, “I think, therefore I am,”  thus breaking the covenant between man and God.

Descartes and his contemporaries wrote feverishly to develop a new system of thinking that quenched their curiosity about how the world works, rather than leaving it to magic and God.  They ushered in a new intellectual era, the age of enlightenment (also called the modern era), where for the first time, curiosity, thinking and reasoning were elevated above the occult. Needless to say, education changed radically to incentivise critical thinking through the scientific method,[15] instead of faith through the scriptures.  This new mindset gave rise to Isaac Newton, Galileo Galilei and Charles Darwin, to mention but a few.  Even politics changed.  God-like monarchies underwent civil wars and revolutions to reform blood-tied systems of government to constitutional democracy where power was legitimised by the people. Economies also changed towards capitalism; once again, moving away from monarchies and theocracies, towards an individualist, “self-made” worldview.  Simply put, during the 300 years of the enlightenment era, the western world changed and leapfrogged over other empires that were just as powerful and innovative if not more in the preceding centuries, noting the Chinese and African empires.

  By the late 1700s, people were gathering spontaneously to share ideas and learn from one another.   These gatherings became universities as we know them today, all of which claimed a space in society as hubs for intellectual discourse and the pooling of resources for scientific endeavours.  Owing to the foresight and fortitude of Horace Mann, an American educational pioneer, schools were reformed to include children from all religious and ethnic backgrounds. Their curricula was also changed to suite the times and loosen the group of religion – well, for the most part, at least.[16]  Mann argued that a society cannot be free and ignorant at the same time, a thought that influenced policymakers through the United States and the world.  Sadly, an idea of which large pockets of societies today seem to have forgotten.


Today we are still enjoying the fruits of the Enlightenment era, and indeed the pre-enlightenment age.  But we must also contend with our times, the post-modern era, the new intellectual flavour that influences our us today.  For instance, it is normal for a customer to know more about a product than a salesperson; gone are the days when companies could pummel consumers into submission with a barrage of advertisements.  Furthermore, a student can know more about a subject than their teacher – all of which are consequences of the internet, which ushered in the information age.

As a result, the individual is more empowered to service their self-interests more acutely, and find others with whom they share interests more readily. Ultimately, the individual is likely to adopt beliefs and values that are not prescribed by their biological or geographic community.  Instead, we can reach into the virtual cosmos, as it were, and find other, more resonant communities and people.  Manuel Castell, a professor at the University of Southern California, calls this process individuation, following Carl Jung postulating that a wholesome person is one who has discovered and come to terms with all aspects their being; one who can resist mass-mindedness, especially the kind that was prevalent in Marxims and Nazism.[17]  Castel argues that “…there is a shift toward the reconstruction of social relationships, including strong cultural and personal ties that could be considered a form of community, on the basis of individual interests, values, and projects.”[18] 

Given Castel’s arguments, however, it is also worth noting a 2017 report from the World Health Organisation (WHO), citing more than 300 million cases of depression and anxiety, seemingly a new psychological epidemic of our times.[19]  Some scholars attribute this to narcissism, a byproduct of hyper-individuation.  In his book, Individuation and Narcissism, Mario Jacoby points out that blindly pursuing self-interests can lead to the notion that happiness is (in a platonic sense) an ideal that ought to be chased at whatever cost.[20]  In the same vein, it follows that ideas that appear to differ or threaten one’s happiness ought to be irradiated.  Ultimately this intellectual fashion, supported by the ease of finding in-groups, and the significantly low cost of slandering and provoking others behind the veil of the internet promotes intolerance and anxiety amongst people and communities with seemingly different objectives or ideals.


Even though education has change throughout the ages, the prevailing constant is the promise of passing down valuable knowledge to the next generation, such that they might live wholesome and productive lives.  As it stands, we live under the influence of three intellectual eras, the pre-modern, modern and post-modern. These intellectual personalities, as it were, are all grappling for a share of our being.  From this perspective, we can appreciate the overwhelming difficulty of trying to manage, let alone conceive of a comprehensive system for the future of education.  Nevertheless, the question that ought to underpin any grand vision for the future must respond to these complexities as effectively as Horace Mann did for the American schools during his time.

The World Economic Forum’s (WEF) vision tries to follow in the footsteps of Mann by advising and even referring to technologies that offer tailored learning pathways.  Indeed, this is a response to Castell’s phenomenon of individuation.  What is confusing about the vision, however, is that it also promotes the central development of scripted curricula.  The report presents this notion in its Closed Loop model, and a case study where they tried to resolve the poor quality of teaching in some Kenyan schools.  

The first challenge with this approach is that if learners have access to the internet, they will have access to the best teachers around the world, making the scripted lessons obsolete.  A second, more fundamental problem is the assumption that a central committee knows what ought to be learned in the 21st century.  We might recall that at the dawn of the Enlightenment era,  disciplines emerged spontaneously out of curiosity rather than a dogged promulgation of preconceived curricula.  There was simply no way of knowing what the new domains of study would be, except to excavate them with the scientific method.  Similarly, the modern, individuated child is spontaneously learning podcasting, video editing, artificial intelligence, screen-casting and other technologies without the aid of formal education. If there is anything to learn from history, it is that the current notion of preconceived curricula runs the risk of shrinking into obscurity as did the pre-enlightenment schools.

The WEF’s report is not entirely flawed. Basic skills, such as literacy and numeracy have enjoyed a degree of importance and relevance throughout history. Likewise, the report identifies this and even extends the catalogue of fundamental skills to include ICT literacy [presumably basic computer literacy], financial and cultural literacy.  Given the general advantages that these fundamental skills bring to daily life, indeed they should be imported into the “post-modern” school.  Regarding the Competencies, as presented in the report – those being critical thinking, creativity, communication and collaboration – they are misappropriated as 21st-century skills. These are old, Enlightenment-era skills which hang on the backbone of the scientific method.  Be that as it may, they should also be imported into the post-modern school, not as an innovation but rather as a going-back to the traditions of the scientific method as a pedagogical tool, particularly for younger children.

The third category of the WEF’s 21st century skills, leadership, adaptability, grit and curiosity, call for more scrutiny.  Understanding personality has been a cause for tremendous agony for psychologist and psychoanalyst for decades, of which the Big Five personality traits are amongst the latest development.[21] This theory hinges on a long-standing assumption that people naturally develop different personality traits, and this is what makes us different from one another. In other words, what the WEF’s report refers to as Character Qualities, are better understood as personality traits that develop naturally with varying degrees in different people. An entrepreneurial person, for instance, will pursue their goals even at the cost of leaving their countries or being cast out by their families – that is supposedly grit.  In terms of the Big Five personality theory, that is because they are highly disagreeable, highly open to experience and highly conscientious; and less because they were taught to be like that. To the contrary, entrepreneurial people exhibit these “leadership” qualities, adaptability, grit and cultural awareness even when they are wholly incompetent.  In this sense, framing Character Qualities as a skill, that presumably everybody must acquire, is a misnomer because these qualities will occur naturally in varying degrees amongst people, as has happened throughout history.

Unfortunately, the report is silent on the most significant intellectual malaise of the post-modern era, that being the byproducts of hyper-individuation – narcissism, depression, anxiety and intolerance. It could be that the notion of Character Qualities was an attempt to capture the essence of this post-modern reality. Be that as it may, the solutions to character development are similarly ancient. One of the most pervasive multi-cultural phenomena are the rights of passage. The English had fraternities such as scouts, where children learned cultural values, respect, teamwork and the like. Africans and some Asian cultures also have practices where young boys and girls were extracted from society, secluded and taught cultural values, poetry, music and history.

Even though pockets of these cultural schools still prevail, they are largely outdated and shunned by the public, but more alarmingly, there is no replacement except for television and smart phones.  In this respect, there is something to learn from the Chinese.  According to Professor Zhang Weiwei, China is a Civilisation-State: civilisation in the sense that they espouse their traditional values to preserve a common identity and unity amongst their people; and state, in the sense of using modern institutions to contend with the world.[22]  Perhaps the post-modern education system can similarly combine history and modernity to reclaim the soul of education from the underworld of post-modernity. 


Given the gargantuan task of crafting a vision for the future of education, the WEF’s report is economical in sophistication.  Instead, it bends towards propagating a system, the Closed Loop, which not only contradicts itself as shown herein but shallowly attends to the complexities of education.  Be that as it may, the report makes a fair attempt at capturing the need for universal literacy.  However, it is necessary to introduce the rigours of the scientific method in schools from an early age, rather than the present regurgitation of mostly irrelevant academic material.  Lastly, the education system must find practical – dare I say traditional ways – of inculcating essential human values.

We live in altogether different times. It could well be that businesses are finding it challenging to work with millennials and younger people. However, it is also worth noting that millennials, the internet-driven customer, are finding it equally difficult to work with traditional businesses.  Therefore, instead of wagging fingers at the education system to manufacture proper humans, it is more sensible to appreciate the extent that our times have changed over a relatively period. 

Instead of looking to large institutions such as the World Economic Forum for one-size-fits-all solutions, we ought to accept individuation as a more prevalent cultural phenomenon.  In other words, the sovereignty of the individual is rising, and policies that wish to cling to archaic hierarchical systems rather than new networked social systems will fizzle into irrelevance, as did the monarchies and the churches at the behest of the Enlightenment era.  In a more technical sense, policies (and indeed education) must shift from a strictly macro, and attend to the micro, meso and exo perspectives of people’s needs [23] – a topic for another day.

[1] “BCC: Businesses and schools ‘still worlds apart’ on readiness for work.” (accessed Aug. 24, 2020).

[2] “Why SA’s Millennials and money don’t add up.” (accessed Aug. 24, 2020).

[3] “Why Youth Are Unemployable,” Adam F. C. Fletcher, Jul. 10, 2014. (accessed Aug. 24, 2020).

[4] “A-13. Employment status of the civilian noninstitutional population by age, sex, and race.” (accessed Aug. 24, 2020).

[5] S. S. Africa, “Vulnerability of youth in the South African labour market | Statistics South Africa.” (accessed Aug. 24, 2020).

[6] World Economic Forum, “New Vision for Education: Unlocking the Potential of Technology,” 2015. [Online]. Available:

[7] J. M. Barry, The Great Influenza: The Story of the Deadliest Pandemic in History. Penguin, 2020.

[8] “Bubonic Plague,” Jan. 20, 2016. (accessed Aug. 24, 2020).

[9] M. C. Kalu, “Birth of the Black Plague: The Mongol Siege on Caffa,” WAR HISTORY ONLINE, Jul. 28, 2018. (accessed Aug. 24, 2020).

[10] World Health Organisation, “Plague,” 2017. (accessed Aug. 27, 2020).

[11] “Clement VI | pope,” Encyclopedia Britannica. (accessed Aug. 24, 2020).

[12] The Pandemic That Shook London | The Great Plague | Timeline. 2017.

[13] D. Mackenzie, “Did bubonic plague really cause the Black Death?,” New Scientist. (accessed Aug. 27, 2020).

[14] R. Scruton, Roger Scruton – On “Harry Potter.” 2017.

[15] H. Andersen and B. Hepburn, “Scientific Method,” in The Stanford Encyclopedia of Philosophy, Summer 2016., E. N. Zalta, Ed. Metaphysics Research Lab, Stanford University, 2016.

[16] “Horace Mann | Biography & Facts,” Encyclopedia Britannica. (accessed Aug. 26, 2020).

[17] C. G. Jung, The Undiscovered Self: The Dilemma of the Individual in Modern Society, Edition Unstated edition. New York: Berkley, 2006.

[18] M. Castells, “The Impact of the Internet on Society: A Global Perspective,” OpenMind, 2013. (accessed Aug. 26, 2020).

[19] World Health Organisation, “Depression and Other Common Mental Disorders: Global Health Estimates,” 2017. Accessed: Aug. 26, 2020. [Online]. Available:;jsessionid=3C066D350D7C8E8630F9F3DF6844E976?sequence=1.

[20] M. Jacoby, Individuation and Narcissism: The psychology of self in Jung and Kohut. Taylor & Francis, 2016.

[21] A. G. Y. Lim, “The Big Five Personality Traits,” Simply Psychology, 2020.

[22] W. Zhang, China Wave, The: Rise Of A Civilizational State. World Century Publishing Corporation, 2012.

[23] U. Bronfenbrenner, “Ecological systems theory,” in Six theories of child development:  Revised formulations and current issues, London, England: Jessica Kingsley Publishers, 1992, pp. 187–249.