How artificial intelligence can be used to increase human intelligence and fundamentally change the way we learn

Artificial intelligence is swiftly infiltrating organizations across nearly every sector and industry. On an earnings call last year, Google CEO Sundar Pichai announced that the tech giant was “transitioning to an AI-first company.” Since then, Pichai doubled down on his promise, expanding AI research with a tremendous push in China and rebranding the research division as Google AI. Although clearly a leader in the space, Google is essentially doing the same as most every other company navigating this revolution: utilizing AI to create smarter machines.

Advances in AI and machine learning are undoubtedly powerful and fascinating, especially when this technology promises to simplify life and take on menial tasks for us. Robotic assistants, autonomous vehicles, drones, even smartphones, all providing hoards of data; the possibilities seem endless. This awe turns to panic, though, when it comes to AI threatening our livelihood. According to a 2017 report, an estimated 38 percent of U.S. jobs are at risk of being eliminated by automation within the next 15 years. This ranks our economy as the most susceptible to AI-driven unemployment among all other highly industrialized nations.

Why? I believe that the answer may lie deep within our education system. While our system has evolved over the years, it remains quite outdated at scale. This is witnessed repeatedly as American students continue to trail behind their peers in the very same economically developed nations that are, uncoincidentally, predicted to fare better against AI.

Overhauling learning in the classroom and in the workplace

Methods used to both teach and learn material have remained fairly stagnant, but the pace of technological advancement demands change if we are to stay relevant. With the age of AI upon us, this gap could potentially widen even further. Historically, an instructor teaches a group about a predetermined topic during a scheduled amount of time, using various visual tools to convey content on which the students will be eventually be tested. Of course, we have thrown some computers and the occasional “death by Powerpoint” into the mix within the last few decades. Up until now, though, how much has technology truly spurred drastic improvements within learning itself?

The last time the U.S. education system received any small degree of notable overhaul was perhaps during the Industrial Revolution. Unfortunately, while this brought free public education, it also ushered in a supposed one-size-fits-all classroom that plagues education to this day. Much of this longstanding learning environment actually exists in stark contrast to what cognitive science describes as ideal conditions for learning. It constrains time for a learner to retain information and is built around standardized assessments that encourage short-term cramming as opposed to true retention and understanding.

I’d wager that history will reflect the overhaul and modernization of our education system coinciding with another key period: The Information Age. Facing an era in which we depend on technology for virtually every aspect of our daily lives, what if we put it to better use? If American students improved performance in lacking areas such as math and science, would they be headed towards a future with more prolific prospects and opportunities? If we caught up to other countries in terms of education results and learning that compliments how the brain truly works, could we improve our chances against the rise of automation? Could we use AI, not simply to make machines smarter, but to make people smarter, as well?

I certainly think so, and it is imperative that we start working along those lines — and fast. Many roles that can be fully automated will be eliminated, but experts predict that AI and robotics will actually create millions of new jobs, many of which will be more compelling and prove more fruitful compared to those they’ve replaced. The issue most populations will face, however, is adapting their skill sets to meet the demands of such roles. The World Economic Forum’s most recent report on the future of jobs found the potential for new technological expansion is halted most often by “multi-dimensional skill gaps” across both local and global labor markets. In fact, such gaps are expected to be the single most common barrier to technology adoption across industries.

After all, the rigid traditional learning environment has not been limited only to primary, secondary, and post-secondary education. It also permeates workforce-specific and technical training in nearly every field. Education in both the classroom and in the workplace needs to be overhauled to help bridge these skill gaps. What if it weren’t the skills themselves that were difficult for workers to grasp, but instead, the issue was with our fundamental approach to learning?

Where traditional learning meets machine learning

A significant factor contributing to the lack of evolution in learning over the last two millennium can be boiled down to a forgotten element within the traditional ecosystem. Historically, the focus has been predominantly on two core aspects, with little change in the process over the last 2,500 years: the “what” — content placed in front of learners, and the “where” — the dissemination of content to learners in a specific environment. Although both subject matter and environment are vital to the learning process, a crucial component was largely snubbed; the “how.” The medium. The delivery. The last mile in the learning process and the connection between the material being taught, how the brain works to process it, and how a specific individual retains the knowledge. Enter, a formidable and well-timed combination of proven cognitive science principles, unprecedented data availability, and AI algorithms.

Two specific concepts are key here: adaptive distributed learning (i.e., spacing out learning over time) and retrieval practice (actively attempting to recall previously studied material).

Do you recall the dates you had memorized after an all-night study session for a history test, or how to count to one-hundred in French? For most of us, sadly, the answer is no. Why? Because time and time again, research has shown that information learned via cramming doesn’t stick and has virtually no long-term benefits for true knowledge transference. The phenomenon is known as the “forgetting curve”, or the rate at which something is forgotten after it is first learned. In order to retain information more successfully, researchers have also found that there’s a predictable, optimal time to review information.

Over the last handful of years, technology has provided us with the opportunity to apply cognitive theories about learning in the real world. By putting personalized, unique data to the test via machine learning and AI algorithms, we’ve moved from a theoretical understanding of learning and small-scale experiments to full-scale, real-life implementations. At Cerego, we’ve developed the ability, not only to increase a learner’s retention but also to improve the speed at which they learn. Our team has worked with over five million users and witnessed significant results across both education and the private and government sectors. When a group of dental students at NYU began using this approach, the class experienced a near-perfect pass rate of 99.7 percent; prior to using the technology, its historic pass rate reached only 80 percent.

AI’s potential to increase human intelligence and performance transcends a mere enhanced learning experience for students. It is amplified to provide previously invisible data that allows instructors and managers to identify and practice unique approaches that increase retention. It can also predict future performance, recognizing and outlining specific characteristics that are unique to a learner. Such metrics may sound like futurist hopes when compared to our currently accepted methods of assessment and personalization, but the truth is, this future state is much nearer than you might expect.

As is the case with most substantial reformative efforts or periods of significant evolution, the path is not explicitly clear, nor is there a rulebook to follow. One clear step towards achieving widespread adoption of optimized learning methods is, quite predictably, education and knowledge sharing on the topic. Historically, it’s been the consumer or end user that drives and demands change — often largely due to spending power. When it comes to learning, every single one of us is the end user in some capacity and we would all serve to benefit. AI fatigue may have set in for some, but we can all agree that reinvigorating the learning experience is a far more exciting use for the technology than applications and machines which ultimately enable us to think and do, less.

AI is here to stay and we’re starting to find that, when powered with a foundation in brain science and provided in the correct format to learners, it can increase human performance. By augmenting human potential in this way, we may just be able to begin closing the widening skill gaps and ensure we have the ability to successfully work alongside robots, instead of being at risk of being replaced altogether.

-Andrew Smith Lewis, co-founder, Cerego