Using Tea bags does not transform us in tea master


A little knowledge is a dangerous thing, and sometimes can be misinterpreted or masqueraded as expertise. 

We can put the kettle on and use a tea bag – it does not mean we have become tea experts. We can put a coffee capsule in the machine, and it does not mean we have become baristas. We can open a bottle of wine, and that doesn’t mean we have become wine experts. We can create a basic graph in a spreadsheet, and that does not mean we have become expert analysts. We can ask Gemini, ChatGPT, and other LLMs for prompts, but that does not mean we have become AI experts.

Pyramids of knowledge 

There is a high likelihood that the searching pyramid of knowledge brings to our attention two types of pyramids. One has a base composed of data, supporting information, knowledge, and wisdom at the top. The other’s foundation is remembering, supporting understanding, applying, analyzing, evaluating, and creating at the top. 

One pyramid models the relationship between data and its transformation into wisdom, and the other models all the activities pedagogists use to support learning and skills development. Learning is unlikely to occur without any data being transformed into information, and information into knowledge, before bringing wisdom. 

We have long sought the automation of learning – generative AI embeds many aspects of both of those pyramids into technological solutions. We have reached the points we need to adapt and adopt to prevent falling behind. 

AI literacy is not the only answer.

Did you know the 21st Century Skills Framework has evolved, incorporating AI literacy, transforming traditional competencies like the “4Cs” into a hybrid model of human-machine collaboration. This updated framework prioritizes critical thinking through the lens of algorithmic bias and verification, creativity via generative AI prototyping, and communication through advanced prompt engineering. 

Modern standards, such as the UNESCO AI Competency Framework, emphasize an ethical “human-in-the-loop” approach, requiring learners to move beyond technical usage toward a deep understanding of data privacy, digital citizenship, and the responsible application of AI to solve complex, real-world problems. There is little consideration of the knowledge, skills, and wisdom required to use the outcomes of those systems safely and securely. 

Imagine a medical doctor assisted in diagnosing patients by AI or advanced statistical methods. Their medical experience, knowledge, and training will guide their decision-making process. The automation of the data process augments their intelligence and may help to diagnose more patients. 

Imagine a student with very little understanding or knowledge of the subject they are studying. They could use generative AI to conduct their own research, generate code, and generate content for their essays. Would those individuals be sufficiently equipped with enough knowledge to identify errors in the outcome of AI-generated content? 

Learning is doing, practicing, and solving problems

Learning is remembering, understanding, applying, analysing, evaluating, and creating. Each of those processes transforms data into information, and information into knowledge. Wisdom comes with experience and reflection. Struggling through solving problems, reading, researching, and questioning brings resilience. Individuals can demonstrate their full capabilities. 

Practicing with AI can reduce the discovery and playfulness of learning. It can limit our imagination and block access to a wealth of information. That is why, until we are specialists, we should not rely on AI to mask our lack of skills, knowledge, and understanding. It should flag that we need to get back to school, instead. Using teabags does not make us tea masters.


Leave a comment