Tag: GenAI for Teachers Series

  • Attention as a Commodity: Neurofeedback and AI in the Changing Landscape of Education

    Attention as a Commodity: Neurofeedback and AI in the Changing Landscape of Education

    An abstract representation of two closed systems with only one connecting element or wave between them, using the style of interconnected layers, swirling patterns of light, and neural connections

    The merging of neurofeedback technologies and Generative AI holds the potential to reshape education by turning attention into a measurable commodity – how does this affect what it means to learn, and teach?

    GenAI can now observe pupils and analyse their discourse. It seems we are moving towards a future where it not only observes pupils but actively measures their attentiveness, quantifies their engagement, and seeks to shape the very nature of how they learn. Recent research is uncovering how neuroscience, neurofeedback technologies, and GenAI are transforming attention into a measurable, commodifiable asset. As educators, how should we navigate the ethical labyrinth that arises when attention—once a personal, cultivated skill—becomes something to be monitored, manipulated, and monetised? How does this relate to the rights of the child?

    Attention and Its Implications in Teaching

    The ‘new science of education’ made a significant impact on teacher education after Dylan Wiliam’s tweet in 2017. Since then, it has gained immense momentum, sparking debates on social media and within academia. Attention has become the golden currency of this new science, rapidly achieving the status of a gold standard. You may recall the viral video of Pritesh Raichura, which ignited widespread discussion in the sector and seemed to provoke a Marmite response—you either love it or hate it. So why has attention become such a hot topic and led to such extreme reactions? To answer this, we need to examine its concerning connection to neurofeedback training—also known as brain training—and explore how GenAI could play a pivotal role in the system.

    Until the last ten years or so, cognitive science was largely kept outside of the educational domain. But in 2023, in response to growing interest, the Core Content Framework (CCF) was introduced for initial teacher education. The introduction of the Core Content Framework (CCF) and the soon-to-be ITTECF brought with it a requirement for all providers to educate trainees on the theory and application of cognitive science. The more I work with trainees and explore the fundamentals of cognitive load theory (Sweller, 1988), the more I recognise that attention is the key hinge point. The connection between what the teacher feels they ‘do’ and what the learner ‘does’.

    Attention is a complex field of study, one that the CCF and ITTECF heavily simplify. However, a clear aim emerges: attention is treated as a commodity to be maximised in order to achieve long-term information retention. I propose that attention serves as the connecting mechanism for two otherwise predominantly closed systems: the external world (the classroom) and the internal world (the learner). In this light, attention could be likened to the role of gravity in brane theory: it permeates and connects otherwise distinct systems. Its effects are felt in and across each system, but the connection is not always tangible or observable.

    Generative AI offers new possibilities for supporting this dynamic. By gathering real-time data from classrooms and operationalising advancements in neurofeedback technology, GenAI could provide personalised insights into students’ attentional patterns. AI-driven adjustments could ensure that key moments of instruction align with students’ peak cognitive receptivity, thereby enhancing memory retention and engagement. But is this ethical? How does this relate to the free will of the pupil? Is this the responsibility of the teacher?

    Navigating Challenges and Ethical Considerations

    While the integration of GenAI and neurofeedback tools offers exciting possibilities, educators must address the accompanying ethical concerns. Monitoring and measuring attention could shift the focus from cultivating student autonomy to prioritising compliance and control. If attention becomes a commodity to be technologically quantified, are we at risk of reducing students to data points and Fourier transforms?

    A major concern is the commercialisation of attention-focused devices. As highlighted in a recent paper by Kotouza, D., Pickersgill, M., Jessica Pykett, & Williamson, B. (2025), neurofeedback technologies and EEG devices have already been marketed for use in classrooms, with some targeting “students with high numbers of disciplinary ‘office referrals’” to improve their concentration. How should educators respond when faced with the growing pressure to adopt such tools? What safeguards are needed to prevent data exploitation, bias in AI algorithms, and breaches of privacy?

    Attention should be cultivated as both a skill and a choice. Using the marriage of neurofeedback and GenAI systems has the potential to remove or heavily reduce this choice. The UN Convention on the Rights of the Child (UNCRC) states that every child has the right to:

    • Relax and play
    • Freedom of expression
    • Be safe from violence
    • An education
    • Protection of identity
    • Sufficient standard of living
    • Know their rights
    • Health and health services

    There are multiple aspects of this list which are at risk through the use of systems that aim to maximise attention.

    So, where’s the line between control exhibited by a teacher, and control exhibited by a machine? Pritesh’s video maximises attention skilfully – how does this compare to a neurofeedback-GenAI system which effectively does the same?

    We are currently in a pivotal period of change and ethical debate – what does it mean to be a teacher in this context of neurofeedback and GenAI and how can these tools enhance education without compromising the agency, privacy, and humanity of learners?

    In conclusion, the convergence of neurofeedback, GenAI, and education offers both promise and peril. As educators, we must critically evaluate whether these technologies truly serve the learner or merely reduce them to data-driven outputs. The potential to enhance attention and engagement is undeniable, but so too is the risk of undermining student agency and the foundational principles of education. 

    The question remains: how can we harness these advancements responsibly, ensuring they enhance teaching rather than challenging humanity at its core? As we grapple with this pivotal moment in educational evolution, the choices we make today will shape not only the future of teaching but the very essence of what it means to learn.


    References


  • Redefining Teacher Identity in the Age of GenAI

    Redefining Teacher Identity in the Age of GenAI

    GenAI Teacher Identity = Humaniod inside an apple

    What’s your teacher GenAI identity? The starry-eyed? The cautious? The early adopter? Is our GenAI use falling starting to create a typology?

    A few months ago, it was revolutionary that GenAI could create a lesson plan. Since then, more and more tools and services have emerged that serve the same function in different ways: reducing teacher workload by creating materials for them. This in itself was enough to grab headlines, but is that enough anymore? Like the widespread use of pre-made schemes of work, the shiny appeal of functionality soon gets lost to the notion of alignment. In this post, I argue that we’ve gone past the point of being impressed by functional possibility, into the realm of matching a GenAI tool to your specific teacher identity. Vanilla shouldn’t be the only flavour available.

    To understand this shift from function to alignment, we need to delve into the concept of teacher identity. The concept of teacher identity is established. Rushton, Rawlings Smith, Steadman, and Towers (2023) define teacher identity as being socially constructed, dynamic, and hybrid. They acknowledge that it is influenced by a range of individual factors such as biographies and narratives, alongside emotion, social contexts, and relationships with others. So in this GenAI age, how does this map to GenAI tool use? I’ll explore each factor in turn, looking at how these aspects can be translated and connected to GenAI tool use. 

    Is Teacher GenAI Identity Really a Thing?

    Is GenAI socially constructed? As it is trained on a large subset of the internet, I’d argue that it is. I’d go so far as to say the concept of GenAI itself has parallels with a giant socially constructed machine such as Viki in I, Robot. Teachers have user profiles for these GenAI tools and accounts, so by using them, they are, by definition, subscribing to the beginnings of a Teacher GenAI Identity. What could the diversity of these identities look like? How well do they map to the classroom? How representative or relational are they to the identity observed in classroom practice? These are all questions that we are just beginning to explore. This aligns well with the idea of a teacher identity being ‘dynamic’. Would a teacher identity survive without a GenAI aspect? At what point will there be any teachers left who don’t use GenAI in some way or another? This human-machine way of interfacing could be seen as the ultimate ‘hybrid’ component of teacher identity.

    Staff Room of the Future - Colourful Cartoon clay figures sitting on chairs in a room

    It seems likely that a GenAI Identity could indeed fulfil the criteria of being socially constructed, dynamic, and hybrid, but what about the other aspects? How can GenAI be used with real-life teacher narratives? A classroom is full of narratives. A teacher has their own narratives. Arguably, this is such a nuanced and personally-loaded concept that it is difficult to see how it could be mirrored by the digital domain. I’ve spent some time trying to deconstruct this concept of narratives, especially for teachers early in their training or careers. A narrative is context-dependent and value-laden, and awareness of them requires advanced reflective capacity. I’ve discovered something, though, with prior knowledge scenarios. Our prior knowledge generator is perhaps the best way to illustrate this. While we are using the ‘new science’ within the CCF/ITTECF, in our profession we will be focusing on establishing the prior knowledge of children for some time to come.

    Prior knowledge is more than just prior learning, though. It is contextual and dependent on lived experience and personal significance. These are difficult aspects for beginning teachers to master, as their lived experience and database of what prior knowledge could look like for children is in its infancy. So our open-source prior knowledge scenario prompt generator aims to provide a scaffold to promote the consideration of the narratives of the individuals in a class. In this capacity, although the scenarios may not be exactly representative of the specific needs of the class, they encourage a teacher to consider the diversity of the possible prior knowledge of the children in their class. A scaffold for one size does NOT fit all approach.

    Continuing with the idea of individual differences, how can we view a Teacher GenAI Identity as being affected by emotion or relationships with others? I’ve observed that the mere fact that there is the capability of GenAI being used to do ‘teacher’s work’ evokes an emotional response in individuals, such as those who are actively resistant to AI in education (Shea, 2024). How can a team of colleagues function, in the future age of GenAI adoption, if there are huge differences in the response to its use? How do departments and institutions address this, and is there even a need or requirement to? Is an early adopter of GenAI Teacher tools at a social advantage? Are they perceived as holding a position of responsibility to influence the GenAI identities of others in a team? In this regard, I could argue that a Teacher GenAI Identity is not just a personal attribute but one that is socially impacted and relevant to teams of employees and their functionality.


    Towards a Typology

    So if teachers can be thought of as having a Teacher GenAI Identity, what could the diversity of these identities look like? Stay with me and consider my rapidly developing (and slightly humourous) typology:

    🤖 The early adopter: Has been involved since GenAI was just AI. Can explain the difference and also talk at length about machine learning. Has already created their own agents and published several papers. Could be utilised to upskill other employees but is struggling to find common ground with ‘the cautious’.

    🤔 The intrigued: Can see the potential to save time and get creative. Will happily attend any specific training and give things a go but lacks deep understanding of how it works or how it could be developed within education.

    😇 The saviours: Have happily subscribed to its use as the ‘new normal’. Convinced it will solve all problems for their institution and maybe even humanity.

    😿 The Shakespeare mourners: Have already charged GenAI with the crime of ending humanity’s ability to write, and maybe even think.

    🙄 The accepting: Will go with the flow. Will use it when it is normalised into operational function but won’t develop it or appraise it.

    ⚠️ The cautious: Worried about the impact upon cognitive development through over-reliance on it. Will cite evidence that taxi-drivers and younger people have already lost critical thinking skills because of it.

    🪄 The magicians: Convinced that it works by magic, and will happily believe in the power of magic. If magic saves them time, then magic they will use!

    😱 The scared: Convinced it will end humanity, or at the very least be the death of critical thinking skills.

    🤩 The starry-eyed: Infatuated with its existence and excited by the sheer possibilities of use and development.

    🪤 The boundary-observing: Like the starry-eyed, but also aware that within the sheer possibilities of future use and development could be AGI and human enslavement.

    As we look to how Teacher GenAI Identity may progress, which of these ‘types’ do you best associate with at present? If, however, you would require a more descriptive typology, the UNESCO AI Competency Framework for Teachers might be a good place to start. Being a competency framework, it is concerned with tangible and measurable outcomes that are explained and exemplified in detail in the report. The basis can be seen in the table opposite. For the definition of Teacher GenAI Identity, there appears to be a need to incorporate the notion of competency. This opens the gates of discussion surrounding the related attributes of self-concept, values, and beliefs. How does a GenAI-produced lesson plan echo the self-concept, values, and beliefs of the individual teacher? This can be explored through considering the specifics of the generation of lesson plans.

    Teacher GenAI Identities in GenAI-produced Lesson Plans

    Most of the lesson plan GenAI Tools I’ve used follow some kind of repeatable structure. Aila (Oak National Academy) reliably provides reasonably detailed prior knowledge, retrieval starter quizzes, and a lesson plan based on ‘lesson cycles’. Teachmate (Teachmate) consistently produces adaptive strategies at the end of their plans. Twinkl’s Ari (Twinkl Educational Publishing) seems to have some variability with lesson structure but the overall approach follows a reasonably standard format. I could go on for many other tools.

    We could question whether every lesson should be based on a repeatable, set, rigid structure. Tools are now integrating opportunities for the teacher to adapt the generated content either during or after the generation, but how likely is it that a teacher will actually do this? I see a need for research in this area. Recent research from TeacherTapp are telling us how many teachers are using GenAI tools, but what are they actually doing with the outputs? Modifying them? Using them vanilla-style? Where is the space in this construct for a teacher ‘taking the children outside’ for a lesson or otherwise deviating in a way that could be an extreme obvious example of the ‘manifestation of teacher identity’? 

    There are those who will argue that all lessons should follow a rigid structure heavily related to specific objectives and not deviate away even slightly, to maintain attention on the required content and outcomes. In this scenario, the more rigid GenAI Tools fulfill their purpose.

    Many will wonder if, or perhaps ‘feel’, that teaching is more than this static structure of what a lesson should be. What is it about me that makes my lesson planning different to yours? A bigger question could be framed as ‘how does a lesson plan reflect or indicate aspects of teacher identity’. Our Lesson Inspector tool was created to provide a critically reflective scaffold to begin to explore this question. I know my teacher identity has aspects that strongly admire dual coding and wholesome consideration to the personal significance components of prior knowledge. Could we get to a situation where teachers can choose their tool based on that tool’s alignment to aspects of their teacher identity? Or perhaps they fancy a move out of their comfort zone and want to know which tool could provide this.

    Moving forward, it is likely that to answer these questions of GenAI and identity, we will need to consider the space for affect within GenAI tools. Could affective GenAI soon permeate this realm too? What would that mean for tool use? How could my preference for a gel pen or a bit of plasticine modelling be represented through a GenAI tool’s knowledge of my teacher identity? Taking this a step further, how long until the user doesn’t input or select from options related to their teacher identity, and their identity is inferred from their social media profiles, like LinkedIn…

    Perhaps I have now morphed from ‘the starry eyed’ into ‘the scared’…😱🤖


    References

  • Content Alignment: Using Lesson Inspector on GenAI Lesson Plans: Aila & Teachmate (and a dragon…)

    Content Alignment: Using Lesson Inspector on GenAI Lesson Plans: Aila & Teachmate (and a dragon…)

    Clay 3D robot with a magnifying glass,inspecting waveforms

    GenAI lesson plan tools are amassing rapidly, but how can we check their output against statutory frameworks?

    Both Aila and Teachmate offer lesson planning tools and will quickly and easily provide a lesson plan. Aila is free, whereas Teachmate is a subscription service (for the lesson plan tool). Here we put lesson plans generated by Aila and Teachmate through our Lesson Inspector and discuss the reports. We conclude that our Lesson Inspector should be further explored as a content alignment tool, and a prompt to promote critical reflective practice. The Inspector reports begin to indicate and consequently quantise the diversity of GenAI-produced lesson plans.

    Even before GenAI tools burst into our pedagogically-informed lives, we pondered on how to judge the ‘goodness’ of a lesson plan. There is no definition for a ‘good’ lesson plan, as essentially this concept could be broadly thought of as dependent upon:

    • Context (the needs of your specific learners)
    • Teacher identity (your epistemological beliefs about knowledge etc.)
    • School ethos (what values does your school promote?)
    • Policy (what frameworks do you have to work within?)

    When asked to define a ‘good’ lesson plan, most teachers struggle to condense their thoughts into a formula or set of beliefs. This is something that educationalists intuitively understand but may find challenging to operationalise. Yet in my experience, it is something student teachers and ECTs commonly yearn for.

    Introducing Lesson Inspector for Content Alignment

    Enter the Lesson Inspector; a tool we’ve developed to evaluate lesson plans based on the themes present in the Core Content Framework (CCF), Initial Teacher Training Core Content Framework (ITTECF), and the Teachers’ Standards. Our Lesson Inspector offers two metrics: a quantitative score and qualitative evaluation/analysis by criteria. This qualitative analysis provided by Lesson Inspector offers potential for content alignment of the LLM with the aforementioned Statutory frameworks.

    In this discussion, we use two lesson plan generators (Aila and Teachmate) to produce lesson plans on the same topic of KS4 waves, with the same objective (below). We then upload these to Lesson Inspector and create Inspection reports for them both. We will discuss:

    • User Process: The steps and methodology each tool uses and how the user inputs their lesson information.
    • Report content : Such as prior knowledge and other relevant themes included in the Lesson Inspector report.

    For a fair comparison, both tools were given the same objective from the AQA (2015) GCSE Physics Specification: “Students should be able to describe wave motion in terms of their amplitude, wavelength, frequency, and period.”

    Erm, great but…what is ‘content alignment’?

    You could be forgiven for having missed the introduction of this term  in the GenAI context. We are using the definition from AI for Education, as can be seen in the screenshot adjacent. Essentially the terms ‘correctly calibrated’ and ‘curriculum standard’ found within this definition could (and should) be debated. Here, we will introduce a ‘curriculum standard’ (the CCF/ITTECF and Teachers’ Standards – although the latter is an assessment framework and not a curriculum) and discuss the notion of being ‘correctly calibrated’.

    But first, let’s look at how these two interfaces are used to produce the lesson plans, what is produced, and what Lesson Inspector provides as reports.

    content alignment

    Aila vs. Teachmate User Process

    Aila is free. Aila is Government funded. These are immediately reasons to consider using Aila. One positive about the process of using Aila is that it is designed to involve the teacher at every step of the process. This means that after each input and generation cycle, the user has the opportunity to refine the generated output. This can be quite versatile in terms of tailoring to specific needs, but does come at the expense of requiring more active user input into the process, meaning that it takes a bit longer to generate a plan. We will explore this user-centric functionality is in another post. 

    When considering the time scales though, do keep in mind that although quicker, teachmate is NOT free. It cost me £6.99 at the time of publishing this. However most non-government backed generators also charge, so this is not uncommon.

    The interfaces are completely different: Teachmate has minimal inputs (see screenshot) and Aila seeks verification along each step of the generation process. See the Aila video below, to see how this process looked for our lesson plan generation on waves. This video is sped-up by a factor of 6, so keep this in mind! (Although this kind of duration is not uncommon in GenAI tools). In this video all that was entered into the interfaces was the topic, subject, phase and objective – the same as for Teachmate.

    Teachmate Planning Inputs

    teachmate input screenshot
    A screenshot of what was entered to produce the Teachmate lesson plan on Waves.

    Aila Planning Inputs

    A video (x6) of the process of entering inputs into Aila for the generation of the Waves lesson plan.


    Output: The Waves Lesson Plans

    You’ll probably want to see what was produced. See below for the .pdfs. It’s easy to see how comparing these two can start to sketch out a skeleton for the diversity of GenAI-produced lesson plans.

    Aila Waves Lesson Plan

    See what was produced from the video of the Aila input process (above).

    Waves-dgnvajzp-Lesson-plan

    Aila Waves Lesson Plan PDF

    Teachmate Waves Lesson Plan

    See what was produced from the Teachmate input process in the screenshot (previous section).

    TMAI-Lesson-Plan-Understanding-Wave-Motion

    Teachmate Waves Lesson Plan PDF


    Lesson Inspector Analysis (Reports)

    Both the lesson plans were put through our Lesson Inspector. See below for the .pdf versions of the reports (we are working on making these look a lot prettier!).There are several pages to each.


    Discussion on the Lesson Plans and Reports

    Starting from the same objective, it’s interesting to see the paths that each tool has taken. This echoes what anyone in teacher ed will have noticed: there can be a huge diversity in planning, given the same initial boundary conditions. In some sense, it could be argued that lesson plan generation tools should mimic this. Certainly the probablistic nature of GenAI should mean that by definition a vast array of lesson plans should occur.

    That brings us on to considering how useful, or correct this diversity of plans could be.

    Many will wonder if the Inspector could score a perfect zero. We’d invite you to consider the (very brief) lesson plan below, and amuse yourself with the Inspector’s report…

    a cute clay dragon breathing fire at children

    My Lesson will be about fire. I will get the children to sit on the carpet and then get a dragon to breathe fire at them.

    fire-dragon-report

    Fire Dragon Report PDF


    Summary – ‘Correct Calibration’

    Once you’ve recovered from the hilarious analysis of that final lesson plan, it’s clear that a perfect zero can indeed be scored. This brings us back to the initial idea of content alignment involving correct calibration of curriculum standards.

    If we assume a static curriculum standard based on the CCF/ITTECF and the Teachers’ Standards, we need to consider the ability of the Lesson Inspector to guide users towards the correct calibration of these GenAI tools when creating lesson plans. Our scoring metric and evaluation/analysis framework provide a teacher with a quick and easy sense check against the frameworks they work within. These two metrics have very different impacts – the quantitative score could be used as an immediate quality indicator, although we have designed it to essentially be a ‘hook’ to ensure the report reader reflects upon the qualitative analysis – a means to promote reflective practice upon curriculum goals.

    If we return to our initial understanding of the values that contribute to a ‘good’ lesson plan, we can argue that content alignment is at least partially achieved if the Lesson Inspector reports facilitate and promote reflective practice. These reports provide a space for teachers to critically consider the output and reflect on how it aligns with their values, beliefs, and ideas about the curriculum.

    Consider an example: a teacher who adheres to a very teacher-led approach and does not favour student-led practice. The Lesson Inspector report suggests student-led practice as an area for improvement and notes the reliance on teacher-led methods. Does this mean the content is not aligned? The teacher reflects on these comments from the report and critically thinks about their ideal use of teacher-led versus student-led practice. They ultimately decide they are satisfied with their initial plan. Does this mean the report is not useful? Does it mean the content is misaligned?

    The Lesson Inspector cannot answer these questions of pedagogical choice and teacher identity. However, it can provide a set of structured prompts to help users explore and solidify their concepts of a ‘correctly calibrated’ learning experience.

    Forthcoming Developments

    Future developments of the Inspector will include creating different ‘curriculum standards’ based on instructional frameworks such as Rosenshine (see our other blogs) or other learning strategies like UDL and enquiry-based learning. There is even potential for custom frameworks tailored to an institution’s own set of values.

    It’s clear that Lesson Inspector is a valuable tool to begin to explore the content alignment of the lesson plans generated by GenAI tools. Our forthcoming developing will centre around:

    • Diverse testing of other content generated by Aila, Teachmate and other GenAI content generators.
    • Discussion of each thematic strand in the reports in detail and with reference to specific CCF/ITTECF criteria. 
    • Development of an assessment framework that can be used for QA. 

    Watch this space. 🐉🔥

  • Reviewing TeachmateAI

    Reviewing TeachmateAI

    TeachmateAI logo in clay animation style with question marks and brains around it

    TeachmateAI claims the market share for teacher tools, but what does it offer for ITT?

    The GenAI teacher tool landscape is pretty well established now, with many companies offering very shiny tools claiming to reduce teacher workload. So what do you get for free, how can we work with these free tools and what do they have to offer teachers and trainee teachers? This series reviews several existing tools, exploring what they produce for an imaginary teacher, wanting to teach friction to year 4. Here we focus on TeachmateAI and conclude: Free, basic, shallow, introductory.

    TeachmateAI is a tool that offer a free (restricted) account and several paid options. you can find it here: https://teachmateai.com/

    Here, we create an imaginary teacher, wanting help with their practice, and who has the following user inputs: subject: science, year: 4, topic: friction, objective: to understand factors affecting friction.

    The free account gives you access to a few tools, as shown in the screenshot. Some of these are aimed at school leadership (for example, the SIP writer). Here we focus on the free teacher tools only. Some of these tools would be more useful for other subjects, for example the reading book recommendations and mini saga. These don’t lend themselves well to the friction lesson our teacher wants help with, so we won’t be focusing on them here.

    Let’s explore these tools:

    • Activity ideas generator
    • Concept explainer
    • Jingle generator

    A nice aspect of these tools is the option to refine results. This is common on other tools and seems to be turning into the industry standard. However, we will explore what the limitations currently are with this functionality, in contrast to using a GenAI instead of the TeachmateAI tool.

    If you’d prefer to skip ahead to our summary, scroll down.

    Listing of Free tools by Teachmate

    Activity Ideas Generator

    One of the options to ‘refine my answer’ was to ‘differentiate’. Once we had recovered from shock of the use of the ‘dirty word’ of ITT, we couldn’t help but click on it and see what was produced. Compare the two below. Perhaps the educational use of the word ‘differentiation’ (which has varied somewhat over the last 20 years) hasn’t been used by the model, and a more generalised definition of the word has been applied instead?

    The vanilla response to the user inputs creates…

    TMAI-Creative-Activities-for-Understanding-Factors-Affecting-Friction-in-Year-4-Science

    It gives ten ideas for activities, as you can see in the documents. The ideas are arguably quite ‘large and diverse’ ideas, probably in-keeping with the title ‘creative activities’. There’s not a huge amount of detail, but its a good place to start if you need ideas. Maybe teachers may be left thinking ‘how exactly do I execute this idea?’

    The ‘Differentiation’ version (now wash your mouth out!)

    TMAI-Creative-Activities-for-Understanding-Factors-Affecting-Friction-in-Year-4-Science-differentiated

    The ‘differentiation’ aspect seems to be a bolt-on to the original ideas. Despite being labelled as ‘differentiation’ these bolt-ons are in fact nothing to do with SEND; they are pre-planned modifications to the activity. This is somewhat in contradiction to ITTECF 5.4: ‘Adaptive teaching is less likely to be valuable if it causes the teacher to artificially create distinct tasks for different groups of pupils or to set lower expectations for particular pupils. ‘

    Really, these are more alike to prompts for teachers to remember to scaffold. We would have liked to have seen more to hit ITTECF 5.8High quality teaching for all pupils, including those with SEND, is based on strategies which are often already practised by teachers, and which can be developed through training and support


    Concept Explainer

    It produced a three page explainer. Being scientists, we are sad to see it hasn’t actually pulled on any of the wealth of information that is out there, for example the Institute of Physics ‘IOP Spark‘ Platform (which we REALLY hope gets integrated with GenAI very soon, hint-hint nudge-nudge!) or anything from the Association for Science Education and their database. Another criticism is that it’s written with a severe lack of technical vocab, or showing any link with prior and sequential ideas on forces. It’s no match for the great plans that Plan Assess have created without the use of GenAI. On the plus side, it’s a start if a teacher has no experience or existing perception?

    When prompted for refining the answer, we just HAD to click on the ‘add in some common misconceptions’ option. The second document below contains the amendments. Again, it’s a start for teacher to remember to consider misconceptions, but it’s missed an opportunity to signpost readers to the sources of information, where they can deepen their subject knowledge.

    What it creates, a la vanilla style…

    TMAI-Factors-Affecting-Friction

    Basic and non-technical but essentially its a start to thinking about the subject knowledge.

    Refined with ‘add in some common misconceptions’

    TMAI-Understanding-Factors-Affecting-Friction-with-misconceptions

    With added misconceptions. We’d like to have seen links for teachers to deepen their subject knowledge here.


    Jingle Maker

    We are all for the arts and science combined. But perhaps a somewhat underwhelming jingle on friction….from TeachMateAI here?!

    So, we asked Copilot to make a jingle on friction, for comparative purposes. Indulge yourself with its response!

    friction jingle, created by Teachmate
    A jingle about friction, created by CoPilot

    Summary

    Lets face it, this is good for free. It’s an introduction. Somewhat missing the opportunity to link to subject specific services and platforms for development of subject knowledge. The functionality is somewhat clunky – it ignores many requests typed into the ‘refine my answer’ box without indicating that it can’t help you. Definitely not hitting the adaptive teaching mark or ITTECF criteria.

    Let’s pull out the pros and the cons:

    Pro’s:

    • These tools were free
    • Gets teachers thinking about misconceptions and scaffolding their materials
    • Fun little jingles

    Cons:

    • Depth of subject knowledge and misconceptions is missing.
    • No links to subject-specific information sources or platforms for teachers to improve the depth of their subject knowledge or subject pedagogy
    • The concept of differentiation is contestable and not aligned to the ITTECF.
    • No integration of SEND awareness, and not useful for adaptive teaching.
    • Ignores ‘refine my answer’ requests.

    So, in summary let’s say: Free, basic, shallow, introductory.

    If anyone has paid for the full version and would like to add to this, please contact us!