42 Comments

"We know that it is possible to deliberately create high growth cultures, and we know that software makes it easier to scale them. But there are many open questions. (If there is interest, I could do a write-up of my understanding of this design problem.)" 🙋‍♂️

Expand full comment
author

Adding 1000 hrs to my todo list thank you very much.

Expand full comment
Sep 26Liked by Henrik Karlsson

Also would love to read more about this. Would it help if I send you some karma points? :)

Expand full comment

Is this post out yet? :)

Expand full comment
author

No. I have 934 hours left to go.

Expand full comment

Will create a timer 😅

Expand full comment

This is very interesting Henrik, I think that a lot of up would like to know more. Screentime is such a big issue in education and I am wandering how this will fit into that conversation.

I could see Simulated Intelligence(SI I refuse to call them AI) doing sort of rote teaching but most of teaching to me is about modeling from the teacher so I have a fair bit of skepticism. The emptiness, the flatness, of the large language models is definitely something that we wouldn't want emulated. I feel like we already have too much of 'seeming alive without actually being alive'.

Expand full comment

What if, now that the bulk of tutoring, marking and grading could be handled otherwise, the role of the teacher was to be incredibly present in the classroom; simply walking around, engaging each child on their level, sharing enthusiasm, prompting possible directions in lines of questioning/learning, offering readings, suggesting books, poems, movies, artists, shows and other sources or points of interest that might serve as a departure point, from a personal list they were themselves compelled to nurture and cultivate so best to guide young minds. They would be offering up a wealth of personal and idiosyncratic insights from a lifetime of learning, a warm and human salve for the flatness of machine interaction. If a ‘teaching’ KPI was the ‘cultivation of curiosity’, those who teach would be inspired to be (and associate with) high growth individuals themselves, becoming a mentor in place of administrator. I think something key, is not leaving students to hack it alone with GPT, but to rather interweave better, stronger, more interesting human relationships into the process. If your rote work disappears, as a teacher (in this case), that gives you the opportunity to attend to the higher levels of your vocation - the very things we often complain we aren’t able to get stuck into due to lack of time and resources. I think there is an opportunity here for the teacher/student dynamic to level up , not decay.

Expand full comment

Absolutely. This is exactly what I would like to see happen. The problem to my mind is, that real learning and real teaching, that is not rote memorization or even the demonstration of solving neatly setup problems, but understanding in real world environments and applying what you understand is incredibly difficult to measure.

Actually having done a good job teaching may not be noticeable for kets just say 20 years and then with so many confounding factors that it may never be obvious who the great teachers are and are not. Just as teaching requires a human and not a machine, evaluating requires a human and not a metric. I have never seen anything to make me think these problems can be well addressed at scale.

Expand full comment

Much of tech has the unfortunate fate of being bastardized. For instance, most of the bandwidth on the internet goes (or went) to Netflix. I agree with you and your other readers, these AI tutors can help with personalization of learning - "if" they are used properly. For instance, AI helping them individual projects. Like your article on geniuses: most invested time in their own projects, possibly out of boredom. An AI tutor, which can shine a light on the darkness around their own research projects - outside of schoolwork - might just create other JS Mills.

Expand full comment
Apr 4, 2023Liked by Henrik Karlsson

This is great. I think there is a big difference between tutoring and the milieu you talk about. I think about Mike Piazza getting batting practice from Ted Williams, or Picasso himself getting instruction from his father the artist/professor. It seems like we have a ways to go before we replace that type of human interaction and training by seeing it being done live.

Expand full comment

I think the best tutors inspire their tutees. I wonder if AI tutors will be able to make mini sub-cultures in each their sessions, making learning personalized and purposeful for kids who’d otherwise not care

Expand full comment
author

Exactly!

My gut reaction is to think that AI can't do this (though they can play an imporant role in organizing cultures, which is something I think your work points at). But maybe it is possible for AI to do this, if we can figure out what is needed to activate our instincts for attachment, status seeking, social learning etc. Maybe adding a world, maybe creating a strong personality - I don't know the lower bound of what is needed to create that connection. Perhaps video games can point at this - though its never worked for me, I've never cared about video games or felt attached to characters in them. And: something about this vision seems spooky to me.

Expand full comment
Apr 4, 2023Liked by Henrik Karlsson

Yes--video games have a particular way of engendering "motivation"/"attachment." When used in order to engender motivation, video games seem spooky to me, too (I'm worried I'll become attached to aspects of virtual reality that aren't ultimately productive to attach to).

I may be more optimistic about AI's ability to motivate w/o immersive media. Maybe all it needs are (1) rich context on what the student cares about, and (2) the ability to concisely connect the lesson content to those core cares. I've seen GPT-4 make pretty concise connections, and though I'm not sure how soon it'll become possible, getting "rich context" doesn't seem impossible either.

Expand full comment

One challenge will be to create AI tutors that don't feel like tutors so that students can learn at a variable pace and aren't put off by perceived dictatorial style tutoring. I'm not as worried about cultural acceptance, if it's effective it should only take a generation or so to become commonplace.

Expand full comment
author
Apr 4, 2023·edited Apr 4, 2023Author

Do you mean that you expect it to be taken up effectively by students in schools if it is proven effective and they don't feel dictatorial?

Edit: I should add that I'm suspecting a big cultural pushback against using AI tutors. I just suspect that many cultures, especially school cultures, will not be able to leverage them effectively.

Expand full comment
Apr 4, 2023·edited Apr 4, 2023

Yeah, essentially I'm saying that people will latch onto the novelty of an AI tutor not only if it's effective or as good or better than a human but also fundamentally different. Otherwise AI tutors will have the same negative connotations as a human tutor. How to achieve this will take some iteration. I think you mentioned videogames in another reply as a starting point which makes sense. I've seen game apps for young children that do wonders for developing vocabulary.

Expand full comment
author

My impression is that tools like this have tended to not make a difference in school so far. For example Khan Academy, which is a great service, and much much better than anything that happened in schools that I attended, but I remember seeing an RCT claiming it did not raise test score when implemented in schools. What do you think will be different about AI? Is it because they will, if done right, be more personal and feel less top-down?

Expand full comment
Apr 4, 2023·edited Apr 4, 2023

If it works I expect it will work for the same reasons as private tutoring by a human: the student is engaged and incentivized to learn by having someone more adept that they can emulate and that can guide discovery for them. If GPT-5 is as advanced as some are predicting (it's projected to release by the end of the year) then I can imagine AI tutors that can be fine tuned to meet different needs for different students. Of course, we don't know exactly how technology will advance so one weak guess that I can make with my limited knowledge of AI is it could have long term memory so students can build rapport with their AI tutors as they would a human. Long term memory is a simple example of how AI might advance to make interaction more meaningful.

Expand full comment
author

It will be exciting to see!

You can already add a weak type of long-term memory to GPT-4, by using a plugin that logs all conversations and then retrieves them using, I guess, semantic search. Similar to a set up I used here: https://escapingflatland.substack.com/p/semantic-search

Expand full comment

Yes but based on my interactions with GPT-4 it doesn't incorporate the material from its interactions into its training. The training is essentially a static set, not dynamic. I suspect that the whole thing is more complicated than that but that is the basics.

I think that it is very easy to overestimate these programs. I have a good example here from a story that I had GPT-4 'write': https://comfortwithtruth.substack.com/p/chatgpt-writes-me-a-story-and-answers-a8f

The punchline is that it only predicts what word should come next in a prompt response. When it writes a story there are no characters or plot, just simulated characters and simulated plot.

Expand full comment

Related to the “5% problem” in success reports of automated tutoring.

These tools may disproportionately benefit students who need the least help, while failing to address the needs of the broader student population.

Success was measured by 4.7% of study participants, which excluded 37.8% of schools and 19.1% of districts.

https://www.linkedin.com/posts/szymonmachajewski_education-instcon24-d2lfusion-activity-7219023521962708994-wEAQ?utm_source=share&utm_medium=member_ios

Expand full comment

Thank you Henrik for this. I have been teaching and mentoring kids from rural areas and under-resourced urban areas for the last 10 years. I can vouch for the importance of learning cultures. I am interested to know your thoughts on how to create and scale high growth cultures deliberately.

Expand full comment

I work at an agile learning centre with self-directed young people and work on a programme where we help them set goals. Broadly we ask them to look at their past, present, the future they want, the journey to get there, and how they will define success. I honestly believe that LLM are a powerful tool, but in the context of your argument about culture, as russett's comment gets at they are no different to google.

You are right in your assumption that culture is the bottleneck, but I think that it is a very specific aspect of culture that is going to prove to be the problem and that is school itself. The reason that khan academy has so little impact in schools, and why your assumption that if AI and LLM were "introduced in schools, I doubt most children would leverage these systems to grow significantly faster" is true is because schools erode our intrinsic motivation.

All the young people that come to our community from school have to go through a process of deschooling, where they essentially play and reject any notions of anything that looks like formal learning (the time this takes ranges depending on how negative the experience of school was). But when they come out the other side they are able to flourish, set their own goals, and work towards them on their own terms. As you note teachers will become less useful, we call ourselves mentors or facilitators and that probably is the future of the profession.

And when young people are in that space with plenty of intrinsic motivation then the tools will actually prove useful and have impact. Just this week I took a tool we use to help think about goals called a learning sprint and applied chat-gpt to it.

You start with a goal and then you note down all the possibilities you can think of within that goal. Say the goal is Romans, you write down anything you know about the Romans, and anywhere you could learn about the Romans - the What to Learn and the How to Learn. Then you build a story - this is a question, or a project proposal, or a maker project - depending on the nature of the goal out of those possibilities. Lots of themes of entertainment might come up - gladiators, plays, colosseums, lions - and so the story might be: How were the Romans entertained?

Then you create a task list of how you would go about answering that question, and you have a few weeks worth of research/making/writing/thinking to delve into.

Chat-gpt can be used to help at multiple stages. You can prompt it to help with the What to Learn and the How to Learn, creating a broader range of possibilities to draw on to create your story. You could put all the possibilities you have listed into it and get it to point out the themes to you to help you craft your story. You could ask it to help as you create the task list. The sub-task of Researching Roman Plays can be seen as a broader task that can be narrowed down into more specific tasks: who are the famous Roman playwrights, find age appropriate translations of plays, translations of plays in picture book format, which were most popular at the time and which plays are most read now? These can all be researched through the interface and all the while building on the previous questions as you narrow down to get what you want/need.

These are all tasks that a facilitator is there to help you with as you progress through the task, but you could outsource to a LLM if they were not there. You could take the learning sprint tool and work on it with me for an hour and then go home and now you know how to use a learning sprint work on your other three goals that you have in your own time with chat-gpt to help assist you. Or these are all things that a facilitator could theoretically do with you, but might not really know enough about the Romans and so you could use a LLM in the presence of a facilitator to get the best of both worlds.

But as I note, this relies on the intrinsic motivation of an eight year old walking up to you and going "I need to learn about the Romans. Will you help me?" We have had the ability for young people to think these goals and then go out and find the information for at least the last twenty years, and so in that sense it is no different to Google, in some ways I like to think of it as a more powerful Google, in that Google is a tool that releases self-directed learners into the world and allows the whole world to permeate back at them, and LLM are just the same but much more powerful, intuitive as they do the heavy lifting of the searching and filtering for you. But as you recognise the culture is causing the bottleneck somewhere and was when we had Google to find the answers to almost everything, and I believe it is the way school indoctrinates us in our relationship with learning, taking away intrinsic motivation and leaving us reliant on outside forces to motivate us, tell us, teach us, and grade us.

The key learning objective really is knowing how to use the software and knowing how to ask the right questions, that is the practice. What questions are going to give me the answers I want. You tweeted about recursive lists to find further authors to read to delve into a topic. An eight year old is not going to know to ask that. But our job as mentors is to help them develop with these tools so that those types of questions will seem logical and second nature to them as they get older. And when a tween says "I want to learn to be brave" as one did to me yesterday, it is about knowing that a LLM, is probably not going to be able to help you as much as a facilitator who knows you intimately is. That exploration is best done back in the real world with those meat sacks we call people.

Expand full comment

“But soon we will be able to spend less precious human time on basic tutoring; instead, the emotional labor we do to support each other can be invested in a more leveraged way.”

I have a feeling that part of the power of tutoring comes from having contact with an adult that is really excited with understanding the world, and who also believes you can understand it too. This is, that the process of teaching has a role in instilling culture; it’s not merely a complement to it. If my hypothesis is right, there will still be a place for humans doing basic tutoring. Maybe human tutors can serve solely as advisers, guides, but I’m not sure this is enough.

Expand full comment

Interesting words, and I think the thesis that culture is a limiting factor in converting AI access to growth/learning is a correct one. I feel confused by a few things, though:

- If AI progress continues the way it has been, what's the point in scaling such high-growth cultures? We're already at a point where LLMs perform better on knowledge tasks than a large portion of the population would be capable of performing. Is there a benefit to people outside of some small portion of the population learning to use LLMs for growth in this manner?

- It seems like we're moving towards a world where having a broad knowledge base internalized is less valuable. In particular, we're moving towards a world where we can access knowledge "just-in-time", in the same way that we currently (or at least prior to pandemic-related supply chain issues) managed logistics. In a few years, it may be that your obsessions with esoteric questions are just a new variant of entertainment, the new way people like you "distract themselves and wriggle out of work".

I suspect that a key culture shift will be that people move from "just Google it" to "just ask ChatGPT" -- and once that happens, and once a new generation grows up with LLMs and is as fluent with prompting them as millenials are with searching Google, and as AI companies make LLMs easier to use, what's the difference between the world we inhabit and the one you worry we won't?

Expand full comment
author

These are good questions. I guess I find it valuable to pursue creating cultures that allow people to flourish and become skilled because 1) an underlying assumption that this is a way of living that is more pleasant than the alternative and 2) I don't think we know how fast and far AI will evolve and how big of a role humans will play in solving the problems that need solving, so we might as well do what we can to make sure that people are there to solve them, in case AI won't.

Expand full comment

The phrase high-"Growth culture" seems to be a euphemism for high-culture growth. Rather than glorifying and prizing the culture of intellectualism, prize multiculturalism and different kinds of knowledge. That doesn't mean that we should value gang culture but it does mean that we don't lose sight of the fact human success and survival won't come from a drive to a singular focus on high-culture knowledge. Farmers and fishers don't need to learn about Pascal or Bertrand Russell.

Blair Kettle

Expand full comment

Really nice post , Henrik ! :) wondering whether you have any references on how people develop particular cultures or change cultures towards a particular direction at scale ?

Expand full comment

Your talk about exceptional people reminded me of chapter 9 of Atomic Habits (the chess prodigies).

They were tutored since childhood to play chess. And I guess they becoming great at chess was only possible with their parent's guidance.

That's why I think that If a kid wants to excel at something, it either needs to discover from young age something they're obsessed with. Or be highly directed by their guardians. I guess this is something that also happened with Tiger Woods. He grabbed a golf club when he was 2 years old. By age 12, he already had his 10k hours in (I think, don't quote me on that math).

This was a great read! I was thinking about AI and school, but I had a much darker vision than yours. Because there are companies marketing AI not as a tutor, but someone that just does your homework. Your text gave me hope into a better future! Thanks!

Expand full comment
author

The chess example is Polgár, I assume?

I'm actually a bit skeptical of the idea of early specialization. I think it makes sense is "kind learning environments" like chess and sports. But generally speaking, most people who go on to do exceptional work tend to have a fairly long sampling period where they try different things: because you can go further if the thing you do fits you perfectly and excites you profoundly. I remember seeing, for example, a study on professional soccer players, and they did actually play less structured soccer than their peers early on - instead trying many sports and playing a lot of unstructured soccer - only accumulating more soccer training hours in later teenage years.

Expand full comment

I recommend on this topic “The Range”. The book shows how generalists who combine multiple areas are catalysts for innovation.

https://www.amazon.com/Range-Generalists-Triumph-Specialized-World/dp/0735214484

Expand full comment

"Can we figure out ways to scale access to high-growth cultures? Are there ways for more people to grow up in cultures that approximate in richness that J.S. Mill, Pascal, and Bertrand Russel had access to?"

I think this is the key question right here, and I think on some level it's replicating it in microcosm in a way that stays tightknit across geographical barriers, and then trying to grow and spread that seed from there.

Expand full comment

I’m definitely interested in reading about your understanding of the high growth cultures design problem (or scaling problem). I guess that you are talking about designing in-school cultures (or learning pod cultures or homeschool co-op cultures or even adult study groups or intellectual circles), but when I read “high growth cultures” I immediately think about the cultural groups in America that exhibit disproportionate academic success: the Jewish, the East Asian immigrants (plus other successful minorities described the The triple package book https://books.google.com/books?id=4F6MAgAAQBAJ&pg=PT3) and the American elites (what Matthew Stewart calls “the 9.9 percent” https://www.theatlantic.com/magazine/archive/2018/06/the-birth-of-a-new-american-aristocracy/559130/)

There seems to be a neo-strict school trend (https://www.economist.com/britain/2023/01/16/why-super-strict-classrooms-are-in-vogue-in-britain) trying to “scale” the East Asian disciplinarian style in schools (because I guess is easier than scaling the other styles)

The Triple Package book aims to provide an explanation for why some groups "seize on education as a route to upward mobility”. It argues that education and hard work are not a good explanation for success; they are a “dependent variable”. Some of the key motivators described in the book are the constant sense of insecurity and a feeling of not being good enough. So I see a design problem there (although perhaps not the one you had in mind) and even if it was feasible, I’m not sure if is a good idea or not to scale insecurity.

Expand full comment
author

I think about the problem broadly (I'm more interested in cultures outside of school actually, but I work with a bunch of people building good schools so I think a lot about that, too). I'll look at the links you sent. If the trick is insecurity, then, yes, scaling that might not be great. There are a lot of cases where people grow excellent without abuse or insecurity (in the bad sense of that word), and I think we should aim at scaling those types of cultures, where people feel flourishing. But I'm not going to dictate that: different people have different goals and needs, so we need diversity.

Expand full comment