Humanizing adaptive learning for ELT

March 17th, 2014
by


Part 1:  Knewton, adaptive learning, and ELT
Part 2:  Open platforms and teacher-driven adaptive learning

The debate over adaptive learning at eltjamPhilip Kerr’s blog, and Nicola Prentis’ Simple English has been both fascinating and instructive, not only due to the posts but also the great dialogue in the comments. It’s a fun topic because it involves our core beliefs regarding language acquisition, effective teaching, and the roles that technology can play.

That said, I can’t help but feel that in some respects we’re thinking about adaptive learning in a limited way, and that this limited perspective, combined with Knewton confusion, is distorting how we approach the topic, making it into a bigger deal than it really is. But, given the potential power that the new “adaptive learning” technology may indeed have, we do need to see clearly how it can help our teaching, and where it can potentially go wrong.

Adaptive learning in context
I wrote “adaptive learning” in scare quotes above because I think the name itself is misleading. First, in a very important way, all learning is adaptive learning, so the phrase itself is redundant.  Second, the learning, which is carried out by the learner, is not what the vendors provide: “Knewton…constantly mines student performance data, responding in real time to a student’s activity on the system. Upon completion of a given activity, the system directs the student to the next activity.” That is not adaptive learning, but rather adaptive content; it is the content sequence (of “activities”) that adapts to the learner’s past performance. We can call adaptive content “micro-adaptation”, since it happens at a very granular level.

Now, good teachers have been adapting to our students for how long…millennia? We assign homework based on what we know of our students’ strengths and weaknesses (adaptive content).  In the communicative classroom, we are always adjusting our pacing, creating new activities, or supporting spontaneous discussion based on our perception of the students’ needs in that moment (the adaptive classroom).  Dogme is one kind of adaptive learning in the classroom. And, when the stars align, educators can successfully design and deploy a curriculum, including methods and approaches, that iteratively adapts to student needs over time (the adaptive curriculum). We can call the adaptive curriculum “macro-adaptation”.

Screen Shot 2014-03-17 at 2.16.58 PM

So how does the new, algorithmic adaptive learning, such as Knewton helps deliver, address each of these categories?

+ As we saw above, the content level is where Knewton focuses, and it’s limited to task level online content that can be objectively scored (micro-adaptation). But, it can do amazing things with this limited data, especially when the data is aggregated (“big data”).  Knewton can change the activity sequence in real time to better fit the student’s performance, and can then make statistical inferences about the quality of specific activities and sequences of activities.

+ For the classroom, students would need tablets or smartphones in order to input the data that Knewton needs. I can think of some very cool pairwork and groupwork tasks involving tablet-based activities, but these aren’t individualized and so would be out of Knewton’s scope. Presumably the student data can only be created by individual tasks, which would severely limit its utility in a communicative classroom. However, the content level input resulting from student online work (e.g. homework, or from a blended course) could be valuable for teachers to have and could help optimize classroom lesson planning.

+ For the curriculum category, algorithmic adaptive learning can analyze the student performance data resulting from the content level, and then deliver insights that can potentially be fed into the curriculum, helping certain aspects if the curriculum iterate and adapt over time (there are limitations here that are discussed below).

So as a tool, Knewton has potential for the ELT profession. But, whether the tool is used appropriately, or not, does not depend on Knewton, but rather on the publishing partners that use the Knewton’s tools. All Knewton does is provide publisher LMSs with a hook into the Knewton data infrastructure.  Knewton is a utility. It’s the publishers that decide how to best design courses that use Knewton in a way that is pedagogically appropriate and leverages Knewton’s strengths to provide adaptive content, classrooms, and curricula. It is the publishers that must understand Knewton’s limitations. As a tool, it can’t do everything – it can’t “take over” language learning, or relegate teachers to obsolescence, although Knewton’s marketing hyperbole might make one think that.

Limitations for ELT

If Knewton’s ambition is one concern, then another is that it is not specifically designed for ELT and SLA, and therefore may not understand its own limitations. Knewton asks its publishing partners to organize their courses into a “knowledge graph” where content is mapped to an analyzable form that consists of the smallest meaningful chunks (called “concepts”), organized as prerequisites to specific learning goals.  You can see here the influence of general learning theory and not SLA/ELT, but let’s not concern ourselves with nomenclature and just call their “knowledge graph” an “acquisition graph”, and call “concepts” anything else at all, say…“items”.  Basically our acquisition graph could be something like the CEFR, and the items are the specifications in a completed English Profile project that detail the grammar, lexis, and functions necessary for each of the can-do’s in the CEFR.  Now, even though this is a somewhat plausible scenario, it opens Knewton up to several objections, foremost the degree of granularity and linearity.

A common criticism of Knewton is that language and language teaching cannot be “broken down into atomised parts”, but that it can only be looked at as a complex, dynamic system.  This touches on the grammar McNugget argument, and I’m sympathetic. But the reality is that language can indeed be broken down into parts, and that this analytic simplification is essential to teaching it. Language should not only be taught this way, and of course we need to always emphasize the communicative whole rather than the parts, and use meaning to teach form.  But to invalidate Knewton because it uses its algorithms on discrete activities is to misunderstand the problem. Discrete activities are essential, in their place. The real problem that Knewton faces in ELT is that both the real-time activity sequencing, and the big data insights delivered that are based on these activity sequences, are less valuable and could be misleading.

They are less valuable in ELT for at least two reasons.  First, they are less valuable because these big data insights come from a limited subset of activities, and much student-produced language and learning data is not captured.

Second, they are less valuable because language learning is less linear than other, general learning domains (e.g. biology, maths). Unlike in general learning domains, most language students are exposed to ungraded, authentic, acquirable language (by their teacher, by the media, etc.) that represents an approximate entirety of what is to be learned. Algebra students are not exposed to advanced calculus on an ongoing basis, and if an algebra student were exposed to calculus, the student wouldn’t be able to “acquire” calculus the way humans can acquire language.  Therefore, for ELT, the cause-effect relationship of Knewton’s acquisition graph and the map of prerequisite items is to some extent invalidated, because the student may have acquired a prerequisite item by, say, listening to a song in English the night before, not by completing a sequence of items. That won’t happen in algebra.

Because of these limitations, Knewton will need to adapt their model considerably if it is to reach its potential in the ELT field.  They have a good team with some talented ELT professionals on it, who are already qualifying some of the stock Knewton phraseology (viz. Knewton’s Sally Searby emphasising that, for ELT, knowledge graphing needs to be “much more nuanced” in the last Knewton response on eltjam).  And, hopefully Knewton’s publishing partners will design courses with acquisition graphs that align with pedagogic reality and recognize these inherent limitations.

Meanwhile, as they work to overcome these limitations, where can Knewton best add value to publisher courses? I would guess that some useful courses can be published straightaway for certain self-study material, some online homework, and exam prep – anywhere the language is fairly defined and the content more amenable to algorithmic micro-adaptation. Then we can see how it goes.

In Part 2 of this post we’ll focus on the primary purpose of adaptive learning, personalization, and how this can be achieved by Knewton as well as by teacher-driven adaptive learning using open platforms.  As Nicola points out, we need to start with “why”….

No Comments

Do you have ELT experience in Saudi Arabia?

July 18th, 2013
by


We’re just starting a fascinating, multi-year ESP project in Saudi Arabia, designing and supporting the English programs for seven new colleges. Blended learning of course, using the English360 platform. We’ll be designing a custom curriculum, authoring content, and creating an online community of practice for the teaching team.

Now, usually we do the course design and content authoring in house, but this is a big project with a near-immediate start date, so we’re looking for a few people to work with, primarily ELT/ESP editors, authors, teacher trainers, and consultants with experience in Saudi Arabia. Teachers are welcome as well, especially if you have some innovative ideas on adapting lessons and methods to the culture.

So, if you’re a freelance ELT/ESP professional with experience in Saudi Arabia, and are looking to put in 5-20 hours weekly on a new project, please get in touch. You can work virtually from wherever you are, and get to work with Jeremy Day and Valentina Dodge, which is always fun and sometimes exciting.  Email me with your CV and whatever information you think relevant at cleve (at) english360.com and use “KSA project team” as the subject line please.

No Comments

My IATEFL session slides, and general conference impressions

April 22nd, 2013
by


Here are the slides of the my IATEFL session, and thanks to everyone who emailed me asking for them.

I thought the session went well, although if I do this talk again I’ll switch up the organization a bit, in order to match the case studies to the framework. This would give the audience a real-life example for every step in the framework, instead of leaving the all the examples to the end. And of course I had to rush the end, because, 30 minutes, come on!  Sessions that short are approaching TED Talk length and in my opinion a professional conference is more about in-depth professional development, not the slick and shiny “ideas worth spreading” TED format that is compelling yet often lacking depth.

I spent most of the conference at the English360 stand, which meant I couldn’t see many sessions – frustrating. But the level of interest in open platforms in general, and English360 in particular, was intense. So our time at the stand was spent in energetic discussion about blended learning with our colleagues and customers. Usually you can learn as much doing that as you can going to sessions, if you really listen.

Thoughts on IATEFL Liverpool:

+ It was great to see the interest in the “flipped classroom” – many sessions on that theme. Valentina’s session was packed, and the room held 300. I have some lingering concern about the term though.  I still don’t see the difference between “flipped classroom” and a well-designed, integrated blended language learning course. “Flipped” seems the buzzword for what we’ve been saying all along…input, heads down tasks, and drilling usually take place in the online component, and collaborative, communicative stuff F2F. Furthermore, we really need to differentiate between the aims and approach of the flipped classroom in non-ELT courses (where it originated) and ELT contexts. They are very different beasts. Paul Braddock in his excellent session in the LTSIG pre-conference event did in fact point this out, although he differentiated things primarily through a constructivism lense, whereas I think most of the difference is in the domain itself…acquiring a language is different from learning, say, history, so what you are flipping, and why, will change as well.

+ I fully understad what Pearson is doing with the “all-iPad, zero print books” tactic at their stand, and I applaud them. But Pearson didn’t execute on the concept well at all, and I think the message was lost on the audience there.

+ Panic among the publishers!  There is so much re-organization and job cutting going on among the larger publishers that their staff is feeling immense stress. We had a lot of job inquiries at our stand from recently downsized senior staff – very qualified folks. I encouraged them all to go freelance of course, and to use an open platform (e.g. English360) to deliver products and services.

Here are the slides (you can download them from Slideshare):

2 Comments

IATEFL 2103 Preview: Growing your ELT career with technology

April 5th, 2013
by


Like many of you I’m sure, I can’t wait for next week’s IATEFL 2013 conference!

It’s great to see everyone of course, but this year I’m also very much looking forward to delivering my session, because it’s something I feel very strongly about: career development in ELT.

I feel strongly about this topic because I see so many teachers who would like to branch out within ELT into new roles, who have the talent and energy to do so, but aren’t sure exactly how. The result is that the ELT profession as a whole loses out on a tremendous amount of talent and innovation at exactly the moment when, as a profession, we need it most.

Liverpool Online

The great thing is that it’s never been easier for teachers to move into new roles such as materials design, consulting,research, school ownership, authoring and self-publishing. Why? Because the technology that is available to us today opens up opportunities that just were not there 10 years ago.

So my talk is about how to do this. We’ll look at a practical, 6-step framework that you can use for career development in ELT and reach your personal, professional and financial goals. Here’s an overvew:

  1. + The ethos of the new web and what it means for your career
  2. + The essential skill set of our technology environment, and how to use it
  3. + Defining the best career direction
  4. + Building your “platform” as an entrepreneur or intrapreneur
  5. + The essential technology tool kit
  6. + Building your community

We’ll also look at case studies of teachers who have successfully moved into new roles, and see what worked (and what didn’t).

So please consider coming:  Thursday 14:00 (session 3.3, hall 4b).

 

No Comments

IATEFL Chile plenary presentation

July 30th, 2012
by


Great conference in Santiago last Friday and Saturday – well worth the long flight. Pete Sharma was excellent as always and Penny Ur’s sessions were both spectacular. Her talk on higher order thinking skills in task design and teaching really made me think of how to help teachers apply those concepts to the content that they author in the English360 platform. Based on Penny’s talk I had an idea for a nifty continuum for drill design that might be useful – a good example of how a great conference session can act as a catalyst for further thought.

Anyway, I told the audience that the slides from my talk would be available on the blog, so here they are. Comments welcome.

1 Comment

Daily Tweet Digest

March 20th, 2012
by


  • Extraordinarily useful session on ELF by @chiasuan, great mix of theory and practice, audio examples. Audience is enthralled #IATEFL #BESIG ->
  • RT @ericbaber: Nervously waiting to be interviewed for #IATEFL online! Actually, Eric's never nervous…he is sangfroid personified. ->

No Comments

Breaking news: English360 is now independent

September 7th, 2011
by


As we announced to our customers last week, English360 is now independent of Cambridge University Press, and we are now a wholly owned and fully autonomous organization. This is, of course, very exciting for all of us at English360, and not just from a business perspective: it’s exciting because it’s the next step in fulfilling our shared vision of where education is going, and how teachers will use technology.

How it all began

English360 was founded 6 years ago as a tiny, teacher-led start-up with big plans but few resources. We presented an alpha version of our web application at BESIG in Berlin in 2007 (it went live on the web for the first time the night before the session!). We had a clear vision, but as a tiny start-up we faced huge challenges when entering into a global ELT community dominated by big players.

BESIG Berlin 2007: The English360 launch session

BESIG Berlin 2007: The English360 launch session

Now, as it happened, Cambridge University Press was scouting new technology at BESIG that year. They attended our session, and to make an extremely long story short, English360 and Cambridge entered into a joint venture, creating Cambridge-English360 Ltd. It was an inspired partnership for a range of reasons: Cambridge got some cutting-edge technology, together with the team that built it. English360 received:

  • a wide range of Cambridge courses and resources to re-purpose in the platform
  • support from the global Cambridge sales teams
  • financial support (we were now able to pay ourselves a salary)
  • co-branding with the strongest brand in ELT

The partnership was successful. Together with Cambridge, we launched over 50 courses in the platform. We signed up thousands of users in dozens of countries and on every continent. We picked up the David Riley Award for Innovation and were shortlisted for an ELTon. We made the software more powerful and added some extremely cool content authoring templates. Many challenges remained, but we had managed to gain traction towards our goal.

So what happened?

And of course what happened is what always happens: this very success stressed the organization. We’d proven the concept: an open platform that gives teachers and schools unprecedented ability to work with publisher content, author their own content, and combine the two into personalized courses, to be delivered online, in class, or as blended learning.

And, now having proven the concept, we were inundated with new ideas and new projects for the platform, from ourselves, from our partner, and from customers. We worked hard to continue to prioritize and execute, but soon both partners realized that we were in danger of losing focus.

So we mutually decided to allow both partners to use the platform as needed, enabling each to dedicate the focus necessary for their own projects. It made perfect sense, and then about a week later we all realized that at this point there was really no reason for a joint venture any more, and that it would be better for both partners to maintain a strong strategic alliance, but without having the complications of a formal joint venture.

So that is what we’ve accomplished: an amicable separation that jettisons the problems but maintains what works. Cambridge University Press deserves tremendous credit for the enlightened, collaborative, teacher-focused business philosophy that provided the flexibility for this new relationship. English360 is now independent, but with the content agreements in place for the all the Cambridge resources currently in the system, and new agreements for new Cambridge content as well (we’re launching the first new course in October).

What does this mean for English360 going forward?

For us, independence has some intriguing advantages, some of which you can probably guess. Everyone on the English360 team is tremendously excited about this next step in our progress. We’ll discuss these advantages in “Part 2” of his post, coming soon.

4 Comments

BESIG 2009

November 25th, 2009
by


Attended the BESIG conference in Poznan last weekend, with fellow English360′ers Paul Colbert and Brian Anderson. As always it was great to actually meet with colleagues that had previously been only virtual: met Karenne and Anne face to face finally. Discussed an interesting new project that Cornelia and Paul have cooked up. Met with lots of folks that I only see once a year.

Vicki Hollett‘s plenary and subsequent session were great. My take away was her discussion on teaching functional language for authenticity when establishing relationships, whether they be business or social relationships. Main point: those nice lists of functional phrases we have in BE coursebooks need an upgrade.

Another highlight was Jeremy Day‘s session on “Results-focused ESP”.   Jeremy gave us an observation that was new to me. I’m paraphrasing here but he was discussing the question “Who is the most important person in the learning process?” and we were all thinking “the student” (as opposed to teacher-centered, or materials-centered classes of course). Jeremy’s point was that another perspective, especially in ESP, is to see that the most important person isn’t even in the classroom. If we are teaching English for nursing, the most important person is actually the patient, who will be communicating with the nurse (our student). If we are teaching English for students who work in a call center, the most important person is the customer, who will need our student to resolve an issue with a product. This expansion of who we prioritize as stakeholders in the learning process is spot on.

10 Comments

“…this is it. The big one.”

June 17th, 2009
by


NYU Professor Clay Shirky (via email from Diane Tucker):

“I’m always a little reticent to draw lessons from things still unfolding, but it seems pretty clear that … this is it. The big one. This is the first revolution that has been catapulted onto a global stage and transformed by social media. I’ve been thinking a lot about the Chicago demonstrations of 1968 where they chanted ‘the whole world is watching.’ Really, that wasn’t true then. But this time it’s true … and people throughout the world are not only listening but responding. They’re engaging with individual participants, they’re passing on their messages to their friends, and they’re even providing detailed instructions to enable web proxies allowing Internet access that the authorities can’t immediately censor. That kind of participation is really extraordinary.

Traditional media operates as source of information not as a means of coordination. It can’t do more than make us sympathize. Twitter makes us empathize. It makes us part of it. Even if it’s just retweeting, you’re aiding the goal that dissidents have always sought: the awareness that the ouside world is paying attention.

From Nico Pitney in The Huffington Post.

2 Comments

Collaboration and 360° content creation.

May 20th, 2009
by


The traditional publisher model of expert authored, professionally edited language teaching course books is often necessary, but seldom sufficient for optimal learning.

Although they are a wonder of high quality teaching content, scope and sequence, and production values, course books have their issues. They may take 3-5 years from conception to classroom, and are usually designed for general appeal to a passive mass audience. They are expensive to produce. Authors are far from the needs of different cultures, different students, and different teachers. Contentious topics are avoided.

Thus, the problem is keeping content relevant, current and personalized. Today, slang, technology, and cultural references evolve more quickly now than before. Content and references have lost validity when they are 5 years old (and often when they are 5 months old). And they may have not been personally relevant to the student anyway, since a “common denominator” approach invariably leaves many students yawning.

So what’s a teacher to do? Well, most teachers have the solution: they supplement the core course book to one degree or another. They supplement with web resources, authentic material, teacher- and school-developed content, content from other course books and resources, and activities and projects that teachers come up with on the fly.

And, critically, they supplement (or, for the Dogme folk, replace) with content brought to the learning process by the students themselves.

If a teacher has the skills, resources, and experience, the result can be an optimal mix of pre-defined language content, and personally, culturally, and professionally relevant and engaging content.
content-box0015

But, it’s not easy. For most teachers, we’re talking analog: photocopiers, tape, manila envelopes and file cabinets. For other teachers it’s a mind-boggling succession of web 2.0 apps, user names, and passwords…each one cool and useful but scattered around in info silos throughout the net.  What each approach has in common is a lack of time to implement it.

Today’s digital technologies will soon open up possibilities for meeting these challenges. Group authoring platforms and collaboration tools will allow groups of teachers (and students) to work together, pool their energy, and create materials and lesson plans that in terms of both quantity (definitely) and quality (optimally) were formerly only possible from publishers. Print-on-demand, e-learning, and PDFs provide a delivery mechanism that again was previously only available to large publishers.

Large-scale collaboration will lead to the same result in language learning material that Wikipedia brought to encyclopedias: a dramatically wider range of topics (Wikipedia has 10 times the articles of a traditional encyclopedia). This long tail of content will provide the custom course work that will result in radically personalized learning – we’ll have as many courses as we have students. And as we’ve seen with Wikipedia, it’ll be fast and it’ll be cheap. And most importantly, what it will be is open.

So, the coming collaborative content has many advantages: speed, relevance, flexibility, personalization, the capacity to mix authored, student-generated, authentic and web content into a more rounded approach. Through collaboration, this “360° content creation” adopts and adapts content from a wide range of sources, leading to learner-centered content that transforms passive learners into active, and a mass audience into personalization.

3 Comments