The website is also available in
BLOGS

Ep 6: Stella Lee on the role of ethics in EdTech

Learn about how to go beyond the L&D buzzwords to create meaningful e-learning that is aligned with strategic business goals.

Inês Pinto

In our most recent episode of Untold Stories in Learning & Development, Stella Lee talks about the importance ethics in learning technology covering a wide range of topics from data protection to accessibility and inclusivity. She also discusses the human biases behind tech development and what we can do to ensure a more inclusive, fair, and equitable learning environment.

Key takeaways from this episode

  • The main ethics concerns in EdTech
  • How unconscious bias and human error shows up in EdTech
  • The shifts happening in the role of L&D

Scale up your learning

Read our free guide to learn how leveraging your employees knowledge can speed up and scale the creation of learning material.

Show notes

00:30: Introducing Stella Lee
03:00: Stella’s motivation for working in corporate learning
06:15: What does ‘ethics’ mean in EdTech?
12:15: How unconscious bias shows up in EdTech
14:29: The ethics of digital surveillance tools
17:20: The main ethics challenges in EdTech
22:08: The impact of bias and human error in technology
26:30: Ethics and inclusion in tech
29:15: The biggest milestone in corporate learning
31:55: The biggest challenge facing L&D
34:05: The future of the L&D industry
36:40: Wrapping up

Highlights

“What kind of learning theory is your technology grounded on? What kind of assumptions are you making about the learning process, the learning experience? And what assumptions are you making about your learners?”

“Because we’re dealing with tech, [ethics] should be a constant, evolving awareness and reflection on what we do.”

In learning, you don’t want to always make it easy for people. You don’t want to just feed people more of the things that they’re good at or they like.”

“How do you think not just from your perspective, with your lens, but also get out of your view and look at how other people see the world?”

“The first question you should ask is: Is this a problem that learning can solve?”

Resources mentioned in this episode:

Featured in this episode: Stella Lee, Paradox Learning

Stella Lee’s first love was actually painting – something she still enjoys to this day. But it was only after getting a PhD in Computer Science that her interest in learning technologies began. From there, she spent a few years working corporate learning before taking the leap to found her own consulting company, Paradox Learning. Today, with 20 years of experience under her belt, she works on enterprise-wide learning strategy and governance, digital ethics for learning, AI and e-learning applications, LMS design, evaluation and learning analytics.

Connect with Stella

? LinkedIn
?️ Website

Full transcript

Kasper: Welcome to another episode of Untold Stories in Learning & Development. And today I have a very special guest with me, it’s Stella Lee. So, hi Stella!

Kasper: Welcome to the podcast.

Stella: Thank you for having me.

Kasper: Yeah, really happy there. So maybe we can start out by doing a short introduction from your side. Can you tell me something about you?

Stella: Yeah, happy to. So, my name is Stella Lee. I’m based in Calgary, Alberta, Canada so I’m Canadian. I have about 20 plus years of experience in L&D – EdTech, particularly. I started out in the academic world as… Well, my academic background, actually, I started… I was a painter, as an artist.

Kasper: Oh?

Stella: Yeah, I know.

Kasper: That’s interesting.

Stella: Yeah. And that was my first love, still my love. And then I ended up having a PhD in Computer Science, focusing on learning technologies. So that’s my journey. And I started working in academia, primarily in education technology – how do we use technology to enable teaching and learning? How do we help teachers and students and the community to learn better and to empower them with tech, right? So, after I got my PhD, I promptly left academia and went into corporate learning because I was curious about what’s the other side. And after a few years of working full-time, I decided that I would be better off doing consulting. So, I started my own consulting company about 18 years ago. I consult internationally. A lot of my clients are larger public sector governments and private multinationals. On the other side, I also work with start-ups. I consult in the EdTech space, working with start-ups to help advise on product development, to advise on user experiences as well as just product roadmaps. So that’s where I’m at, yeah.

Kasper: And so, the switch that you sort of made –  so what is it? Why did you make the switch? What did you want to achieve? What is what is your interest there? Is it tech? Is it education?

Stella: Oh, like when you say switch, like when I was in academia and switched to corporate? Or when I switched my…

Kasper: No, maybe the first choice to go into the learning world as a first step. So what was the motivation?

Stella: Right, Um, I think I’m always a lifelong learner. I felt like that’s just very much who I am. I grew up with… my dad is the perfect model. He is a lifelong learner. He’s always curious about things. And he’s always, always learning. So I got it from him. I just think that’s part of who I am. I just didn’t know, when I was younger, how I could make a living out of it. I was just like, Okay, I love to learn. I love schools. I like to help people. I like to facilitate learning. And I also always loved using technology and gadgets and tinkering. And somehow it all fit. As I started out working – even as a student, I would work in the lab and I would help people use technology in the lab. And so it was a natural progression once I started working. I started… and at the time, it was when e-learning and teaching with tech was the big thing, was the next big thing. And so I end up getting a job at a university helping faculty to develop the tech skills for learning and helping building online curriculum. And so it was the right place at the right time. And it just go from there.

Kasper: Okay, I understand that. And how do you weigh the educational components versus the technical components? So how much of you is like tech driven and how much is like education driven or is that balanced?

Stella: Oh, this is a trick question, isn’t it? I think it’s like the Venn diagram, you know, the three circles?

Kasper: Yeah.

Stella: I think my whole guiding principle is always the intersection of the three circles. The first one, obviously, is educational pedagogy, right? Or learning. And then the second circle is the tech component. And the third intersecting circle is – because I came from the arts, I came from, you know, design – it’s the design component.

Kasper: And you’re sort of in the sweet spot of the three.

Stella: Yes, exactly where they all three intersect. It’s what the happy spot is, right?

Kasper: Yeah.

Stella: And so yeah. So I guess 33.3% for each one?

Kasper: Oh, yeah, I was just curious to sort of where the drive comes from because we always talked about a specific topic. And the topic you chose is EdTech ethics. And I’m in EdTech myself. So Easygenerator is an EdTech company. And I actually, to be frank, never thought about ethics in EdTech. So I’m really curious to start that off so: what is your definition of that? What is it?

Stella: Yeah, Kasper, can I just kind of backtrack for a minute?

Kasper: Yeah.

Stella: When you asked me like, “What component of tech and what component of, you know, learning, do you place your values?” To me, my biggest pet peeve is to see companies that are driven by tech only or, you know, put so much emphasis in tech when we should be thinking about the learners and the learning, right? So what drives me when I look at tech and when I look at projects that I work with, when I give advice, I’m always looking at that first. It’s: what kind of learning theory is your technology grounded on? What kind of assumptions are you making about the learning process, the learning experience? And what assumptions are you making about your learners? Right? And that should be driving everything that you do. So when you talk when you say, Oh, I never heard about tech ethics and I’m curious about that. To me, it’s so closely connected. Because if you don’t think about that, then you really you’re not thinking about your target users, right?

Kasper: Well it depends, maybe it’s because I don’t know the definition of the word. So maybe if you explain it to me.

Stella: Well, I really… I think it’s a broad, broad area. It covers privacy issues, it covers access, it covers any algorithmic biases, it covers intellectual property. And so it’s a broad range of topics that sits under the ethics umbrella term. And I think what is interesting… Obviously, when people think about ethics it’s the moral principle of doing what’s right and what’s wrong and identifying that. But to me, in addition to a guiding principle, because we’re dealing with tech, it should be a constant, evolving awareness and reflection on what we do. It needs to be an ongoing process. It’s not like we look at the ethical implication at the beginning and that’s it and we’re done. You know what I mean? As you collect data, as you grow your user base, you have to keep thinking and reflecting and integrating that practice, constantly. So, to me it is an ongoing reflection of what’s right, what’s wrong, where the gray areas are, from access to privacy, to biases, to intellectual property, and many other areas as well.

Kasper: So I’m trying to apply that to our own product and company. So, we are an authoring tool for employees, for subject matter experts. Yeah, so for me, if I then would think about the ethics of that that is really, for example, basic rule is indeed that all the content created by those people is, of course, their copyright. But even at Easygenerator, we don’t – although technically we could – we don’t even have access to that content. So if you, for example, get stuck in a course and want to have support from our support team, you have to grant them access to that content otherwise they will not be able to reach it. Is that the kind of things you’re talking about? To do be really strict in those rules? And of course also the data that we store and how you handle with privacy. Is that the thing that I should…?

Stella: That’s one aspect of it, but it’s broader than that, right? To get, you know… Okay, so perhaps you don’t collect the data that are deemed personal or you don’t store them, you don’t sell them – which is good. But it is that a clear policy that you make it transparent to your end users?

Kasper: Correct. Yeah, because we do store some data and we have data privacy policy for that, indeed.

Stella: And to a greater extent is the fact that: are you allowing your users to have control over… Like, do you give consent? Are they in control? Can they opt out? Those are part of ethics as well.

Kasper: I’m talking like the GDPR legislation we have in in Europe, and you have a separate, same thing , basically, in the States. So for example, one of the things is the right to be forgotten. So indeed, in Easygenerator, we have a button that you can click as an as an author and not only your content, but also everything that we ever sort of knew, for example, support things, all that is deleted. So those are the things we’re talking about then.

Stella: Yeah. That could be one aspect of it. Also, when you creating a tool, right? What kind of assumptions are you making about users? Are you inadvertently excluding certain people? Are you creating some sort of assumptions about people that perhaps is not correct? So those are things that, you know, I think tech creators also need to keep in mind. Are you presuming… Llet’s say, if you’re a learning content creation company, is the assumption that video-based learning is better? Are you trying to tell people, you know, are you influencing people subtly about certain things, right? Or are you excluding people that perhaps cannot, you know, access the technology? If they don’t have a fast connection speed, what are the implications on that? Are you being clear about what who you’re excluding? Are you being clear on providing accommodations or alternatives? And it could also touch on accessibility issues, right?

Kasper: Yeah, so not consciously excluding people, but much more like making sure that the access level is really easy. That’s what you’re talking about?

Stella: Yeah, so those are things that, you know, perhaps people don’t always consciously think about it, you know what I mean? Like, so I think tech ethic is important because it makes… first of all, it gives you that awareness of, you know, thinking through these problems more carefully, or these issues more carefully. And to say, “Yep, we adjust that.” “Yep, that that’s covered.” “No, that’s not what we mean.” “We’re excluding by intention because there’s a reason for it.” So those are the topics that that we talk about. And it’s, it’s always been… tech ethics has been around for a long time, EdTech ethics also been around for a long time. But I think what makes it more interesting now, is that is so ubiquitous, right? Like, our data is being collected… like think about a smartphone that we use. Just that single device itself. Like, how many how many data points are being collected? And how many of those are being commercialized? How many of that is being sold, right? And people are not aware of it. And what makes it dangerous is also… it’s not just one-dimensional data. It’s not just like in my phone, but also could be in a smart home like a thermostat. It could be in my alarm system. It could be everything else that’s integrated, and at work as well. Like, there’s so many work productivity tools that are tracking you now. And I think now, it’s worse since COVID because we’ve all had to work at home. And so companies is always a little bit paranoid about all your people working at home. Are they actually really working? So, let’s check them. So, the sales of surveillance tools actually went up, I think the first two or three months of COVID. Like, 2020, March, April, and May, the sales went up the roof for surveillance tools. So I think it’s a very timely topic as well.

Kasper: Yeah, I agree with you. So we of course at Easygenerator, I have sort of the same experience. We are already a remote company in the way that we do have offices and people work there but they also work from home. But the whole sales process, everything we do is online because our customers are all over the world. So, we don’t go there. So the whole sales process… So, for us when COVID hit and we had to close the office with the first lockdown, everybody just took his laptop and went home for the next Monday. But we actually saw productivity rising for 20%. But I think that is… And I never, ever sought to install any surveillance. But that’s… but we are really output driven. So we look into the results people make. So I don’t really… I’m not interested in: are they online? How much time do they make? As long as they reach their goals within the boundaries that we set for that. And then I would be happy. But that is not like a common thing?

Stella: Yeah, that’s because you get it right? That’s because you have trust in your employees. And that’s because your output driven, and you don’t care if they… You know, if somebody take a 30-minute break, you’re not tracking that as long stay completed the projects. And as it should be. But that takes a shift in mindsets in some organizations, right? It’s very organizational based, depending on the culture. You know, some industries too are much more traditional in their thinking. I think, you naturally being in a tech start-up, you’re already ahead in terms of thinking, you know, how are you going to manage and lead people remotely. But so many companies don’t have that experience.

Kasper: No, no, I can imagine that. Yeah, no, I was aware that we were on… having an edge on a lot of other companies because we already, like, halfway there when COVID happened. But if you look at… Are there specific areas where you are concerned? So, you mentioned a couple of things where you are sort of focusing on with a specific concern. But do you think that things are not going in the right direction, or do you see in general improvement?

Stella: Um, there are many concerns and challenges, especially now you see so many products that AI or machine learning driven. And that complicates the issue as well because that means there’s more data, there’s more datasets available. That means it’s, you know, it’s taking in a large volume of data, but it’s also spitting out large volumes of data. And there are lots of biases in… Like, how do you feed… you know, what do you feed the data from to an algorithm? For example, when you’re trying to feed learning data from a training data set from a department is traditionally more male than female, perhaps you didn’t think consciously that you’re feeding data that are more biased from a gender perspective. Or if you’re feeding data from a department whereby traditionally have more older people or more younger people, you’re also feeding data that is not a representation of all age groups, right. But you don’t always think about that. You know, I think it was at Amazon that that used a hiring a recruitment tool and it discriminates against women? But they didn’t know that until they start running through the system and they realized that the algorithm is picking out words that are deemed feminine in quality and it gives a lower ranking. So yeah. And there are many, many examples of that out there that… Even Google image recognition had a hard time recognizing people of color. Facial recognition is a technology that’s particularly problematic, right? If you’re women of African descent, the error rate is like +30%, as opposed to if you’re a white male with light skin, the error rate is I think, maybe 3%? 5%? So there’s a significant difference, right?

Kasper: Oh, that’s really interesting. But would you then see a role…? So it’s not that people have like the wrong intentions with EdTech, but they’re just not aware of things going wrong? Is that your role to make the companies aware of: wait a minute, so you think you’re making a proper selection, but you are not?

Stella: Well, I mean, sometimes they’re, a lot of times it’s unconscious biases. But they could be conscious biases, too. It’s a combination of things, right? You get the these kinds of biases of sourcing data that you didn’t know it’s biased. Or it could be something that’s happening in the algorithm that it becomes more biased. For example, when you’re using a recommendation system. Like any of the learning experience platforms, when you recommending content, it creates a positive feedback loop. So, if it’s something that people use often it gets ranked higher. But then there’s no way of kind of opening the algorithm to say, “Well, is there something missing?”

Kasper: Yeah. You’re creating, like your bubble based on the algorithm.

Stella: Yeah. It’s like your own echo chamber, right? So, we just feed you more of the same thing that you like. But in learning, you don’t want to always make it easy for people. You don’t want to just feed people more of the things that they’re good at or they like.

Kasper: Yeah, you want to go outside of the box every now and then.

Stella: Yeah. Or do you want to challenge them? Or you want to have something that’s more serendipitous? You know, it’s like when you when you go shopping, you end up buying things you didn’t think you were gonna buy, but you chanced upon something. Because you see it, and it reminded you of something else. It’s that serendipity that that is not happening. In fact, it almost trying to exclude that as you give more feedback to the system.

Kasper: So making things transparent, being aware of things. So I spoke the other day to somebody he said, about biases, we’re all biased. So if you tell you’re not biased, it’s not true. The only thing that you can do is be aware of that and do it in the proper way. So that’s basically what you’re telling me as well.

Stella: Absolutely, yeah, And I think that the perception, is that people think, Oh, technology’s not biased because it’s, you know, it’s a neutral. But it’s not true. Because who built them? Humans, right? And we built assumptions into them. So I think that’s why it’s not so much concerning, but I think it’s more about what we need to know. We need to have that level of awareness before we can actually do something about it. So yeah, like sometimes… And also, on top of that, there’s also human errors that… we might not intentionally do something wrong, but you might mislabel something. Don’t forget with AI, for example, when you do classification, it’s actually a very manual process, right? Humans have to label things. So, you can make mistakes, too.

Kasper: Yep, absolutely. I’m great at that.

Stella: Yeah, me too, right? Like, when you get tired, you know, you don’t pay attention, you put a typo in there, you forgot to enter something, there’s an empty field… So all of that would also have an impact. And I think what makes it more challenging now is like, it’s not so easy to spot them because, you know, it’s not transparent. Like, when you feed it to an algorithm, you may feed it to, like a machine learning process, it’s not always easy to kind of open this black box and say, “Oh, hang on a minute. It’s just this step”, you know? You can’t always pinpoint it. So I think that’s problematic. And also when it gets to your end users, when it gets wron,g when it gets things wrong, when you make a wrong recommendation, or when it makes wrong predictions or assumptions… it could demotivate people, it could demotivate people from learning. There’s like a predictive learning analytics to kind of predict your success rate or your risk factors. Like, is this employee at risk of, you know, disengaging from work. What if you get it wrong? Then you’re essentially accusing people of things that they, you know, they’re not doing or haven’t done yet. So, you have to be very careful. And there’s a lot of that HR technologies out there. Even Microsoft has like a productivity score that came out a couple years ago and it got huge push back. So they had to…because they were reviewing productivity scores at an individual levels, that their manager can see that. To basically track you and say: you’re spending 30% of your day today replying emails. So, but it’s not giving you context, right? Like, perhaps it’s a day you need to do that, because it’s an emergency happening, but it’s not giving you the nuances. It’s not giving you to story behind that.

Kasper: That’s interesting, because I’m sort of she made me think because I never looked at Easygenerator, our tool, from that perspective. So I think what we want? We want to empower people to work, but I never thought about the fact: are we biased in a way? And will that have impact on what kind of people are attracted to the tool? So yeah, I really should look into that because I don’t have a clue to be frank.

Stella: And yeah, I think, when you when you look at your user base, you know, they’re self-selecting themselves, too, right? They are drawn to you, too, because perhaps they are more visually-inclined people already. And so, you know, what does that say about the tool? Did they give you feedback to say, “Oh, we want more visuals.” But you’re reinforcing the same thing from the same people that are giving you feedback, but perhaps you’re missing out on other perspectives.

Kasper: I do notice… a couple of years ago, we were really not paying attention to accessibility. That was really not on par. And that’s something that we’ve really upped our game with, and also sort of opened my eyes on how diverse that issue by itself already is. Yeah.

Stella: Oh, for sure. And even just like when you’re trying to do business globally there’s also cultural differences, right? Like, color is highly culturally dependent. So even just to have that awareness, not so much as unethical, but I think it’s just being inclusive. And I would classify that as an ethical topic, you know: how to be more inclusive. How do you think not just from your perspective, with your lens, but also to kind of get out of your view and look at how other people see the world, right? And, and to say, “Okay, is my product making assumptions about people that are only using it from Europe and North America, but not from other parts of the world?” You know?

Kasper: Correct. Okay.

Stella: Because I’m doing a project currently with the Asian Development Bank (so essentially the World Bank but focusing on the Asian region). And one of the big topics that comes up all the time is, you know: a lot of the EdTech products are very European and North American centric. And it might not be applicable to the region because, depending on the maturity level of the country, from a tech and from a EdTech perspective, they might not be able to use it because does it work offline? Do you need to have a fast internet access? Are you making assumptions about learning in a very Western perspective? So there’s a lot of products that we review and we have to say, we had to put caveats because we’re advising Ministries of Education in the region. So we have to say, “Okay, these are some products, there’s some good examples of EdTech out there. But this one, perhaps, the terminology is a very North American focus, so pay attention to that.” Or the way they go about, you know, designing learning has certain assumptions that might not apply to your country.

Kasper: No, I don’t think… I see that, that’s very clear to me. Okay. Timewise, we need to go to the next topic, otherwise, we’re running really, really late. So, just want to pick your brain on a couple of things, how you look at corporate learning in general. So, what do you think in your mind, what is the biggest milestone in corporate learning that we’ve reached so far?

Stella: Um, I think in corporate learning, I think we get to a point… there’s a maturity about understanding and appreciating different types of learning. I think we kind of got past that learning is a one-off event in a classroom. You know, I think that’s been thinking for a very long time. I think that’s been I think, recently, there’s a little bit more diversity in thinking about what learning entails and it’s not just a classroom, an event, a workshop but also incorporating the richness of what’s happening within your organization: what’s happening informally, what’s happening socially? How is knowledge being transferred, how they how is it being shared? So, looking at it from a more holistic way. I mean, don’t get me wrong, we still have a long way to go. But I think I started hearing a lot more conversations on that, right? I start hearing people talk about the ecosystem. So, I think we kind of hit a milestone there. I think L&D as a department also are becoming more consultative in nature and more integrated with the business instead of just thinking, “this is the silo.” And I think we get to this point where we’re able to push back a little bit. Because historically, other departments and other people come and say, “We need a course”, “We need a training on this topic.” And it’s really prescriptive, and it’s driven by other places. And I think L&D lacks the confidence to say, “Hang on a minute.”

Kasper: Yeah, there’s a problem here. What are we going to do?

Stella: Yeah, why do you want a course on that, right? Is this even helping? Or is this… Like I like to say, the first question you should ask is: Is this a problem that learning can solve? Or is this a problem that some other things need to be involved? Because sometimes it’s an issue of communication, it’s not a learning problem. So I think we have built a bit more of that confidence in consulting across the business more broadly.

Kasper: Okay. Is that then also one of the challenges you see for corporate learning? To grow further in that? To be more critical on those kinds of things and much more on the results of learning? And not just creating something?

Stella: Kasper, I don’t think we have time to talk about all of that! Yes, yes, and yes. Definitely I think the challenges all start with the changing role of L&D, right? I think every year, and more pressing the past two years, it’s: how to we upskill and retrain, right? Do we need to be a coach? Do we need to be a facilitator? Do we need to be a mentor? Do we need to be e-learning content creation creator? Do we need to now also be data analysts? Do we need to, you know, be an EdTech specialist? There’s just so many buckets of special, you know, of areas of a specialty now. I see that it’s either gonna go two ways. I think I will see some of the L&D professionals be more of a generalist, so kind of consulting at a broader level. And then some of them will become more niche focused, right? So some of them will…

Kasper: Like data analysts…

Stella: Exactly. Or maybe EdTech ethics ethicists, you know?

Kasper: Yeah. But at the same time, it’s also sort of like a change from like a bottom-up approach to more a… sorry, a top-down approach much more into a bottom-up approach, from a controlling perspective into facilitating perspective, is that correct?

Stella: I think so. It’s more distributed and more grassroot, as well. I mean, I’m speaking very, very broadly. But yeah, I think depending on how big the organization is, the culture and everything else. But I think you have to come up with something that works for your organization, as well.

Kasper: Yeah, that makes sense. So, and if you take that forward, so 5 years from now, where we will be with corporate learning? How will that t look? What is the biggest change from now and then?

Stella: 5 years?

Kasper: That’s it’s like a life here, I know.

Stella: You’re in tech as well, how can we talk about 5 years? 6 months? Well, I see it being more personalized. I think it needs to be more personalized. And I think, like I already talked about, it’s more niche focused. And I think it needs to be a little bit more balanced approach as well. It’s not just looking at… Again, I think this whole ecosystem, it’s going to continue to develop and it’s going to continue to pop in and connect to different areas of business. So, I can see that continue to evolve that way as well. And who knows about tech? I think tech will always change. But I think, you know, that’s getting back to your earlier question about like, we still have to think about what’s driving this field, right? What are your values? And where do you place your emphasis? And to me, it’s always back to that Venn Diagram of: how do you balance the three while putting your learner at the center of it, putting your people at the center of it?

Kasper: But that also increase your problem because if you want to make it more personal, put it in a learning center, it’s all about data. So then the whole ethics story becomes even more relevant, of course, because you have to track more data in order to advise somebody on a personal level.

Stella: Absolutely. So it’s the balance, right? I think there’s gonna be tension all around. And I think our challenge is to also like: how do you balance this tension? And as things get more complex, as tools get more sophisticated. I think it’s a great problem to have. I think it’s a very exciting time that we live in. Look at the range of tools we have available. Look at look at the type of people that go into L&D. I think having that diversity of people coming in, it’s also going to bring, some new blood that will bring in different perspectives.

Kasper: Okay, I understand that. To sort of bring this to an end. I have two final questions for you. So who is your ultimate learning hero? We all have… somebody we all should look up to?

Stella: Oh, I got so many of them. But recently, I really like Dr. Ruha Benjamin. She’s a professor in, I think, Sociology at Columbia University. And she’s not a learning-focused hero, but she talks about race and technology and how it, you know, shapes the world. And her book is on “Race After technology”. I highly recommend it.

Kasper: Okay, and do you know the title of that book?

Stella: “Race After Technology.”

Kasper: Okay. That’s interesting. Because, well, basically that already touched on my second question: is there a book we should read or blog we should go? Would that be the book that you recommend to all of us to read?

Stella: I definitely would.

Kasper: Interesting, I will, because that’s a new one for me. That sounds like an interesting, like a different angle that can sort of shed new light on things that we’re working on.

Stella: Yeah, I often like to look outside the learning field to inform my practice and my research. I like to bring in different… I think sociology is very relevant. I think psychology is very relevant. I think policy is very relevant. So, you know, I like to look outside of L&D in terms of looking at heroes, aspirations, new learnings. I think that helped push the envelope in our field.

Kasper: Yeah, well that’s a good lesson by itself. And, well, thank you for taking the time to talk to us.

Stella: Of course. Now, I feel like I want to interview you and ask you the same questions.

Kasper: Maybe we can do that another time. That’d be interesting. Yeah.

Stella: Absolutely.

Kasper: Thank you.

Stella: Yep. Thanks for having me.

Ready to create e-learning?

Start my free trial Book a demo

About the author

Inês Pinto is the content manager at Easygenerator. Originally from Portugal, she grew up in Canada and the US before returning to Europe to complete her university studies. She currently resides in Rotterdam with her husband, daughter, and two dogs.