Grace Fuisz, Psychology,
'19
As computer-mediated teaching becomes more and more prevalent, it is important to account for the difference in social experience for learners when their teacher is an avatar or AI system. This essay describes some of the differences in learning in a mediated versus unmediated context, providing suggestions for creating successful
digital instructors and future directions for research.
In Punished
by Rewards, Alfie Kohn criticizes conditional academic praise,
claiming that social rewards can be as enduring and impactful as tangible
rewards when it comes to decreasing intrinsic motivation. We want our
teachers, our parents, our authority figures to like us, and social rewards
exacerbate our need to please others by verbalizing when we have performed
adequately. There is essentially no difference in receiving an A or a ‘check’
on an essay, and receiving a confident “good job!” from the teacher.
Source: https://www.inc.com/gordon-tredgold/the-thing-that-many-people-get-wrong-about-giving-praise.html
Either
way, we learn that our performance contributes to our value and being
socially accepted.This, of course, does not mean that teachers should not interact with their students or avoid feedback altogether. Social interactions, both with peers and teachers, are incredibly helpful for efficient learning. Collaboration with peers and comfortable, social classroom environments foster confident and deep learning. Respect, perceptions of credibility and competency, and feelings of trust towards a teacher all the quality of a student’s learning. As grade school students are handed iPads in the classroom and computer-mediated learning environments are integrated into nation-wide curriculum, one might wonder how the social aspects of education will change.
socially accepted.This, of course, does not mean that teachers should not interact with their students or avoid feedback altogether. Social interactions, both with peers and teachers, are incredibly helpful for efficient learning. Collaboration with peers and comfortable, social classroom environments foster confident and deep learning. Respect, perceptions of credibility and competency, and feelings of trust towards a teacher all the quality of a student’s learning. As grade school students are handed iPads in the classroom and computer-mediated learning environments are integrated into nation-wide curriculum, one might wonder how the social aspects of education will change.
Social computers
There
is overwhelming evidence that computers are treated as social beings. People
demonstrate the same in-group/out-group effects with computers as they do with
humans, rating computer agents of their same ethnic identity as more
attractive, trustworthy, persuasive, and intelligent than a computer of a
different race (Nass & Moon, 2000). Just like humans, we are able to
recognize the personality of a computer and the similarity of a computer’s
personality to our own. Submissive people rate a submissive computer as
friendlier and more competent than dominant people do, representing another
in-group bias (Nass, Moon, Fogg, Reeves, & Dryer, 1995). Even a minimal
manipulation (assigning a participant and computer to the same arbitrary team)
is enough to signal human-computer in-group bias behaviors (Nass & Moon,
2000).
We apply gender stereotypes to computer agents, rating computers with
male-voices more competent with regards to stereotypically masculine topics
(computers, technology) than female-voices; female-voiced computers are rated
more competent on feminine topics such as love and relationships. Furthermore, when a computer with a male-voice praised another
computer during an evaluation, the evaluated computer was rated more competent
than when the same praise was given by a female-voiced computer (Nass &
Moon, 2000). This paradigm, wherein people mindlessly react to computers as
social beings, has been termed Computers as Social Actors (CASA) (Nass &
Moon, 2000). It seems that it does not matter if participants are aware or not
that the computer is not human; computers are still perceived as social
entities and receive the same patterns of cooperation, trust, and socially
desirable behavior that is expected from human-human interactions. When
participants were asked what they wanted to watch in a waiting room by a
computer agent, they were more likely to pick the socially
desirable choice (watch a documentary) compared to an action movie or
television, even though the social influencer was a computer (Kim & Baylor,
2006).
Computers as teachers
Source: https://www.newsroom.co.nz/2018/08/22/203646/digital-teacher-in-kiwi-schools
,
The image of a young child playing
on an iPad in a stroller, watching videos on the subway, or playing on his
mother’s iPhone is all too familiar in the age of the internet and smart
phones. Many of the apps marketed for children’s entertainment promise learning
in the form of strategy, mathematics practice, and language exposure. People interact with computers as social beings,
which implies that the importance of social learning environments, trust and
respect for computers, and the harm of social rewards should apply to
computer-human interactions and learning, as well. So how do we create effective
computer teachers?
Multimedia representations used for
teaching are commonly referred to as pedagogical agents (Atkinson, Mayer, &
Merrill, 2005), and range from animated text, to video recordings of lectures,
to avatars and artificial intelligence. Current applications of computers-as-teachers
use artificial intelligence for simple content delivery, and feedback, mainly
in higher education (Popenci & Kerr, 2017). Essentially, it seems that the
most effective computer teachers are similar to human teachers, and are used to
augment (rather than replace) traditional teaching methods. College students are more motivated to learn and have more favorable
ratings of an artificial intelligence (AI) tutor with an older voice than of an
agent with a young voice. This finding suggests that students recognize the AI
in the stereotypical professor role, associating age as a sign of authority and
experience (Edwards, Edwards, Stoll, Lin, & Massey, 2018).
Pedagogical agents can personalize
interactions with content, which motivate and support the learner more than
stagnant material like a textbook. The addition
of a pedagogical agent is more important to the learner’s subjective experience
and enjoyment than to the actual performance of the student, although there is
evidence that pedagogical agents can increase transfer success (Krämer &Bente, 2010).
The visual
presentation of a pedagogical agent is important for successful learning. Multimedia
instruction is more effective for increasing test scores when it includes
social cues (a video of a speaking professor instead of a stagnant relevant
image) (Töpper, Glaser, & Schwan, 2014). Compared to on-screen text, an
animated, speaking pedagogical agent displaying the same content increased
performance on transfer tests and interest ratings in material (Moreno, Mayer,
Spires, & Lester, 2001). Similarly, an animated image was rated more
engaging, person-like, credible, and instructor-like compared to a static
image, although animacy did not affect performance directly). An animated image
was, surprisingly, not significantly more person-like than on-screen text
without an accompanying image, although it was more person-like than a static
image (Baylor & Ryu, 2003). Perhaps this effect is due to our current
familiarity with texting and e-mail, which normalizes on-screen text as a
representation of live human communication. How the avatar is represented
visually seems to only matter when the instructor is demonstrating a physical
skill. If the pedagogical agent is using observational learning to demonstrate
a physical movement, it is most effective if it is represented with a
human-like body (Krämer & Bente, 2010).
As previously mentioned in the
gender stereotyping of computer agents, a computer’s voice shapes our
perceptions of competency, and credibility. Consistent with the importance of
animacy to learning, computer voices are most effective in teaching when they
are human-like (as opposed to off-the-shelf machine/robotic voices). In a study
by Atkinson, Mayer, and Merrill (2005), students performed better
on practice problems, and had higher rates of both near and far
transfer when learning from a pedagogical agent with a human-like voice,
compared to a machine voice. This effect was consistent, even while accounting
for the effect of novelty and learning contexts – the study was replicated in a
high school classroom, in addition to a lab setting.
The implication of the CASA
paradigm is that social rewards coming from a computer would be as detrimental
and complicated as those coming from a human. Praise from computers, then,
should come only in the same form as Kohn suggests. Computer-human praise
should focus on what people do, as opposed to quality, to draw focus away from performance
goals praise should be specific; praise should be genuine (Kohn). An
animated pedagogical agent with a thumbs-up and a “good job!” speech bubble is
counter-productive to learning and, in theory, decreases intrinsic motivation
as much as a grade, or getting a sticker for good behavior. Instead, animated
agents could focus on learning achievements: “you used a new word,” “you tried
a new strategy.” With artificial intelligence, specific and unique praise may
be more achievable in the near future, as it may be difficult to program a
simple avatar to provide unconditional and genuine support.
Future directions
Future
research should explore the three conditions of harmless praise. As defined by
Kohn, praise is less harmful than more tangible rewards when 1) praise is less
salient 2) praise is less controlling, and 3) praise is not promised in advance. This effect could easily be applied to pedagogical agents by testing
intrinsic motivation after a pedagogical agent violates any or all of these
rules. For example, to violate the controlling praise rule, a pedagogical agent
could socially reward a learner with: “Congratulations! You met the standard
score for students of your age.” This statement would be controlling as it sets
an expectation for the learner, so that if they do not receive similar praise
going forward they know they have failed or disappointed the agent. It also
violates a socially supportive learning environment by comparing the learner to
other students, so that the student feels the need to compete with their peers
for success. This aspect of cooperation could
also be studied in digital learning environments by testing the effectiveness
of computer-mediated learning in cooperative versus competitive peer environments.
References
Atkinson, R.K., Mayer, R.E., &
Merrill, M. M. (2005). Fostering social agency in
multi-media learning:
Examining the impact of an animated agent’s
voice. Contemporary Educational Psychology, 30, 117-139.
Baylor, A. L., Ryu, J. (2003). The
effects of image and animation in enhancing
pedagogical agent
persona. Journal of Educational Computing Research, 28(4), 373-394.
Edwards, C., Edwards, A., Stoll, B.,
Lin, X., & Massey, N. (2018). Evaluations
of an artificial intelligence
instructor’s voice: Social Identity Theory
in human-robot interactions. Computers in
Krämer, N. C. & Bente, G.
(2010). Personalizing e-learning: the social effects of pedagogical agents.
Educational Psychology Review, 22 (1),
71-87.
Kohn, A. (1993). Punished by rewards: The trouble with gold
stars, incentive
plans, A’s, praise, and other bribes. Boston:
Houghton Mifflin Co.
Moreno, R., Mayer, R. E., Spires, H.
A., & Lester, J. C. (2001). The case for
social agency in computer-based teaching: do students learn more
social agency in computer-based teaching: do students learn more
deeply when they interact with animated
pedagogical agents?
Cognition and
Instruction, 19(2), 177-213.
Nass, C., & Moon, Y. (2000).
Machines and mindlessness: Social responses
to computers. Journal of Social Issues, 56, 81-103.
to computers. Journal of Social Issues, 56, 81-103.
Nass, C., Moon, Y., Fogg, B. J.,
Reeves, B., & Dryer, D. C. (1995). Can
computer personalities be human personalities? International
computer personalities be human personalities? International
Journal of Human-Computer Studies, 43, 223-239.
Popenici, S. A. D. & Kerr, S.
(2017). Exploring the impact of artificial
intelligence on teaching and learning in higher education.
Research and Practice in Technology Enhanced Learning,12 (22).
https:/doi.org/10.1186/s41039-017-0062-8.
intelligence on teaching and learning in higher education.
Research and Practice in Technology Enhanced Learning,12 (22).
https:/doi.org/10.1186/s41039-017-0062-8.
Sproull, L., Subramani, M., Kiesler,
S., Walker, J. H., & Waters, K. (1996).
When the interface is a face. Human Computer Interaction, 11(2),
97-124.
When the interface is a face. Human Computer Interaction, 11(2),
97-124.
Töpper, J., Glaser, M.,& Schwan,
S. (2014). Extending social cue based
principles of multimedia learning beyond their immediate effects.
Learning and Instruction, 29, 10-20.
principles of multimedia learning beyond their immediate effects.
Learning and Instruction, 29, 10-20.


No comments:
Post a Comment