In her research, Assistant Professor Anastasia Kuzminykh explores the role of communication context in how we share and perceive digital information. Recent research topics have included parents sharing of information about their young children online, suggestions as to improve people’s attention in online meetings, and human users’ relationships with virtual assistants like Apple’s Siri and Amazon’s Alexa including out tendency to anthropomorphize them.
The way an agent like Siri or Alexa is perceived, says Kuzminykh, “impacts how it’s accepted and how people interact with it, how much people trust it, how much people talk to it, and the way people talk to it … People start attributing lifelike quality and traits to these agents.”
Kuzminykh, whose expertise is in Human-Computer interaction and who joined the Faculty of Information this term, carried out the virtual assistant study with a research team while at the University of Waterloo, where she completed her PhD.
Participants interacted with Siri, Alexa and Google’s unnamed assistant, and were then interviewed by researchers about their perceptions. Overall, Alexa was perceived to be older while Siri, says Kuzminykh, was seen to be “using her sass as a defense mechanism.” Google assistant landed in the middle, described as “more professional” than the other two. The identified differences were found to be consistent among users as were the visual portraits they conceived using a software program.
With an academic background in psychology and ethnography before she switched to computer science, Kuzminykh says she has always been fascinated by how people communicate. After first researching human-to-human communications, her focus is now on communications between humans and machines and systems design. Among other things, she explores “how much we bring from human-to-human interaction to human-to-machine interaction.”
In two-other recent papers, co-authored with the Microsoft researcher Sean Rintel, Kuzminykh explores people’s attentiveness in video conferences to try and understand how to make online meetings more comfortable and effective. Kuzminykh says the goal was to focus on the more procedural aspects of attention as opposed to the cognitive aspects, which have been well studied.
Using the framework of “attention as action,” she showed the “multilayered nature of attention” and identified possible avenues of technological support. For example, when people can’t signal as they would in person, it makes it harder for the user to manage their social interactions and to appropriately participate in the meeting dynamics. Online, speakers often find it hard to be looking at just one thing: “In person, I make sure that I’m not just talking to one person or one direction in the room. You need to switch focus, talk to everyone. That’s not something I can control remotely – I’m only looking at whatever I can see,” said one participant in this study.
“We need help seeing into that room,” said Kuzminykh, who believes artificial intelligence could be used to develop new features to augment human vision in meetings. Then speakers could, for example, zoom to where everyone is looking rather than focusing on what she “static ‘Brady Bunch’ style walls of video.”
In a paper to be published later this year, Kuzminykh will delve into the topic of parents’ strategies for sharing information about their children online. “’Sharenting’ as it’s called is very complicated legally and socially,” says Kuzminykh. “It’s a very polarizing topic,” raising questions about who has the right to share and intended communications contexts.
This term, Kuzminykh, who was hired through a completely online process as a result of the pandemic, is teaching INF2169: User–Centred information Systems Development, which looks at different research methods in user-centric design and synthesizing them in order to inform design of system features. “It’s a project–based course with each group working on their own topic,” says Kuzminykh. “I’m a big advocate of owning your project.”