AI may be taking the world by storm, but its roots date back to the 1950s – long before Chat-GPT and other large language models became commonplace. Today, MacEwan students are examining the implications – and opportunities – that come with this technology, and how it impacts our work, school and life.
See how these MacEwan students (a small representation of the more than 300 who presented from across faculties at the university’s biggest-ever Student Research Day) are digging deeper to ask questions about implications, opportunities and risks of AI in areas such as public service, health care and psychology.

Can AI help teach future nurses?
First intrigued with the topic of deep learning and anatomical knowledge retention during an anatomy course, Bachelor of Science in Nursing classmates Taij Mann, Kiara Ukrainetz and Sarah Burden, approached their prof about looking at changes AI is bringing to education.
Together with Dr. Yuwaraj Narnaware (a.k.a. “Dr. Raj”), a member of MacEwan’s Immersive Learning Institute, the team analyzed the extent and capacity of AI in teaching and learning, specifically in nursing education compared to medicine and allied health disciplines.
“AI will change how educational institutions teach and evaluate students,” says Mann. “Students will need to understand its strengths and limitations to be competent health-care providers in the near future.”
The team’s findings also explored current applications, identified research gaps and highlighted opportunities for advancing nursing education with AI, including within the curriculum at MacEwan.
Their research – which culminated in presentations at Student Research Day and at the Human Anatomy and Physiology Society's annual conference in Pittsburgh – is far from over.
“I do not understand AI nearly as well as I thought. The more I study the topic, the more I respect its complexity,” says Mann.

Helping people face their fears, virtually
A group of computer science students decided to learn through exposure and actually create a virtual reality (VR) application alongside their research project.
“Two of us are psychology minors and hearing about the rise of technology in all sorts of different therapies inspired us,” says Tomer Mazor, Bachelor of Science student.
Mazor and his classmates Tyler Hardy and Ryan-Jay Rosales, under the supervision of Dr. Mahmoud Elsaadany and Dr. Shokry Shamselsdin, assistant professors in the Department of Computer Science, were curious if VR could help in exposure therapy. They chose acrophobia – fear of heights – and began to build, creating a VR app that simulates different heights, allowing individual to slowly and gradually expose themselves to increasing heights, reducing their fear response over time.
The addition of a heart rate monitor, which communicates to the app, allows people the added ability to watch their body respond to the exposure in real time.
“VR can be more than just a way to play video games,” he says. “It’s possible to use VR to simulate an aspect of real life, and use that simulation to help people in their day-to-day lives.”

What if ChatGPT wrote public policy?
When Bachelor of Arts student Brandon Biglow set out to look at how large language models, like ChatGPT are being used in the public service, he didn’t realize the path his research would take.
“As I started engaging with public servants and reviewing the literature, the project shifted,” says the Political Science Honours major. “The real story was less about automation and more about ethics, trust and accountability.”
Under the supervision of Associate Professor Dr. Brendan Boyd, Biglow found himself asking fewer questions about what the tools looked like and more questions about what public servants thought about the tools.
“It gave me a bottom-up perspective – one grounded in real experiences.”
With funding from an Undergraduate Student Research Initiative (USRI) grant, Biglow was able to engage with even more public servants.
“AI is already embedded in the public sector, and it’s reshaping how government functions from the inside out. That raises big questions – about transparency, using emerging technologies responsibly and what it means to keep public service human in an age of machine-generated decisions,” he says. “This project is about making sure we don’t just adopt technology because we can, but because we understand its impact and are prepared to use it responsibly.”
Can students practice nursing using VR?
It may seem odd to see a computer science student interested in developing games take on a research project about nursing.
But it actually makes perfect sense once you learn that Jehdi Aizon’s project is a virtual reality simulator to help first- and second-year nursing students learn.
Working alongside Dr. Sam Qorbani, who had been speaking with assistant professor Melanie Neumeier about potential cost-effective ways to engage and train nursing students, Aizon decided to try to come up with a solution. She created a virtual reality simulator for an infusion pump.
“First and second year nursing students can train on medical machines in a portable and accessible way,” says Aizon. “They can improve their efficiency and understanding even before they get to use a physical machine.”
Aizon says that research like this can innovate and improve current training methods for nursing students. “It can increase engagement, information retention and is a budget-friendly solution.”
And while she understands research that is so niche might not appeal to a broad audience, to students considering research, she says, “No matter how little or big your hypothesis was proved, as long as you learned something, it is worth doing.”

Can training help humans identify real from AI-generated images?
Cadence Mutch, Deyan Vulkov and Lindsay Downs first thought about detecting AI-generated images in a final group project for Dr. Michelle Jarick’s brain and cognition course.
“We wanted to know how people decide what’s real or fake and if they can get better at doing that,” says Mutch. Working with Dr. Jarick, the three Psychology majors delved into belief bias, educational training interventions and eye-tracking technology.
After discovering a gap in the research involving how people visually inspect images, the students used eye–tracking technology to explore gaze patterns during fake image detection tasks.
“We tested how well people could tell the difference between real and AI-generated images,” says Mutch. “Then we gave some participants a short training video showing common AI mistakes.”
The team found that education, attention patterns and even personal beliefs affect how people judge whether an image is real or not.
“Together, our research paints a bigger picture about how people interact with AI-generated content – and how psychology can help us better understand (and maybe even combat) the spread of misinformation online.”

Super-fast, wearable technology that doesn’t get kinked up
When a group of computer science students embarked on a research project under the supervision of Dr. Mohammed Elmorsy, they wanted to look into the future of wearable, wireless technology.
After one student co-researcher, along with Dr. Mahmoud Elsaadany and Dr. Shoukry Shamseldin, assistant professors in the Department of Computer Science, ran successful simulations with a special type of thin and flexible antenna cable called Printed Ridge Gap Waveguide, they called Matthew Kostawich to the table to help put together their findings.
“Even when wrapped or bent, the material was still able to transmit, allowing gadgets to communicate reliably,” says Kostawich. “This means future wearable technology, like virtual reality headsets or advanced medical monitors, can be flexible, reliable and perform well – even if they get bent.”
And thanks to a USRI grant, Kostawich was able to share these findings, virtually, at the ICECE 2024: XVIII International Conference on Electronics, Information and Communication Engineering, where he also earned the Best Performance award.
But research for the sake of research wasn’t the only takeaway for Kostawich. “Presenting this research at an international conference and winning an award taught me the value of clearly communicating complex ideas to diverse audiences.”

Is research a game?
Not all research projects are fun and games, but Keanu Burr’s was.
“I chose to minor in Classics and major in Computer Science in the Gaming stream, so I knew that I wanted to create, code and produce a video game that uses ancient history as inspiration,” says Burr, who based his persuasive game designed to raise awareness about healthy eating on Homer’s The Iliad.
“Endless runners are mildly popular and represent a suitable coding challenge,” he says.
Under the supervision of computer science prof Dr. Chinenye Ndulue, Burr did tutorials on common video game engine, Unity, while teaching himself the other skills he needed.
“The project specifically involved coding the player's movement, physics, and interaction with objects such as power ups as well as using models and textures from a student assets pack to create a number of different levels to play through.”
“There are so many aspects to game design. Technical knowledge with coding, logical thinking for puzzles and challenges, creativity for design, and a drive and passion to see the project through to the end,” says Burr. “I learned a huge amount about game engines and the process of week-to-week game development.”
Learn more
