Home Artificial Intelligence Laura Petrich, PhD Student in Robotics & Machine Learning – Interview Series

Laura Petrich, PhD Student in Robotics & Machine Learning – Interview Series

0
Laura Petrich, PhD Student in Robotics & Machine Learning – Interview Series

Laura is currently pursuing a Ph.D. in Computing Science under the supervision of Dr. Patrick Pilarski and Dr. Matthew E. Taylor. She received a B.Sc. with Honors in Computing Science from the University of Alberta in 2019 and an M.Sc. in Computing Science from the University of Alberta in 2022. Her research interests include reinforcement learning, human-robot interaction, biomechatronics, and assistive robotics. Drawing inspiration from her anatomical studies with Dr. Pierre Lemelin, Laura’s research goals to develop control methods for robotic manipulation with the goal of increased functionality, usability, reliability, and safety within the real-world.

We sat down for an interview on the annual 2023 Upper Certain conference on AI that’s held in Edmonton, AB and hosted by Amii (Alberta Machine Intelligence Institute).

What initially attracted you to computer science?

So, once I got here back to high school, I wanted to really be a health care provider to start with. And work-life balance is admittedly necessary to me, so I could spend a whole lot of time with my family. And I’ve at all times wanted to simply help people, and I ended up taking a pc science class in my first 12 months of university, and it just felt like such a tremendous tool that you could possibly use to unravel problems. And I just fell in love with problem solving and decided that having the ability to work within the space of assistive technology where I could put that problem solving to make use of and truly help people in the method was where I desired to exist.

How did you work out that assistive technology was your passion to get into?

I took a robotics class in my undergrad, and it was where we used Lego Mindstorm kits with a purpose to learn the fundamentals of mechatronics and robotics. And it just all clicked together that it was a lot fun to work with, and you could possibly write programs after which immediately see the outcomes of it. So, it just made sense to merge my love of working with these robotics systems with my desire to assist people. Assistive technology just suits exactly into that space.

Could you define mechatronics for our audience?

It might be identical to anything contained in the robotics sphere where you might have these hardware systems, and you may control them with a purpose to enact change within the environment.

What are a few of the different use cases you’ve got worked on for that technology?

At once, I’m working within the BLINC Lab (Bionic Limbs for Improved Natural Control) with Patrick (Pilarski), and the major use case there may be I’m working on upper limb prosthetics. So, now we have these smart robotic devices that you may control through myoelectric signals, and that will be the major use case is how can we control these prosthetic limbs which might be now attached to the human body with a purpose to do what the user wants.

How long have you ever been working on that specifically?

I just began my PhD in January. I did my master’s within the robotics and computer vision group on the University of Alberta, where I worked on robotic manipulation with robotic arms.

I’m just now venturing into the world of prosthetics.

What are some safety concerns that you simply would say with the technology in the intervening time?

With these smart prosthetic devices, we at all times keep safety first and forefront. That is at all times what we take into consideration first, because these devices are attached to a human. So, at the tip of the day, the device doesn’t have the ultimate say. The human is at all times in complete control. We are able to make suggestions to the human, say, “I feel that you would like to do that,” but they at all times have the ultimate control over what happens. So, we’re at all times serious about what’s the protection of the person that’s going to be using these devices.

In a previous interview, Patrick was talking about how normally the brain has to learn to adapt to the device, but on this case the device uses machine learning to adapt to the brain. Could you discuss your views on this?

Yes. So, we wish to construct continual learning systems for running on these devices. So, it’s all about mapping the signals from the user, which in our case could be EMG signals. So, surface EMGs, you set electrodes on the person’s residual muscles that they’ve, after which what do you do with those input signals? So, we wish to map it to robotic motion, right? Do you wish hand open? Do you wish hand closed? First now we have to come to a decision how we’re doing that mapping problem, and that is where machine learning comes into play.

You could have pattern recognition systems, so we will predict what is occurring with the particular muscle activation. We wish it to repeatedly learn and adapt over time. You may think that, say, your muscles immediately. Should you were to go to the gym, your muscle shape changes. So, do now we have to now completely train a recent machine learning model? No. We wish the device and the machine learning components to adapt to the person over time as they undergo changes or as their intent changes.

What type of timeline do you think that it’ll be until we see these out in the actual world?

I hope inside my PhD. That’s my goal is to unravel this control problem for upper limb prosthesis.

That will be amazing. And what’s your vision for the longer term of assisted robotics, for instance in a 10-year timeline or 20-year timeline?

I envision a world where individuals that will wish to have a prosthetic limb to be used of their on a regular basis life would have the option to make use of it in a way that we use our arms. So, to make it reliable, make it intuitive, make it easy to make use of, that is what I would really like for.

And do you see a customized future where a smaller person would have a smaller prosthetic? Or do you think that they’d be all the identical?

No, absolutely. I feel for something as personal as this, you wish these devices to feel as an extension of your individual body. Right? You wish it to be an element of you. So, that brings us all the way down to personalized and individualized healthcare. These prosthetic limbs would have to be shaped and adapted to the person that’s going to be using them, at the very least in my view. We already see this. At once, if you go to, say, the Glenrose Rehabilitation Hospital here in Edmonton, and also you’re getting fit for a recent prostheses, they take 3D modeling of your residual limb, they usually personalize the custom fit of your prosthetic limb to you. So, we’re already seeing this occur, and for these smart devices, it’ll just be much more so.

So, it’ll be custom 3D printed, probably, for the user.

Yeah, custom 3D printed, custom fit, after which custom personalized control systems, also.

And the way long does it take for a user to learn easy methods to use one in all these systems?

Our goal is that we’d have the option to coach a more generalized machine learning model, after which have the option to individualize it to the person inside a five, 10 minute training session. That will be a goal of ours.

Once you’ve got achieved this, what would you wish to work on next?

An enormous a part of my research can also be working with the anatomy department on the University of Alberta, And I’m completely fascinated with how we, as humans, manipulate the world around us. So, the whole lot to do with the human upper limb. So, our arms and our hands are completely fascinating. So, I would really like to focus all my efforts on upper limb prosthesis and really making these devices usable for people in the actual world.

In what ways is it more difficult than the lower limbs?

The lower limb, the motion could be very repetitive. Should you consider our walking gate, it’s a better control problem to unravel. Whereas, for the upper limb, it’s essential have the option to control objects in 3D space, you might have so many more degrees of freedom. Should you give it some thought, if we close our eyes and we reach around, we will still see the world around us through our hands. So, now we have these amazing sensory organs that we will use to explore the environment. So, to have the option to present that back to the medical community through prosthetic limbs, I feel, could be a amazing,

Are you able to share the way you draw inspiration from anatomical studies

I have been working with Dr. Pierre Lemelin within the anatomy department on the University of Alberta for the past five years now. I’m hoping that through the study of human anatomy and understanding exactly how we manipulate objects within the environment, what are our nerve pathways, what muscles are activated, we will use that knowledge with a purpose to not only improve, initially, the structure of prosthetic limbs.

Are there small places within the actual design of the prosthetic limb that we will change that will be a small little tweak in mechanical design, but see an enormous increase in function? But in addition, the underlying control systems, how we use it. If we will understand exactly which nerves are firing, once we’re pondering, “Oh, open my hand,” then can we use that with a purpose to predict what the user desires to do with their prosthetic limb, after which enact that motion within the robotic device.

LEAVE A REPLY

Please enter your comment!
Please enter your name here