Connections with Evan Dawson
How locally developed AI tools are helping parents of Deaf children learn ASL
1/24/2025 | 52m 10sVideo has Closed Captions
The development of an AI-powered, augmented reality tool to help parents of Deaf children learn ASL
University of Rochester researchers Zhen Bai and Wyatte Hall are developing AI-powered, augmented reality tools to help parents of Deaf children learn ASL. They’ve been working with the Deaf community to develop and test the software, including early collaborations with the Rochester School of the Deaf. Guest host Sarah Abbamonte explores the new technology and its implications with the team.
Connections with Evan Dawson
How locally developed AI tools are helping parents of Deaf children learn ASL
1/24/2025 | 52m 10sVideo has Closed Captions
University of Rochester researchers Zhen Bai and Wyatte Hall are developing AI-powered, augmented reality tools to help parents of Deaf children learn ASL. They’ve been working with the Deaf community to develop and test the software, including early collaborations with the Rochester School of the Deaf. Guest host Sarah Abbamonte explores the new technology and its implications with the team.
How to Watch Connections with Evan Dawson
Connections with Evan Dawson is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- From WXXI News, this is "Connections."
I'm Sarah Murphy Abbamonte filling in for Evan Dawson.
(upbeat music) Our connection this hour was made at a new faculty orientation breakfast when Dr. Zhen Bai, an Assistant Professor of Computer Science at the University of Rochester and the Biggar Family Fellow in Data Science at The Goergen Institute, met Dr. Wyatte Hall, a deaf researcher and Assistant Professor at the University of Rochester Medical Center's Department of Public Health Services.
Both share an interest in childhood development and learning.
The first five years of a child's life are crucial for language development, shaping cognitive abilities, enhancing communication skills, and supporting social interactions.
Within the deaf community, 90% of deaf children are born to hearing parents.
Their child may be the first deaf person a parent ever meets.
Supporting hearing parents to gain fluency in American Sign Language or ASL can have a significant benefit on language development for deaf and hard of hearing children.
To address this, Dr. Bai and Dr. Hall are collaborating on a program called the Tabletop Interactive Play System or TIPS.
Over the next hour, we'll learn more from both of them about TIPS, how it works, the development process, and where it's headed.
We're also joined by Karen Fisher-Malley.
Karen is the Director of Early Childhood Programs in Kindergarten at the Rochester School for the Deaf.
She worked with the TIPS research team on its development and helped recruit participants for an initial interview study in 2023.
Also, joining us in studio is our ASL interpreter, Byron Behm.
As always, we invite your feedback during the program by calling 1-844-295-TALK.
That's 1-844-295-8255.
You can also email us at connections@wxxxi.org or leave a comment on our YouTube feed.
Zhen, Wyatte, Karen, and Byron welcome.
Since we're now live streaming daily to YouTube, I'd like to ask each of our guests to introduce themselves to our audience and provide a short visual description.
I'll start.
I am a cis woman in my early forties with an olive complexion and short, dark brown hair.
Today, I am wearing a white sweater and black jeans with glasses.
Dr. Bi, if we could start with you, and we'll work our way around the studio.
- Okay.
Hello, everyone.
Thank you for having us.
So I'm a thin Asian woman.
And today, I'm wearing a dark gray dress and a vest.
- [Sarah] Thank you.
Dr. Hall?
- Good morning, Dr. Wyatte Hall.
I'm wearing a flower blue shirt with a gray vest.
Brown hair.
In my late thirties.
- [Sarah] And Karen?
Hi, I'm Karen Fisher-Malley, and I am in my forties.
I am wearing, I guess it would be a mustard sweater with glasses and dark long hair.
- Wonderful, and Byron?
- Hi, I am Byron.
I'm a middle aged white individual with a black long sleeve shirt.
- Wonderful, thank you.
I'll start with either Dr. Bai or Dr. Hall.
Would you like to get us started describing TIPS, what it is, how it works?
and we'll talk also about kind of some of the problems and the inspiration for why you're working on this project.
- Yeah, sure.
I guess I can start first.
So TIPS system, it's a smart technology that help people to interact, particularly helping hearing parents better interact in the American Sign Language with their DHH child.
And kind of imagine, you know, the parent and the child is at the, you know, dinner table, and they brought up some toys on the table and they were just, you know, playing.
And, you know, the parent like playing with a goldfish, maybe goldfish in a pan, and they're telling a story.
But, you know, just, it sounds really new to him, and he has the desire to express, to show those wonderful things in front of his child.
But he's, you know, run short of knowledge.
So he may just look it up online, ASL dictionary, about how to sign goldfish, and then try to sign it.
So there are technology available here, but as you can see, the moment might be already gone.
You know, the kids might be already moving on, look at something else, you know, playing with other toys.
So I think the main interesting aspect and also challenge for TIPS system is that we are trying to really help parent to be in the moment.
And the moment that they wanna express something, they wanna share something with their kid, they can quickly get access to relevant and self-sign, and then they can reproduce it right in the moment.
So that's kind of in a nutshell, the technology.
I can tell more about the technical detail, but before that, I wonder if Wyatte would like to chime in, introduce anything about like the background.
- Sure.
So I think the inspiration for this project came from what you just mentioned, is that most deaf children are born to hearing families.
And like you had said, the first deaf child, for many families, that they've ever met is their own deaf child.
And so they're often at a loss.
"What do I do?"
You know, we wanna, you know, make the best of this.
And typically that comes from what the doctor recommends.
But what we've seen over the years is that there's a tremendous impact on the child when they don't have access to language.
And every day the child's missing access, that has prolonged impact on their development.
And we can think about different ways to make it easier for parents to connect or bridge that communication access.
You know, if they're going to an appointment or have someone that's providing access, but that only happens once a week, well, what's happening in the times between when the parent is just playing or interacting with their child?
And that's where this type of technology and this system can really help make those connections happen.
So that's where we're really excited to see where this project's headed.
- Yeah, absolutely.
Karen, I wonder if you wanna chime in here talking about the supports in the systems for families with very small children.
- Of course.
So just like Wyatte had mentioned, young deaf children need access to language just as hearing children need from day one.
And for deaf children, that's a visual language, and, in our country, American Sign Language.
So providing opportunities and resources for families as early as we can is gonna set our deaf and hard of hearing children, babies, on the right track for developing language, which in turn helps social emotional growth, cognitive growth, and just having different resources will provide families a connection that they will be able to, in turn, provide for their child.
- Great, thank you.
Dr. Bai, I wanna turn back to you.
You offered, so lovely, a few minutes ago to kind of walk us through the technology that TIPS uses.
- Yeah, so you can imagine TIPS as a third party observer.
So it has a webcam.
It has a mic that is connected with this so-called speech to text technology.
And with those live stream into the system, the system's able to, in real time, analyze the visible object on the table, like the toys and the interaction, like the hand interaction with the toy at the same time because it's serving for hearing parents.
So they might, you know, also speak while they're interacting with their child.
So the system's able to listen to the parents speaking.
And so that is kind of the observant perspective.
And at the same time, it will analyze the situation.
Oh, okay, you know, a goldfish is brought up.
There is a goldfish toy.
And let's just quickly look up into the ASL dictionary to bring relevant ASL signs.
And then there is a projector, overhead projector, to project the relevant ASL sign in the video form right in front of the parent.
So the this loop is very fast.
So as soon as the parent finish the speaking, maybe with a short delay, then the video will be presented so that they can immediately offer the sign that is relevant and still connect it with the current interaction that they have with their kid.
- That's great.
And I know for our audience members you were lovely enough to provide a video, so we were able to see the technology a little bit before the show today.
So I'm wondering if, for those who haven't been able to see the video yet, talk to us about what devices these are on.
Is this an at-home system or do you have to come on campus to use it?
How does that work?
- Yeah, so we started with this tabletop version, which, it involves a surface.
It can be thinking about the table surface or carpet.
And it has a overhead projector with a camera mounted on top.
And you wear a microphone, basically.
- Okay.
- And then just interact with your kid, just like usual how you, you know, play toys together, yeah.
- Great, great.
And are you looking at making this available in kind of different formats or different technologies, devices?
- Yes, so we have been running initial studies to gather feedback from both hearing parents and also other stakeholders like deaf, and hard of hearing parents, and, you know, professionals in ASL, and early DHH child education.
So we get those feedback and we explore different form factors or, you know, just the form of the display to better offer this non-intrusive interaction.
And we have explored.
So this projection based display is one of the option.
And we also studied smartwatch, smart glasses, and tablet.
So we got a lot of really useful feedback.
And the projection-based display is one of the highly voted option because people think that it's kind of less distracting when they're interacting.
- Less intrusive.
- Yeah, less intrusive.
The tablet is another one that people think is very practical, and they have the device at home already.
So that's also motivated us.
Now, we have another parallel project that called portable TIPS.
So that portable TIPS is only use a tablet system, that the tablet system will be able to, you know, like monitor the voice input and to be able to print up this relevant sign as the interaction goes.
So to answer your earlier question, so we are looking for parents and their kids to help us test the current working prototype, both the tabletop version, that will be in our lab setting on campus at U of R, and the tablet version.
We are really interested to get feedback about, you know, just with the minimum function, is it useful at all?
And what are the, you know, gaps that parent find that we need to keep working further to accommodate their day-to-day interaction?
- Mm hmm.
- Yeah.
- Great, thank you.
Dr. Hall, I wonder if you could take us back to kind of the seed in the beginning of this idea, and how did you work through what TIPS was gonna look like in its first or early iterations?
- Yeah, so, really, it really started with Zhen, but, like you had mentioned, we bumped into each other at a faculty orientation.
And faculty orientations can be pretty awkward sometimes.
You have these awkward icebreakers, and you have to start sharing a little bit of your work, and your science, and all that, but we actually hit it off really well because we were at the same table by chance, and I was sharing a little bit of my work, which has been primarily focused on language deprivation in this early stage of life when people don't get access to language, the different cascading effects that happen on development that really exist throughout life.
So I've been really motivated in studying the effects of that and how we can actually prevent those cascading effects.
And Zhen had experience doing similar work with other populations and was interested in technology.
And so we shared at our icebreaker, and then we met a few additional times afterwards.
But what it looked like is that we recognized that it's already hard for parents to find out that they have a deaf child and be told that maybe they need to learn another language that's not a traditional spoken language.
That becomes a really big barrier for some people.
So what are some ways that we can reduce that barrier, those fears, those, you know, anxieties or the initial response of "I don't have time to learn another language?"
Well, can we use technology to help address that?
And so it kind of came out of a spontaneous interaction of like, this is what we're dealing with, can we fix this or can we address that?
And that's where the ideas sparked from there.
- Great, yeah.
And Karen, how, how did you get involved in the process?
When did you come in?
- Well, Wyatte and I have known each other a while and worked on a lot of projects in regards to language deprivation and the impact of early language access for deaf and heard of hearing children.
And somehow I got matched up with Zhen.
And that was in 2019, I believe, or early 2020 when I was invited to go to the University of Rochester with a parent, which was a wonderful experience.
This was a hearing parent that went with me that had a deaf child and was very motivated to learn American Sign Language and follow the trajectory necessary for their child to become fluent, which meant they needed to become fluent as well.
So Zhen welcomed us into her research area, and I met a lot of wonderful people she was working with and we were able to see what her vision was.
And then we didn't touch base again until 2024.
And I was invited again to go and see the work that has happened in the last five years.
And what I appreciate is that you really take into account everyone's perspective, which is important when working with the deaf community because as hearing people, we don't have all the knowledge.
So I've appreciated that you've really worked with so many different people to make sure that it's, you know, valuing the deaf community.
- Yeah, I'd like to add on top of that.
- [Sarah] Please.
- There was the period that, you know... - [Sarah] Yeah, I heard Karen say 2020, and I was like, Ooh.
- But even through that process, Karen has been very supportive to help us reach out to parents and help us recruit participants.
So we have conducted online survey during that, you know, remote period, and we have in-person interview as well with parents, both hearing parents and deaf and hard of hearing parents, who has DHH child like in this early childhood.
So those are really essential support and feedback that we get that help us to kind of, you know, have the early kind of touch base with potential users, whether this is the sort of technology that they would find useful or some people might benefit from using it.
And I think start from there.
We spend more time, you know, with the feedback we gathered to help us to understand, you know, how to create the user interaction experiences and the right functionality that people appreciate and actually thinks useful, not because the technology is there that we can do it, but it's useful, and also there are things they want.
Maybe the technology's not there yet, and that's where lead to a lot of potential technical innovation that we can particularly focus on.
So that's just something I would like to add on that.
And I think we've been really fortunate to have a lot of support through the deaf community.
Yeah.
- Yeah, I just wanted to add to that.
I think exactly what you shared.
I think this creates that foundation.
And like you had said with the innovations that will come out of this, you know, thinking about, you know, if we get to, you know, 3D holograms as an example, that is a tremendous advantage to visual languages.
So this is the start of where we can go.
And I think, you know, we needed to make that step now, but we're really excited to see where this develops and what will come of it in the future.
- Great, great.
Karen, what supports or resources have been out there for parents prior to the development of TIPS and similar type projects?
- So, as you know, we live in Rochester, and we have a very strong and vibrant deaf community.
So the resources that we might have here might not be resources that are found in other states, other counties.
The resources... With technology, I feel a lot of barriers have been broken down.
So being able to connect online with other deaf and hard of hearing organizations, different schools, and allowing families to be able to find a connection themselves.
Like, "Oh, I have a deaf child."
"Oh, I have a deaf child too."
So the technology has allowed families to make faster connections, but maybe not necessarily all the resources that they need to get that jumpstart.
So, you know, as a people person, a teacher at heart, it's always best to learn a language in person.
But there are barriers that come into play.
I live in a town that has no other deaf person.
I live three hours from a school for the deaf where those types of resources can benefit families and kind of make them not feel alone.
So I think there's definitely been resources out there, but it depends on where you live and your accessibility to them.
- Great.
We are talking with Dr. Zhen Bai, Dr. Wyatte Hall, Karen Fisher-Malley, and Byron Behm on a new technology, the TIPS system that they are developing for hearing parents of deaf children who are looking to gain fluency in ASL.
As always, we invite your feedback during the program.
You can call us at 1-844-295-TALK, 844-295-8255, or email us at connections@wxxi.org.
And if you're listening and watching on our YouTube feed, you can leave a comment there as well.
So we talked a little bit before we got on air about stories.
I'd love to hear from anyone on the panel, if you wanted to share some particular success stories or lessons learned so far in the development process.
- [Zhen] Yeah, I- - I'm just trying- - Get started- - I think being, you know, someone starting the project, not a signer and it sounds completely new language, I think that's something both the language itself as a visual spatial language and also the kinda appropriate and ethical way of collaborating with the deaf community.
I think that is something that me, myself, and my team has been growing knowledge constantly.
And the more we, you know, get in touch and learn things, the more we realize we have to learn.
Yeah.
So that's something I think it's helping us, too.
I think because this is an interesting project, you think about it.
It's targeting on hearing parents, but they are having potential future members of the deaf community.
They're raising potential members.
And there is just such a different modality of communication in the family and in the extended community, like, you know, in their schools, in their communities, and this heterogeneity of communication.
I think that's something I personally find that it's a lot of really valuable lesson to learn when we are trying to work with different stakeholders that would all benefit or all care for the particular, you know, positive development of DHH child.
So that's something I think personally I find.
And, also, I mean, I felt like I have such a supportive collaboration and consultation group, but personally, I also really want to have the future opportunity to work with student funded deaf community, you know, like on the team would be able to, you know, participate in the research and conduct study.
We had some successful experiences.
Have summer student from the Bridges Fellow Program.
So I wish I can have all year round student working with us.
- I was gonna add to that.
You know, I think one of the things that really impressed me from the start, not just when we met, but immediately after, is when we started, you know, sharing these ideas Zhen always would look to the community.
And I think this is a little bit about the historical context, and this is really unfortunate, but oftentimes that hearing people see the issues or say, "Oh, here's an issue with the deaf community."
They'll take an idea and just try to run with it, but not actually have any deaf people on board.
And the end product doesn't actually directly benefit the deaf community.
It started with the best of intentions, but it didn't have that collaboration with the community from the start.
And we've seen that happen over and over again.
But with Zhen, we started to get the community on board from the start, and that's how connections with Karen happened and with the community, and it was really organic.
I think that was one of the key ingredients to build that stronger foundation from the start.
And I think if we didn't have that, that wouldn't get us to where we're at now.
So that spirit of inclusion, of bringing people on board, if you're, you know, working with the community, making sure the community stays on board, that's really important from the start.
- Yeah, absolutely.
And, and Zhen, you bring up a good point about the difference between ASL and, say, a spoken language in that, you know, I think there's a lot of misconceptions about that.
And then Dr. Hall, I don't know if you wanna speak to that, but that it is, you know, it's a fully embodied language.
And that these are things that have to be taken into consideration.
I know we have to take it into consideration here just to do this program, as we've discussed.
So, and then on top of that, there's also, to Karen's point, different sign languages, ASL being one of them.
And within ASL, just like any language, there's dialects and nuances regionally.
How is your technology kind of combating those differences or including those differences?
- So, to start, I mean, first of all, we need a couple hours to get to that.
(panelists laughing) - I always find that with these programs.
- Yeah, for sure.
So I think that you bring up such an interesting point, and that could be a whole talk in itself, but American Sign Language is one of many sign languages.
I would say that, you know, each country would have their own form of sign language.
Like, British sign language is so different- - Very different.
- Than American Sign Language.
If I was talking to a deaf individual in England, I would really struggle to find common connections.
And even though we share written English, our sign languages are very different.
So American Sign Language is its own language, and I see that being a common misconception, you know, from the start.
It's not a visual form of English.
It's not just gesturing.
You know, the ASL grammar takes into account facial expressions, movements, you know, hand shapes.
And so ironically enough, I think that it's a lot more efficient than English, but I'm a little biased, but I think, you know- - No, that's great way to look at it, yeah.
- Yeah, English is very linear.
You know, you have to get to the full sentence, but ASL, you can hold information in space and quickly reference it again later, so you can get to your point faster at times.
So sometimes even, I don't know if the viewers on the show can see, I could be done signing, but Byron the interpreter is still talking because he's not done with the English linear sentence that he's trying to get to.
But point is, is that's where some individuals who are learning ASL, it's not as simple for some people as acquiring another spoken language if you don't have context or, you know, some sort of familiarity with visual languages.
So I think this type of technology would encourage individuals to grow with their child.
I would imagine that some hearing parents, if they find out that they have a deaf child, they might feel like, "I'll never be good enough to be able to be fluent in signing," and drop American Sign Language completely.
But if we make parents more comfortable with something that could be difficult, but they can have a little bit more ease picking up ASL, that has lasting impacts on that child's development.
- Yeah, and Karen, I imagine that's also a consideration at RSD.
- Yes.
So we work really hard to welcome families as soon as they enter our program, and feel comfortable, and make it a safe place because when you have a deaf child, and they come to a deaf school, it becomes a family school.
So it's imperative that everyone feels at ease in learning.
And learning anything new at any time in life can be very overwhelming.
It's much easier for a young child that's deaf to pick up American Sign Language because it's that window of opportunity.
Their brain is just ready for it.
As we become older, it becomes more challenging.
So just creating a place, a space where families can feel at ease.
But then also we only get to see them an hour a week or do a home visit an hour a week.
If you have a child, you know they are busy all the time, all day, at night sometimes.
And we need to be able to communicate with them.
We need to be able to say, "We need to change our diaper," or "Oh, are you hungry?"
and give them that language.
And unfortunately you don't have a deaf person 24-7 right there giving you the signs.
So being able to provide a tool after they've started making those connections in the deaf community can really potentially support them in those moments of, "Oh, the water's hot in the bath, I need to tell you that, I need to give you that language."
- Sure.
- So I think that, you know, it's scaffolding, it's really building together what our deaf community and deaf individuals can provide for the families, but then also, hopefully, creating a tool in those moments that we don't have someone that we can quickly call or tap on the shoulder and ask for help.
- Absolutely, and I think, you know, our conversation last week with Dr. Buckley from NTID and a couple of other folks, what we're seeing in the news with just emergency preparedness and disaster planning that has excluded all sorts of people with disabilities, but particularly the deaf and hard of hearing community, and how, as you say, even just water in a bathtub being hot, like we need to communicate things in an expedient and efficient manner.
So that's great.
Thank you so much.
- Sure.
- We're gonna take a short break.
When we come back, we'll have more from our guests.
We'll be right back.
(upbeat music) - [Megan] I'm Megan Mack.
Coming up in our second hour, guest host Sarah Abbamonte continues the conversation about inclusion by discussing inclusion in the hospitality sector.
Travel went down during the COVID pandemic, but it's backup, and so is visitor spending.
But travelers with disabilities need to make different accommodations.
We talk with local people in the hospitality sector about how they are making the environment more inclusive.
That's next hour.
(upbeat music plays out) - [Announcer] Support for your public radio station comes from our members, and from Great Tie Consulting, providing management consulting services to help support not-for-profit organizations.
Services include C-suite staffing support, executive coaching, and business and fund development.
More at greattieconsulting.com And Ralph Honda, providing guidance throughout the car buying experience for more than 50 years, working to help individuals and families find a vehicle for life's journey whatever the mile marker.
Online at ralphhonda.com.
- This is "Connections."
I'm Sarah Murphy Abbamonte sitting in for Evan Dawson.
Today, we're talking to local researchers on developing AI tools to assist hearing parents of deaf children to learn American Sign Language more efficiently.
I wanna welcome back my guests, and also make sure that you, our audience, know you are always welcome to talk back to us.
1-844-295-TALK.
That's 1-844-295-8255.
You can also email us at connections@wxxi.org or leave a comment on our YouTube feed.
And I do just wanna give a note to CY who wrote into us via email about the regional differences, which we had talked about just before the break.
But thank you so much, CY, for listening.
and for for writing in today.
I wanna kind of pick up where we left off and talk about the actual technology that goes into the TIPS system.
So we've mentioned that it's AI, and there are always conversations to be had around artificial intelligence.
For the technology, is it native AI?
Is it cloud-based?
How is that?
- Great, so we host the AI system in our lab.
So it's not connected into the cloud, but we do, yeah, so like for example, the speech-to-text component, we used the technology called Whisper.
So it can be hosted right now.
We host it in our lab machine so that, you know, the data stays in the lab, so we can make sure that it's secure.
And so we have, there is a component that we use a large language model.
So that part is in the cloud right now.
What it does is that it find...
It's not translating.
It will map the input text-based sentence into what we call Gloss, which is a labeling system, I would say.
So it's not a translation system.
- [Sarah] Okay.
- It's a labeling system that help to label the corresponding ASL sign mapped to the spoken word.
And we use that, then, to retrieve ASL dictionary to identify the right video that is aligns with the parent expression.
At the same time, we also have...
So in addition to this one-to-one mapping, we also look into some smarter mapping on technique.
So, you know, sometimes you've already have received the same sign as you, you know, interacting with your kid.
So we want the system to be smarter to almost kind of read the situation, read your experience better, so it will minimize, it's recommended to you.
Only the essential ones to be recommended to you.
Or some parents, based on our chat with them, they also interested in... "Okay, you know, I'm also interested in a more diverse range of recommendation.
Maybe something that I did not really bring up myself, but it's relevant to the situation."
Like if it's, you know, we are talking about the shape or the size or the situation of the play, parents have, you know, open to get additional signs relevant to the current situation.
- So it's learning just like the Netflix algorithm learns what we wanna see in our queue.
- You can think about it that way.
- [Sarah] At a very simplistic- - That's actually the main reason why we want to study the at-home usage of such technology.
I think just like what we brought of the, you know, the regional differences, the language differences, another difference, and a huge one, is just individual differences.
If you have one parent-child interaction, you have that one parent-child interaction.
The day-to-day interaction, the vocabulary, and their learning trajectory are all different.
So that's why we want to have technology possibly just grow as the parent and the child grow.
- And I would think too, kind of the formality, informality of being in a public or lab setting versus being at home would also contribute to that.
- Yeah, lab setting is often, you can consider it, we have more control, we can have more channels of human-human interaction to be captured so that we can analyze, we can have more high fidelity of understanding of what happened between the parent and child.
It's not possible when you thinking about at-home environment, right?
You can't have all the cameras following you all the time.
So, you know, that's a very pragmatic challenge.
And so we want to minimize the sensing channels, but at the same time to still provide satisfactory, or at least useful, functionalities that would be able to be used on a day-to-day basis.
Yeah.
- Thank you, and we've talked a lot about, you know, the parent child relationship.
Are you working with whole families?
Are you working with siblings, with grandparents or is it just strictly-parent child at this point?
- That's a great question.
So right now, the two studies that I mentioned of having parent, child interacting, using our system, those are yeah, strictly parent and child.
But we do present our technology at public events whenever we can.
We participated in, for example, the Deaf Day of Play at Strong Museum.
And often, you know, it's the family that get involved in those events.
So I think if the technology is successful between parent and child, I can imagine it can be also useful for other intimate family members and even, you know, extended community members, yeah.
- How about with Rochester School of the Deaf, Karen?
Do you guys take a whole family approach?
How does that work?
- Yes, we do take a whole-family approach.
As that saying is, it takes a village to raise a child, and a child is not always with their parents.
They might be with a grandparent, or a babysitter, or an aunt, or a sibling.
So at Rochester School for the Deaf, when we host events, all the family is welcome.
We have what's called a pint group on Friday.
I just left it before coming here, and grandparents are there and babysitters are there.
Really allowing a whole community, welcoming them, and learning the language together.
And one really exciting thing we did this year was in implementing Sibling American Sign Language classes because siblings are always communicating with each other, and to be able to start sharing a language with one another is very important.
So we often use the word parents, I think, but I think that we do have to start kind of using that term family more because family is just, you know, it's different for everybody.
But we definitely have a wonderful group of families that join us on Fridays.
- Wonderful.
Oh I'm glad to hear that, yeah.
Awesome.
What are the plans for the future?
Where are we taking TIPS next?
I know, Zhen, you had mentioned wanting to recruit for additional studies.
- Yeah.
- If you wanna share some information on that.
- Yeah, so our next stage, we're preparing for two user studies.
One user study is the in-lab study that inviting parent and child to use the tabletop version of the system in the lab setup.
And another study is we want to bring this portable TIPS system to people's home.
And we want to kind of through from there, to be able to run, you know, kind of a little bit longer session, maybe 15, 30 minutes per day, you know, to whatever number of days that, you know, will work with the family so that we will be able to get more in-depth and in-contact understanding of the usage, and their needs, and challenges that they face, yeah.
- Yeah, I was gonna add, I think, you know, what you just shared for that, you know, we need to grow our financial capital and being able to do that.
And right now, we've applied to grants like National Science Foundation.
Different grants would help us expand, not just the research, but a greater reach like capacity to other families so that we can learn more about how to best, you know, fit their needs.
And in order to do that we need, you know, financial resources to do that as well.
- Absolutely.
How does it work?
So you're in different departments at U of R. You're both at U of R. We've talked a lot about the tech, but I'd love to hear from the public health side.
You know, how does that work within the other work that you're doing or within the remit that you have for your center?
- Yeah, so my work focuses primarily in public health, and language acquisition, and the language deprivation as a public health crisis, as a public health issue.
So in order to address that, under that great umbrella, this does fall under that work.
- And, I'm under the medical center, at University of Rochester Medical Center and Zhen's under the University of Rochester River campus, which that typically, even though we're part of the same institution, like, we are working to bridge more collaborative efforts there, but being able to have this in our labs, like this was always a part of my labs.
So I always feel like it's an extension of that, and that's what we're hoping to continue.
- Great, and from the public health perspective, do you see this technology branching out into other areas?
I know we've talked a bit in the past about, you know, first responders or emergency preparedness.
Do you see it as a training tool for those folks?
- For sure.
I think, as far as this technology, I can definitely see its use, and its use cases helping people who are interested in learning ASL, like Karen mentioned people in different areas, different counties that might not have local resources.
And it's, again, not just parents who are learning sign language, but you can think about first responders, or think about other individuals who want to develop basic communication tools in American Sign Language and just want to interact with this technology.
So often we see technology developed for one use case, but actually benefit so many other use cases.
And we can definitely see that this can follow in the same pattern.
- Yeah, absolutely.
It can definitely.
It's very apparent.
I wanna bring in another email from CY.
"What about individuals with fine motor skills disabilities, or who might be missing a limb, or a hand, or even just a finger or two, and would struggle with sign language?"
- Yeah, so I don't have a specific answer, but I know that deaf individuals have physical disability.
I mean, some individuals have CP who are still using sign language for communication, and it doesn't always look the same as other sign language users, but they develop their own way of using sign language for that.
I've also seen people with other physical, you know, I guess, any individual has a way to use sign language, but it can really be tailored to the individual.
- Zhen or Karen, if you wanted to add anything.
- Yeah, I think that's a very interesting topic, and I would love to learn more about, you know, some background, and maybe that would help us to think about, in the long run, how the technology can, you know, yield generalization to other individuals and community members, yeah.
- And I'm thinking too of the deaf-blind community, and using tactile sign language.
Do you see that as a potential research area?
- Yeah, I think in terms of deaf-blind community, absolutely, yes.
But there's a lot that depends on other technologies as well.
So for example, tactile sign language, we need additional resources that give that interactive tactile feedback.
And I think this as a core technology that can, you know, once we get the AI, the speech-to-text, but not just speech to text, but into the sign language component, once we get those parts nailed down, then we can integrate that with other technologies that would really help serve the deaf-blind community.
I definitely think that's possible.
- Yeah, one thing I think, thinking about the benefit to other community, that one aspect that we didn't get a lot of chance to talk about: this parent-child communication, that between communication modality is joint attention.
- [Sarah] Mm, yes.
- So it's not only the language itself, but also the way the cues that you pick up as a, you know, like a second nature to understand people's attention.
Like often, you know, parents who may not pay close attention to where the child's visual attention is, and even if they sign in front of them, they're not paying attention to you, then that wouldn't work.
So that's another kind of line in parallel with this: the content-wise is our interaction.
We also want to look at the attention aspect and from that aspect, because with my prior experience working with children with autism who also have this, you know, joint attention like space for improvement.
So maybe in that domain there might be some potential, you know, cross paths with that community.
- Yeah, you brought up a really good point, and it made me think of another example.
Sometimes parents or individuals, they don't always understand that the deaf child is not simply a hearing child that can't hear.
And how they process the world and how they interact with the world is gonna be very different.
And so your joint attention looks different for deaf children compared to hearing children.
So, you know, if you think about the incidental learning, like, if a hearing child is walking around and they're not necessarily paying attention to a specific thing, but they're getting a lot of audio input, that incidental learning is not just going to, you know, translate well for a deaf child.
So a lot of deaf kids miss that auditory input.
So understanding joint attention is a really important part of the research process.
- Yeah.
- Yeah, that's a really good point.
Karen, there anything you wanted to add?
- No, I agree that joint attention and making sure the language is accessible, which means is the child looking, is the child attending, is always an important thing, especially when you're working with very young children.
Their interests or what you might be trying to achieve might not be something that they are wanting to do.
So really having that in the back of your mind that, you know, we can try to lay out the perfect plan, but children are pretty incredible, and they let us know what they're interested or not interested in and what they would like to attend to or not attend to.
We see that every day.
But that joint attention is a very, very important part of language development for deaf and hard of hearing children.
- Yeah, thank you.
Wyatte, your comment earlier about the different technologies and having to look to other sources of research and development to move TIPS forward in certain ways, are there folks who are doing that kind of research?
Are you in contact with them?
- So my research focuses in that public health sector.
So Zhen really is the only person I'm working with that's really tackling the technology side.
And quite honestly that's like a little bit of a stretch for me 'cause I don't have that technology background.
But partnering with deaf schools.
We have a project in California that focuses on early intervention for the state of California.
And then there's other areas that people are trying to, you know, address that can all help in some way.
And we see technology, it has a public health impact, and it's ingrained in our life.
And that's gonna continue to develop.
And that's where I think, you know, this project will continue to support that.
- Okay, great.
And we are getting close to the end of our hour here.
I agree.
We could have gone on for many, many more on this topic of conversation.
There's many veins that we could have gone down.
Quickly, if folks wanna learn more about your work on TIPS, if they wanna be included in future studies, how do they get in contact with you?
- Yeah, absolutely.
So we have online contact form.
So maybe the program host could help post it on the page of today's episode.
So its online form.
Anybody interested, parent and ASL instructors early education for DHH children educators, all welcome to just, you know, fill in the form, it's like one minute, and then we we'll follow up to keep you posted of our upcoming studies, yeah.
- And if students are interested in working with you?
- Absolutely, absolutely.
Just search my name and, yeah, University of Rochester.
You will find my email address, our website.
- Same for me too.
- Yeah, reach out.
I mean, it's not only for the study.
It just, you know, simply for a chat and some exchange of thought.
There's, you know, the most inspiring idea are just come across a casual chat.
- Great, thank you so much.
- Yeah.
- [Sarah] And Karen folks can find Rochester School of the Deaf... - On our website.
Drive by on St. Paul Street.
Facebook.
We're here.
- Wonderful.
Thank you to my guests Zhen, Wyatte, Karen, and Byron.
I'm Sarah Murphy Abbamonte sitting in for Evan Dawson.
Thank you to our executive producer, Megan Mack, Julie Williams, Rob Braden, and the whole "Connections" team.
And thank you for listening to member-supported public radio.
(upbeat music) (upbeat music continues) ♪ Ho ♪ (upbeat music plays out) - [Announcer] This program is a production of WXXI Public Radio.
The views expressed do not necessarily represent those of this station, its staff management, or underwriters.
The broadcast is meant for the private use of our audience.
Any rebroadcast or use in another medium without expressed written consent of WXXI is strictly prohibited.
"Connections" with Evan Dawson is available as a podcast.
Just click on the connections link at wxxinews.org.