21 May, Adelaide's accessibility A11y meetup was joined by Vision Australia's Charlii Parker, Dr. Scott Hollier and Microsoft's Emmanuele 'Manny' Silanesu. Event details.
[Title slide: Global Accessibility Awareness Day (GAAD) lunch and learn, presentation opens visually to group of 4 people presented the Microsoft Teams format]
Cliff Edwards (00:00:05):
Okay. I know, everybody's time's precious. So, yeah, we'll make a start. Hi everyone. As today's meetup's originating from Adelaide. It's important to acknowledge this land we meet on today is the traditional lands for the Kaurna people, and that we respect their spiritual relationship with their country. We also acknowledged the Kaurna people as the custodians of the Adelaide region and that their cultural and heritage beliefs are still as important to the living Kaurna people today. We also pay respects to the cultural authority of Aboriginal people attending from other areas of Australia. Also, acknowledging that today's Global Accessibility Awareness Day. The purpose of today is to get everyone talking, thinking, and learning about digital inclusion. To help celebrate there's a number of free online events happening throughout the day from A11y Bytes to Intopia to Vision Australia, you can check them all out on the Global Accessibility Awareness Day website.
Cliff Edwards (00:00:59):
Here today, we're joined by Vision Australia's Charlii Parker. Charlii's here to give a brief overview of what to look forward to at our next meetup. Dr. Scott Hollier is our guest presenter today. Scott will begin with an overview of his life journey as a person who's legally blind. His work with W3C and discuss the evolution of assistive technologies. Closing off Microsoft's Manny Silanesu is here with details on Microsoft's upcoming accessibility Webinar Series being made available for us all. Before handing over to Charlie. Thank you and welcome to everyone for registering today to attend. 123 registrations, these numbers fantastic and help support the next meetup. And of course, highlight and maintain the interest in digital inclusion. Just a reminder today's meet up has been recorded and will be made available to everyone after the event.
Cliff Edwards (00:01:47):
A couple of housekeeping items, just to help the presenters and cut down on the background noise. We'll be muting everyone and by default, if you'd like to ask q question of the presenters, there's a number of ways you can do that. Microsoft Teams has a raise this hand function. This is located in the control bar for those using tab browsing, if you go past, share, and more actions (the three dots) you'll hit raise your hand. This will give us a visual cue that you want to ask a question.
Cliff Edwards (00:02:13):
Another way is to use chat and show conversation. This is one past raise your hand. Activating show conversation will display the chat window. If you don't feel comfortable raising a question in public forum, I'll be here to provide support. You can reach me in the chat, email me at email@example.com, or phone me direct. I'll pop my details in the chat in a sec. You can of course, hang around at the end to ask questions to the presenters and each other. We put an hour and a half down for the that, but probably won't go that long. So now just like to hand over to Charlie from Vision Australia.
Charlii Parker (00:02:49):
Hello and thank you. Happy Global Accessibility Awareness Day everybody. I am. I'm really, really happy to be here at this presentation, looking forward to hearing what Dr. Scott Hollier has to say. We are holding a GAAD event today as well at Vision Australia. Online, and there are tickets available through humanitix and it's available on the digital access accessibility events page as well. So, if anybody has some time left this afternoon and wants to attend another event, we have Bruce Maguire speaking, who was the person who started Australia's first landmark case against digital access for digital accessibility with the Sydney Olympic games. So that's, he was one of the people who first started and brought digital accessibility into the light. In a few weeks time coming up, I'll be talking about, at the next event, the effects of COVID, you know, all of our lives have changed drastically and significantly. So looking at things that have changed and also what the future might look like when life gets back to what will be the new normal, I guess.
Cliff Edwards (00:04:17):
Thanks Charlie. Now, I'd like to welcome today, Dr. Scott Hollier as our main presenter. Scott's joined us from Western Australia and over the time we've been developing the online accessibility policy and toolkit. Scott's been a source of expertise and support. Dr. Scott Hollier everyone.
Dr. Scott Hollier (00:04:39):
That's great. Thank you, Cliff. And thanks so much for the opportunity to present today. It really is a great pleasure and privilege, especially on this Global Accessibility Awareness Day. I'm going to turn my phone off just in a tick, so it doesn't ring halfway through the presentation, but before I do, people often ask me how do I use a mobile device being visually impaired? So I thought just before I do, I just, should I quickly hold it up to camera if you can see it. So what I have here is an app called big launcher that has my phone put into six big buttons. This phone isn't anything particularly special. It's just a pretty standard Android phone.
Dr. Scott Hollier (00:05:16):
And that launcher really helps me to be able to use the little sight I have to visually get around it. It's also got the talk back screen reader turned on. So if I put my finger on the screen, it'll read this out to me. When I hear what I want, like ABC, I can double tap on that. It will then load up the stories and I can again, put my finger on the screen and it reads these out. And I can use gestures to get back to that. So it's just a very quick overview of some of the assistive technologies is built into everyday devices. I'll just power it off now.
Dr. Scott Hollier (00:05:54):
And the fact that there is all this great built in technology just makes a huge difference into the opportunities for people with disability. So now I'm going to turn on my own screen reader on the computer. I'm just going to navigate through the buttons and I'll get my screen sharing for the presentation.
Dr. Scott Hollier (00:06:23):
Okay. I'm just hopefully sharing out the desktop now and you'll see it, a big zoom difference coming through properly. How's that cliff is that coming through?
Cliff Edwards (00:06:34):
Yep, absolutely. For me, and zoomed like you say Scott.
Dr. Scott Hollier (00:06:38):
Very good. Okay. I'll just zoom out. And I will turn off my screen now and I think we're good to go.
Dr. Scott Hollier (00:06:46):
Fantastic. We'll look again, many, many thanks for the opportunity to share with you today. It is wonderful to be coming to you from, from Perth WA to share this information today on this very special day. So I'm going to be sharing a bit about digital access, both from a personal perspective, in my own journey and then I'm going to be moving into some W3C work relating to the research questions taskforce. So for those that don't know me, I'm a Digital Access Specialist on the director and co-founder of The Centre for Accessibility Initiative. And I've done a bit of work in Academia in this space around research to support people who are blind or vision impaired. And I'm also legally blind myself. I'll explain a little bit more about that as we go along. To begin with, I'd like to play you a video from The Centre for Accessibility, which I think really illustrates the importance of today.
[video from The Centre for Accessibility plays]
Helen Errington (00:07:39):
Since the internet has come into being it's, it's really expanded my access to a range of what goes on out there in the big wide world.
Peter Blockey (00:07:50):
Most websites are good, but having access for the disabled people is often hidden.
Julia Hales (00:07:57):
When I log on to something different that I haven't used before. It is quite hard to understand it.
Hugo Flavelle (00:08:05):
Every website has too much words. I don't like that. Very difficult to find stuff.
Grace King (00:08:19):
If you decide that you want to have a carousel that changes pictures so that my screen reader spits out random stuff while I'm shopping, I'm probably not going to be able to buy your product.
Kyle Quinn (00:08:31):
There's heaps of sites, but they just have to be easy to use.
Helen Errington (00:08:35):
We're a huge section of the public, we're 20% of the population. So if you don't make it accessible, you're missing out on 20%.
Grace King (00:08:44):
Putting a very good text description of anything, a product, or just a picture of something in a news story, it's really helpful because it gives me the information that I need to know.
Julia Hales (00:08:55):
For the writing, it needs to be bigger so you can understand about the language and also the videos as well, that will make it much easier as well.
Hugo Flavelle (00:09:05):
I like pictures on websites.
Peter Blockey (00:09:12):
It's very visual for the deaf community they can understand it better.
Kyle Quinn (00:09:15):
Clear and simple, just clear and simple.
Helen Errington (00:09:22):
You want my cash, you make your website accessible and I'll be there.
Grace King (00:09:27):
This is the best time right now to be a person who's blind. If you follow all of the accessibility guidelines. I'm able to use the website. I'm able to do my banking. I'm able to do my shopping.
Kyle Quinn (00:09:39):
And you don't have to go through heaps of books, you can get information in what, less than a minute.
Helen Errington (00:09:45):
We're not that weird. We're not that strange. We want to, we've got thoughts and hopes and dreams of all people. We want to come into, come into life and be welcomed and included.
With thanks to participants in order of appearance, Helen Errington, Peter Blockey, Julia Hales, Hugo Flavelle, Grace King, Kyle Quinn. Accessibility is about more than compliance; it's about people. Centre for Accessibility, accessibility.org.au.
[video from The Centre for Accessibility ends]
Dr. Scott Hollier (00:10:21):
The Centre for Accessibility is essentially an initiative that's been created to try and advocate and promote the importance of digital access, and like many, we do have a range of services. So if anyone would like any more information about that, please feel free to get in touch.
Dr. Scott Hollier (00:10:33):
So I'd like to just shift the focus a little bit away from the bigger picture and come down to the individual. One of the things that I've noticed, and if you are a developer or a designer, it might be the case that you do a lot of accessibility work, that you have an interest in this area, but you may not always get to see the fruits of your labour. You may not always get to see the great benefit that your hard work and support leads to the lives of people with disabilities. So I want to share a little bit about my personal journey to start with today, just to explain the importance of the support that people provide to people with disability and just what a difference that does make.
Dr. Scott Hollier (00:11:10):
So to explain a little bit about my journey, I was born in Kalamunda, which is a Hill suburb of Perth. And my childhood was pretty happy. In fact, one of the best memories I have was when I was five years old, and my Mum said that Fat Cat was going to come and visit our house. Now, I'm not sure if you have Fat Cat in Adelaide, the channel seven mascot, but even to this very day over here, Fat Cat says good night to children at seven thirty. So when I was the age of five, it was like a superhero was coming to my house and sure enough, one day there's knock at the door. And Fat Cat was there and came in and we played on the swings and had a great day. It's a really fond memory I have in my mind.
Dr. Scott Hollier (00:11:50):
The only problem was that the next day I went to my kindie class and told my teachers about Fat Cat coming to visit. And they immediately put me in the naughty corner believing that I was telling a lie. Fortunately, my Mum took some photos on the day and it was this photo that's on the screen now of myself and Fat Cat on the swing, that finally convinced the teachers that perhaps I was telling the truth after all. I'm sure this had absolutely nothing to do with the fact that my uncle worked for channel seven at the time. The reason I mentioned this is because whilst at the age of five, it was a very happy time for me, for my parents it was quite a challenging time, because they started to realise that their son had a vision impairment. Certainly, during those early days, they started to realise that I had not much night vision and sometimes things that were similar colour, I had trouble picking up.
Dr. Scott Hollier (00:12:37):
So, basically they took me to a specialist and the specialist said, look, your son's going to go blind, and he's got this eye condition called Retinitis Pigmentosa. Essentially your son's future will be to play flute on street corners or work in a telephone call center for the blind. He'll certainly have to go to a special school. And you can appreciate that receiving that news would be extremely difficult. My parents being teachers and valuing education, they went to get a second opinion. And the other specialist said, yes, the prognosis is correct, but how long it takes for your son to go blind is unknown. And really the best thing to do is just to take life as it comes and make the most of it. And it's fair to say, I've never be more thankful for that second opinion.
Dr. Scott Hollier (00:13:20):
So as such, I was very fortunate to go through mainstream schooling and there were two teachers during my education that made a huge difference to my life. There was a teacher in Year 9 who noticed that there was number of equipment that could help me in my education. And because this was the late 1980s, disability and support in education was still a relatively new and evolving thing. He managed to identify assistance, which my parents didn't know about. And that made a huge difference to my education. I managed to get to university and I did computer science, but while my computing went all right, my maths not so much. And it was a course coordinator named Steve Casell, who I still keep in touch with today. He got in touch and even call me up on the mid semester break and very kindly looked at how I could restructure my course so that I didn't fail math twice and get kicked out of uni. Thanks to the intervention of those two educators. I'll be fortunate to go on and do further studies, including a PhD.
Dr. Scott Hollier (00:14:23):
And I mentioned this because if it wasn't for critical support at critical times, my life would be completely different. Sometimes people have said when it comes to independence, doesn't support take your independence away? I tend to look at it a bit like trying to change a light bulb by jumping to the roof and unscrewing it, it is technically possible, but it's a very hard way to do it. It's much, much easier if you can get a ladder. A ladder means you can still independently change the light bulb, but you've got that support around you. And it's fair to say that when it comes to the work that you're involved in and having that support at critical times to people with disability, that support does lead to independence.
Dr. Scott Hollier (00:15:04):
One of the big challenges is employment and is currently 59% unemployment amongst people with disability. When I finished my studies, it was challenging to try to get my first job. I'd often make interview like my friends, but when I'd mentioned about my eyesight, the atmosphere in the room would change. And you could tell that things were going to be a little more challenging. And I'd often miss out on job despite making interview. Eventually, I managed to figure out how to deal with the main question that would come up, which is, do you have a license? My answer would be no. So next time I had an interview and I was asked, do you have a driver's license? You might need to go and do IT stuff off site. I said, look, I don't have a driver's license, but if you need me to go anywhere, I will get a taxi and I'll pay for it myself. You'll never have to worry about that aspect. Not only did I get that job, that's my first job in the industry. But for the two years I was in that role, I never once had to go anywhere, which is a real Testament to sometimes disability can be a little bit about marketing as well as the disability itself.
Dr. Scott Hollier (00:16:07):
The last point I'd just like to address before moving on to the main topic is the challenge of getting out the front door. Unlike that famous footy mantra, I believe failure is absolutely an option, sometimes. When you have a disability, there are those days and those darker moments where things can be tough and you may not always get to, get to achieve the things that you're hoping to achieve. So during that time it's, it's really important to try to figure out, okay, well, what are those small things that I have a lot of trouble with? And then hopefully if I can overcome those small things, it can potentially lead to doing the bigger things.
Dr. Scott Hollier (00:16:43):
One of the things I find particularly challenging is when I'm traveling places, I travel a fair bit back, I'm pretty COVID-19 whichever fair bit to present and do motivational speaking. And it's, it's good to be able to do that, but I do find myself feeling a bit nervous, a bit apprehensive, especially when I'm in say situations like conferences, where I need to ask people to help me find the toilet or getting into a taxi, knowing you'll get dropped off somewhere unfamiliar. But I find that if I can keep persevering at those little things, it does lead to the big things being possible. And that's such, I've had a lot of things in my life, which have been very fortunate. I have a picture here of the sail trade ship luan, and there's a little wave and blob at the top of the mast and that's me in that scenario. I've also been very fortunate to carry the Olympic torch, which was a great privilege and honour, because I was nominated because of some charity work I was involved in and it was a day like handing out joy. I was very, very privileged to have that opportunity. I have a picture here of myself and my cousin in a scuba tank with sharks in it. I still have my limbs, so that did seem to go all right. I've also been fortunate to travel to all seven continents. I've got a picture here of myself in front of the pyramids of Egypt. One of the hottest places on earth through to the coldest. A picture here of me standing on the Southern Ocean, just off the Antarctic Peninsula. The reason I mentioned all of this again, is because none of this would be possible if it wasn't for the dedication, help and support of others, to enable me to be able to, to achieve these things.
Dr. Scott Hollier (00:18:23):
And again, coming back to Global Accessibility Awareness Day. In your work you may not always get to see the outcome, the wonderful things that people with disability are able to achieve that independence because you've made that a website effective you've fixed up that accessibility issue on that app. But please believe me, it does make a real and profound difference to the lives of people with disability. And whether you are joining this call because that's the first time you've, you've really come across accessibility or you're a seasoned professional, your interest makes a difference. I just wanted to cover that before the, the main topic. If anyone would like to learn a bit more about my personal journey, I have a book out running the night, a life journey of disability, determination, and joy, more details are at outrunningthenight.com. In addition to all the usual formats, surprisingly, there's actually a Vietnamese version as well. And that's a really interesting story how that came about. So if you'd like to know more about that, please get in touch.
Dr. Scott Hollier (00:19:19):
So now shift to the main part of today, and in order to embrace accessibility, really two things need to happen to support people with disability. One is that people with disability need to have the assistive technologies on their device of choice. And that it was demonstrated just with my phone at the beginning, I'm using technologies to assist people with disability and that's a critical part of access. And the great thing is that whether we're talking about Mac or Windows or iOS or Android, those features are largely in place now, which is terrific to have so many options there. But in order for those tools to work, we have to make sure our content is accessible. And so this really comes to the crux of today. What can we do to ensure that our content is accessible? What work is going on out there to make sure that we can support people to disability using those assistive technologies.
Dr. Scott Hollier (00:20:14):
Cliff said that most of the group are familiar with the international web content accessibility guidelines or WCAG standard as we call it current version being 2.1. So I believe that most of you are already familiar with WCAG. So today, given that it's about raising awareness of accessibility, I'm going to share with you the work I'm involved in. It looks at more of the cutting edge, leading edge developments around accessibility. But if you're not familiar with WCAG 2.1, if this is a new topic for you, and this is really the first time of getting some exposure to it, we do have an introductory resource at The Centre for Accessibility, with accessibility.org.edu/resources. But yeah, look, anyone that's wants to start this journey, please get in touch. I'd be absolutely delighted to provide you with free resources and guide you through that sort of accessibility one on one process. Cause it's always exciting when people are new to this topic and really get that passionate and want to embrace the way things are going.
Dr. Scott Hollier (00:21:12):
But for today, I'm going to focus on looking at a number of topics associated with my involvement with W3C. And I'm going to take a deep breath here as I explain the group that I'm involved in. It is the W3C WAI APA RQTF. They love their acronyms. So just to break that down, a W3C is a world wide web consortium, which is responsible for that international work extended and it falls under their Web Accessibility Initiative WAI group of the W3C. The part I'm involved in is the Accessible Platform, Architectures working group. There's number of working groups that develop web standards and notes. And my particular subset of that is the Research Questions Task Force. The RQTF that's really the work I'm going to be focusing on today.
Dr. Scott Hollier (00:21:59):
To explain what the RQTF is we're a bit like an advanced scouting party. We look at what new technology is over the horizon, and we break it down in terms of how other W3C groups can take on that information. And we also provide our own notes and advice regarding the access implications for those new and emerging technologies. So today I'm going to cover off on a few of the things that we're working on at the moment. I know we published recently looking at things like XR accessibility, real time communication and remote meetings, accessibility, and also CAPTCHA, which is always a hot topic when it comes to accessibility.
Dr. Scott Hollier (00:22:34):
So to begin with, I'll start with that work with XR. And the question I often get asked is, well, what exactly is XR? And that's an extremely good question. So XR is effectively an umbrella term that looks to cover things like virtual reality or augmented reality, mixed reality, all those immersive environments and what the access implications are for those. As you can appreciate, it's a very much an evolving space and the space that is of great interest at the moment as things continue to move into that, that augmented mixed reality space. So we're very keen to provide guidance in terms of what are the access implications.
Dr. Scott Hollier (00:23:18):
And it's worth asking the question then well, are they any real much work going on in this space? And the answer is yes, there is some really interesting projects. So some projects that have already been happening in the XR space are things like people with a cognitive disability are using augmented reality devices, like be like the old Google Glass where people can go down the street and have people with them in real time to provide that, that support maybe interpret signage, helped to support people who might be a bit nervous about interacting in that environment. So that's a great thing. There's apps like Be My Eyes, which also have a similar support mechanism. There's also a lot of that real time communication and use of assistive technology. So making sure that we can carry over the assistive technologies into the immersive environment. So for example, making virtual reality screen reader accessible.
Dr. Scott Hollier (00:24:16):
And also one of the really interesting projects I've come across through our research in this, as we've developed this advice, is using traditional aids within a virtual environment. So for example, I use a white cane to get around. And when I'm in a virtual environment, perhaps I might like to use that cane as well to tap around virtual reality and understand the environment using those things, which I use every day. Virtual reality also has the option to be very liberating. And for example, if you use a wheelchair, you can fly in VR. So there is those applications, but also the application of maybe you just want to explore the immersive environment the same way that you do explore the real world.
Dr. Scott Hollier (00:25:03):
So it's really interesting to see a lot of this work developing. And for this reason, we're very keen to make sure that there is good developments and advice in this space for developers. But what if you could actually even just control virtual environments with your mind? Well, this might seem a bit crazy and out there, but it's actually in the consumer electronic show in Las Vegas in January, a company that has developed something called Next Mind, which does exactly that. And I've got a little video here to show.
Game narrator (00:25:55):
[Video plays, adult man in black T-Shirt wearing VR headset standing in front of TV, video game plays on TV] Game narrator: you can move on. Perfect, no. The extraction point is not far away.
Dr. Scott Hollier (00:26:07):
So, it's just a very short clip demonstrating that Next Mind product released in January. And one of the things really exciting about that is that when you think about the implications of such a product for people with disability, it's really quite exciting. I mean, whether you have a mobility impairment or a vision impairment or other disabilities, the idea that you could just think and control things to that level of accuracy. In that example, playing a game, opens the door for really exciting immersive opportunities. So it's really things like this, which demonstrate why XR has huge potential for people with disabilities and why important guidance on XR is required. So because of that, we've really had a look at what are the current barriers for people with disability, getting into XR and what some of the advice we can provide to developers through the XR note that we were publishing to help with that. So there are a few barriers. One of the big ones is that XR is very reliant on gesture controls. So if for those of you who remember the Nintendo Wii when it came out in 2006 Nintendo Wii had a lot of great games like Wii sports, but also a lot of the games that came out for the Wii where just gesture base, because the controller was gesture based, unfortunately, because there was such a heavy reliance on gesture, a lot of the games became almost unplayable or just painful to have to always use gesture controls and XR into some degree is facing the same thing. We need to make sure that there are multiple inputs and different ways that people can interact with the environment, not just gesture. And this is an issue at the moment. Another problem is that the hardware is all very proprietary.
Dr. Scott Hollier (00:27:45):
So whether we're talking about hollolands , whether we're talking about the HTC Vive or Oculus Rift, there's not really a consistent feature set of accessibility across these products. And so because of that, it is challenging to have consistency around accessibility in these things. And there's also issues around, and these are some things we've also provided guidance on around things like captioning and audio description. So if you have CAPTCHAs available for someone who's deaf or hearing impaired in that virtual environment, where do you put the CAPTCHAs and how do you indicate if someone's talking behind you? So this is all work that is being developed at the moment, likewise, audio description placement and other things along those lines. So there's still issues around really clear spatial audio around how we convey that information to people with disability. So as such, we've focused a lot on just trying to highlight what are possible scenarios and what are possible ways to address it.
Dr. Scott Hollier (00:28:47):
On the screen I have a picture of the Xbox adaptive controller, which is a good example of types of devices that we should be considering when we develop for XR, the XIC is a controller which can have inputs available. Basically you can map any button or aspect of the controller to switch keys or any other assistive technologies, which is absolutely fantastic. And it's been a great success for Microsoft. So this just demonstrates that there is some great hardware out there. And if we ensure that our XR environments can support this type of thing, then it really does open the door for people with disability to embrace those benefits. So what we've been doing to try and accommodate this is that we've created a document called XAUR. Sounds a bit like some sort of evil character in a superhero movie. I know, but XUAR is the XR accessibility user requirements.
Dr. Scott Hollier (00:29:42):
And what the XAUR aims to do is to provide this advice to developers on what they can do to ensure that accessibility is part of the XR development. Now we'll take a bit long today to go through all 18 scenarios that we've come up with. I'd like to share a few other things during the session as well, but I had just picked out five scenarios, I think does highlight where XUAR is going and give a bit of insight into what we're doing with it. So the first one here is looking at navigation and so effectively what we need to do is to make sure that we can use a variety of different inputs to effectively navigate around an XR environment. As I was saying before, it's quite heavily gesture based at the moment. And we want to make sure that the variety of navigation options are available, whether it's controlled, whether it's thought from Next Mind, whether it's more traditional input methods to make sure that that navigation is effective in an XR environment, for whatever tools people with disability want to use.
Dr. Scott Hollier (00:30:39):
And we also want to make sure that there's good cognitive support. So this scenario is looking at the ability to replace things with other things which are more easier to understand. So for example, if you're someone with an intellectual disability, it would be much easier to go through an environment and in real time have complex words switched out for understandable symbols to make it much easier to grasp that environment. And so this is certainly very possible, and you could have a symbol feature set, which would replace some of the more complexity of a virtual environment. So this is another one which we think is important to make sure that there is that capability and just give people the ability to simplify, to make it clear and simple. As it's mentioned in the center for accessibility video in the XR space. This one is something that's very close to my heart.
Dr. Scott Hollier (00:31:33):
This is about screen magnification. And as a screen magnifier user, something that I personally find really useful. So the idea here is that if you saw just when I shared my screen, I had a portion of the screen viewable rather than the whole screen, because I zoomed in to make use of my limited sight. So because of that, if you are zoomed in, then it might be the case that you can only see, you just see a very small blown up part of a very big environment, and it's easy to get lost in a three 60 degree environment. If you're zoomed in only seeing a very small part of it. So what this is about is to make sure that there is some sort of map or aspect that means if someone is zoomed in, they can still get a good understanding as to where they are in that XR space.
Dr. Scott Hollier (00:32:21):
The scenario I've got here is looking at sign language. And I think one of the areas that XR really opens the door for in terms of disability is support for people who are deaf and use sign. There isn't a lot of great support at the moment in the broader web for people that use sign language. It's not something you come across very often, but in the XR space, there's huge potential. Not only could people sign to each other very comfortably in XR, but also you could have avatars that could take information and the avatar could then be a translator in sign directly to the person who has sign language as their first language. So this opens up great opportunities for people that sign to be able to interact in their, that environment to have text or anything else translated and pushed into an avatar that can sign and provide that much more effective, interactive opportunity for people that sign.
Dr. Scott Hollier (00:33:20):
And the last scenario I'll go through in this section today is I'm looking at the needs of people who are deaf blind. And again, this is a disability group that don't get a lot of access at the moment. Most people who are deaf blind will interact using a refreshable braille display, which is where you have a series of pins that go up and down to create different braille characters that people can feel those braille characters. But if you think about an XR environment, it really opens the door for people who are deaf blind, cause not only could you still use that type of interaction, but also you can feel a whole lot of different objects. So you could get a lot, much more tactile feedback in that environment. We want to make sure that developers consider just making sure that for people who are deaf blind, there is as much immersion and interaction as possible.
Dr. Scott Hollier (00:34:11):
Another disability group, I think could really benefit beyond what's available now in the XR space. So that's essentially XR and I'll provide some links to the notes where it stands at the end. So I'm going to shift tact and I'm going to go to CAPTCHA. We just published our CAPTCHA advice in December last year. So it's relatively new and a big update on CAPTCHA advice previously out there because there had been so many new technologies and CAPTCHA methods. So for those who might have are aware of what CAPTCHA is CAPTCHA stands for the Completely Automated Public Turing test to Tell Computers and Humans Apart, most people will know CAPTCHAs from the picture I have on the screen here, which is the squiggly text on a bitmap image and where you have to then type in those characters to indicate that you're not a bot.
Dr. Scott Hollier (00:35:05):
However, the problem is that people with disability often find these CAPTCHAs difficult and often then get placed in the bot category. So there are a number of different types of CAPTCHAs. And when it comes to people with disability, there's also an equivalent number of issues. So that original CAPTCHA is almost impossible for someone who has low vision to do and completely impossible for someone that is blind. But wait, I hear many people say there are audio CAPTCHAs. You can have a listen to a CAPTCHA and you can do that. Well. Yes, that's true. But often those are difficult to hear as well. And I'll come to that shortly. Another interesting thing about CAPTCHAs is that they're very biased towards the English speaking world. CAPTCHAs are almost always using English as words or the character set. And that means a majority of the world just simply can't input it.
Dr. Scott Hollier (00:36:00):
You can appreciate your challenge if you don't speak another language to have CAPTCHA come up in something else. And you know, what that challenge would be in trying to then type that out. So there is a lot of issues with that traditional CAPTCHA that makes it really difficult for people with disability. So I was saying about audio CAPTCHAs being a bit difficult. And so I'd like to, to put that to the test. So I'm going to play it on the audio CAPTCHA. And I've just noticed that my audio does seem to be a little quieter today. So you may have to turn up your device just a little bit to hear this, but I'm going to play an audio CAPTCHA. And then we'll see if, if anyone has a good understanding as to what this is.
CAPTCHA audio (00:36:42):
Dr. Scott Hollier (00:36:55):
So I'll just play that one more time.
CAPTCHA audio (00:37:03):
Dr. Scott Hollier (00:37:09):
So I'll throw it to Cliff. And if anyone wants to have a go at what all that was, you know, maybe they can reach out to Cliff. Cliff do you have any, any thoughts on that?
Cliff Edwards (00:37:21):
I was listening intently, but I've got absolutely no idea Scott.
Dr. Scott Hollier (00:37:25):
No worries. And look, if, if you also feel the same way that you have absolutely no idea what that was, or maybe you can pick out a word or two, please don't worry. Because I gave a presentation in the studios of the ABC, not that long ago using their fantastic speaker equipment. And no one was able to pick all the words in that CAPTCHA. So this really gives you an example of why audio CAPTCHAs just don't really work either. This is a very common audio CAPTCHA and a yes, I've listened to this many, many times, and I, there's still one or two that I just can't pick up.
Dr. Scott Hollier (00:38:07):
So clearly CAPTCHAs are a problem. And so the question then is, well, what do we do about it? And what are the implications of it? Well, because of that the RQTF, we've did a lot of research and I was the senior editor for, for this. And what we found is that interestingly, that traditional CAPTCHA with the squiggly text is not only inaccessible, but it's also not that secure anymore. This is a technology from 15 years ago. And to be frank computers now actually have a pretty good way of cracking those these days with so much greater OCR developments. So while people with disabilities are considered bots, some of the bots are now considered humans. So that is certainly a problem for something that is meant to be a security mechanism. Also our digital assistance, while we probably couldn't figure out those audio CAPTCHAs digital assistants, these days have a fair chance of understanding those audio CAPTCHAs.
Dr. Scott Hollier (00:39:05):
So again, you know, the purpose of those has largely been undone by the evolution of technologies these days. So the question then is, well, if those CAPTCHAs don't work, what CAPTCHAs do, there are a number of choices of CAPTCHA options. Some are accessible and some are not so common CAPTCHAs that you'll come across on the, on the web. In addition to the traditional CAPTCHA, which is still fairly widely used out there is reCAPTCHA v2, which is the one that has the tick box saying I'm not a robot. There's also reCAPTCHA V3, which doesn't bother the user at all. So that's, that's very appealing. And there's things like honeypots, which have a hidden field that if the field gets filled in by a bot unintentionally, then that flags it as being a bot. So that's another one that doesn't bother the user. So that's, that's quite good.
Dr. Scott Hollier (00:39:55):
And it's things like visual CAPTCHAs that compare, say a person to a robot or a man or a woman. And there's 3D CAPTCHAs, there's logic puzzles, there's an email registration verification processes. There's SMS, these days, we're seeing a lot more in terms of biometrics. So using face ID or your fingerprint to identify what's going on. So there are a lot of different CAPTCHA options out there. So the question then is what is the best solution? Well, in terms of reCAPTCHA v2, we see that I'm not a robot tick box quite a lot, and I've got an example of it on screen. And so that's, you know, an option to consider certainly a screen reader can get to that tick box and enable and you know, so that could be a valid option. However, there is one major problem with accessible CAPTCHAs. I have a little clip here from the Big Bang Theory and it illustrates the point far better than I can.
[TV: Big Bang Theory plays showing 3 cast members in front of a computer, 1 cast member is on a mobile]
Big Bang Theory Cast (00:40:52):
All right. Right. His enemies list. Ooh. He updated the interface. You can search by first name, last name or length of grudge.
Big Bang Theory Cast (00:41:01):
Let's see. Yep. Right here, Sam When.
Big Bang Theory Cast (00:41:04):
Great. What did he do?
Big Bang Theory Cast (00:41:05):
Hold on. I have to agree to the terms of service. No, I'm not a robot. Okay. Which of these are plants and we're in.
Dr. Scott Hollier (00:41:19):
Now, did you pick up the problem with that process? The main issue is that the, I'm not a robot tick-box worked well, but unfortunately it then fell back to an inaccessible CAPTCHA in the case of Howard's attempts, it was plants. And I've got another example on the screen here. It might be traffic lights or trucks, or, you know, identify, you know, people in these pictures and this fallback CAPTCHA is completely inaccessible. So one of the big problems we have with accessible solutions like reCAPTCHA v2 and v3, which doesn't bother you at all, is that if they don't work, if it still thinks you're a bot, it will fall back to an inaccessible CAPTCHA. And basically you're right back to the beginning. Interestingly, reCAPTCHA versions two and three, a lot of people believe that these CAPTCHAs work by monitoring your movements around the screen and then figuring out based on your mouse and other navigation that that's how it knows that you're not a bot, but it's actually a little bit more interesting than that because what Google actually does is it keeps track of all the things you've been browsing and doing online. And especially if you're using Chrome. And so by the time you actually get to one of these reCAPTCHAs, Google already has a pretty good idea on whether you're a bot or not. And that's a large part of how it determines whether, whether these things will let you through or not. If you install a brand new browser or you've, you know, using your computer from a different country, you'll almost always get failed the CAPTCHA test. If you're using a computer you've used for a long time and it's over a long period of time, then you'll probably be fine. So because of that, you know, first time users of computers are almost certainly going to fail and then it will go back to an inaccessible CAPTCHA.
Dr. Scott Hollier (00:43:05):
So what are the best solutions? Well, basically things like honeypots are a good option because it both doesn't actually identify the user and it's cognitive accessibility. Although I do appreciate that there can be spam implications for forms without obvious CAPTCHAs email verification is still a favorite because that's very accessible, SMS is still widely used by government. And certainly those federated identities, if you're with Microsoft, you have a range of things within the Microsoft ecosystem or the Apple ecosystem or Google that can figure out who you are through your different accounts and devices. So the only downside to those is that they have to know who you are, not just whether you're a bot or not. So there's a bit of an overview regarding CAPTCHA and yes, our advice on that was published, not that long ago. So the last major thing I'd like to cover off on today is to talk about something that is really, really been important in recent times.
Dr. Scott Hollier (00:44:08):
And that is around remote meetings and the accessibility of remote meetings. One of the things that has been challenging with the Coronavirus pandemic is the need for us all to scramble, to get online. And it certainly has been a, a time of change. And, you know, a very difficult time for the world is we've never gone through the virus and its implications, but for people with disability, this could actually offer something encouragement with the 59% unemployment amongst people who are blind or vision impaired, there is a door opening as a byproduct of what's happening. And that is that there are now unprecedented. And I know that word gets used a lot, but there are unprecedented opportunities for education, there's courses that were never thought to be possible to deliver online and are being delivered online and our employment, again, jobs that we've never thought to possibly be remote are possible now.
Dr. Scott Hollier (00:45:02):
And the fact that we're all online today, as I'm doing this through Teams and is a, is a further example of just how, you know, accustomed, we're getting to doing everything remotely. So as a result, you know, to have these new opportunities online, this does open a door for people with disability to get new educational and employment opportunities. And it'll be interesting to see as time goes on what the implications are, but when it comes to the accessibility of remote meetings, there is a concern that in a rush to get on line, as we have all had to do in the past few months, what are the access implications? And so this is where a lot of the work that we're doing at the moment is looking at in the RQTF and we have a remote meetings, wiki. We also have broader advice regarding real time communications and more technical user requirements on that, which is called the RAUR real time accessibility access user requirements.
Dr. Scott Hollier (00:45:58):
And that's something which we're almost completed, but the remote meetings, wiki, it's really what we want to do to provide guidance to both developers, building tools, and also to people using the tools to make sure that the information is accessible. Some examples that have come up are things like, is it possible if we have a sign language interpreter on a zoom call, is it possible at the end of the meeting to just have a recording of that, that person signing rather than having the video change perspectives during talking, can we just freeze it on the interpreter? So that that becomes the record for someone who is deaf. Can we make sure the keyboard shortcuts on our remote meetings platforms, don't clash with keyboard shortcuts used by assistive technologies? So these are just some of the aspects that are covered off in the wiki.
Dr. Scott Hollier (00:46:51):
So our wiki is evolving rapidly and I would very much welcome anyone that has thoughts on what should be in the wiki to provide that extra guidance. So the wiki looks at things like what WCAG guidance is applicable, what other W3C standards like the Authoring Tool Accessibility Guidelines and User Agent Accessibility Guidelines are applicable. We provide guidance on how to make sure that other materials around your remote meeting are accessible. So for example, it might be the case that we have something like today, but are my presentation slides accessible? Well, I'll be very happy to provide copies to anyone who would like them. And, Cliff I'll touch base with you about providing the slides, but are my slides accessible? Well, I certainly hope they are. I've done a bit of work to make that be the case to practice what I preach. But, you know, it's important to have guidance on how you can check your slides for accessibility.
Dr. Scott Hollier (00:47:43):
What about other documents and notes relating to remote meetings? If people have chat, text chat during a meeting, is that part of the meeting accessible? So we've provided a lot of guidance in the wiki. We're also looking at some sector specific guidance. So in say an educational setting, you want to make sure that the learning management software platform is accessible as well, where all the lecture slides and videos are kept for students. So there's lots of evolving guidance in this space. And again, if anyone has any thing that they would like to add, I'm certainly welcome that. So I'm just outside of the RQTF, there's some, a few other things going on as well, which I just thought would be good to draw your attention to at the end. So while WCAG 2.1 is the current definitive standard for digital access. There is also WCAG 2.2, which is now in draft, and that is evolving.
Dr. Scott Hollier (00:48:43):
And for people who are particularly interested in support for cognitive disability, this is really about trying to get some additional guidance around that cognitive disability has often followed in the less implemented level AAA aspects of WCAG and there isn't as much guidance in WCAG as there should, there should be. And WCAG 2.1 seems a bit of a missed opportunity for that. So WCAG 2.2 is looking to come up with about 10 new success criteria to provide some additional guidance in this space. And there's only one draft success criteria at the moment, so that's an evolving process, but the draft is now available. The bigger one is what would be known as WCAG 3.0, the web acronym has been changed to W3C. So it'll become known as the W3C Content Accessibility Guidelines 3.0. And the reason for that is because WCAG 3.0 is looking to encompass pretty much everything we've talked about today, including immersive environments, driverless cars, all sorts of things, as well as the web apps, offering tools, user agents, all bundled into one standard.
Dr. Scott Hollier (00:49:51):
It's been the initiative, which was codenamed Silver has been going for a number of years now. And it's yeah, it's looking really exciting. And there is a draft expected before the year is out.
Dr. Scott Hollier (00:50:03):
Also being Global Accessibility Awareness Day, there could be some announcements on some of these things that I'm not across. Just be aware that, yes, there's a lot of new things popping out today, but that's my understanding as to where those are at. So, look, essentially, you know, that's the most of it. I've got all the references here. So the XAUR is not too far off being completed as is the RAUR. The more Remote Meetings Wiki is evolving rapidly. The CAPTCHA advice came out in December last year, and yes, I've got links to the WCAG 2.2 draft there as well. That's really pretty much it from me. I'd just like to, again, thank Cliff and the South Australian Lunch and Learn Group for the opportunity to present today. It is an absolute pleasure and privilege to be a part of this, especially on this such a special day.
Dr. Scott Hollier (00:50:55):
If anyone is interested, I've put a few thoughts and reflections on what Global Accessibility Awareness Day means on my website at Hollier.info, if you'd like to have a read of that. If anyone wants to get in touch with me, firstname.lastname@example.org is probably the easiest way to do that. I do have my website there. Also the Centre for Accessibility website is available and if anyone wants to follow me on Twitter. It's at @scotthollier. I only Tweet about digital access news, not about my lunch, so if you do follow that, you can be assured that you will get just this type of information. Thanks very much everyone. Thank you, Cliff. Cliff, I'll hand back to you for questions.
Cliff Edwards (00:51:37):
Thank you, Scott. I really value your time today and ongoing support here in South Australia. Opening up to everyone. Any questions from anyone for Scott? Like we said before, Scott's happy to hang around after the main presentation and answer any questions at the end. But if anybody's got any questions now, I'm happy to say those.
Dr. Scott Hollier (00:52:09):
Maybe I'll just unshare my screen. Cliff, just because my screen is taking a bit of time to catch up, I might get you just to guide me to the, is that the, the unshare screen button is, am I close?
Cliff Edwards (00:52:24):
Go up. It's obscured a little bit by popups. Go up, go down. Go to the right one, one more. Yep. There. Excellent. So last call for any questions from the floor. No, no one. We do have a question from Ted for you, Scott.
Dr. Scott Hollier (00:52:53):
Cliff Edwards (00:52:55):
What's the recommendation for making sure honeypot is not discovered by screen reader or keyboard users.
Dr. Scott Hollier (00:53:05):
It's a good question and there isn't an easy answer. I mean, there is, you know, you can certainly hide form elements, but it is still ... I mean, what I would recommend is the easiest option is just to simply have in that, you know, for the label for that form, just to say something for the bots, but then also had something after that. 'Do not fill in' or something along those lines. I mean, there are different coding techniques you can have to hide form fields, but you know, they do ... Screen readers are getting more and more clever all the time and often can pick up a honeypot field. It's an extremely good question. There's not a simple answer beyond probably just labeling to let people know not to put anything in that box.
Cliff Edwards (00:53:53):
Thanks Scott. Ted, if you want to reach out to me after the meeting, we do actually use a honeypot solution for website design system for our reusable template website across South Australian government. So I'm not a technical person, but can certainly put you in contact with the people that do. Just to reassure everyone on that, and we quite often get people concerned about levels of spam, potentially, we get next to nothing on the sites that we manage and what we do get, we can delete within 10-15 seconds. We take the view of that 10- 15 seconds is my time, but it's not impacting and causing a barrier for people with assistive technology needs. So a worthwhile thing to introduce. But anybody that's interested, we'll be able to put you in contact with our vendor who developed the solution in partnership with Vision Australia.
Cliff Edwards (00:54:51):
Well, thanks again, Scott. Yeah, really a fantastic presentation again. Thanks for all your support. I'd just like to hand over to ... We've got a bit of outside noise, so I'll try and find who that is and mute. But I like to hand over to Manny. I'm really looking forward to this one, to hear more about Microsoft accessibility webinar initiative, and I'm not going to steal any of your thunder. Over to you, Manny.
Cliff Edwards (00:55:29):
No, can't can't hear you, Manny.
Emmanuele Silanesu (00:55:34):
Two, secs. How about now?
Cliff Edwards (00:55:36):
Yep, coming through loud and clear.
Emmanuele Silanesu (00:55:38):
I'm afraid that echo may be coming from me cause I was about to share a video, so hopefully ... Is there an echo of my voice? Nope. Okay. My screen is sharing? Just to be sure before I kick off.
Cliff Edwards (00:55:52):
Emmanuele Silanesu (00:55:53):
Excellent. All right. So my name is Manny. I'm the Modern Workplace Director at Microsoft Australia. I am also on the Accessibility Council as part of our Diversity and Inclusion Council at Microsoft Australia. This is certainly one of the areas that's very passionate for me. Not too different to your story, Scott, so I might share a little bit about my personal story. When I was two years old, I contracted bacterial meningitis, so meningococcal is a common form of that today, if in terms of a name, but back then it was bacterial meningitis.
Emmanuele Silanesu (00:56:33):
At that point in time, I was in a coma for a number of weeks. As a child, as a toddler, I'd already learned how to start speaking and walking, but when I awoke from that coma, I had lost all of those fine motor functions, as well as the ability to speak. The doctors at that point of time said to my parents that that was pretty much the state that I would stay in for the rest of my life. My mother refused to believe that. She went for multiple opinions on that as well. Even at the age of six, obviously after numerous years of occupational therapy, the doctors kept telling her that I would need to attend a special school. She, again, insisted that, no, he will attend the same school that his brother and his sister attended.
Emmanuele Silanesu (00:57:23):
I had, at that point in time, regained a lot of obviously those functions, but I was still undergoing a decade, or nearly 12 odd years by the time I hit high school, of occupational therapy. You know, to the eye, physically people would not know that I had any remnants of that meningococcal or meningitis, but I have a brain tremor still today that causes the right side of my body to shake. So my handwriting is pretty much the only thing left that people would notice, or if I choose to carry a cup of coffee in my right hand, they may notice me shaking. But my handwriting is the one thing that really held me back at high school. My teachers could not read what I was writing and it was again at about grade 10, my science teacher, Mr. Lee ... I recall him very fondly ... He recognised that, you know, I had abilities that were not being recognised based upon, you know, this difficulty with my handwriting.
Emmanuele Silanesu (00:58:25):
If you imagine this is mid, probably 1994 at this stage, laptops were just starting. Before Windows 95, running on DOS to become more and more prevalent. He applied to the Department of Education to get me a laptop. So 1996 in Queensland, we had the QCS test, which, if you wanted to go to university, you had to sit the QCS test and it's all handwritten over a number of days. He applied to make sure that I was able use that laptop. So I was the first ever student in Queensland to sit the QCS test with a laptop. The Department of Education basically sent out people to watch over my shoulder to make sure that I wasn't cheating or copying and pasting. We didn't even really have the internet at that point in time. If you think back, it was dial up modems, etc.
Emmanuele Silanesu (00:59:14):
That's why I have a passion where I believe personally that I wouldn't be where I'm at today at Microsoft had I had not had the ability to use that technology. I would not have been able to make it to university because I would not have been able to complete that examination, which was the prerequisite to get me to university. I was the first ever of my family to attend a university. It wasn't like I was coming from a privileged background either. I grew up, and the school was probably one of the toughest schools in Brisbane at the time. Moving on from there, I'm going to share with you today, just a little bit of a sneak peek around some of the things that we've got planned to share as a webinar series across South Australia. We've just finished it in terms of the recording here. We actually ran this with the New South Wales Government earlier in the year.
Emmanuele Silanesu (01:00:08):
I decided that the best way for us to create this, to be able to share it with as many people as possible is to create a video. Because with that video, I'm able to make sure that it is completely, well, to the best of my ability, accessible. So sometimes I speak too fast and I do apologise for that. You hopefully can see and read my captioning service that I have built into PowerPoint. This is part of PowerPoint. Everybody, in my opinion should turn this on whenever they present, irrespective of whether they think there is somebody in the audience that has a hearing challenge, as I like to think of it because I even find myself when I have the captioning turned on, I'm more focused. So it actually helps me to stay focused on the presentation. Now, for some of you, you may be reading the captions and trying to find if there are any mistakes.
Emmanuele Silanesu (01:01:03):
I know I do that all the time. I'm constantly looking to see if it is accurate, but it keeps me focused on the content and I am able to go, what was that word and quickly refer back, if I do lose focus, to the captions. You know, all in all, this service has improved immensely over the past few years. It is one of those things that we have invested quite heavily in at Microsoft, just as one of our accessibility features. I guess it really comes down to this focus around being able to create a more inclusive world. We are, you know, extremely passionate about this at Microsoft and Satya Nadella who is our CEO has really brought this on when he rewrote the company's mission. If you, I think most of us would be familiar with Microsoft's initial mission, which was to put a PC on every desk inside of every home and office.
Emmanuele Silanesu (01:01:59):
Largely, at the time when Bill Gates came up with that mission, it was conceived that that was just was impossible. Then today I can say that whilst we're not there in every third world country, in every home as such, we've made amazing progress in the emerging worlds and you know, the amount of funding and resources that have been put towards that to ensure that technology is available. It's democratised for every single person, no matter what their social standing might be or where they were born. But it also then a lot ... It isn't just based on money. Technology should be accessible, not only for those, but it should also be available to everybody in every context. It isn't just a case of that it needs to be provided in the sense ... and I can notice here that my caption has just covered up this, "Every person. Empowered. In every context."
Emmanuele Silanesu (01:02:57):
So I'll put my captions back now. We think of it this way, from an accessibility point of view. It is not a personal health condition. It generally, in our opinion and my opinion of my experiences, a mismatch of human interactions. So if we're able to use technology in a way that helps to match people with their skills, that they're able to unlock, whatever it is that they want to do using technology. In my case, it was the ability just simply to be able to type my exams, rather than having to hand write them, we can unlock their potential. From that perspective, it isn't a case that, you know, we were talking about the statistics earlier ... It isn't just a case of it's a small amount of the population. This helps a very large portion of the population. When I think of it from the perspective that, you know, 70% of disabilities are actually invisible. There are so many disabilities that people can't necessarily see. Similar to my shaking on the right hand side of my body. That wouldn't be visible to the normal person.
Emmanuele Silanesu (01:04:14):
Just getting somebody in the background with background noise.
Cliff Edwards (01:04:19):
Sorry about that, Manny. I can't see who that is. What I'm going to do is mute everyone and then I'll put you back on unmute.
Emmanuele Silanesu (01:04:27):
Okay, no problem. I am back. I have un-muted myself now. So I think Scott, are you, can you hear me? Yes, you can. I can see you. All good. You've moved towards the camera. That's that was cue. That's what I needed. So you're muted. But nonetheless, I got the view that I got what I needed there. So, you know, from this perspective, we are thinking of it from the perspective that how do we create technology that isn't just for people with a permanent disability? We also think of it from a perspective that there is temporary and then there is situational. So in the example, right now with the scenarios that we are in working from home, there are quite a number of instances where accessible technology helps people in different situations. In the example, there of carrying a baby at the same time, we might have an accident and break our arm, or I was unfortunate enough to break my collarbone. All of a sudden I realised how important it was to use this technology in a different way in order to still get the things done, to still stay connected.
Emmanuele Silanesu (01:05:51):
Microsoft is really focused now, largely since Satya became the CEO of the company, around making sure that we develop our technology first and foremost with accessibility in mind. It isn't a small market, as I mentioned, with over one billion people that identify as having a disability. I am confident that there is a much larger number that don't self-identify themselves as having some form of a disability. Our job is to lower that stigma, that people feel that they can't self identify because in the workplace, back to Scott's point-of-view, there are a lot of technologies that can assist people to be productive, and, in my opinion, more productive by using this technology.
Emmanuele Silanesu (01:06:39):
There's an example of that. One of my, one of my very close friends and one of my allies is Kenny Singh. He features in some of the videos that we'll see in the webinar, but Kenny is also visually impaired, vision impaired, I should say. My apologies. Visually, he again, looks as a normal human being, so he's not visually impaired. He is vision impaired. He was one of our highest billable resources at one point in time and that was because he was working on a specific project and he was getting billed because he could check and be able to create technical documentation faster than any of the other consultants we had. Really what it came down to was Kenny was using that technology and the skills that he'd built through his loss of vision to be able to use obviously speech recognition and also the read out loud function to play back to him and read out what was the technical documentations that a lot of us would find extremely boring to listen to. But for him, he found it extremely passionate.
Emmanuele Silanesu (01:07:49):
It was his lot in life to make sure that technical documentation was a 100 percent correct, which is, you know, essential when it comes back to technology. You don't want to be referring to the technical documentation to find out there's an error in it. But at the same time, being able to make it more humanistic so that it wasn't that boring for people. So he's developed and he's now moved on to a cyber security role within our business. Again, I am extremely passionate around the fact that technology unlocks that capability. He and I are on a bit of a mission in order to increase the employment of people within our organizations and our governments, commercial as well as public sector organisations. That there is massive opportunities for this billion plus people in the world if we just allow them with the technology that is available freely today, in most instances, and not have a stigma or make, you know ... Make them have to make allowances such as Scott's with ensuring that he will pay for his own taxis. That should be what I call a reasonable accommodation. This is a reasonable accommodation we could provide that person.
Emmanuele Silanesu (01:09:02):
From that perspective, really what we're driving here, and I'm going to do some demonstrations to give you a bit of a sense on what we're talking about here, but there really is no limits to what people can achieve when we reflect the diversity of the people that use it. We have a huge ... In Australia, we think about diversity in so many different ways. We have a huge opportunity here. So I kind of in the webinar will break it down into these seven, well, I actually break it down into seven pillars. We have the six pillars on the screen, but one of the pillars that I've added, actually is no more important than it is today, is around mental health. The tools that we have that are also there within our technology to help us with our own mental health.
Emmanuele Silanesu (01:09:48):
But, you know, we look at vision. We look at hearing, we look at cognitive, we look at speech mobility and Neural as those pillars throughout the video, with demonstrations of the different technology available and how we're enabling people in that area. Even the specific areas. The idea around it is for you to be able to share that with anybody that may actually benefit from it, but also from your colleagues that are not even aware of it, that they can use this technology similar to the accessibility checker. Again, coming down to every document that's sent out should be spellchecked and accessibility checked. I'm going to give a little bit of a demonstration around that in a moment.
Emmanuele Silanesu (01:10:33):
But the key thing of why this matters, why is this so important to Microsoft right now? And on the screen, I have an image of stairs themselves. The image of the stairs will appear. Stairs themselves do not make the building inaccessible. Yeah, sorry. The stairs make the building inaccessible, not the actual wheelchair. In this instance, when we put a ramp, there is an image on the screen of three steps with a metal aluminum ramp and somebody being pushed in their wheelchair up that ramp. This is an afterthought. Now it actually is not that accessible. It's better than not having the ramp, don't get me wrong. But again, from this perspective, it is an afterthought. It's our mission now with the next picture that I have on the screen, which is a set of stairs that are built with a ramp, beautifully designed into them, made out of a stone material. It makes a nice gradual feel. It's not a straight line up the stairs. So you do not have to exert a large amount of effort in order to get up that ramp because it's a much longer ramp. It is aesthetically pleasing for people to see the ramp. In fact, more people probably walk the ramp than they walk the stairs because they are feeling whether it might be through muscle soreness or stiffness that the ramp is more accessible to them.
Emmanuele Silanesu (01:11:59):
Again, this comes back to those two images that I've described. Hopefully if you cannot see the images I've given a accurate reflection, verbally, of what that actually provides is something that we should be thinking about in technology. These accessible features should not be an afterthought. They should be built into technology just as they should be built into the physical stairs that we see. From that perspective, Satya made this comment to the Microsoft shareholders meeting back in November 2016, and we will focus on designing and building products that our customers love and that are accessible to everyone and built for each of us.
Emmanuele Silanesu (01:12:44):
Now, you may not be aware, but Satya has two children that will greatly benefit and do greatly benefit from this technology. Hence, he himself has a vested interest in making sure that we create accessible technology for the future. That then comes back to his mission again, around empowering every person and every organisation on the planet to achieve more. That really is what he would like to achieve as his legacy. What drives him every day. With that, I'm going to flick over. I've got a short video that I'm going to share with you. Again, I will, hopefully the audio is loud enough for everybody to hear. I'm just going to bring up the video for everyone. Drag it across to the monitor I'm sharing and I'll hit play.
Sanjib Shaikh (01:13:50):
I'm Sanjib Shaikh. I lost my sight when I was seven. Shortly after that, I went to a school for the blind and that's where it was introduced to talking computers. That really opened up a whole new world of opportunities. I joined Microsoft 10 years ago as a software engineer. I love making things which improve people's lives and of the things I've always dreamt of since I was at university, was this idea of something that could tell you at any moment what's going on around you.
Pivot Head Smart Glasses announces (01:14:25):
I think it's a man jumping in the air, doing a trick on a skateboard.
Sanjib Shaikh (01:14:31):
I teamed up with like minded engineers to make an app, which lets you know who and what is around you. It's based on top of the Microsoft intelligence, APIs, which makes it so much easier to make this kind of thing. The app runs on smartphones, but also on the pivot head smart glasses. When you're talking to a bigger group, sometimes you can talk and talk and there's no response, and you think is everyone listening really well? Or are they half asleep? You never know.
Pivot Head Smart Glasses announces (01:15:03):
I see two faces, 40 year old man with a beard looking surprised, 20 year old woman looking happy.
Sanjib Shaikh (01:15:09):
The app can describe the general age and gender of the people around me and what their emotions are, which is incredible. One of the things that's most useful about the app is the ability to read out text.
Service staff member (01:15:23):
Hello good afternoon, here is your menu.
Sanjib Shaikh (01:15:25):
Great. Thank you. I can use the app on my phone to take a picture of the menu, and it's going to guide me on how to take that correct photo.
Seeing AI voice (01:15:34):
Move camera to the bottom, right and away from the document.
Sanjib Shaikh (01:15:37):
And then it will recognize the text. Read me the headings.
Seeing AI voice (01:15:41):
I see appetisers, salads, paninis, pizzas, pastas.
Sanjib Shaikh (01:15:46):
Years ago, this was science fiction. I never thought it would be something that you could actually do, but artificial intelligence is improving at an ever-faster rate. And I'm really excited to see where we can take this. As engineers, we're always standing on the shoulders of giants. Building on top of what went before. And in this case, we've taken years of research from Microsoft research to pull this off.
Seeing AI voice (01:16:09):
I think it's a young girl throwing an orange Frisbee in the park.
Sanjib Shaikh (01:16:12):
For me, it's about taking that far-off dream and building it one step at a time. I think this is just the beginning.
Satya Nadella (01:16:24):
I'm here with Ann Taylor from our accessibility team. And Annie, you're going to talk to us about how you use AI in your daily life.
Narrating Voice (01:16:32):
And this morning.
Ann Taylor (01:16:34):
Absolutely, Satya. So Seeing AI is an on-going research project that uses AI for computer vision to enhance the understanding and the perception of the physical environment for the blind and individual with low vision. And this app has been available for free on the iOS app store for over a year now, and our customers have found it to be valuable tool for interacting with their physical environment. Isn't that great?
Satya Nadella (01:17:05):
That's awesome. And I know you yourself use it every day. You want to show us how you use this app?
Ann Taylor (01:17:10):
Sure. This app has nine channels. So each of them offers specialised features such as identifying product by using barcode scanner. Let me show you.
Narrating Voice (01:17:22):
A smart phone, a box.
Ann Taylor (01:17:23):
I have the, Seeing AI is set as on the product channel and I will scan the barcode.
Seeing AI voice (01:17:32):
Processing, Microsoft Surface Dial, cursive.
Satya Nadella (01:17:35):
That's pretty cool. So you were just able to switch channels and then change to the product channel and put an object in front of it and it recognised it. That's great.
Seeing AI voice (01:17:43):
So I would like to show you the one that I use most, is short text channel.
Narrating Voice (01:17:49):
She positions a braille business card.
Ann Taylor (01:17:52):
So first let me switch to it.
Seeing AI voice (01:17:54):
Share button, channel, person, prop document short text.
Narrating Voice (01:18:00):
Taps her phone, points it at the card. Anne Taylor, Supportability Director Accessibility, Microsoft Corporation, Anne Taylor.
Satya Nadella (01:18:10):
That's so cool. You want to have the card backwards and it was able to just read it.
Ann Taylor (01:18:12):
Yes, indeed. By the way, it's a typical occurrence for me to read business card upside down.
Satya Nadella (01:18:20):
No, I can quite imagine. And can you then even take handwriting and recognise it as well?
Ann Taylor (01:18:25):
Yes. And this is really meaningful to me personally ... is because for the first time ever, I can use this app to independently read personal notes that's written to me from my family and my loved one.
Satya Nadella (01:18:40):
Let's give it a try. This might be a real test of this app because if it can read my handwriting, anything is possible.
Narrating Voice (01:18:48):
He scribbles a note.
Satya Nadella (01:18:50):
Ann Taylor (01:18:51):
Satya Nadella (01:18:52):
Ann Taylor (01:18:54):
Let me go ahead and switch to that channel.
Seeing AI voice (01:18:55):
Channel currency, scene preview, color preview, handwriting preview.
Narrating Voice (01:19:00):
She touches the note and positions her phone over it.
Ann Taylor (01:19:03):
I will take a photo and pick her out what you wrote here.
Seeing AI voice (01:19:05):
Take picture, take processing. Accessibility is awesome.
Ann Taylor (01:19:12):
Yes, accessibility is awesome indeed.
Satya Nadella (01:19:15):
Absolutely. We could agree on that and to see you use this application, the fact that we have these ever-growing list of channels and the powers it brings to you. And to be able to put it to everyday use and empower people, to be able to do more with technology. So it's fantastic. Thank you very much for showing us this application in use.
Ann Taylor (01:19:35):
Thank you very much. Satya, let's celebrate accessibility.
Satya Nadella (01:19:38):
Narrating Voice (01:19:39):
Please high five Microsoft.
Emmanuele Silanesu (01:19:43):
So, that is just a taste of the videos that we have available. Like I said, I have about an hour and a half. That's just one demonstration for vision. And one of the areas I'm going to highlight now is one of the points that Scott actually highlighted, which was around, we should all be providing accessible content for and checking our accessible documents. So I'm actually going to share a PowerPoint in... I've got... I'll drag it up to my other monitor. So give me two seconds while I bring it up. It's coming. Now, I've got a brand new PowerPoint following it up and I've just put it in there. I'll make it full screen for everybody. And now I might want to, for example, I'm going to delete my slide and start with a brand new slide and a new slide here.
Emmanuele Silanesu (01:20:38):
It's just a normal title slide for people that I've added in here. Now I'd like to maybe just insert a picture to make my title slide a little bit nicer than I would normally. So I'm going to pick a picture that's on my device right now. And so the device has obviously, I've got some photographs on here. I'm going to add in this instance, this nice vase of flowers into the picture. So one of the key things that using that artificial intelligence engine that's built into the cloud is that picture now, it has done two things with the picture. The first is its given me a few design ideas on the right side. So somebody, again, that doesn't necessarily have the ability to navigate or to create a bit of slides has some ideas around. Let's make this title slide that's a little bit more formatted and a little bit more attractive. But at the same time, the picture has automatically put alternative text on it.
Emmanuele Silanesu (01:21:40):
And so somebody's screen reader will automatically say that there is on the right hand side of the image or through the screen reading technology. And you can edit this. It says that is a vase of flowers on a table. Now that is actually a vase of flowers taken at my sister's house. And that is my sister's kitchen, I took that photo on my mobile phone, but it gives also a description at the bottom to say that this description is automatically generated. Now we have accessibility checker built into all of our Office products now. So in the bottom here, it actually says that there's accessibility and that I should investigate. So if you were in the review function, just where you would normally find the spelling check, you can also now find in review that there is an accessibility checker as well. Now, what it will ask before you send out the document is to check, is that alternative text, correct, because it's using machine learning.
Emmanuele Silanesu (01:22:37):
And if you say, "No, actually I would like to change slightly that image." And I'm going to give you an example of this right now. I'm going to create another new slide, another blank one. And in this one, I'm going to put a different image to give you a sense of this. So I am going to insert and I am inserting my features from this device. Cliff shared with me his website. He was working on, I think with Ralph the other day around recreational fishing. So I wanted to prove to Cliff that I really do love my recreational fishing. And this is a picture I have inserted of one of my best friends holding one of the fish that we caught earlier last year. We did actually put this fish back, I'd like to also say, because we are conservationists. We do not want a fish of this size to be effected.
Emmanuele Silanesu (01:23:26):
We took the photo and had it straight back without touching it as much as possible. But the description it has come up here is this is a person holding a fish. And so I could go in there and edit that and actually say what kind of fish it is. And so that this is actually a mulloway and the machine learning will start to learn by pattern recognition and it will give more accurate descriptions of what it is, because if I edit that it then start to get more granular. And so next time somebody puts a photo up. It might actually start to recognise what kind of fish it is rather than just that it is a fish. And that is the power of the artificial intelligence that we have available for us today. So again, the alternative text, just a small thing, because most of us never think of that for somebody who's using a screen reader, but I know there is nothing more frustrating for my friend, Kenny, than to receive an email with a screen snip in it that has no description of what it is.
Emmanuele Silanesu (01:24:26):
And all it says is here's our new organisation change. And he's sitting there, he has to get somebody else to say, "What changed?" Because my screen reader does not do the ability to read that org chart. So these are one of the things that we are thinking about in terms of how that artificial intelligence can help us to be so much more productive for people that are using this assistive technology just as an example. Now my last demo and I'm conscious of the time Cliff, so if you need to wrap me up at any time, because like I said to you on the phone, I can talk about accessibility features till the cows come home and it's time to go to bed. I have lots and lots of demonstrations, but the last one I wanted to showcase here just as a highlight is some of the screen reading technology built in for people who suffer from visual crowding or dyslexic, or dysgraphia, etc.
Emmanuele Silanesu (01:25:23):
We now have the ability and Alt Q is the keyboard shortcut to ask questions when you're in the Office Suite. So Alt Q for question, and you can go in here and you can search for what you're looking for. And in this instance you might say, read out loud or you can use natural texts. It doesn't have to be the exact term that you're looking for, but I was actually looking for Immersive Reader in this instance, I need to spell it right. And the Immersive Reader, what it will do for me is it will pop out that text, which just happens to be an email in this instance, that we sent a few months back and it will pop out that text into a separate screen and it will remove the visual crowding for me. So from that perspective, it will also read it out loud.
Emmanuele Silanesu (01:26:10):
And this has been fantastic for children in schools at the moment who do suffer in the schooling from home, lots of information being sent to them, online classrooms, etc, yet they don't have that ability to have as an example, the support teacher that they may have had in the classroom to help them out with that. And the burden that that's placing on their parents as well around being able to help them with their reading and help them with the ability to actually do the work that they need to do. So I'm going to get my little button out of the way here. That little box is in its way. So give me two seconds while I bring up the Immersive Reader. And I've got three monitors running at once now. So things keep popping up on different monitors, but I'm only sharing one with you.
Emmanuele Silanesu (01:27:02):
So the Immersive Reader is coming up. It clears away the rest of the information so that you get a blank screen. It sends this information anonymously and privately through the cloud and it will break that down. Now we have the ability in here to also change the text preferences. You can change colors for people that might have different color and spectrum issues for vision perspective as well, so that it makes it easier for them. Different font sizes, different spacings, etc. We also even have the ability here to highlight the nouns for the kids or the verbs. And I know to be honest, that doesn't help kids. It helps me because sometimes I'm not sure when I help my son with his homework, whether it's a noun or a verb. I can't remember those things from High School, but we have the ability to highlight those things here as well.
Emmanuele Silanesu (01:27:54):
So now is my screen still sharing, Scott? I'm looking at you guys. Yep. I see nodding. I just wanted to be sure that that didn't pop away, but I can also get it symbolised so that it will break it up for the student as well. And again, I'm using it as a student example because I use this with my kids. This is impactful for humans all over the planet, no matter what age you are, because it certainly, from that perspective gives you that ability to be able to consume content. And I now listen to my emails on walks rather than sitting in front of my computer, reading my emails. And I find that I actually read more emails by having them read to me than I do when I'm sitting in front of my laptop because I get bored very quickly at and I switch to the next email and someone goes, "Did you read my email?"
Emmanuele Silanesu (01:28:44):
And I'm like, "Oh, I read the subject." So now I'm using my tools in Outlook on our mobile now, we have a preview feature where you can have Cortana read out loud your emails to you as well. But in this instance, you can also slow it down and speed it up. So Kenny listens to his learning at 400% or four times speed. I listen to my learning compliance training as we like to think of it at Microsoft at 200%. And that's about as fast as I can get until it becomes inaudible to me. But Kenny's trained his ears. He can listen to it at double the speed I can listen to it to. So you have the ability to do that here. When you press play, it will highlight the words and you can also change the speed of it, the voice of it, and go up to 150% if you'd like, yeah. I'm going to stop that demo. And I'm looking at you Cliff, how much time have I got left?
Cliff Edwards (01:29:49):
I think that's time. But if you want to keep going, people can drop off or stay. We are recording it. So we won't make that available to everyone after the meeting.
Emmanuele Silanesu (01:30:01):
Now, as I said, we have all seven pillars as I like to refer to it with demonstrations of all of the different accessible technologies that are available. I've just tried to highlight a couple that might be easy for people to utilise immediately to refer to your friends. The seeing AI app that was featured in those videos is downloadable from the iOS store. It's always a question. Why did we do it on iOS and not on Android? I believe we're working on it now for Android, but at the time, the demographics of people that were using iPhones was... With the money invested, we could get three times the amount of people using it on an iOS device immediately. So from that perspective, I have thousands of different resources. So my last plug is microsoft.com/accessibility, great website, which has all of the resources available, third parties, etc, as well on the Windows Ecosystem as well as the Android and iOS Ecosystem. So we're not just a windows platform anymore.
Emmanuele Silanesu (01:31:07):
And with that, we will send out dates, Cliff will be sharing with you the dates for the webinar. And I look forward to you sharing that with a number of your colleagues and friends and us sharing this message together. Thanks again, Cliff, and really appreciated the time today.
Cliff Edwards (01:31:25):
Thank you Manny. There's a couple of questions in the chat for you as well. Just want to acknowledge here while you just haven't looked for those... Yeah. In the research for the online accessibility policy toolkit we undertook. The consistent message we received back was Microsoft, and this was Australian wide and internationally, Microsoft were recognised as a world leader in a lot of areas with this work. So I'm really excited for the webinar video series. And we'll send more details out to everyone on the meetup distribution lists, the website and through our media and comms.
Emmanuele Silanesu (01:32:06):
Yeah. That, like I said, it's a mission we're never there. It's not a case of we are there and this job is done. It will be a job that we continue to undertake every time we build a new product and new technology and new service for our customers, how we can make it just that little bit more accessible for every single one of us.
Cliff Edwards (01:32:27):
Thanks again, Manny. I know we have sort of pushed for time and we've gone over, but how do you want to manage the questions in the chat? Just to answer them in the chat after the meeting? Or do you want to do that now?
Emmanuele Silanesu (01:32:41):
Look, I'm happy. I can go back and scroll through and type a response to each of them. In that way people will get a notice when I've responded to them all. If anybody wants to come off mute and ask me a question now, or if there are any, I'm happy. I've got another half an hour up my sleeve. I could keep talking.
Cliff Edwards (01:32:59):
Yep, absolutely. So yeah, officially we'll close, but if anybody wants to hang around and ask questions of Manny, Charlie, Scott, or anybody else that they see in the meeting, I'm more than happy to keep the meeting open. Thanks everyone. Thanks Charlie. Thanks Scott. And thanks Manny. A few presentations and time today. We really appreciate it. And open up, anybody brave enough to ask any questions or come off mute.
Emmanuele Silanesu (01:33:37):
Is Charlii still there? I know Charlii did ask me a question before the meeting.
Charlii Parker (01:33:42):
I am still here. Yes.
Emmanuele Silanesu (01:33:44):
And I found two shortcuts that worked because they both cut me off. And the question you had was ... so I thought I wouldn't disrupt Scott because it took me until Scott had started to rejoin and give you that answer. But in order to... There were two shortcuts I found to end the Teams meeting, using a keyboard shortcut. The first one was Ctrl + Shift + B. B for Bob control and shift and B held together. Now that was in the preview version of Teams. So I'm not a hundred percent sure if it's in the general available version, but Ctrl + Shift + B, did work for me. And then the other one I did use was I think it was Ctrl + Q, but it logged me out of Teams completely. So the log out button not only ended the meeting, but it also logged me out of teams and I had to re-log back in.
Charlii Parker (01:34:39):
Well, I will pass those onto Chris for our webinar today so I have to make things easier on him. Thank you for that.
Emmanuele Silanesu (01:34:45):
Dr. Scott Hollier (01:34:49):
Manny, can you hear me?
Emmanuele Silanesu (01:34:50):
Dr. Scott Hollier (01:34:51):
Hey, I just wanted to thank you very much for your talk. That was terrific. And Microsoft have always been great leaders in this space. And I just want to also thank Microsoft for looking at the Android version of Seeing AI because I've been a long time Android user for my personal needs. And yes, I know that iOS tends to be the platform of choice for people who are vision impaired. But Android has come on leaps and bounds. So they have the support that apps like Seeing AI and perhaps Soundscape, if that's an option, that would be terrific to see that that development happening.
Emmanuele Silanesu (01:35:27):
I believe it's something that we continually get asked. And one that we've fed back to our research and development team who develop those apps that, that is underway. Five years ago, you wouldn't have heard Microsoft saying it was even developing Office applications. Might be seven years now, I might be exaggerating. So seven years ago, you wouldn't have even heard of Microsoft developing Office applications for iOS because they were considered such a formidable competitor. Its actually brought a new perspective inset, it isn't a case that we have to create the technology you and me. People have to choose which technology they want to use, which is best for them. So that's why we are working now on the Android version. And a lot of our new developments are actually coming through on the Android platform, more iOS.
Charlii Parker (01:36:22):
Just to come in on that, having worked with people who are blind and have low vision, Seeing AI has been an absolute lifesaver for them, for those that have iPhone. I've set up their device, so their iPhone, they could just pop it on a little stand and they can put their letters under and read their own mail for the first time in their life. So on behalf of those people, they are so thankful on behalf of people who have got Android and a lot more blind and low vision users are using Android now just because it has come a long way and the cost is generally a lot better. They'll be very excited to hear that it's coming for Android as well.
Emmanuele Silanesu (01:36:59):
Don't quote me on when.
Charlii Parker (01:37:02):
I won't put it out there. So there's obviously no GAAD announcement about Seeing AI for Android?
Emmanuele Silanesu (01:37:07):
Well, I've actually got to get up at midnight tonight to join some of the presentations. So I don't know. I honestly have no idea, but I'll be going to bed early tonight to get some sleep in because I've got some sessions to join in the middle of the of the morning, the middle of the night, I should say.
Dr. Scott Hollier (01:37:24):
In fairness, there are some decent third party...