Although one in six Australians has some form of hearing loss, deafness tends to be forgotten. Perhaps because the majority of deaf Australians don’t sign, there are no visible reminders of just how many of us there are around. Many deaf people don’t announce their hearing loss and you may never know that several of the people you communicate with are actually struggling to understand you.
As an example of how we are forgotten, the train station closest to where I lived for 20 years has rubber strips on the ground and braille on machines for blind people, and it has ramps for people who use wheelchairs, but there is no accomodation for Deaf people. To find out when the next train is coming or other information, one must press a button and listen. It is clear that efforts have been made to ensure the station is accessible but it seems the efforts did not include deaf people.
The emergency department of the hospital closest to me has an intercom in order to access it late at night. One time I drove a friend to emergency at two in the morning. I left my friend in the car while I went to get hospital staff, but was unable to talk with them through the intercom to explain what was needed. It was very stressful. I had to get my extremely ill friend to come and handle the access side of things! That makes it difficult for me to be a good support person and carer when the need arises, as I am actually more of a liability on the person I am attempting to support!
In the small town where I live, the police station is not staffed around the clock. They have an intercom that connects people to the police station in the next town. So if I go to the police, I can’t communicate with them!
When filling in forms, it is routinely required to give a phone number, but there is nowhere for me to tick to say that this is for text messages only, no voice calls. People routinely call me on the phone, and to understand how stressful that is, read The stress of a phone call for Deaf people.
When deliveries of parcels are made by courier or postal services to my address, sometimes they will call me to clarify about when they can deliver or where to leave the parcel. It doesn’t occur to them that I might be Deaf and cannot take their call. When entering phone numbers on parcels there is no box to tick saying that I am Deaf and to contact via SMS. Frequently parcels are returned to the depot and the only way for me to track it down is to make a phone call.
The support departments of numerous organisations can only be contacted by phone, and will only deal with me by phone call. We live in an era where email is prolific but many organisations refuse to deal with me by email. For example, the ATO, Centrelink and more. I use the national relay service (NRS) to call such organisations but I am frequently on hold for a long time, and the chance of the NRS cutting out before the call is complete is high. Even then, I can’t convince the organisation to start the conversation by email. Organisations regularly ask me to call them and it doesn’t occur to them that I might be Deaf – it would be better if they said, ‘please call or email,’ and then provided email contact details as well as phone details.
With the pandemic, I worried that contact-tracers for Covid-19 would not consider the possibility that the person they are trying to contact is Deaf. Usually when people call me voice on my phone I ignore it, because the alternative is to hassle someone into ‘helping’ me and usually that is intensely frustrating. If only I could be confident they would send a text message as well as attempting a voice call.
How to include Deaf people
When you make or update something, whether it is a service, a form, a website, a product, a building or a course, you ask the question:
This needs to become normalised. We need a huge campaign to get people to change their approach to start incorporating Deaf access. The more people who do it, the more forms there are that allow you to tick that you are Deaf, the Auslan interpreters and open captions there are, the more others will think to do the same.
A campaign to kick start this, and having the government lead the way, would be excellent.
I was recently asked to judge the Readings YA book award. I found myself quite confronted about this. I mean, who am I to judge books? Won’t my judgement be skewed towards my personal taste in books? I like books that explore contemporary issues, with valuable messaging, where I learn something new, and enjoy escaping into the world the author has created. I know my enjoyment of the escape is partially made possible when a writer writes well – smoothly and seamlessly. But part of my enjoyment is surely personal taste – I like, for instance, gothic type worlds, and contexts that are radical, feminist, pushing the boundaries of the status quo. How can my judgement of a book be fair? it will be skewed towards my personal taste.
But a friend pointed out that the fact that my book, Future Girl / The Words in my Hands, has won that particular award as well as many others, means that I have the capacity to pick up on certain aspects of writing, such as whether a book is ‘well written’ or not, and whether an enjoyable world was successfully created, and knew how to make it interesting enough for readers to persist with. So when organisers ask me to judge an award, what they are really asking is for me to identify the book(s) that have the qualities mine has. In that sense, it’s ok for my personal taste to inform the decision. It’s ok for me to consider whether I personally enjoyed gong into that world or not. It’s ok to favour books that push the boundaries because that’s what people are looking for now – not just me.
The books shortlisted for the award are terrific. I have very much enjoyed reading them so far and highly recommend them as good quality YA books. I am very glad I am not the only judge though, that any personal taste of mine that is not a ‘universal’ qualitative assessment of the book will be balanced by theirs. I am very curious to see how we will go discussing the books and whether we are in agreement about our experiences of reading them and which should win.
School seems like such a good thing. Education is power, right? We are still fighting for the rights of girls in so many countries to be educated. And yes, while we need an education, there is something about our school system that sets up an expectation for how our successful life should be. The system grooms us to fit into the constructs of our society, to accept a life of 9-5 slavery to pay for a 30 year mortgage. Where’s the focus on life? On what makes a good life? Might we be happier if we ditch some of the things we are working to pay for, and spend more time just being, connecting with nature and the people we love? But the idea of finding that balance does not seem to be a core component of our education system. Instead we are indoctrinated to believe it is normal, healthy, and successful to devote most of our waking hours to a financially profitable career.
This painting is a part of my exhibition, Love, Lies and Indoctrination, which can be viewed online here.
If you’d like to buy this piece, it’s available here.
Auslan was brought to Australia with Deaf convict, Betty Steele, who used British Sign Language (BSL). Over time, and with separation from England, our sign language has evolved to be a separate, but similar language. Presumably it was Betty Steele or one of her friends who made up the sign for ‘Australia’ – you can think of the sign as picking up people in England and disposing of them by dropping them down in Australia.
This video shows signs for the name of our country, Australia, and our states. Notice that several of them are simply letters of the alphabet.
New South Wales
My apologies, but the video omits a few places. Here’s a description for how to sign them:
Tasmania – fingerspell T A S
Hobart – fingerspell H and point down
Australia Capitol Territory – fingerspell A C T
Canberra – With your non-dominant hand, form a ’1’ with the pointer finger. With your dominant hand, create the letter C. Rest the letter C on top of the pointer finger.
This post is part of my free online Auslan course. See the rest of the course here.
To learn more about what it is really like to be Deaf, details about the Deaf community and how Auslan is used by Deaf people, read my book, Future Girl.
If you like my book, Future Girl / The Words in My Hands, and are interested to do some art journaling along those lines, you might like to try my art therapy course, Pour Your HeART Out. In this video I talk about what you’ll get from the course and show snippets of the course content.
When the pandemic started, I began to watch ABC news, via Apple TV. Although the news has closed captions, which I set to be switched on, there are numerous problems with this technology, and these are problems that occur across many streaming and live television platforms. Streaming and television services may feel that in providing captions, they have ticked ‘access for Deaf people’, but more needs to be done to truly provide access. In this article I describe some of the issues and measures that can be taken to fix this.
For news, live captions are often used, which means that a person sits there typing what is said, and the words appear on the screen a little later. With live captions, there can be a significant lag time between the two, which means that when we see a picture on the screen, we don’t understand what that picture is about, until a few seconds later when the captions appear. Meanwhile, the picture changes, so that when the captions do appear, we are looking at totally different content. In order to relate the captions to the picture, we need to mentally recall what was on the screen a few seconds ago, and simultaneously we need to be collecting the current images shown ready for when the captions appear. This is an exhausting process.
To make it even more challenging, it is common for errors to occur in the captioning – that is natural since it is a human sitting there typing. But even so, the errors are sometimes ludicrous. I recall seeing ‘cupboard 19’ instead of ‘covid 19’ and ‘palla shan’ instead of ‘Palaszczuk’ (the Queensland premier). Surely a person captioning the news has an obligation to become familiar with the words used in the topics of the day? Mentally correcting the numerous errors while coping with the time lag adds another level of stress to the experience.
A further problem that frequently occurs is that it is common for live news to be shown with scrolling text at the bottom of the screen, summarising breaking news and the main news items of the day. When the subtitles are overlaid on top of this scrolling text, it is challenging for the brain to filter out the background scrolling text and only read the subtitles. It is a chaotic image to process.
Combine these three issues together and watching the news becomes a form of mental gymnastics.
On Apple TV, the ABC news has some further problems. First is that when I watch the live-stream news, the captions only sometimes work. I have noticed that if I wait until an hour after the news segment, it will be published as a standalone episode, and the subtitles are more likely to work then. This means waiting a whole hour, rather than watching it live, which is frustrating.
Further to this, when episodes are published with captions, those captions were generated as live captions, and are not corrected. In order to benefit from watching the news live, I accept that there are some technical issues involved that I need to put up with. But in the instance where I do not get to watch the news live, I would appreciate the captions to be done properly, synced up with the image and errors corrected. It would only take a small amount of time for an editor to go through the captions, sync and correct them. But this is not done, so regardless of when I watch the news, I have to do the stressful mental gymnastics involved in processing live captions.
I emailed ABC to raise these issues with them but they never got back to me. Likewise, other Deaf people have complained and had no response.
Since the start of the pandemic, many government officials have an interpreter with them when they present to the press, which is a welcome improvement when it comes to access. However, news services which then present the press releases do not edit with the interpreter in mind. For example, they will show a brief snippet of the premier making a specific statement, but because the Auslan interpreting is slightly behind, the image will show the interpreter signing the end of the previous topic, then the start of what the premier says, and then will be cut off as they have not yet finished relaying when the premier stops talking. Thus, attempting to watch interpreters on the news is a disjointed and frustrating experience. The only Deaf people who benefit are those who actually attend the press event in person, which of course is a small proportion of those who would like to access the news. Just as captions need to be synced, news segments need to be edited with Auslan interpreters in mind, so that the interpreters say complete sentences that make sense.
In general, on streaming platforms, there are some common problems that frequently occur when it comes to captions:
While some platforms allow you to turn on captions and then they remain on for everything you watch, many platforms require you to turn on the captions afresh for every episode. This is annoying.
On some platforms, within a single series, there may be captions for some episodes but not others. This is very disappointing when we get drawn into a series and then discover it is only partially captioned.
For overseas movies and television series that are shown in another language, frequently the overseas language component is captioned, but when the actors speak English for some scenes (as frequently happens in foreign shows), the captions for the English scenes are often forgotten. Thus we miss part of the story.
If you are involved with a news or streaming platform, I encourage you to take on board these issues and create processes to resolve them. It would not take a great deal of effort to:
Correct and sync live captions before publishing an episode;
Double-check the captions are actually streaming during live news segments;
Double-check the captions are actually present before publishing an episode;
Remove the scrolling breaking news text or place it in a different part of the screen than where captions routinely appear, such as at the top of the screen;
Edit with Auslan interpreters in mind so that segments make sense to Deaf audiences too.
If you are a Deaf person frustrated by these issues and want to do something about it, please copy this entire post and send to your preferred television platforms, asking them to take action.
Thank you for my new plan, and the funding you have given me to support my disabilities. However, I am confused by your decisions. I applied for a review of my plan because my circumstances had changed – my disabilities have increased (ie, now I need to use a wheelchair whereas before I didn’t). I was hoping to get some additional support for the areas where I am now struggling.
I understand that in order for you to be confident that these new disabilities are real, you require me to be assessed by an occupational therapist (OT) and by doctors, and for reports to be written about me and my functional capacity.
I must admit I have been shocked at the amount of time and cost involved to do these assessments. I would estimate that over the past year, since I first applied for the review, that the appointments and reading/editing of reports (a separate and very long report for each thing I request you fund) have used up an entire half of my available energy. That’s a LOT. It has been an exhausting and draining process. But I have persisted with it in the hope that eventually I will be set up for this new disability and I can stop focusing on it and instead can focus on getting on with my life.
I have also been a bit shocked that such a large amount of my previous plan had to be spent on the various professionals who have assessed me and written all the reports. I don’t dispute that their time is valuable and should be paid for. But perhaps a condensing of reports would be more efficient, and not requiring quite so much paperwork. It would be less for you to read, as well as taking up less of my time and costing less in OT fees. More of my plan could be spent on stuff that actually supports me.
Given that you spent so much on my OT, I am rather surprised with my new plan that you disregarded her recommendations. What was the point in having her write all those reports if they weren’t going to be heeded? Was this not, perhaps, a waste of taxpayers’ money? And my time? Did you folk at NDIS actually read the reports? Because the plan you gave me doesn’t seem to reflect anything in the reports about my specific circumstances. It seems rather… generic.
I was surprised to see that in some areas where I am struggling the most you actually cut my funding, rather than increasing it, despite my disability increasing. Does this mean you felt my previous plan was too generous? Does this mean you don’t believe my increased disability needs support? If the report from the OT is not enough to convince you, what would it take? And… why did you require those reports in the first place?
Could I suggest a more efficient system?
You could require a single functional assessment from an OT, which outlines my disabilities and functional capacity. This could be entered into your system and never need to be written about again. Each time a new disability/functional limitation is encountered, it could be added to your database, along with details about how this affects a person.
Let’s give an example of Deafness. Deafness affects different people in different ways, and no two Deaf people are the same. But there are some broad categories that inform the types of support needed. For example, Deaf people who use Auslan and did not have access to language as young children usually have difficulty with English. A Deaf person with a large amount of difficulty with English and little childhood language access usually finds it very hard to access information in today’s world. For example, such a person might struggle to do a Google search because they cannot read and interpret the results. They might struggle with everyday tasks such as cooking and health management because they have never received adequate education in Auslan in these areas. They might have difficulty understanding subtitles on TV. This could affect their behaviour, making ‘social appropriateness’ difficult when it comes to interacting with the hearing world. Such a person needs significant support to catch up on missed learning areas and help them access and function in our society. Providing an interpreter for social and work occasions doesn’t even come close to providing the type of help people in this situation need.
The average OT that I have met has only a superficial understanding of Deafness, and thus does not realise that they need to enquire about these types of barriers. We Deaf people have been advocating for each other and teaching OTs about our needs, and getting them written into the reports. But it’s an exhausting and draining process, and different Deaf people have different levels of access to advocacy, and varying skills when it comes to self-advocacy, so there are plenty of Deaf people who need additional support to access the world but are not getting it, because they don’t know how to ask for it and their OTs don’t realise it is needed.
For every disability, for every subtle way it affects people that is brought to the attention of the NDIS, this could be entered into a database, so that forevermore, NDIS staff and OTs have access to this information.
It would not be difficult to create a program that OTs and NDIS planners can use which accesses this information and helps inform the types of support. For example, enter ‘Deaf’ and up comes a question about the level of skill with the English language. Depending on the skill level entered, a range of supports are listed, which an OT can select from (and add to if a required support is not listed).
That way, every bit of advocacy that has ever been done for any Deaf person who struggles with English would all be right there in the database, along with explanations and justifications for those who are not familiar with why a certain support relates to a certain disability. As the database expands, we would no longer need to advocate for ourselves.
My new disability, which requires me to use a wheelchair, is orthostatic intolerance (OI) – it means an inability to tolerate being upright. When I stand up, blood is not maintained in my brain, and I feel faint and sick and cannot think. This appears to be a fairly obscure disability, since the wheelchair companies I dealt with did not seem to have encountered my needs before, and thus didn’t know now to accomodate them. My OT, likewise, did not understand. I had to do a lot of educating of the professionals around me, and a lot of correcting of their reports, and a lot of rejecting of their proposed wheelchairs that didn’t meet my needs. My disability is actually a standard thing, well established medically, and the symptoms are outlined in numerous places all over the internet. I assume I am not the first person who has been funded by NDIS for OI. But it would be really good if the information from those who have gone before me was captured and able to be utilised for me. If I am truly the first, then the condition I have described and the ramifications it has on my everyday life should be utilised so that the next person who applies with OI does not have to go through the enormous draining process I have gone through to educate the professionals.
NDIS, by capturing this information, and using it to refine and gradually standardise supports for various disabilities and circumstances, you could make life so much easier for all of us, and so much cheaper for taxpayers too! OTs would only ever need to write a report for a new circumstance that has not yet been encountered in your database.
To assess me, the OT and I could sit down together at a computer, enter the disabilities, and then choose which supports are appropriate for me. A standard amount could be given by NDIS for each of these supports, and if a greater amount is needed, the OT would write a short explanation of why. If NDIS accepts it, that new circumstance with the greater level of support would, of course, then be entered into the database.
Think how much more efficient it would be! Not only that, but once entering our circumstances into the system, we would be given a list of supports we can expect to receive, just like we can receive a tax estimate once we have entered our financial details at tax time. I hope you will consider my suggestion.
In the meantime, I await my appeal and hope that my next plan will be better tailored to my circumstances.
Yours in hope, Asphyxia
PS. If you like my suggestion, feel free to let NDIS know. Maybe with enough numbers they will listen. You could send an email to firstname.lastname@example.org and say, ‘I support Asphyxia’s idea of creating a database of disabilities and supports, to reduce costs, reports and the need for self-advocacy.’ Or just email them the image attached to this post. Please share this post with others who may be interested in supporting the concept.