How to do accessible social media – 20th October 2020



How to do accessible social media – 20th October 2020WelcomeANNIE MANNION: Hello, everyone. Welcome to today's webinar. It is just gone 1pm. So I will give everyone a chance to join. There are lots of people registered today, it is a very popular topic. If you joined online, feel free to drop into the Q&A box to say hi to the panelists and the other attendees. We disabled the chat function as we discovered it caused problems for people using screen readers. I will leave it a few moments for you to arrive. I can see there are lots of you here now. Glad you can make it for the day. OK. So let's start now. Hello, everyone. Welcome to: How to do accessible social media. My name is Annie Mannion, I'm the Digital Communication Manager at AbilityNet, running you through what to expect from today's session. So just to go through a few bits of housekeeping, Daniel, if you could move on to the next slide. HousekeepingWe do have live captions on the webinar, provided by MyClearText. So, thank you, Heather. You can turn those on using the closed caption option on the control panel. There are also additional live captions via Streamtext. You can also access the slides for today at: abilitynet and on the website: .uk/social-webinarIf you have technical issues and need to leave early, don't worry, you will receive an email with the recording, the transcript and the slides on Thursday. Depending on how you joined the webinar, you might find the Q&A window. If you want to ask Daniel any questions, do drop them in the Q&A area for us to address later on offer after today's session or a follow-up blog on the website. We also have a feedback page you will be directed to at the end. So, if you have any future topics you would like us to cover in our webinar, do please let us know. For those of you who are not yet familiar with AbilityNet we support people of any age, living with any disability or impairment to use technology to achieve their goals at home, at work and in education. We do this by providing specialist advice, services, free information resources and I will share a little more about our services at the end of the webinar. OK. Great. Poll and ResultsSo, we are just going to, just before Daniel begins his presentation, I will start with a poll. I will launch the poll now. Please can you tell us what social media content do you mainly produce? Is it mainly blog posts or articles, microblogging or updates, tweets, status updates, photos, things like that. Podcasts or video, or all of the above. Depending on how you joined the webinar, you may not see the poll but you can respond in the Q&A panel. I will leave that for a few moments for anyone who sees the poll on the screen and wants to engage with it. OK. I will end the poll now. And I can see ... we have 28, so, 12% of you have said blog posts or articles. The majority of you have said, 51%, have said microblogging, so tweets, or update, tweets, status and updates or photos. 3% using podcasts or video, and 34% of you say all of the above. I will stop sharing the poll now and over to Daniel for some accessible tips for all of the items mentioned. How to do accessible social mediaDANIEL MCLAUGHLAN: Thank you, Annie for that wonderful introduction. Welcome everyone. That poll is very interesting. There is definitely a lot of content today to do with the microblogging, Twitter and Instagram but also things on video as well. So hopefully you will find those useful. Introduction Social media is awesome. That is the first thing I want to say. It gives us a lot of power and potential to reach out worldwide and to reach a diverse audience. There are about 4 billion social media users. That is about more than half of the population of the planet. At least 2.5 billion are using Facebook alone. It also has the potential to inform, educate and to entertain. We have seen that this year with the global pandemic. We have been able to keep in touch with friends and family, to get news, updates, and to share advice and information. We are always connected. It is always on. Most of us have Smart phones where we can access major apps, Twitter, Facebook, Instagram, YouTube, so we’re always connected. But that potential, that power, does not come without a warning label. If we don't think about the people accessing our content and the particular access needs they may have, we can unintentionally exclude. A lot of what I am talking about today will be highlighting things like particular disabilities, particular accessibility considerations, but this is for everyone, for all of us. If our bodies are constantly changing, our situations and context is constantly changing, if we don't build that in in terms of design and how we share content, we are excluding ourselves, ultimately. I am looking at four of the major social media platforms – so Facebook, Instagram, Twitter and YouTube – and also to break those down into what I am calling social content building blocks, so, text, images, audio and video. Hopefully by doing that, if there is a particular platform I have not discussed you can take the principles away to apply them to that particular platform.TextSo, text... text is the foundation of the web. It is in the very technology. HTML, hypertext mark up language, is built around the idea that we can connect through text to make a network of ideas. Text by itself is fundamentally accessible if we use it in the right way. ComplexityIn terms of access needs consider the complexity, the bar, the level of complexity you are setting when producing your text. Avoid making assumptions about the education, the literacy or the knowledge of the people interacting with your content. Keep it simple, basically. Use simple words. Less syllables.Be direct, use active voice. So “the Government produced a report”, rather than “a report was produced by the Government.” Avoid jargon and abstract concepts. Abstract concepts can particularly be difficult for someone who has Aphasia, which is a speech and language condition, commonly following a stroke. It does not affect intelligence but affects the ability to understand and to produce language. In general, aim for a lower secondary education reading level. That is ages 11-13 or in the UK, in England at least, it is years 7-9 in high school. LanguageConsider language... we often use everyday terms without thinking but those terms can come with their own biases. Their own meanings beyond the one in which we intended. Being mindful about what those negative connotations could be. It will avoid us from unintentionally excluding people. Think about common phrases we may use, like 'blacklist', 'blind drunk', and thinking how they reinforce a negative stereotype of someone being somehow different or less than, due to the colour of their skin or if they have a particular disability. Use gender inclusive language where possible. Don't always go for the masculine default. Remember that gender itself is a spectrum. So, don't fall into the binary trap of he/she. If you can, depending on the context it may be appropriate to ask people how they wish to be interacted with. Which pronouns do they use for example. There is an accessibility lead, Fen Slattery, they created a fantastic zine, Pronouns 101. It is on their Itch.io page and it breaks down some of the most common pronouns and how to use them.Avoid phrases that suggest victimhood. Phrases like "suffers from”, “wheelchair bound”. It reinforces the stigma that disability is something to be ashamed off when that’s is not the case at all as many people are proud of their disability. It’s a core part of their identity.And avoid reinforcing existing stigma, particularly when we’re talking about mental illness or mental health. Saying words like crazy, or talking about using the term OCD to mean that you’re very detail oriented. OCD is a real condition with real world impact so be mindful of using terms like that. Provide trigger warnings on your content. I might be scroll through Twitter and find myself on a thread, and before I know it I have gone through tweets that potentially have content in them that is going to be traumatic for me. So providing a trigger warning – it’s very common on Twitter to have TW or CW for trigger warning or content warning – just to ultimately give me the choice about whether I want to interact with the content. Emojis and Symbol FontsThink about your emojis and symbol fonts. Emojis are fantastic to communicate, convey an emotion very easily and quickly, but they come with their own inherent, literal meanings. I have three on the slides. I have a grinning face, a smiling face with hearts and confused face. That third emoji does not say confused to me, it says unsure or worried. But beyond that, the literal meaning of the emojis is something picked up by assistive technologies. So a screen reader for example will announce the literal term of that emoji, the literal definition. That can be laborious if you fill a profile or tweet we emojis. With the symbol fonts, this is where you see stylised almost italics fonts, which is quite common in people's profiles or in tweets to highlight something, that is not a real font. It is a Unicode character that is not often picked up by assistive technologies. So often if I am using screen reader – and it’s not exclusive to screen readers - I will miss out on that information entirely. So, general rule of thumb: unless your platform gives built in options to change the font, avoid copying and pasting fonts from another site like that. Be sparing with your emojis. By way of example I have a very short clip of a Twitter profile I created. I’m using some stylised text, and symbol fonts in the profile with a couple of emojis and a tweet underneath with the clapping hands emoji after each word to give a round of applause. This is what it sounds like when you run it with the voiceover screen reader, built into IOS.[ Short clip of VoiceOver reading out a Twitter profile and a tweet ]DANIEL: So, the take away there, I hope that it came across, in the profile the stylised text and symbol foundation hospitals were not announced. All we heard were the two emojis, cat face and standing chick. Again those are quite literal descriptions. In the tweet I had a sentence along the lines of “It's a good idea to wake up your owner at 3 O’Clock in the morning”. But after each word a clapping hands emoji, so we heard that quite repetitively. That is going to be extremely annoying but also potentially quite painful to someone who is navigating that content. Hashtags Think about your hashtags. A hashtag is a way to tag a particular word with pound sign in front of it. If I write content under the term and you write content under the term, we can explore the term to bring up the content. But be mindful about how you write them. So I suggest 'camel case'. Capitalise the first letter of each word. It is easier to read as I can see what the words are but it gives breathing space. If I am using something like screen reader, it will give you the pauses, the natural pauses between the words.You want them to be short in terms of readability. If they are long it is difficult to read them.Usually at the end of the post. I don't want to get through 7 or 8 hashtags to get to the content of the post. Typically at the end. You can.. like the example on the slide here about our TechSharePro event from last year, we have a hashtag in the middle of a sentence. That’s appropriate as it will be announced as “hashtag Tech Share Pro.”ImagesSo images. We have lots of choices when it comes to sharing images. The major platforms allow us to share images. They can be static, selfies, or animated image, like gifs for example. Image DescriptionsProvide image descriptions. Again, as we cannot assume the level of literacy when it comes to our text we cannot assume that everyone is perceiving our images in the same way. It is not to say that someone may be completely blind. They may have a form of visual impairment that means that they simply rely on additional technology like other software or if images are blurry or if they have cataract. Again, trying to avoid assumptions, don't think in terms of binary: This person cannot see the image at all but image descriptions are useful in the circumstance when someone is accessing them and using assistive technology, so that they can get the purpose of what that image is meant to be.I keep saying assistive technology, it is not just about screen readers. I mentioned that text is fundamentally accessible. If you are providing a textual alternative for images it can be repurposed, it can be sent to a refreshable Braille device for someone who is deaf/blind, it can be picked up by search engines and they can use it to semantically understand your content, and it can also be used by the web browser, so if for any reason that image fails to load there is text there for the purpose of the images to come across. Most images on social media are informative. We talk about images as being functional, informative or decorative. The obvious, functional example is the logo on the website. Typically a link to take you to the home page. But we don't have the option often in the social media platforms. The images that we put in there will be your selfies, they are pictures of products, that sort of thing. So they are informative. We want to provide image description that conveys that information. On some platforms – all major platforms can do this – it is easier than others. It is on some platforms, unfortunately, where it is hidden. Some platforms will do it for you, take your image and generate automatic alternative text. You need to review those. They are not fit for purpose. I have an image on this slide, a picture of my cat. It is from Instagram. I used a similar image in the Instagram profile, and the automated text was: “Blue sky, sunglasses” so completely wrong. Keep it short. Like the hashtag, get to the point. Tell me the relevant information in the image. If I am sighted I can glance at the image and take the key detail. I don't need to know that the image of the cat, the blanket has a pattern with lines and circles, that may be too much information. And avoid redundant information. You don't have to say it is an image or a picture as the assistive technology like screen readers will announce that sort of information. How to add image descriptions on TwitterSo, on Twitter, the option is there to add an image description, also known as alternative text. When you attach an image to a tweet it, there is a button with a plus symbol and A L T. So, add ALT text. So, when you activate that you get the input to prompt to describe the photo. The contrast on the input is very bad. It is grey text on a black background but that is where you write the description. Here is an example of the cat in the garden: “Black and ginger tortoise shell cat. I picked out what was relevant. Writing good image description is as much an art as a science, I am not saying it is the best example. I tried to keep it short, 125-140 characters to get to the point of what it can be. When you post the image, in the timeline, there is an alt symbol to signify that the particular image has Alt Text, it is not available to me as a sighted user, I have to access it with a screen reader which is read out when I land on it. How to add image descriptions on InstagramInstagram. Similar. So, typically on Instagram choose an image, pick a filter. Then before you post the image you choose or add a caption, it appears below the image. You can add different accounts to post it to but at the bottom, there is an advanced setting option on the IOS version. This is bad contrast, light grey text on a white background. Activating that you get more options, one of them is Write Alt Text, that is where you want to describe your image. So I have another short clip. An example of that to go through the process. This is from Twitter. It should be the same kind of thing. [ short clip of VoiceOver on iOS reading out an image tweet with an alternative text ]So, it was the same image on Twitter, the VoiceOver screen reader announced the hashtags as part of the tweet. I capitalised them. It announced the description of the image: “A black and ginger tortoise shell cat on a white stone wall.”How to add image descriptions on InstagramFacebook, again, providing the option. So, it is slightly different depending on if you are on desktop or a mobile. So I have iOS screen shots, I'm Apple biased, I'm afraid. Uploading the image on Facebook, you post the image first then above the image are 3 dots. When you activate those there is an options menu. One of them is edit photo. You can choose then to edit the alt text. On desktop, I think it is called edit photo but not in the dots but on the image itself where a menu pops up. You can add in the Alt Text and publish the image itself. Facebook, if you are on desktop, will give you an automatically generated alt text, and there is an option to override the text. It does not appear on the iOS app though. You have to enter it. Best Practice for Image DescriptionsIn summary for images: The best practice on image description, rather:Keep it clear, concise, 125-140 characters. Be meaningful, the context matters. The way that I describe the cat on the Instagram feed is different to how a pet adoption agency may describe it on their business Instagram feed.Include the relevant information. You don't have to describe everything in the image. Even as a sighted person I will not take all of that information away. Only what is in the context of the page what you are trying to convey to me. Again, a slight side bar for functional images, if it is a link that leads somewhere. Describe the function, you know, AbilityNet home page. Give me the confidence, where that image will take me. Front load the important information. You need to write in proper sentences. but again, get to the point. I don't want to sit through a whole description to find the relevant part of the image you trying to convey to me. Avoid redundant words like Image or picture of ... the screen reader will announce those ..and on Instagram, this is popular but it is also quite a good idea in terms of awareness and also just providing information in many channels to consider putting the description in the post itself. So you can put it as alt Text and providing it in the post itself, for someone like me, I am a visual learner, so, just having the redundant information of the image and the text, just compliments graphics and Images of TextThinking about your infographics. So illustrations, where you are trying to communicate an idea, often like text and diagrams. And images of text. As a rule of thumb, when we are talking about accessibility, we say not to use images of text, as they don't scale well. Particularly with what we called raster graphics, so the .PNGs and .JPEGs. So, when I scale it is difficult to scale. So, if you use them, it is common for branding and to put up about ten images in a row to create tutorials. So think about the font choice, the colour contrast and crucially, as there is text in the image, provide the text alternative, the Alt Text. In terms of font choice, sans serif fonts, without the curly tails are easier to read. I have an image on the slide that I followed this account, @recoveryconnor, on Instagram, I was graciously given permission to use the image. He talks about eating disorders and mental health but what is good about the way he has done his title cards on Instagram, is he is using this very emboldened font. It is a black font on a light, green background. Every letter is a different shape. That is not only easier to read compared to some more cursive and swirling fonts but it is really good for someone who is dyslexic, for example, as there is more to hold on to. He is actually dyslexic himself. The contrast on that, I think it was 11:1 with the green background it is way above the minimum that is recommended that is 4.5-1. Again, if I have low vision, or if I have colour blindness, it is not only about relying on colour, sometimes the colours themselves can be washed out. Having the high level of contrast make it is easier to see. I am not suggesting that this particular font is going to be perfect for every user case, obviously not, but do consider your font choice, consider using sans serif fonts that are easier to read. Also consider contrast for graphics. When creating infographics, diagrams, the parts relevant to understanding, they need to have sufficient contrast. So the contrast minimum ratio is 3:1. I would say minimum but – and there are tools out there, such as colour contrast analyser to check the contrast – try to supersede that minimum. That is a very base level. You want to exceed that. Again, if you have text in the image, provide a text alt so that if I cannot perceive the image and using a screen reader, I can get the access to that information. Audio and VideoAudio and video ... a lot of the major platforms now allow us to produce both audio and video. TranscriptsFirstly want to talk about transcripts. Particularly when it comes to things like podcasts, where you are only providing audio. If you are not providing the text alternative, or the transcript, you are just relying on audio, if I cannot perceive it, I may be deaf for example or watching the video or listening to the video with sound off, I need that textual information in order to access the same content. I'm not, aside from YouTube, I am not aware that many of the platforms provide access for transcripts, so, Twitter for example, they had an experimental voice feature recently. I don't know if it has been popular since but it was purely voiced tweets so, no transcript, it immediately sets a barrier for the deaf community. There is no set format for transcript but what you can do is to provide a link in your tweet or the Instagram bio to a Google Doc or to the webpage where you have transcripts. The example on the slide is from the Lady Bug podcast. I enjoy listening to the show and they always provide a transcript. No set format. Here are the speaker names in bold text with the timings and the paragraph underneath. You can include headings, if you are doing a presentation like this, when talking about many topics. That may not be do conveyed by the dialogue so breaking the transcript up to different sections make it is easier to read. You want to describe not only the dialogue but important sounds and crucially if it is a video for example, with the audio visual effects. You may be showing that in the video but not describing in the audio. Saying “click here”, if I can't perceive that, the location will be lost on me. So, providing that description in the transcript is essential. Again, format the text. Use punctuation, break it into paragraphs to support the understanding. How to add transcripts on YouTubeYouTube allows transcripts, it produces them for you automatically. So, when you post a video on YouTube, not only will it create automated captions, once they are available, a transcript will be available. So, anyone viewing your video, can click a, again, 3 dots, it is a common choice, under the video, it is an options menu, there is an open transcript option. You get the side panel with transcript with the lines of each dialogue and the timestamps. What is good about this type of player, it is also interactive. So if you click on any of the line, you can skip to that part of the video. It stays in sync. You can see this on TED talks, so it is a useful way of – “What did he say?” – and skip back to the part of the video you are interested in.Unfortunately, there is no feature to download the transcript but you can copy and paste it into another document. That works as well. Best Practice for TranscriptsBest practice for transcripts? You are reproducing the dialogue. You can tidy it up. If you are like me, I say um ... ah, a lot. So you can remove that information. Break it into section, paragraphs, using bullet pointed lists and make it easier to read and to navigate. With heading, if there is a topic you are describing and there is another article to reference, put the link in to the article. If you have a podcast, especially with many speakers, indicate who is speaking. If it is relevant, how are they speaking? Whispering, shouting? Is there an emotion being conveyed? And again, describing the important background sounds and visual events. CaptionsSo, captions ... it is another form of text alternative. In this case for video. You are wanting to provide a text alternative for audio. So my wife for example is profoundly deaf: she relies on not only the transcripts for the podcasts but the captions for the video in order to interact with the content. She relies on lipreading so sometimes with a video it is obvious what is said but, you know, the character may be off screen or have their back to the camera or maybe just a very difficult person to lipread. So providing captions can be beneficial for someone who is deaf. But also, bear in mind it is not the ultimate solution always for the deaf community. If someone is a full signer, so, a British Sign Language signer, English is not their first language. British Sign Language is language in its own right with its own grammatical structure and rules so, providing captions may not be tailored to everyone in the audience. But it goes beyond the deaf community. It is useful if you don't speak the language of the video, it reinforces what you are hearing in the audio with the text. It can be a useful way of not only learning but understanding the content of what is being discussed. Like transcripts you want to capture the essential audio, so capturing not just the dialogue but sound effects and off-screen events, background music as well. Some of the platforms do provide automated captions. You cannot rely on those and check what they are. They can be either missing words or completely incorrect. And guidelines do exist. I have linked a couple on the slide. There are services in America. I think it is Digital Media Captioning Programme. [ actually Described and Captioned Media Programme ] DCMP with a style guide as to how to present the captions or the BBC Subtitling Guidelines here in the UK which has good recommendations for things like the number of lines, how to break the lines so that they are easier to read for example. And a quick side bar on subtitles versus caption, you can see the terms use interchangeably but they are not the same. Subtitles convey the dialogue. They are intended for those who can hear the dialogue but not necessarily understand the language. So good for translation captions include all of the essential audio, dialogue, music, sound effects. They are intended for audiences who cannot hear the audio. You can see on the different social media platforms they are used interchangeably. Don't lose sleep over it but there is a slight difference there. How to add captions on TwitterSo, how to add captions on Twitter? Twitter has a media studio, I believe it is in the options men. Otherwise you can go to studio., it is available on the Twitter API or Twitter Ads. You can select a video file. Get tonne the subtitles tab and upload the subtitle file. There is a format, it is basically a text file, an .SRT file. With a number for each line of subtitle, dialog and the timestamps. If you don't see the option, the other suggestion that I am giving to create your subtitles outside of Twitter and burn them into the video, so they are a part of the video, they are permanent . This is what we call Open Captions. I linked to Courtney Craven, founder of CanIPlayThat? A site that publishes guidelines for games. They created this article to explain to you to take captions from YouTube and burn them into the video using a service called HandBrake. When you upload to Twitter, the captions will always be there. Instagram is confusing as it has so many services for video. How to add captions on InstagramIt has stories, which are 15-second long clips, you can put an image or a video up doing as many as you want, they will stay on the profile for this hours unless you save them as a highlight so people can reference them later. Reels is Instagram's answer to Tik-Tok. It is 15 15-second clips with stickers and filters. Instagram Live and Instagram TV. Instagram TV, now has automated captions. That is a recent development as of last month. You have to do a bit of work to get them. So, in Instagram, you, when you are publishing to Instagram TV, in the post itself, you have to go, similar to the Alt Text, to go to the advance settings menu, this is an auto generated captions option. If I want to view the videos on the Instagram TV or Instagram with captions, I need to make sure that in the settings I have turned on the auto generated captions option. Then in the last step, what you may have to do, I have an example of my friend, Ra, trying out the captions, you have to tap on the person's face, to tap on the video to get the captions to appear. It may not necessarily appear straight away. For all intents and purposes, they are OK. They serve a purpose. You can edit them, that is good before publishing them. But Ra did not use a special set-up and it was fairly accurate. Failing that, that is for just Instagram TV, there is no knowledge to not to my knowledge yet an option to upload. Lots services exist, I have added Kapwing who take your video, auto generate captions and give you an editor to tweak them, you can download the video or download the caption file if you want. There are also lots of services where, again, don't rely on the automated service, you can pay a captioner, just like we have Heather from MyClearText right now providing the captions for us, they can turn the captions around at a relatively low cost very quickly. With something like Instagram Live, you can save out the video, to Instagram TV but unfortunately it does not then give you the option to add the caption, unfortunately. But you can download the video, burn in the captions and reupload it to Instagram TV. Otherwise there are apps that exist, Clipomatic or Apple Clips. Apple clips is free. I have tried that, that is fairly accurate. Facebook, slightly different depending on if you have a business page or just a personal post. On the personal post when you add a video, after you have published it gives you the notification to say it is processing and tell you that the video is ready. When ready you can edit the video and in the options there is options for captions. If you are a business page you can have the option to auto generate the caption but you want to review to make sure that they are appropriate. How to add Captions on YouTubeYouTube is probably where we are most familiar with that provides this sort of thing. With you add videos to YouTube they automatically start to generate automatic captions. It may not appear immediately. Go to YouTube's media studio, so, profile, select media studio and in navigation there is subtitles option, in there you can see the initial dialogue track and it will say: Add as you don't have subtitles yet. When they are ready you can see the automatic captions track it will say to duplicate or edit. When you select that you get 3 options you can choose from.To upload a file. To auto sync, a transcript of the dialogue that will automatically work out the timings for you. Or choose to type in the captions and set the timings and the last a auto translate you created the captions it will attempt to translate them into another language. So in the screen shots here. I have done that with my old video using the automated captions and I selected the Type Manually option. It is showing the edit transcript screen so we can quickly see there are problems there, with lots of ums, ahs, there are words that are wrong. I can edit, click assign timings and the next slide with the individual captions, their start and end times with a video preview and time line and keyboard short cuts so I can type in a caption, to press shift and space to play a video and then option so that when I start typing the video pauses, so set up in a quick way to help you to produce the captions quickly. Live StreamsYouTube has live streams. As far as I understand from YouTube's own documentation they only provide automated captions for a live stream for channels for over 10,000 subscribers. I believe what happens after that is that the automated captions are replaced by the standard automated generated captions to appear on the video after the event has taken place. Otherwise, you can do what is happening here. You have external captioner, they will use something like a decoder to send captions to your live stream while you are doing it. I have linked on the slide YouTube's document about the set up you may need for that. And failing that, there are services that exist to produce automated captions, one that I have heard of that is free is Web Captioner. It is experimental in the support for YouTube but it picks up the audio and tries to live translate or transcribe, rather, your captions. That is worth exploring if you want to explore adding captions into your live stream. Best practice for captionsSo, best practice for captions, you know, keep it clear, consistent, accurate, to be true to the dialogue. You don't want to be changing the dialogue. You want to give everyone the same experience. Allow time to be read. I believe that the guidelines on the BBC is something like 160 words a minute. But if you look in the style guides they talk about specific timings. Avoid obscuring the visual content. So you can see this on broadcast TV with a sporting event, sometimes the captions are at the top of the screen. That is so they don't cover for example the scores or something that may be at the bottom of the screen and vice versa. And usually no more than two lines a caption. Sometimes you can get away with three but don't obscure too much content nor to give too much for people to read as well. And write in proper sentences. Capitalise letters, using proper nouns and punctuation and breaking at appropriate points, so a comma and the end of a sentence. And convey the dialogue but the relevant sound effect, the music and the source. Audio DescriptionLastly, in terms of audio and video, I want to talk quickly about the audio description. You are providing audio track to describe visuals on the video necessary for understanding. The best practice is to not need it. If you plan ahead and know the content with the video you can describe it with the dialogue, making audio description unnecessary. That is relevant for a how-to video demonstrating a technique, you literally describe what you are doing and the features to be aware of you may not need audio description.Nor it is it necessarily required with a single speaker, talking into a camera with unchanging background. There is no visual information providing content to understand the video, it is only dialogue. You must be mindful of title cards so, providing that in the dialogue, or providing that as text around the video as well. And the key reason I want to mention this is, YouTube is now, as of very recently, supporting multiple audio tracks. I have a screen shot from the Assassin's Creed trailer. The basic option on the video is to have not just the original audio track but the track with the audio description within. So in the pauses between the dialogue you can hear the audio describer describing the visual events in the video. Closing SummarySo, in summary ... we have been through a lot, very quickly! We have looked at text, images and audio and video. If you take things away from today, what can you do? Please always provide a text alt for your images. I should have mentioned on Chrome, I use Extension for Twitter that basically will not post an image without the Alt Text. I think it is called: Required alternative text. So look it up.In the tweets and profiles use the legible fonts. Avoid symbol fonts, they are not picked up by assistive technology. If you are using images of text, ensure there is sufficient contrast between the text and the background.If you have illustration, what are the key parts necessary for understanding? Make sure you have sufficient contrast between the adjacent colours. Be sparing in your use of emojis. They can make our tweets look too busy and full. But they will be announced by things like screen readers and make it laborious to get through the content. Write your hashtags in what is known ‘Camel Case’. So capitalise the first letter of each word. It gives a breathing room in terms of reading and in terms of the announcement by things like screen readers, making it easier to understand. And please, please provide captions on the videos and transcripts on the podcasts and where relevant audio description on videos. Useful LinksI will end with useful links that I added on here. The Government Digital Service has a page about writing for disability. What sort of terminology that is appropriate. I have linked to a dictionary by an engineer, Tatiana Mac: Self Defined. It’s full of terms that we use that could be problematic, ableist and discriminatory. There is content warnings on it but it is a great resource. It’s also open source.I have linked to the Stroke Association, Guidelines on aphasia. How to present information in accessible ways.And then Twitter and YouTube, how to add image descriptions and captions. Could not find a lot of information regarding Instagram TV as that is very new, so I linked to an article that describes how to do automatic captions on Instagram TV. Thank you very much! Questions and AnswersANNIE: Wow! A whistle-stop tour of accessible social media. Thank you, Daniel for all the practical pointers. So, I am sure that you all have a lot of questions to ask. We have lots waiting for you. I doubt we will manage to cover all of them now, so we will capture the questions in a follow-up blog with the answers and add them to the website in the next couple of days. Apologies if we don't get to your question right now. I will pick out some of the questions for you, Daniel. The first one: Can descriptions be applied to .GIFs as well? DANIEL: As far as I know. If you are thinking of Twitter, yes, you can. Yes, they definitely can. I have tried it! ANNIE: Another question from Ruby: She knew about Twitter and Alt Text. Can you go back to previous posts to edit them and to edit Alt Text? DANIEL: Really good question. Twitter you cannot but Instagram you can, so please do go back and change to Alt Text. ANNIE: If images include text and picture which should be described in the Alt Text first? DANIEL: Oh! I think that depends. It is subjective. Think of the purpose of the image with the text. What is the most important information you trying to convey? If it is the text people should take away and the image is to give it colour and background, almost decorative, you may not need to describe it, so prioritise the text. But it depends on if it is an image or a text. What is more relevant. Include both, obviously, if both are relevant. ANNIE: OK. A question from Rama: For automatic captioning and subtitles they are helpful in English speaking videos, do you have suggestions regarding non-english speaking automatic tools?DANIEL: I don't, that is a little beyond my area of experience. I would be happy to research that. But I don't know any tools off the top of my head. The only thing that I came across with YouTube's auto translate option but I can't speak to how reliable that is. ANNIE: A couple of questions on this: From Priscilla: How can we make Instagram Stories accessible? DANIEL: Yeah, so Instagram stories have problems, basically it is a slide show. So it has problems with the timing. You have to physically hold the finger on the screen to stop the slide show. That causes issues for people with motor problems and the time alone can be a factor. I think there, if you are putting clips in, you can record your subtitles elsewhere and upload them that way. I have seen people using the built in text tools to add text around there, be mindful of the colour contrast, to make sure, particularly when doing text at the top of the stories. It’s not always obvious, there is a little square, I think a little A in it, when you tap it puts a background on the text. You can tap it a couple of times for different options so pick dark against light to make a good contrast. That is seem of what I can think of at the moment. ANNIE: Blitzing through the questions! A question from Carita: Will Alt Text be shared if we make Alt Text on Instagram and use the share option to Facebook or do you have to re-write it? DANIEL: I don't know about Facebook. I know who you embed Instagram photos on a website, no, the Alt Text does not pull through. It may change. I know Instagram were writing recently about updates the API that developers may use, so maybe more options, what does pull through is the caption underneath the Instagram post so, that is a reason why it is good to double up. I don't know about Facebook. I have done it a lot but I didn't check if it pulls through but I expect that the caption will, at least. So try with the caption. ANNIE: The video you played of the screen reader describing photos and tweets, is it sped up automatically or does it read at that speed? DANIEL: Apologies, that was the speed on the screen reader, to be honest that is slow by screen reader standards. I don't use it, it is just for work. It was at the time I recorded that video. So I set it to me for a comfortable setting but for the purposes of this, maybe a little too fast, yes, you can, and it is built into IOS, you can change the speech rate on all screen readers and on the voiceover screen reader as well to make it slower if you need to. ANNIE: From Cariss:Can you edit the auto generated transcripts on YouTube. People have problems with them in not being close enough to the audio captioned? DANIEL: As far as I understand it, the transcript is coming from the captions. If you go to the subtitles menu to edit your caption and select the transcript, the part where it shows the text, you can edit that text, it will be pulled through into the captions. I have not seen anything to say otherwise. It does not seem like they provide a separate transcription option, it is whatever you provide for captions is put together in a wall of text and pulled through into the transcript. So, yes. ANNIE: Also on subtitles or captions, what is the recommended text size for videos on social media? DANIEL: I think it comes down to percentages or the percentage of the screen size as people views on different devices. Here I suggest looking at the BBC Subtitling Guidelines but I don't have a figure for you right now. But do look at their BBC Subtitling Guidelines. It is worth mentioning I am doing a session on accessible video in December, that people may wish to join when I intend to go through more detail in terms of captions. That is the kind of thing I will include there. But I think it is a particular percentage of the screen size is what I can think of at the moment. ANNIE: I have seen people using image descriptions within the caption of Instagram images, is there advantage to doing that over Alt Text or are both used for different reasons? DANIEL: I don't know about useful for different reason, I think there are advantages in the sense of it raises awareness as people will start to question why you are writing image descriptions. If you know nothing of disability or accessibility you will think: ‘That is something that I should be doing.’ But also it is providing something into two channels of information and in doing that for that redundant information, it helps different learning styles. It helps people accessing your content in different ways. I think that until our platforms are a little better, it is a good idea to be adding that redundant information there, just in case, back to the example of where the Alt Text or the image does not pull through to another service for example. So, I would certain not advise against it, I don't see negative issues, it is a little repetitive but the benefits outweigh the negative. ANNIE: One or two more questions: We are limited to platforms adding more accessibility features, what more do you think is needed or currently missing from the corporations and how can we influence them? DANIEL: Yeah! That's a big question! I wanted to have a slide that basically just said, you know, raise voices about this. Get on the accessibility hash tags and tell Twitter, Instagram, Facebook, when it is not up to scratch. YouTube recently removed their community contributed captions as it was not getting used. But the argument there is it was not used as people did not know about it. It was hidden. So the features must be more visible. They are not special features they benefit all of us so, make them the default, part of the options. Studies are shown when captions are enabled by default people don't turn them off. It reinforces the information having it on many channels. It is a really big topic. More consistency across the platforms and just more, more interest. You have seen in some of my screen shots for example, even in the accessibility features they are inaccessible. They have poor contrast, small fonts, they are hidden away. There is a lot more I can say on that topic! ANNIE: Looking at the time, I think we have to wrap it up there. As I mentioned, we will try to capture any unanswered questions and put together some text for our website. We can do that at the end of today. In Closing I wanted to share information that may be of interest to people on the webinar. We have run online training sessions on digital accessibility. You can find out more about this at: .uk/trainingThere is an exclusive 10% off discount code which is Socialmedia10. And the news letter, if you have signed up for the latest announcements an digital accessibility at: .uk/newsletterAnd we have a YouTube channel, Podcast and a suite of accessibility services to suit your organisation.And finally, don't forget about our next webinars at: .uk/webinarsThe next session is on Tuesday the 10th of November with Christopher Patnoe who heads up Digital Accessibility at Google, and just after that is ‘How to spot an online scam avoid it’, and that’s on the 24th of November. Thank you, Daniel and for all of you for joining us. We will be in touch with you soon. Bye everyone!DANIEL: Thank you very much everyone. ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download