Cognitive Web Accessibility: Readability 2012

Published in 2012, these resources are original studies, literature reviews and/or related articles that cite references.

  • Reading Adaptations for People with Cognitive Disabilities: Opportunities
    "Some people with cognitive disabilities have difficulty with aspects of reading other than seeing and decoding text. The aim of this note is to bring to the Symposium a number of opportunities for research that may lead to ways to adapt textual content so as to make it easier to read for these people."
  • Social Networking Service for People with Cognitive or Speech and Language Impairments
    "Social media has become an important tool for social networking. However, most social networking services are very challenging for people with learning disabilities or cognitive impairments. The problems are mostly related to understanding the different concepts of the environment and related terminology, but the accessibility and usability problems common to all internet services also apply.
  • Including Easy to Read, Legibility and Readability into Web Engineering
    "This position paper discusses the feasibility and a possible structure to include "Plain Language" or "Easy to Read" into the process and workflow of Web-engineering regarding the inherent (or micro-) workflow of authoring in "Plain Language" or "Easy to Read", and the (meta-) workflow of authoring and designing for the web in general. Following studies of Web-Engineering workflows and experiences out of day by day practice, legibility and readability in most cases are treated as an additional or parallel activity, comparable to the concept of (screen reader) accessible "extra pages" we had more than a decades ago. Most often customising text is recognised as a requirement for content development - a follow up of the design and implementation phase, neglecting the need for addressing those requirements already in structure, design, navigation, look and feel, functionality and layout. Legibility and Readability is meant as both, a service for a specific user group (people with cognitive disabilities) and at the level of general readability (e.g. "Plain Language" use) nevertheless lacking the state of a general requirement of Web Accessibility. To avoid inefficiency, inaccessible websites and extra costs, those new aspects demand for early planning and inclusion in the Web Development process. Moving legibility / readability from an add-on or posterior activity to an integral part of the Web Engineering process will evolve accessibility."
  • Accessibility 2.0 - Providing improved Access to text information for People with cognitive and intellectual disabilities by user generated content
    "The usage of electronic information and communication services is of great relevance in different parts of people's daily life, including private and work activities. People who are not able to use modern information technology - e.g. due to a disability - are therefore threatened to be excluded from modern information society. Especially people with intellectual disabilities regularly face problems preventing them from accessing information. These problems include hard or unfamiliar words, long sentences or complex phraseology. In addition an illogical or irreproducible information structure can prevent people of this target group from accessing information too. This submission to the RDWG Online Symposium describes an ongoing PhD project that implements an approach using concepts typical for "Web 2.0" applications. The basic idea is to build up a community that composes explanations or easier to understand alternatives for hard to understand text elements. The community may create explanations using different formats including text, audio, video and images. These explanations are stored by a server forming a "glossary" service. An extension in the users' browsers queries this glossary and enhances the original web contents with these explanations where appropriate."
  • MIA - My Internet Assistant for successfully reading and using web content
    "Many public and governmental services, regulations and information aim at people who are unemployed, have low education levels, have disabilities or are elderly. The same groups have low levels of digital skills compared to the population at large and experience problems in comprehending the digital content and applying it in realistic tasks. According to studies of adult literacy, about 10% of the Dutch population has a literacy level that is insufficient to use written information to fully participate in society, realize their own goals and develop their knowledge and competences. The proportion of the Dutch population that has inadequate digital skills to fulfill ordinary daily-living tasks on the web is even much bigger. That means that large parts of web content is inaccessible for people who, for reasons of cognitive or auditory disabilities, or because of inadequate schooling, lack the required written text and/or digital literacy. The problem of difficult-to-understand web content is not simply solved by making web texts easier to read."
  • Evaluation of Terminology Labeling Impact over Readability
    :"One of the factors that can decrease readability is the presence of specialized language and advanced terminology. Such terminology can influence the understanding and the ease of read for lay persons. Several research studies have identified that many of the existing web sites within specialized domain use language with advanced terminology that is often inappropriate for their target audience. Examples of domains where this can happen are medicine, technology, low, finance and others. There are several Natural Language Processing (NLP) tools that are trying to cope with this. In this paper we evaluate the efficiency of a NLP tool that is labeling medical terminology with the terms definition"
  • Calculating text complexity during the authoring phase
    "Reading and understanding texts containing long sentences, unusual words, and complex linguistic structures can be very hard for persons with cognitive or learning disabilities. Knowing the readability level of a document, users have the opportunity to choose the most suitable text, from a collection of documents delivering the same information. Considering that the availability of texts annotated with the proper level of readability is strictly relied to the availability of software supporting the authoring, and that the literature is still neglecting this aspect, we have been working on SPARTA2, a tool supporting the authoring of highly accessible texts."
  • Easy-to-read text characteristics across genres
    "Traditional readability indices and formulas have mainly rested on shallow features such as word and sentence length, while deeper linguistic features contributing to text understandability have been ignored. Furthermore, the needs of specific groups of readers have generally been overlooked, and easy-to-read texts have predominantly been produced in order to fit a broad audience including second-language-learners, dyslectics, beginning readers and persons with cognitive disabilities. We suggest a NLP (natural language processing) method for readability assessment of texts, based on deep linguistic features. We also propose some properties that might characterize text genre and correspond to a certain level of interest and intelligibility for a narrowed target group. Finally, in an ongoing project we exploit finds from previous activities, and suggest a method to supply easy-to-read newspaper texts with symbol support for persons in need of AAC (Augmentative and Alternative Communication)."
  • Improving the Readability of User-generated Content in Web Games Using Text Normalisation
    "User-generated content (UGC) has transformed the way that information is handled on-line. In this paradigm shift, users create, share and consume textual information that is likely to present informal features such as poor formatting, misspellings, phonetic transliterations, slang or lexical variants (Ritter et. al., 2010). These texts found in social networks, chats or blogs, usually offer poor accessibility for people with cognitive disabilities or people not familiar with these non-standard language deviations. Moreover, when social web gaming entered mainstream thanks to the appearance of new technologies such as HTML5, that gave support to hundreds of graphically-rich multiplayer web games, additional challenges appeared. The HTML5 canvas element is merely a low-level drawing surface and text rendered with this element lacks support for automatic accessibility or localisation tools. Thus, the informal and noisy textual UGC found in in-game chats is usually difficult to understand for both people and accessibility tools such as screen readers and text simplification or normalisation applications. Also, some on-line gaming communities develop their own sub-culture and vocabulary which can exclude newcomers. In order to overcome these challenges, web and social multiplayer game developers should normalise user input in order to provide alternate clean texts. For this reason we propose TENOR, a multilingual text normalisation web service with aim to help web and on-line game developers to process in real-time textual UGC in a way that can be understood by the majority of users."
  • Bridging the Gap between Pictographs and Natural Language
    "When using digital pictograph communication environments, such as the WAI-NOT environment (www.wai-not.org), which aims at users with cognitive disabilities, users can give input in two forms. They can select pictographs from a two-level category system or they can use text, which is then converted into pictographs. In the conversion from text to pictographs, we see that only straightforward string matching procedures are currently applied, resulting in two types of problems. The first type is the fact that words possibly do not match with the name of a pictograph, so no pictograph is generated, for instance when verbs are conjugated because no lemmatisation takes place. The second type is the fact that words occasionally match with names of wrong pictographs, so a wrong pictograph is generated. Both types of problems lead to difficulties in understanding the message converted into pictographs"
  • Guidelines or standards for Easy-to-read?
    "Despite of many efforts, there is still no overall acceptance on the universal linguistic principles of easy-to-read (ETR). The principles vary from detailed and strict standards (e.g. Inclusion Europe 2009) to holistic and loose guidelines (e.g. IFLA 2010) which all try to guide the writers and authors to create simplified texts for non-fluent readers. This discrepancy seems to be connected to two main factors: to the definition of the persons needing ETR and to the definition of text genres that are supposed to be simplified. This paper aims to discuss these factors and analyze the different ETR-principles in order to summarize their adaptability to create different text genres and to serve different users of ETR."
  • Some Challenges for developing an Easy-to-Read Website
    "The guidelines for Easy-to-Read -material are originally intended for printed media. So quite understandably these rules mainly contain guidelines that concern how to E2R-content and how to present it. But Internet is a different kind of media. In addition to content, also the user interface and the structure of the site should be designed to be accessible when easy-to-read language is used in network services.One important issue when E2R-material is published in Internet is the fact that E2R-users have very different needs. For some user groups an easier content is enough but some user groups need an user interface which is considerably easier to use. User studies have also shown that the most important factors which influence the accessibility of a website for E2R-users are difficult to state as a simple rule or a guideline."
  • Easy-to-Read and Plain Language: Defining Criteria and Refining Rules
    "The technical aspects of Web content accessibility are discussed since many years and are addressed by international guidelines and legal regulations in many countries. The importance of understandable content and accessible information for persons with learning difficulties has only recently begun to receive increased attention. The rules and guidelines for understandable Web content are more heterogeneous. Often they were defined as ad-hoc rules and lack scientific evidence. This paper analyzes the differences between E2R and Plain Language (PL) with regard to target groups and guidelines. We present a linguistic analysis of selected criteria to get a better understanding of the guidelines for the two language levels.
  • Easy-to-Read on the Web
    "The symposium aimed to explore the user needs and state of the art in research, development, and practice to contribute to a common understanding of easy-to-read on the Web. It is intended to encourage the development of better guidance, support, and tools for developers, designers, and users, and to inform researchers, standards developers, and policy makers on how to better address easy-to-read on the Web. In particular, it is intended to analyze how to better connect, elaborate, and integrate the user needs in web accessibility guidelines and techniques."
  • Reporting Simply: A Lexical Simplification Strategy for Enhancing Text Accessibility
    "We propose an approach to simplifying lexical content, based on an empirical analysis of a parallel corpus of original and manually simplified texts in Spanish. We focus on the treatment of reporting verbs (RepV) – verbs that introduce both direct and indirect speech when reporting a speaker's language as a specific type of lexical units that have rather consistently received the same treatment by human editors. The present work is part of the Simplext project aimed at developing an automatic text simplification system for Spanish in order to make newspaper articles more accessible to readers with cognitive disabilities. The treatment of reporting verbs is just one element within the lexical module of the said system. Constructions containing these verbs are particularly common in the journalistic genre. The simplification of these expressions could, therefore, enhance readability of these texts for people with cognitive disabilities, and thus improve their accessibility to these essential sources of information."
  • Estimating Dyslexia in the Web
    "In this study we present an estimation of texts containing English dyslexic errors in the Web. A classification of lexical errors is proposed and unique dyslexic errors are distinguished from other kind of errors due to spelling and grammatical errors, typos, OCR errors and errors produced when English is used as a foreign language. A representative sample of each kind of error is used to calculate a lower bound for the prevalence of dyslexia in the English Web. Although dyslexia has been studied in the context of Web accessibility, to the best of our knowledge, an estimation of Web texts containing dyslexic errors was unknown. Our results are useful to tackle future work in Web accessibility among dyslexic users focusing not only in the interface but also in the text content."
  • How Bad Do You Spell?: The Lexical Quality of Social Media
    "In this study we present an analysis of the lexical quality of social media in the Web, focusing on the Web 2.0, social networks, blogs and micro-blogs, multimedia and opinions. We find that blogs and social networks are the main players and also the main contributors to the bad lexical quality of the Web. We also compare our results with the rest of the Web finding that in general social media has worse lexical quality than the average Web and that their quality is one order of magnitude worse than high quality sites."
  • DysWebxia: A Model to Improve Accessibility of the Textual Web for Dyslexic Users
    "The goal of this research is to make textual content in the Web --especially in Spanish and English-- more accessible to people with dyslexia. The techniques that we will use to make the Web more accessible are Natural Language Processing (NLP) for its content (text) and Web design guidelines for its layout. To find out which solutions tackle better our purpose we will test a diverse set of Web pages examples. The main methodology to evaluate these examples will be eye tracking using regular and dyslexic students. In the case that our findings show that there are strategies that make the Web more accessible for dyslexic users, we plan to develop and application which includes such results, transforming a regular Web site into a dyslexic friendly Web site."
  • On Measuring the Lexical Quality of the Web
    "In this paper we propose a measure for estimating the lexical quality of the Web, that is, the representational aspect of the textual web content. Our lexical quality measure is based in a small corpus of spelling errors and we apply it to English and Spanish. We first compute the correlation of our measure with web popularity measures to show that gives independent information and then we apply it to different web segments, including social media. Our results shed a light on the lexical quality of the Web and show that authoritative websites have several orders of magnitude less misspellings than the overall Web. We also present an analysis of the geographical distribution of lexical quality throughout English and Spanish speaking countries as well as how this measure changes in about one year."
  • Graphical Schemes May Improve Readability but not Understandibility for People with Dyslexia
    "This study explores the relation between text readability and the visual conceptual schemes which aim to make the text more clear for these specific target readers. Our results are based on a user study for Spanish native speakers through a group of twenty three dyslexic users and a control group of similar size. The data collected from our study combines qualitative data from questionnaires and quantitative data from tests carried out using eye tracking. The findings suggest that graphical schemes may help to improve readability for dyslexics but are, unexpectedly, counterproductive for understandability."
  • Lexical Quality as a Measure for Textual Web Accessibility
    "We show that a recently introduced lexical quality measure is also valid to measure textual Web accessibility. Our measure estimates the lexical quality of a site based in the occurrence in English Web pages of a set of more than 1,345 words with errors. We then compute the correlation of our measure with Web popularity measures to show that gives independent information. This together with our previous results implies that this measure maps to some of the WCAG principles of accessibility."
  • A Mobile Application for Displaying More Accessible eBooks for People with Dyslexia
    "In this paper we present an ebook reader for Android, which displays ebooks in a more accessible way according to user needs. Since people with dyslexia represent a substantial group with a reading disability, we designed a set of specific guidelines which are included in the tool. These layout guidelines for people with dyslexia are based on a user study with a group of twenty two users with dyslexia. The data collected from our study combines quantitative data from tests carried out using eye tracking and qualitative data from interviews, questionnaires and the think aloud technique. The ebook display includes the most readable options observed with the eye tracking and user preferences; however the settings are customizable."
  • There are Phonetic Patterns in Vowel Substitution Errors in Texts Written by Persons with Dyslexia
    "In this work we present an attempt to analyze vowel substitutions found in a corpus of Spanish texts written by children with dyslexia, taking into account the phonetic nature of the errors. First, we present a brief characterization of dyslexia (Section 2 ), followed by a discussion of the relevance of errors produced by persons with dyslexia (Section 3). In Section 4, the corpus of Spanish texts written by children with dyslexia is described, and in Section 5 we put forward a typology of errors based on the corpus and provide the frequency of occurrence of each class of errors. A phonetic analysis of vowel substitution errors is presented in detail in Section 6. Finally, in Section 7 we summarize the conclusions of the study."
  • An Eye Tracking Study on Text Customization for User Performance and Preference
    "This paper presents a user study which compares reading performance versus user preference in customization of the text. We study the following parameters: grey scales for the font and the background, colors combinations, font size, column width and spacing of characters, lines and paragraphs. We used eye tracking to measure the reading performance of 92 participants, and questionnaires to collect their preferences. The study shows correlations on larger contrast and sizes, but there is no concluding evidence for the other parameters. Based on our results, we propose a set of text customization guidelines for reading text on screen combining the results of both kind of data"
  • IDEAL: a Dyslexic-Friendly eBook Reader
    "We present an ebook reader for Android which displays ebooks in a more accessible manner for users with dyslexia. The ebook reader combines features that other related tools already have, such as text-to-speech technology, and new features, such as displaying the text with an adapted text layout based on the results of a user study with participants with dyslexia. Since there is no universal profile of a user with dyslexia, the layout settings are customizable and users can override the special layout setting according to their reading preferences."
  • The Presence of English and Spanish Dyslexia in the Web
    "In this study we present a lower bound of the prevalence of dyslexia in the Web for English and Spanish. On the basis of analysis of corpora written by dyslexic people, we propose a classification of the different kinds of dyslexic errors. A representative data set of dyslexic words is used to calculate this lower bound in web pages containing English and Spanish dyslexic errors. We also present an analysis of dyslexic errors in major Internet domains, social media sites, and throughout English-and Spanish-speaking countries. To show the independence of our estimations from the presence of other kinds of errors, we compare them with the overall lexical quality of the Web and with the error rate of noncorrected corpora. The presence of dyslexic errors in the Web motivates work in web accessibility for dyslexic users."
  • Optimal Colors to Improve Readability for People with Dyslexia
    "In this study we analyze how an specific aspect of text customization, text and background colors, can improve readability of people with dyslexia. Our user study compares two kinds of data, quantitative (user performance) and qualitative (user preferences), taking into consideration previous recommendations and the color luminosity ratio prescribed by the WCAG 2.0. (W3C, 2008)."
  • A readability evaluation of real-time crowd captions in the classroom
    "We ran a study to evaluate the readability of captions generated by a new crowd captioning approach versus professional captionists and automatic speech recognition (ASR). In this approach, captions are typed by classmates into a system that aligns and merges the multiple incomplete caption streams into a single, comprehensive real-time transcript. Our study asked 48 deaf and hearing readers to evaluate transcripts produced by a professional captionist, ASR and crowd captioning software respectively and found the readers preferred crowd captions over professional captions and ASR.
  • The Next 50 Years: A personal view
    "I review history, starting with Turing’s seminal paper, reaching back ultimately to when our species started to outperform other primates, searching for the questions that will help us develop a computational account of human intelligence.. I illustrate how these answers can influence a research program, describing the Genesis system, a system that works with short summaries of stories, together with low-level common-sense rules and higher-level concept patterns...I conclude by suggesting, optimistically, that a genuine computational theory of human intelligence will emerge in the next 50 years if we stick to the right, biologically inspired questions, and work toward biologically informed models."
  • Tracking the Mind’s Eye: A New Technology for Researching Twenty-First-Century Writing and Reading Processes
    "This article describes the nature of eye-tracking technology and its use in the study of discourse processes, particularly reading. It then suggests several areas of research in composition studies, especially at the intersection of writing, reading, and digital media, that can benefit from the use of this technology"
  • Significance of learner dependent features for improving text readability using extractive summarization
    Information and Communication Technologies play major role in all types of day to day life activities including Government, public and social domains. The need for HCI aspects to be taken care in these activities has become a predominant one. Especially, incorporating HCI features in the academic environment is getting more attention. In the case of reading materials associated with any type of academic process or content delivery, much focus is to be given so that even people with learning difficulties can get the information to his/her level of ability. Though lot of information is available in the academic scenario in the forms of text books, course ware, on line resources, web documents etc., not all content is readable by all. Tools and technologies must be made available to help people with learning difficulties including dyslexics. The idea is to reduce the burden levels of the learners with the help of assistive technologies. In this direction, this paper focuses on highlighting the importance of various features to be considered for making the text readable by all, using summarization techniques. Text summarization based on both document level and learner level features are discussed for making the content readable for people with learning difficulties. The experiments show that learner dependent features improve the readability of the text through summary as preview of the text.
  • The recognition of web pages' hyperlinks by people with intellectual disabilities: an evaluation study.
    "One of the most mentioned problems of web accessibility, as recognized in several different studies, is related to the difficulty regarding the perception of what is or is not clickable in a web page. In particular, a key problem is the recognition of hyperlinks by a specific group of people, namely those with intellectual disabilities...The referred analysis indeed shows that not only did these specific participants gain a better understanding of the demanding task, but also they showed an improved perception concerning the content of the navigation menu that included hyperlinks with images."
  • Web Article Publishing Guidelines
    "The following is a proposed standard for bringing more semanticity to articles on the Web. We hope that by providing some simple guidelines we can help publishers make their content a little more presentable with Readability while also making the Web a bit more semantic."
  • Ergonomics of usability/accessibility-ready websites: Tools and guidelines
    "The purpose of this research is to study the available literature on usability/accessibility ready websites and their tools and guidelines. The research findings will help web engineers to build websites and web services accessible for all the target audience, including people with special needs. A descriptive/interpretive research method was used for the study of usability, accessibility, globalization, readability and culture differences based on related literatures and on previous studies by academics and industrial institutions."
  • Accessible Content Generation an Integral Part of Accessible Web Design
    "The importance of accessible Web design significantly rose within the last years. This is also reflected in a growing set of legal regulations that demand for accessible Web design. Unfortunately these regulations usually consider technical accessibility only, the complexity of the language used plays a minor role. Thus a huge group of people cannot make use of the content presented on Web pages. This paper discusses in detail the important interaction of accessible Web design and Easy-to-Read to generate accessible content and shows which phases of the design process demand for intensive user involvement."
  • SymbolChat: A flexible picture-based communication platform for users with intellectual disabilities
    "Persons with intellectual disabilities benefit from participating in the modern information society, especially the World Wide Web, social media and Internet-mediated communication services." "Overall, the results show that social inclusion for people with intellectual disabilities can be improved with customizable communication tools. The implemented communication platform forms a solid basis for further improvements and new communication services. In addition, we found that users with motor impairments would greatly benefit from alternative input and output methods for symbol browsing and selection."

Note: If no resources are displayed above, please refresh this page.

Visit The Clear Helper Blog: Developing best practices of Web accessibility for people with intellectual / cognitive disabilities.