Profile picture of John Scott

John Scott

Apr 14, 2019

Talking Accessibility Data in Ohio

Benchmarking your Journey to Inclusion

Talking to the Team at the University of Toledo

On our way across the country to California for the CSUN conference, we made a quick two-day stop in Ohio to visit the University of Toledo (Learn LMS) and Bowling Green University (Canvas LMS), where we held our Universal Design for Learning Workshop and a Data Review. We also had the chance to sit down with the team at the University of Toledo to learn about their journey with Ally so far, as well as hear their reactions following their data-review session. 

 

[https://youtu.be/oFyn6LfmxQ4](oembed:https://youtu.be/oFyn6LfmxQ4)

We did a full [Ally] implementation in the Fall. We were prepared for pushback and all kinds of things, and it didn’t happen. Faculty have been using it, faculty have been embracing it. We were worried about it being accepted, and it’s been wonderfully received.

 

– Melissa Gleckler, Educational Technologist 

Three ways to track your progress

In our most recent Ally Community Story, Eric Kunnen from Grand Valley State University shared how GVSU is using Ally data to systematically address barriers to inclusion. When you’re dealing with tens of thousands of content items across the learning management system, using data to inform strategy and benchmark progress is crucial to establishing a scalable, sustainable approach to those challenges. At each campus stop on the Tour, we’ve held a “Data Review” session with key personnel to analyze some key Ally data points in charting a pathway forward.  

So how can you benchmark your progress in improving the accessibility of the content in your LMS? Of course, you can refer to your overall accessibility score over time in the institutional report for a quick, high-level view at changes from term to term, year to year, or month to month. But depending on the amount of content in your LMS, change may appear glacial at this scale. Further, changes in content from term to term can also cause fluctuations in overall score that may not be a reliable indicator of progress. For a more fine-grained look at your progress over time, here are three ways to benchmark your journey to inclusion.

Issue-specific progress

From the Institutional Report, focus on progress across key issues. Five issues we focus on in the “Data Review” are:

  • Total number of scanned documents

  • Total number of untagged documents

  • Total number of documents missing headings

  • Total number of images missing a description

  • Total number of documents with contrast issues

When you hover over each issue in the Institutional Report, you can see the total number of files with that issue out of the total number of files that could be affected by that issue. Record and timestamp those numbers offline to mark your starting point, and check those numbers periodically to see your progress on that issue. You can even calculate the percentage of files with that issue, and compare your numbers to what we saw in our data study across 21 million course files.

Course-level progress

Focusing on progress within individual courses can also be an effective way to benchmark your progress. In our Fresno State Community Story, Bryan Berrett and Walt Hebern share how they benchmarked pilot course progress, including total time to remediate a course. There are a few ways you can benchmark your course-level progress. You can take a snapshot in time of all your courses by using the Export button from your Institutional Report. If you find this CSV is overwhelming to work with, you can also export individual courses by navigating to the “Courses Tab” and searching/selecting individual courses. You can Export a CSV of an individual course, or simply record the key issues and score for each course you want to benchmark. Track your progress by re-running the export at a later date, and seeing how the course score and issues changed over time.

You might start with high-impact courses, such as those with high enrollment numbers or with a large amount of content, and consider ways to support faculty in fixing those issues. Jeremy Olguin at Chico State has taken a course-level approach, offering to inventory course files and accessibility issues for faculty, and then suggesting files that they can start fixing on their own, such as image descriptions and fixing headings in Word docs, as well as files that his team can fix for them, such as tagging PDFs or replacing a scanned document.

Usage data progress

A third way to benchmark your progress is by requesting your usage data, which includes files altered through the Instructor Feedback and downloads of Alternative Formats- you can request access to these data by filing a Behind the Blackboard Ticket. For files altered through the Instructor Feedback, you might begin by comparing your overall number of clicks of the Ally indicators with the total number of attempts to fix files. Early in your roll-out, you may see a high number of indicator clicks against a low number of attempts to fix the file, as instructors first begin exploring the feedback and raising their awareness about accessibility issues. Across all Ally campuses, we have seen instructors attempt to fix a file about one-third of the time that they click an indicator, so if you’re conversion to fixes remains low, you might consider offering some targeted professional development. Keep in mind, instructors might fix files without uploading through the Ally, such as by deleting the old file and adding the fixed file as a new file. Ally data won’t reflect these fixes, so you might refer to the course-level issue data for more complete progress tracking.  

You can also look at your file fixes and alternative format downloads by Course ID. You can then cross-reference the Course ID with your Institutional Report export to determine which course/faculty are making fixes to their content or where downloads are happening. When you observe courses where instructors have made a significant number of fixes, you may have found a campus champion. Reaching out to them with an email thanking them for their contributions and working with them to encourage peers to do the same can help build momentum and drive adoption. If you see a course with a large number of alternative format downloads, you might also reach out, and inform the faculty, “Did you know students are downloading alternative formats of your course content, and that by making a few small fixes to your files, you can improve the quality of those formats? Here are a few places to get started.”  

 

 

Like (2)  · 
Latest blog posts