A student sits in lecture and puts their head in their hands, seeing the results of AI transcription from their laptop.
Created with DALL·E

AI Transcription Isn’t Working for Students with Disabilities. Here’s How to Fix It.

“So let’s check credit seminars typically right like on people a lot like discussion, a lot of policy and that. But they’re also like similar to like, regular kids freedom to talk with each other. Quite a bit to let me note, I’m not the only one who teaches dance class.”

This paragraph is a state-of-the-art AI technology-generated transcript of a lecture at Georgetown University Law Center. Do you think you could turn it into something useful? Think carefully about your answer, because the Law Center expects students with disabilities to do just that. Georgetown is just one of many law schools in the United States implementing technologies that not only fail to address accommodations needs, but simply do not work. In this article, we explain how such technologies have harmed both students with disabilities and the entire student body, and we provide recommendations for how to deploy these moving forward.

The accommodations process, past and present

The Americans with Disabilities Act (ADA) and Section 504 of the Rehabilitation Act of 1973 require postsecondary institutions to provide reasonable accommodations for disabled students. This means that when a student submits medical records to the Law Center’s Office of Disability Services (ODS) demonstrating the existence of a disability, Georgetown must ensure that the student has “an equal opportunity to achieve the same result or the same level of achievement as others.”

Peer note-takers, along with class recordings, have long been recognized as reasonable and effective accommodations for disabled students who cannot take such notes on their own, for example, due to a motor or cognitive limitation. In the past, such students could request a note-taker and ODS would pay a classmate to take notes on their behalf. This fall, administrators purchased an Otter.ai business license and began providing it to disabled students instead of note-takers, sometimes over students’ objections.

If AI transcription is used to accommodate students with disabilities, it should complement, not replace, student note-takers. This is not only good and inclusive policy but may also be necessary to provide the “reasonable accommodations” required by law.

AI transcription technology does not work 

“I guess denoted by detention for right back to the five month veil of pretrial detention, who gets held back they get filled in between the preliminary hearings for this detention, I guess, assume they’re during the chapter three on the bus.”

AI transcriptions are often neither accurate nor usable. Otter does not allow real-time error correction, meaning students must note errors and associated timestamps in a separate document as they occur. Requiring this kind of monitoring from an individual who has requested assistance with note-taking due to cognitive or physical limitations is far from an “accommodation”—it distracts from the learning experience and hinders class engagement. It also requires students with disabilities to take additional time outside of class to replay the audio recording and correct mistakes. Irrelevant conversation, grammatical errors, and filler words often riddle a transcription with mistakes, making this a laborious and lengthy process.

This burden is especially felt by law students, because Otter cannot properly process law-specific terminology, such as case names and terms of art. The software was not originally designed with the legal field in mind. One student reported spending fifteen hours outside of class working to create usable transcripts for just two class periods. This would be an enormous burden on any law student, but is especially arduous for students with disabilities, many of whom already have less time to dedicate to their studies.

AI biases harm our most marginalized communities

AI models inherit biases from their training data and design which disproportionately impact marginalized communities. This bias manifests in two ways: the software design process 1) does not incorporate the perspectives of people with disabilities and 2) leads to inaccurate transcriptions of accents and languages of people from diverse communities.

First, without incorporating the perspectives of people with disabilities, transcription tools cannot adequately serve them. As an Otter executive explained, “We didn’t build a product with accessibility at the forefront, it came to us.” Though Otter is certainly trying to improve its accessibility, the problems mentioned above show there is still a long way to go.

Second, AI transcription tools can only recognize accents and terms they have been trained to understand. Otter only supports English, mostly as spoken in North American or European accents. As a result, Otter is worse at transcribing classmates and professors that are non-white, those who learned English as a second language, or anyone with an accent. Beyond Otter, this is a widespread and well-documented problem: a recent study found that error rates for automated speech recognition tools for Black speakers were nearly double that of white speakers. By favoring white professors over professors who speak African American Vernacular English or with accents, Otter cannot help but perpetuate inequality.

No, the solution isn’t just more of the same AI

It is possible to improve the accuracy of AI transcription models by providing more data, through a more diverse set of speakers and a greater breadth of legalese. But that training should not take place at the expense of the time and privacy of Georgetown Law students.

The Office of Disability Services licensed Otter in part to replace Panopto videos, which were universally accessible during the first two years of the COVID-19 pandemic. In response to faculty concerns that the recordings stifled classroom discussion, GULC revoked the universal accessibility policy and began sharing Otter with disabled students. Yet Otter recordings of classes capture student speech the same as Panopto; the only difference is that students in Otter-recorded classes are not notified  that their every question, comment, or remark will be recorded, raising serious concerns around consent norms. Moreover, Otter’s retention of class recordings on its servers to train its “proprietary artificial intelligence technology” raises serious concerns. To name a few, evaluations have raised concerns over Otter’s privacy practices; a journalist who interviewed a Uyghur activist once received an email from Otter about their conversation afterwards; and Otter shares data with third-party advertisers. And the research shows that de-identified data is not really anonymous and AI models can leak their training data, posing a real risk of class recording data getting exposed elsewhere. If students knew about this, the use of Otter could very likely chill free speech.

Moreover, for students who do not wish to disclose their disability status, using Otter is a privacy nightmare. First, students must obtain permission from professors in order to record a class, inviting ableism, stereotyping, and discrimination. Second, ODS advises students to sit at the front of the classroom and keep Otter open on their laptop screens to monitor errors, which “outs” disabled students to their peers sitting nearby. Third, disabled students register for the services using their Georgetown emails, providing Otter with a list of students with disabilities and their contact information. Why does this matter? A person’s disability status is considered  “High Risk Data” at Georgetown, meaning it should be restricted as much as possible—not organized for third parties to access or share.

Finally, training Otter to be usable in the legal field should not require disabled students to put in hours of labor correcting transcripts. Nor should those students be forced to rely on substandard or incorrect transcriptions, especially from marginalized professors and students, in the interim. There is a longstanding history of bad actors in the United States exploiting the data and bodies of the most marginalized, including disabled people, for profit. Otter should ensure it proceeds ethically, improving its AI and data by offering its services in beta-test form or compensating users for their labor, while ensuring the consent and privacy of all participants involved.

Recommendations, and the bigger picture

Let us be clear: despite their flaws, Otter and other AI transcription tools can still meet specific needs for some disabled people. But while AI transcription could be a useful tool for some students, it cannot fully replace note-taking or class recordings as a reasonable accommodation.

Moreover, AI transcription at Georgetown Law is but part of a larger, concerning trend of implementing AI technologies into the classroom without due consideration of their harms. Institutions should take a more careful approach to integrating AI systems that follows the principles in the White House’s Blueprint for an AI Bill of Rights: 1) ensure systems are safe, accurate, and effective; 2) take proactive measures to prevent algorithmic discrimination; 3) maintain individuals’ data privacy; 4) provide notice and explanation to users on how and why the system is being used; and 5) allow users to opt out of automated systems. Based on these principles, we provide specific recommendations for how to implement these systems moving forward:

Recommendations for Georgetown and other universities

  1. Bring back note-takers: If Georgetown continues to use Otter, it should follow a “both-and” approach by offering it as a complement to, not a replacement for, preexisting, functional accommodations such as note-takers.
  2. Provide notification and consent: The school should notify students when a class is recorded. Georgetown should ensure Otter does not store those class recordings on their servers longer than a semester or use them to train their AI models unless all students consent to such activity. Or, Georgetown should license with a different company that has these capabilities.
  3. Provide community representation for tech procurement: Georgetown should create a procurement board for technology that’s supposed to serve disabled students, and ensure disabled students or faculty are represented on it. This could prevent similar issues in the future and mirrors proposals in other contexts such as surveillance technologies and pretrial incarceration algorithms.

Recommendations for Otter and other AI vendors

  1. Include customers in the process: To create a tool that meets the needs of the disability community, ensure members are involved from the start by hiring executives, engineers, or designers who are themselves disabled. Elevate accessibility as a priority by creating a role dedicated to the issue, such as a Chief Accessibility Officer. More broadly, ensure similar practices are implemented for other marginalized communities.
  2. Use a values-based approach to guide the sale and use of technology: Technology vendors have the power to shape the ecosystem by embedding human values in their software. Create an AI ethics policy that respects individual dignity and autonomy. Sell and market the technology as a complement, not a replacement, to existing accommodations. Don’t overrepresent the AI model’s efficacy; publish model cards to be more transparent and honest about its training process and biases.
  3. Make AI models better, the right way: Train AI software to be more effective across various contexts. But don’t experiment on vulnerable people without their consent and knowledge. Ensure that when using people’s data for training, each individual provides informed consent and is compensated for any costs incurred.

These proposals are not groundbreaking; the accommodations process is already supposed to be individualized and tailored to each student’s needs. These recommendations would both address students’ immediate needs and ensure that more systemic changes prevent such harms from happening again. By demonstrating the responsible use of AI technologies in education, and by fully respecting the needs of our disability community, all institutions involved—both educational institutions such as Georgetown and AI vendors such as Otter—can lead and set an example for others.

E.L. Tremblay & Ashwin Ramaswami

E.L. Tremblay
Disability Law Student Association President; Georgetown Law, J.D. expected 2023; Colgate University, B.S. 2016.

Ashwin Ramaswami
GLTR Staff Member; Georgetown Law, J.D. expected 2024; Stanford University, B.S. 2021.

Thanks to Rebecca Johnson and Joanne Springer for their thoughts and feedback on this piece, and to Amanda Levendowski for reviewing this piece in her capacity as Faculty Advisor of the Disability Law Students Association (DLSA).