As artificial intelligence makes inroads in the legal system, court reporters are pushing back, highlighting the potential risks of relying solely on AI and automatic speech recognition for capturing and transcribing legal proceedings.
Recently, the National Court Reporters Association (NCRA), a 125-year-old trade group representing over 12,000 professionals, hired Washington, D.C.-based lobbying firm Farragut Partners to help them address the emerging use of AI in the judicial system. The NCRA has expressed concerns about the potential harms of using AI and ASR in legal proceedings without the presence of a stenographic reporter to verify the chain of custody of the official record.
The push for AI and ASR in the legal system is driven by the promise of increased efficiency and cost savings. Proponents argue that these technologies can streamline the process of capturing and transcribing legal proceedings, reducing the need for human court reporters. However, the NCRA and other experts in the field have pointed out several critical issues that must be addressed before AI and ASR can be trusted to handle such sensitive and important tasks.
One of the primary concerns is the accuracy of these technologies. As highlighted in a white paper published by the NCRA in November 2023, AI and ASR systems have been shown to have significant biases based on factors such as race, gender, and age. For example, a Stanford University study found that the error rates for ASR systems were nearly double for black speakers compared to white speakers. This bias raises serious questions about the fairness and reliability of using these technologies in a legal setting, where the accuracy of the record is of utmost importance.
Lisa Migliore Black, Vice-Chair of the NCRA STRONG Committee, emphasises the gravity of the situation: "In a justice system dedicated to equality, automated transcription has no place because it continues to yield transcripts that contain many 'inaudible' parentheticals as well as higher error rates than trained human court reporters."
Another issue raised by the NCRA is the potential for manipulation and tampering of digital audio and video files. With the rise of deepfake technology and voice cloning, it has become increasingly easy to alter recorded content, which could have severe consequences in a legal setting. Without proper safeguards and a clear chain of custody, the integrity of the official record could be compromised, eroding public trust in the judicial system.
While the adoption of AI and ASR in the legal system is still in its early stages, it is crucial that policymakers, courts, and legal professionals carefully consider the potential risks and limitations of these technologies. The NCRA's efforts to bring attention to these issues and push for regulations and guidelines are an essential step in ensuring that the pursuit of efficiency does not come at the cost of accuracy, fairness, and public trust.
As Sue A. Terry, Chair of the NCRA STRONG Committee, aptly puts it, "Success cannot be measured by short-term budgetary considerations but instead should be measured by honest, equal, and fair treatment for all parties."
The battle between court reporters and AI in the legal system is not just about protecting jobs; it is about maintaining the integrity and reliability of the official record upon which the entire judicial process depends.