SCOTUS and Artificial Intelligence
Full-length view of the United States Supreme Court building in Washington, D.C.

SCOTUS and Artificial Intelligence

You may have experienced artificial intelligence’s (AI) role in classroom assignments, songs that you listen to, and even the movies and videos that you watch. Now, AI is being used to make the Supreme Court more accessible to everyday Americans. Here, btw examines how this innovative use of technology makes the Supreme Court of the United States (SCOTUS) more democratic. 

The Oyez Project 

The Supreme Court of the United States makes decisions that shape our everyday lives. Yet very few Americans ever get to see, hear, or experience that decision-making process for themselves. A professor at Northwestern University named Jerry Goldman believed that the public should have easy access to what goes on behind the closed doors of the Supreme Court.  

So in 1996, Professor Goldman created a nonprofit project called Oyez, an Internet site where people could go to listen to the Supreme Court’s oral arguments and opinions. Oyez included access to audio tapes of every case since 1955, when the Supreme Court began recording its courtroom proceedings. This was a big deal because usually, access to the recordings was very limited. People who wanted them would have to wait months to get them, and some recordings were lost altogether. Until Oyez, many Americans weren’t even aware that the proceedings were even being recorded at all. 

During the COVID-19 pandemic and lockdown, the Court was forced to allow every court case to be broadcast live, with the public able to listen in. Since then, the justices’ arguments have continued to be broadcast. But the announcement of their decision has always been limited to just the people in the courtroom at the time. And since no cameras are allowed in the courtroom, only a few hundred Americans throughout history have ever experienced the decision announcement firsthand. 

“On the Docket” 

To capture the experience of being inside the courtroom at the moment when an important decision is announced, Professor Goldman’s team used AI to create avatars of the Supreme Court justices. By matching the avatars with real audio, they were able to create realistic videos of the decision announcements. The first one they created was of Chief Justice John Roberts delivering a fourteen-minute summary from the bench of the Supreme Court’s decision granting U.S. presidents immunity from prosecution while in office. This is followed by a video of Justice Sonia Sotomayor, voicing her dissent (disagreement) with the decision. The project is called “On the Docket.” 

How Real is Reality? 

Goldman’s team also faced several ethical concerns when creating AI-generated avatars and videos of the justices delivering their decisions. How lifelike should they make the videos, and should they include some kind of warning label to indicate that the videos aren’t real? Ultimately, they decided to make the videos slightly cartoonish. This makes it clearer to the naked eye that they aren’t real–and to mark them as AI generated.  

The People’s Court 

Over the years, Goldman’s Oyez project has sought to give the people greater access to the Supreme Court in other ways as well. For example, Goldman used a digital camera to create virtual reality tours of the courtroom and the chambers of several of the justices. 

With “On the Docket,” Goldman’s team hopes to put snippets of the videos out through social media and YouTube. Goldman believes that this is the best way to get Gen Xers interested in watching the decisions that will shape their future. 

What Do You Think? The name of Professor Goldman’s project, Oyez, comes from the French verb ouir, which means “to hear.” Why do you think his team chose to call it this?