Lawless Playground

All News

The world was set on fire when OpenAI launched ChatGPT to the public in late November 2022. For the first time, everyone from school children to the elderly had access to machine learning-assisted information. Before long, more than 4,000 generative artificial intelligence (AI)tools became available for everyday use—from chatbots and search engines to research and voice generation.

Digital technology is shaking up the legal world, touching every corner of the profession and sparking fresh legal and policy debates. At the forefront of this transformation are experts at Maryland Carey Law, who are exploring the crossroads of law and technology, delving into how these changes are shaping law, policy, legal education, and the practice of law itself.

A New Frontier

Professor Liza Vertinsky says that the legal community faces significant complications with the adoption of new technology. Lawyers must strike fast to adapt, and the legal industry needs to be innovative to keep up. This is a challenge when preparing new lawyers for the digital quagmires they may face. “We have to teach our students how existing laws apply to new technology, and how they must respond to it agilely as lawyers,” she says.

“It is important that faculty help students think big picture,” says Markus Rauschecker ’06, lecturer and executive director of the Center for Cyber, Health, and Hazard Strategies. “We need to think of the implications and the best way to proceed with the law and policy making to ensure we don’t lose sight of core values such as social justice, equity, and fundamental values when moving at these breakneck speeds.”

Rauschecker teaches courses on cybersecurity law for students in the JD and Master of Science in Law programs. “Maryland Carey Law is one of the first law schools to offer cybersecurity law,” he says. “When we started the program a decade ago, technological policy issues were not a concern for state legislation. It has evolved to the point that the team now provides counsel to government agencies and private corporations on the legal framework."

Another example where Maryland Carey Law is blazing a trail in law and technology is by testing AI on legal knowledge. Professor Andrew Blair-Stanek has used OpenAI’s GPT software to take legal exams. In November 2023, OpenAI released GPT-4-turbo, which was touted as having the ability to pass the American Bar Exam. Blair-Stanek used the technology to take fall 2023 law school final exams and graded on the same curve as students. The results, he mused, were underwhelming. The software achieved three B+’s, a B, three B-’s, a C+, and a D. However, the results improved overall from the prior semester when Blair-Stanek tested an earlier version of OpenAI’s GPT. “It will be interesting to see how OpenAI’s models perform in the future,” Blair-Stanek wrote.

Advanced Technology and Justice

As society becomes more digitized, technology is playing a bigger role in shaping important decisions in areas like criminal justice, housing, labor, health care, and education. In the field of critical race and digital studies, Professor Chaz Arnett is uncovering how seemingly neutral technologies can reinforce old patterns of racial inequality and injustice. “We have turned to advanced technology as a way to deal with some of the ‘less savory’ aspects of our justice system, such as police bias and racial discrimination,” he says. Arnett emphasizes that advancing technologies—like algorithms, surveillance tools, social media platforms, predictive analytics, and digital databases—can carry racial biases and cause harm.

“For example, the use of electronic ankle monitors as an alternative to detention has several implications,” he explains. He says that corrections—the supervision and management of individuals who have been arrested, convicted, or sentenced for criminal offenses—is being outsourced to the community. “Policies and procedures around this have created burdens and raised several questions such as safety, and if it is beneficial.”

Arnett explores themes of technology and policing, technology’s obstruction of racial justice, and ways of engaging in local and national advocacy in his scholarship and teaching. “My overarching goal is to prepare the next generation of lawyers to think critically about these issues,” he says.

Also at the intersection of law and technology is forensic evidence in today’s criminal cases. In addition to traditional methods, such as fingerprints and DNA, new technology brings digital and algorithmic methods, including surveillance tools, into criminal legal systems. The work in forensic defense will come into play this fall with the launch of the Innocence Project Clinic, a transformative collaboration between the Office of the Maryland Public Defender, University of Baltimore Law School, and Maryland Carey Law. Students in the clinic will represent clients who maintain their innocence but have been convicted of serious crimes in Maryland state courts.

Dean Renée Hutchins Laurent is enthusiastic about the new clinic, stating, “This collaboration represents a significant step forward in our efforts to provide our students with hands-on legal experience while making a meaningful impact on the lives of those who have been wrongfully convicted.”

Containing the Chaos

While generative AI news has exploded in the last three years, the technology itself has been around for nearly a century thanks to scientific pioneers such as Alan Turning and Herbert Alexander Simon.

What is new are the widely available AI tools.

Generative AI software that can assist lawyers with research, briefs, and contracts may seem like a boon for the legal field. Yet, Professor Patricia Campbell, director of the Intellectual Property Law Program and the Maryland Intellectual Property Legal Resource Center, says that a lot of these products are creating negative value. “These AI-generated tools sometimes produce a low-quality product. It still needs to be vetted by someone with a legal background,” says Campbell. This creates concerns, such as clients thinking they don’t need a lawyer now that they have access to this software, or lawyers themselves using these tools as shortcuts instead of their own legal resources. “Generative AI can be helpful as a starting point, but a lawyer needs to review every single word to ensure a brief, contract, or other document works for their client,” she adds. Law students need to recognize the usefulness of these tools in the classroom and also the perils of using them as a crutch. Campbell points out that there have been instances where briefs were handed to the court using citations that never existed. One such example was explained to students at Maryland Carey Law by the judges from the Fourth Circuit when they spoke at the school in March 2025.

The generative AI tool created a hallucination—a made-up story—that appeared legitimate based on patterns that are currently in the generative AI tool database. “It’s a tool. That doesn’t mean lawyers and legal experts shouldn’t perform due diligence and fact check the work,” she says.

But fear not, says Nathan D.M. Robertson ’12, retired director of information policy and management and copyright law lecturer. “It’s all going to work out. The copyright world has been panicking about new technology for decades, such as MP3, VCR, and Xerox machines. Creativity didn’t end. Digital technology, such as Generative AI, is going to make us rethink the ways we do things, but it’s all going to work out.”

Wild, Wild West of AI Copyright

Since its inception for public use in 2022, more than 15 billion AI-generated images have been created, according to Leonardo.Ai, and the number continues to skyrocket. In a space once held for designers or artists with digital tools to create original artwork, now anyone can originate new images with a few prompts in a generative AI tool. Images and text are only the beginning. There are now tools for anyone to create music, video games, proposals, contracts, and even voice-generated podcasts.

Intellectual property law and copyright legal experts are concerned that the new content is being created and used without a regulatory framework. “There are so many questions that we don’t know the answers to – it’s like when the internet started and we claimed it was the ‘wild, wild west’ of digital technology and copyright law,” says Professor Patricia Campbell, director of the Intellectual Property Law Program and the Maryland Intellectual Property Legal Resource Center. She reiterates that the same questions from 25 to 30 years ago still resonate today: should we regulate it, how do we regulate it, or will it work itself out?

According to Nathan D.M. Robertson, retired director of information policy and management and copyright law lecturer, there are three major areas where generative AI products are potentially problematic when it comes to violating copyright law:

1. Training of Generative AI Tools: Generative AI tools, such as ChatGPT, Claude.ai, and others, need to download massive amounts of data for training purposes so the tool can generate a product or response based on what is in its database. Because these companies replicate data without a license, their actions may infringe on the copyright owner’s exclusive rights. The courts have yet to decide this issue.

2. Copyright Infringement with a New Creation: A generative AI tool can produce a photo or narrative that heavily borrows from a copyrighted piece. A user can ask the generative AI tool to create a logo or presentation and add it to a website, not knowing that it bears likeness to something else. If the copyright owner can prove both copying and “substantial similarity,” the copyright owner may have an infringement case against both the generative AI tool and the user.

3. Non-copyrightable Work: The U.S. Copyright Office requires human authorship in order to register copyright in a work. If the creative output is solely the product of generative AI, it is not a work of authorship and is not protected by copyright at all. Close cases can arise when a human creator guides the generative AI’s output, as long as the human’s contributions to the work qualify as creative authorship. However, simple prompts probably do not qualify as authorship for purposes of copyright law; instead, the prompts are merely ideas. And ideas, says Robertson, are not copyrightable.