Automated Legal Guidance Redesign | The regulatory review

by admin
Automated Legal Guidance Redesign | The regulatory review
Automated Legal Guidance Redesign | The regulatory review

[ad_1]

Agencies need to be careful when using chatbots, virtual assistants, and preset menus to share legal information.

Federal agencies perform many functions and responsibilities. One of these functions is to help citizens understand and apply the law. Increasingly, agencies provide this assistance using automated legal guidance tools such as chatbots, virtual assistants and other automated systems.

For example, when people have questions about their immigration status, they can turn to Emma, ​​the U.S. Citizenship and Immigration Services’ computer-generated virtual assistant. When they have questions about their student loans, they can ask Aidan, the U.S. Department of Education’s virtual assistant who answers questions about federal student aid. And when they have questions about how personal and business activities affect their U.S. federal tax obligations, they can consult the Internal Revenue Service’s interactive Tax Assistant to answer taxpayers’ personal tax questions.

There are a number of explanations for federal agencies’ increasing use of automated legal referral tools. Under the Plain Writing Act of 2010, agencies are required to communicate complex legal rules and procedures to the public in “plain language.” However, formal law—including statutes, regulations, and court decisions—is often so complex that it is difficult, if not impossible, for most members of society to understand.

Agencies also often lack the resources to fully explain legal issues through the use of human customer service representatives. In addition, agencies face pressure to provide services comparable to the private sector, where automated customer service tools have become commonplace. Automated tools appear to be helping agencies address a number of these issues by translating complex legal material and making it more accessible.

As a result, several federal agencies now use automated referral tools to respond to tens of millions of inquiries about the law each year. Other agencies are considering implementing these tools as a supplement or replacement for human customer service representatives. Despite the prevalence of this change, scholars who have studied technology and artificial intelligence in government agencies have not focused on agencies’ use of automation to explain the law.

To address the increasing use of automated legal guidance by federal agencies, the Administrative Conference of the United States (ACUS) tasked us with examining the use of automated legal guidance by federal agencies and offering recommendations for reform.

In our study, we reviewed the use of automated referral tools by all federal agencies and conducted an in-depth study of two main models of automated legal referral. The first of these—a decision tree “answer” model—requires users to click on topics online to find answers to their questions. In contrast, the second model—the natural language “sort” model—allows users to enter their questions in natural language and then uses artificial intelligence to sort the questions into categories and provide relevant information.

We explored how these different models of automated legal guidance produce responses that conform to or deviate from the underlying law. To learn more about how agency staff themselves think about such tools, we also interviewed federal agency staff who have direct responsibility for the well-developed automated legal guidance used by the federal government or oversight responsibility for guidance in agencies that are have developed such tools. We also conducted interviews with US Government Accountability Office staff who work with agencies to develop such tools.

We have found that automated legal guidance tools offer many benefits to agencies and the public. They enable agencies to respond to public inquiries more efficiently than human customer service representatives, help the public navigate complex legal regimes and, for certain inquiries, provide accurate answers based on the underlying law. Automated legal guidance tools also allow agencies to disclose their views to the public in an easily accessible format.

But automated legal referral tools also have drawbacks. In their attempt to offer simple explanations of complex laws to the public, automated legal guides may provide advice that deviates from the underlying law. This result occurs in both decision tree “answer” models and natural language “sorting” models. For example, both models may present unsettled law as unambiguous, add an administrative gloss to the law, and omit discussion of statutory and regulatory exceptions and requirements. These deviations can mislead the public about how the law applies to their personal circumstances.

Currently, the agencies’ automated legal guidance tools also provide users with little information about the underlying laws that the agency’s guidance relies on, and little or no warning about the limited authority of the automated legal guidance or the inability of users to rely on it as a matter of law. We also found that no federal agency publishes archives of changes made to automated legal guidance.

The potential for automated legal guidance to mislead members of the public, coupled with the public’s inability to meaningfully rely on such guidance, may exacerbate disparities in fairness between members of the public who have access to reliable advice through legal advisers and those who do not. do it.

Interviews with federal agency officials also revealed that agency officials were not adequately informed about some of the shortcomings of automated legal guidance. We heard little concern from agency officials about reliability issues, as they took the position that members of the public do not or should not rely on automated legal guidance. Agency officials held this belief in part because they believed that automated legal guidance merely provided “information” and was not a source of law. This reaction was common, even though millions of people turn to automated legal guidance each year to get answers about the law from federal agencies.

Automated legal guidance plays an important role in advising members of the public about the law and will in any case be used by agencies in the future to explain the law to the public. However, agencies should be aware of the potential drawbacks of such guidance, especially as the use of automated legal guidance expands.

Based on our report, the ACUS full assembly adopted 20 recommendations earlier this year regarding the agency’s use of automated legal guidance in the following topic areas: design and management, accessibility, transparency, and reliability.

These recommendations include, among other things, a call for agencies to consider when and whether a user’s good faith reliance on guidance from automated legal referral tools should serve as a defense against penalties for noncompliance. The recommendations encourage agencies to allow users to obtain a written record of their communications with automated legal guidance tools, including date and time stamps.

In addition, agencies should explain the limitations of the advice consumers receive when the underlying law is unclear or unsettled. To the extent possible, agencies should provide access through automated legal guidance tools to the legal materials underlying the tools, including relevant statutes, rules, and judicial or judicial decisions. More generally, agencies should design and operate automated legal guidance tools in ways that promote fairness, accuracy, clarity, efficiency, accessibility, and transparency.

It is critical that agencies follow these best practices when implementing automated legal referral tools. As our research revealed, automated legal guidance can enable agencies to communicate complex legislation to the public effectively. But it could also lead the government to present the law as simpler and clearer than it is — a phenomenon that current agency practices threaten to exacerbate, including by making automated legal guidance appear more personalized than it is.

Ultimately, our report and the ACUS recommendations provide agency staff with guidance to maximize benefits and minimize costs as they implement automated legal guidance to help members of the public learn about and comply with the law.

Joshua D. Blank is a professor and director of strategic initiatives at the University of California, Irvine School of Law.

Lee Osofsky

Lee Osofsky is a professor and associate dean for research at the University of North Carolina School of Law.

This essay is part of a three-part series for the Administrative Conference of the United States entitled Using Technology and Contractors in the Administrative State.

[ad_2]

Source link

You may also like