Racism in predictive justice, the issues with algorithmic policing

A new article by Will Douglas Heaven, senior AI editor at the MIT technology review has called for an end to the use of predictive policing and justice, powered by AI algorithms. The article looks at a number of ways that race feeds into AI algorithms, and how this can detriment minorities. The article suggests that current AI systems, when applied to justice, end up continuing to reinforce existing systemic racism, and potentially lead to an increased bias, as judgement formed by a supposedly objective system, then reinforces existing bias.

Heaven, therefore, suggests that until AI has been developed to the point where it can be genuinely objective, it should not be used in such an important decision-making capacity, particularly as discussions continue in the US and globally around racism and bias in the justice system.

Visit the MIT technology review to read the full argument.

0

BSB to use AI to carry out online testing

The Bar Standards Board has announced on the 12th May 2020, that the Bar Professional Training Course and Bar Transfer Test assessments, that were delayed from April to August, will be carried out online with the assistance of Pearson’s OnVUE secure global online proctoring solution, which will allow for remote invigilation. Allowing the exams to take place within this timeframe will then allow for students with pupillage offers to take these up in the Autumn, rather than causing further delays.

The BSB has said that the “OnVUE system uses a combination of artificial intelligence and live monitoring to ensure the exam is robustly guarded, deploying sophisticated security features such as face-matching technology, ID verification, session monitoring, browser lockdown and recordings.” However, some criticism has come about suggesting that the system may prejudice students with young children, as the system automatically ends the test if another person is detected in the presence of the examinee.

BSB director-general Mark Neale said: “Since the current health emergency began… students and transferring qualified lawyers have had to face considerable uncertainty, which we very much regret, and I am delighted that we can now deliver centralised assessments remotely in August with Pearson VUE’s state-of-the-art online proctoring system.”

For more information see the full article on the BSB site.

 

0

AI Regulation in Europe

Abstract

With the regulation of Artificial Intelligence (AI), the European Commission is addressing one of the central issues of our time. However, a number of core legal questions are still unresolved. Against this background, the article in a first step lays regulatory foundations by examining the possible scope of a future AI regulation, and by discussing legal strategies for implementing a risk-based approach.

In this respect, I suggest an adaptation of the Lamfalussy procedure, known from capital markets law, which would combine horizontal and vertical elements of regulation at several levels. This should include, at Level 1, principles for AI development and application, as well as sector-specific regulation, safe harbors and guidelines at Levels 2-4. In this way, legal flexibility for covering novel technological developments can be effectively combined with a sufficient amount of legal certainty for companies and AI developers.

In a second step, the article implements this framework by addressing key specific issues of AI regulation at the EU level, such as: documentation and access requirements; a regulatory framework for training data; a revision of product liability and safety law; strengthened enforcement; and a right to a data-free option.

Citation
Hacker, Philipp, AI Regulation in Europe (May 7, 2020). 

Download the full paper from the SSRN

0

Key Elements of Responsible Artificial Intelligence – Disruptive Technologies and Human Rights

Abstract

One major challenge facing human kind in the 21st century the widespread use of Artificial Intelligence (AI). Hardly a day passes without news about the disruptive force of AI – both good and bad. Some warn that AI could be the worst event in the history of our civilization. Others stress the chances of AI diagnosing, for instance, cancer, or supporting humans in the form of autonomous cars. However, because AI is so disruptive the call for its regulation is widespread, including the call by some actors for international treaties banning, for instance, so-called “killer robots”. Nevertheless, until now, there is no consensus how and to which extent we should regulate AI. This paper examines whether we can identify key elements of responsible AI, spells out what exists as part “top down” regulation, and how new guidelines, such as the 2019 OECD Recommendations on AI can be part of a solution to regulate AI systems. In the end, a solution is proposed that is coherent with international human rights to frame the challenges posed by AI that lie ahead of us without undermining science and innovation; reasons are given why and how a human rights based approach to responsible AI should inspire a new declaration at the international level.

Citation
Voeneky, Silja, Key Elements of Responsible Artificial Intelligence – Disruptive Technologies and Human Rights (January 1, 2020). Freiburger Informationspapiere, January 2020.

Available from the SSRN

0

Professions and Expertise: How Machine Learning and Blockchain are Redesigning the Landscape of Professional Knowledge and Organisation

Abstract

Machine learning has entered the world of the professions with differential impacts. Engineering, architecture, and medicine are early and enthusiastic adopters. Other professions, especially law, are late and in some cases reluctant adopters. And in the wider society automation will have huge impacts on the nature of work and society. This paper examines the effects of artificial intelligence and blockchain on professions and their knowledge bases. We start by examining the nature of expertise in general and then how it functions in law. Using examples from law, such as Gulati and Scott’s analysis of how lawyers create (or don’t create) legal agreements, we show that even non-routine and complex legal work is potentially amenable to automation. However, professions are different because they include both indeterminate and technical elements that make pure automation difficult to achieve. We go on to consider the future prospects of AI and blockchain on professions and hypothesise that as the technologies mature they will incorporate more human work through neural networks and blockchain applications such as the DAO. For law, and the legal profession, the role of lawyer as trusted advisor will again emerge as the central point of value.

Citation
Flood, John A. and Robb, Lachlan, Professions and Expertise: How Machine Learning and Blockchain are Redesigning the Landscape of Professional Knowledge and Organisation (August 9, 2018). Griffith University Law School Research Paper No. 18-20.

Available from the SSRN site.

0

AI-Enabled Business Models in Legal Services: From Traditional Law Firms to Next-Generation Law Companies?

What will happen to law firms and the legal profession when the use of artificial intelligence (AI) becomes prevalent in legal services? This paper addresses this question by considering specific AI use cases in legal services, and by identifying four AI-enabled business models (AIBM) which are relatively new to legal services (if not new to the world). These AIBMs are different from the traditional professional service firm (PSF) business model at law firms, and require complementary investments in human resources, intra-firm governance and inter-firm governance. Law firms are experimenting with combinations of business models. We identify three patterns in law firm experimentation: first, combining the traditional PSF business model with the legal process and/or consulting business models; second, vertically integrating the software vendor business models; and third, accessing AIBMs from third-party vendors to take advantage of contracting for innovation. While predicting the future is not possible, we conclude that how today’s law firms transform themselves into tomorrow’s next generation law companies depends on their willingness and ability to invest in necessary complements.

Citation

Armour, John and Sako, Mari, AI-Enabled Business Models in Legal Services: From Traditional Law Firms to Next-Generation Law Companies? (July 12, 2019). Available at SSRN.

0

Event: Challenges of Global Digitalisation for Governance and Justice

16-17 September 2019, Luxembourg
European Institute of Public Administration (EIPA)

About this course

Digitalisation is rapidly transforming our world and affects governance, businesses and justice. In light of this, there is an urgent need to adopt solutions to the global digital changes in automatisation, artificial intelligence, blockchain technology, digitalisation of legal practices and services, as well as electronic evidence.

The seminar will address the main issues at play in terms of overcoming the challenges of the rapid scientific and technological changes faced by governments, economies and markets, as well as justice. Centred around global cooperation and technological convergences, the seminar will explore following solutions in governance innovation and technological distribution, taxation of the digital market economy, digital justice, protection of people’s fundamental rights and the generation of new digital ones.

Who is this course for

Public administration officials, legal officers and technical staff of national public administration and ministries, justice professionals, legal counsellors, practicing lawyers, EU staff.

Read more and book…

0
post

SRA launches legal access challenge

Legal Access Challenge launched to encourage innovation

  • Six in 10 don’t think the legal system in England and Wales is set up for ordinary people
  • Many who experience a legal problem don’t take professional advice, citing cost and trust as key barriers
  • Eight in 10 say it needs to be easier for people to access legal guidance and advice
  • We are partnering with Nesta Challenges to launch a prize to make legal support more accessible and affordable through new technology

New research from Nesta Challenges reveals six in ten (58%) people in England and Wales think the legal system is not set up for ordinary people, with the vast majority wanting it to be easier for people to access legal support.

The research was conducted to mark the launch of the Legal Access Challenge – a new prize we are running in partnership with Nesta Challenges – which aims to help more people access legal services through new technology.

The survey also found one in seven (15%) people in England and Wales have experienced a legal issue in the last 10 years; although with only half (51%) of all respondents confident they can identify whether a problem is a legal matter, this is likely to be far higher. We know from existing data that very few people seek professional advice from a solicitor or barrister when they have a problem1, and the research showed people are instead turning to friends and family (20%) or Google (16%) for legal advice.

When asked about barriers to accessing legal advice, seven in ten (68%) say the high cost, followed by the uncertainty of the cost (56%) and knowing who to trust (37%). The vast majority (79%) believe it needs to be easier for people to access legal guidance and advice for themselves.

There is a widespread belief that technology could be the solution to this, with six in 10 (59%) saying they think technology could lead to better services to help people resolve their legal problems. People believe that the biggest benefits to using a digital service for legal advice would be having a fixed price upfront for legal fees (38%), being able to understand their rights (26%) and having access to cheaper legal advice and information (23%).

Part of our wider programme to drive innovation in the sector, the Legal Access Challenge will offer £250,000 in grants to help innovators develop new technology solutions to help make legal advice more affordable and accessible for the majority.

Chris Gorst, Head of Better Markets, Nesta Challenges, said: “For too many people, legal support and advice seems out of reach and reserved for those with the time and money to navigate a complex legal system.

“Technology is not a panacea, but in many areas of our lives it has transformed the choice, convenience and quality available to us and this could be true in legal services too. The UK is a world leader in both technology and legal services, and there is a huge economic and social opportunity in bringing these together.

“We are launching the Legal Access Challenge to help demonstrate what technology can do and to bring these new solutions to market. We want to see digital solutions that directly support individuals and small businesses to access legal services conveniently and affordably, and which can help close the ‘legal gap’ we currently face.”

Nesta Challenges is part of Nesta, the innovation charity, and offers financial prizes to stimulate innovative solutions to some of the biggest challenges society faces. The team works with regulators, policymakers and others to help make markets more competitive and open, advising on how regulatory reforms and targeted public investment programmes can work together to achieve greater impact.

Anna Bradley, Chair of the SRA Board, said: “Whether they are dealing with a personal legal matter , or running a business, people need to be able to get legal support when it really matters.

“Having access to professional advice is important at those life changing moments. And for small businesses, it can make the difference between success and failure.

“There are real barriers for people looking for help and the innovative use of technology is one way of tackling those barriers.

“We want our regulation to support new ideas. The Legal Access Challenge can help to drive the development of new approaches which will deliver tangible benefits to the public, opening up access to legal services for as many people as possible.”

The Legal Access Challenge is open to entrants until 11 August 2019. More information can be found at www.legalaccesschallenge.org

0
post

The implications of AI on legal regulators and how they can use it

At last year’s ICLR Annual Conference in The Hague, ICLR member came together to present on the implications of AI on legal regulators and how they might harness this technology to their advantage. Panelists drew from input from ICLR members and how their own institutions were engaging with Artificial Intelligence, as shown in the infographic below:

The presentation cover various aspects, including:

  • What is Artificial Intelligence? … And what it isn’t: Steve Wilson, Standpoint Decision Support
  • What are the Potential Risks to be Managed: Bridget Gramme, Center for Public Interest Law at the University of San Diego School of Law
  • How Legal Regulators can use AI: Crispin Passmore, Solicitors Regulation Authority
  • Getting into Artificial Intelligence: Alison Hook, Hook Tangaza

You can access the full presentation here:  ICLR Artificial Intelligence Presentation


Interested in the impact of new technologies on regulation? Get involved at this year’s annual conference. Contact Jim McKay (jamesmckay@lawscot.org.uk) to become involved as a speaker or session moderator. 

0
post

California Bar Exploring Opportunities To Deploy AI

The agency is examining how artificial intelligence could help it review misconduct complaints and administer the bar exam.


The State Bar of California has started wading into the artificial intelligence waters.

The agency is exploring ways AI could help bolster the efficiency of its attorney discipline system and assist with administering the bar exam.

The bar recently entered into a contract with the MITRE Corporation to help it develop and evaluate algorithmic processes for identifying whether aattorney misconduct complaint could be closed without investigation.

State Bar Executive Director Leah T. Wilson said if an AI tool can be crafted to help the bar more speedily review whether to close or investigate complaints, it would reduce the administrative burden on staff. That in turn would allow the bar to shift its “human resources to other parts of the case processing continuum,” Wilson said.

Freeing up staff to assist with serious allegations of lawyer misconduct could be beneficial in light of the state auditor’s recent recommendation that the State Bar not hire as many new employees for its discipline unit as desired.

Wilson said an AI tool could also assist in ensuring a level of consistency and standardization in the bar’s review of the roughly 16,000 attorney misconduct complaints it receives annuallyThe technology could be especially helpful amid a nearly 60 percent increase in complaints made to the bar in recent months, a bump that has come amid the agency starting to accept online complaints.  

It was the transition last fall to permitting online complaints that allowed the bar to even contemplate an AI tool for reviewing such filings, according to Wilson.

She stressed that the MITRE project is in the early stages, and the bar will closely examine the effectiveness of the tool developed.

“If it’s not reliable to a very high degree of statistical significance, we couldn’t implement it,” Wilson said.

The bar’s $90,000 contract with MITRE, which was signed last month, calls for the company to complete its work on the project by mid-October.

Meanwhile, the State Bar is planning to use AI to help it administer the First-Year Law Students’ Examination, known as the “Baby Bar.” Law students completing their first year of law study at an unaccredited law school or through the Law Office Study Program are among those who must take the test.

At two of the sites where the Baby Bar will be given next month, the bar will be piloting the use of AI proctors for the essay portion of the exam that is taken on computers, Wilson said.

Live proctors will be there as well to ensure things go smoothly and to respond if the AI software alerts them to any patterns of eye movement or gestures out of the norm for test takers.

“If all goes well, it is our intention to deploy AI proctoring for the July bar exam,” Wilson said.

The bar began exploring AI proctoring because it struggled last year to attract enough proctors to administer the exams it gives to prospective lawyers.

Wilson said there will still need to be some human proctors present at future exams for security purposes and to help with any issues that arise, according to Wilson. But she said a successful pilot of AI proctoring would reduce the overall need for human proctors moving forward. 

On both the bar exam and attorney discipline fronts, Wilson said the bar is seeking to strike the right balance between being forward-leaning while protecting against the risks that arise with the deployment of new technologies.

I think it is the responsibility of good government everywhere to figure out how we can take advantage of technology to improve the services that we provide to the public,” she said.

Separate from the initiatives mentioned above, a bar task force is actively working to identify regulatory changes that would provide members of the public with greater access to legal services through technology, such as AI. The next meeting of the Task Force on Access Through Innovation of Legal Services is Monday, May 13.


*This article first appeared on Evolve the Law. 

0