Author: Dr. Aline Tanielian Fadel*
Diversity in Arbitration
Dispute Resolution Technology
“Say not, ‘I have found the truth’, but rather, ‘I have found a truth’.”
– Khalil (Kahlil) Gibran, The Prophet
As a university lecturer of law, my students frequently ask me: “What is the solution to this practical problem?” and I usually reply: “Law is not exact sciences; focus on the reasoning that leads you to the solution you think is correct.”
Ideally, judges and arbitrators resolving a particular dispute would all reach the same solution, “the” truth, by applying the same law and rules to the same facts, but dissenting opinions and decisions taken by a majority vote are here to remind us that in reality they do not: rules of interpretation, personal beliefs and an arbitrator’s legal or technical background, among other causes, lead to a particular solution, “a” truth.
Predictive analytics are an artificial intelligence (“AI”) tool designed precisely for that: they can predict the outcome of a dispute on the basis of the previous awards and opinions rendered by the arbitrator that a party is considering to appoint; therefore, predictive analytics aim to help their users, particularly the parties to an arbitration agreement and their counsels, select the arbitrator who is the most likely to side with their position and their arguments, i.e., their “truth.”
Indeed, like most AI tools developed nowadays in the field of legal services to assist with legal research (e.g., LexisNexis, Ross Intelligence), contract review and due diligence (e.g., ThoughtRiver, Leverton) and e-discovery (e.g., EDR), among other services, predictive analytics tools (e.g., ArbiLex, Lex Machina, Westlaw Edge):
use machine learning, wherein the AI identifies patterns and varies its algorithm based on already-existing data and user feedback. A subset of machine learning is deep learning models (or, artificial neural networks), which are inspired by the structure of a human brain. It identifies features without human intervention by learning from heavy volumes of pre-existing data.
One might think that with such features, predictive analytics tools should broaden the choice of arbitrators because they enable parties to analyze the trends and patterns of arbitrators from all backgrounds and nationalities, thereby contributing to the diversity of arbitrators.
But how to draw a pattern for candidates with a first-time appointment as arbitrators? If predictive analytics tools become the norm to nominate any arbitrator, would they exclude newcomers from the pool of arbitrators? This was the question I asked during the very interesting webinar, “Artificial Intelligence and the Changing Face of International Arbitration,” organized by ICDR Young and International on 17 July 2020, with ArbiLex’s own founder, Ms. Isabel Yishu Yang, among other exceptional speakers.
Ms. Yang’s reply to my above-mentioned question confirmed that predictive analytics are “pro-diversity” tools, even towards newcomers to the international arbitration field (I), provided that they are built on the basis of appropriate data and algorithms, failing which they could fall into the trap of promoting the “pale, male and stale” arbitrator stereotype (II).
I. PREDICTIVE ANALYTICS ARE “DIVERSITY-FRIENDLY”
With the significant volume of data that can be processed by AI tools and their machine-learning capacity, predictive analytics are the ideal tool to provide extensive information about arbitrators from every nationality, background, legal system, and location.
Moreover, with the appropriate translation enabling tools, even opinions and awards issued in various languages may be included in the processed data to provide information in English about arbitrators who do not use English as a working language.
Ms. Yang had clarified during the above-mentioned AI-related webinar of July 2020 that “diversity is an objective we can model,” meaning that the algorithm can be customized to suit the client’s request (whether that client is a party to the arbitration, a third-party funder, etc.) and could therefore prioritize diversity or be as diverse as instructed by the client.
In the same context, the IBA’s Arbitration Committee article, “AI arbitrator selection tools and diversity on arbitral panels,” mentions a similar opinion of Kwan, Ng and Kiu:
Diversity is an objective that data scientists can model like any other. Should a client prioritise diversity in their selection, factors such as race, age, and gender can all be built into a model to suggest new candidates. Alternatively, a chosen algorithm could turn a blind eye to race, geography, age and gender, focusing instead on suggesting candidates based on their knowledge in particular areas of law, languages spoken, average time taken to render a final award, any potential conflict of interest, and even availability.
As for newcomers in the international arbitration field, their articles, opinions and blogs could all be fed as data into the predictive analytics tools to allow these tools to draw a pattern and predict the arbitrators’ position on a specific issue.
This is particularly feasible if the clients, users of predictive analytics, are interested or even bound by law or the applicable arbitration rules to find new candidates for appointment as arbitrators, whether to avoid conflicts of interest or repetitive appointments of the same arbitrator or, simply, because of the unavailability of famous arbitrators: in their annual international arbitration survey of 2017, Bryan Cave Leighton Paisner found for instance that an overwhelming 92% of respondents said that they would welcome more information about new and less well-known candidates.
Therefore, if the users of predictive analytics follow the pro-diversity trend of international arbitration institutions and the numerous organizations and initiatives currently advocating for diversity within the international arbitration community, they would be satisfied with the “diversity-friendly” outcome of the predictive analytics tools because research undertaken by the European Commission has shown that “by contrast to the human brain, algorithms offer opportunities to better visualise, measure, detect and ultimately correct discriminatory biases if proper legal regulation and public policy is put in place.”
Nevertheless, the issue with the inclusion in predictive analytics tools of newcomers in international arbitration is probably less an issue of avoiding discrimination against them, and more one of efficiency of the actions undertaken to enable the creation of a pattern in order to identify their positions and opinions.
II. PREDICTIVE ANALYTICS RELYING ON HOMOGENEOUS DATA: A HURDLE TO DIVERSITY?
The IBA’s Arbitration Committee has highlighted the issue of pattern (or track record) creation in the context of newcomers in the arbitration field, pointing out that predictive analytics are more accurate with longer track records. Consequently, this could lead to the exclusion of newcomers:
the need for a longer track record presents a significant barrier to entry for more diverse arbitrator candidates. Indeed, a longer track record increases the ability of AI tools to predict future outcomes and, in turn, improves predictive accuracy. In the case of arbitrators, a track record is built on a given individual’s experience in case management and decision-making, including a number of pending and concluded arbitrations. It is easy to see how such a model would favour experienced arbitrators and keep newcomers and those with less experience out of the race.
Furthermore and as mentioned above, predictive analytics are customized according to their users’ request, and despite the overwhelming number of articles and initiatives promoting diversity, it has been pointed out that many parties and their counsels often nominate the same arbitrators or famous ones, and that the task of choosing less well-known or more diverse arbitrators is generally left to the arbitral institutions, and since “[p]arties are responsible for a large proportion of arbitrator appointments in most institutions (in 2017, for example, 76% of Vienna International Arbitral Centre (VIAC) cases, 58% of ICC cases, and 49% of LCIA cases),” therefore, parties and their counsels using predictive analytics may be more likely to favor experienced arbitrators with long track records based on data from previously rendered arbitral awards over diversity and new faces in arbitration, and the predictive analytics tool will naturally follow these customers’ requests.
Consequently, it is crucial to raise awareness of the importance of diversity of data and algorithms used by predictive analytics tools: track records and patterns could also be built on academic opinions and published materials of professors, experts, renowned practitioners from all over the world who may have less arbitration appointments under their belt, but similar or perhaps greater expertise than other well-known arbitrators. In other words, the diversity of data and algorithms is the key to the diversity of arbitrators found through predictive analytics.
However, if it is true that predictive analytics, like all technologies, are neutral towards diversity, which is perfectly illustrated by the writer William Gibson who said: “I think that technologies are morally neutral until we apply them. It’s only when we use them for good or for evil that they become good or evil.” It remains difficult to model them to include newcomers into the arbitration field if they do not have the elements to enable the creation of a track record allowing what predictive analytics are designed to do: predict.
* Dr. Aline Tanielian Fadel is an arbitrator, lecturer at Université Saint-Joseph in Beirut (USJ), and has been appointed as Partner and Head of Arbitration Practice at Eptalex – Aziz Torbey Law Firm in Beirut in 2020. She has participated in numerous arbitration proceedings before the ICC, the Beirut Chamber of Commerce and Industry (BCCI), the DIFC and DIAC (both in Dubai), along with several ad hoc arbitration proceedings. She is a member of the Beirut Bar Association, ICC Arab Arbitration Group, and ArbitralWomen, and listed as arbitrator on the panels of the BCCI, the VIAC (Vienna) and the ADGM (Abu Dhabi).
 Aditya Singh Chauhan, Future of AI in Arbitration: The Fine Line Between Fiction and Reality, Kluwer Arb. Blog (Sept. 26, 2020), http://arbitrationblog.kluwerarbitration.com/2020/09/26/future-of-ai-in-arbitration-the-fine-line-between-fiction-and-reality/.
 Numerous articles have already highlighted the positive impact of diversity in arbitration and, more specifically, in the decision-making process of the arbitral tribunal. We will therefore not develop that aspect: the references at the end of the following article from the IBA’s Arbitration Committee constitute for example good reading materials tackling this matter. See generally Allyson Reynolds & Paula Melendez, AI arbitrator selection tools and diversity on arbitral panels, Int’l Bar Ass’n, https://www.ibanet.org/Article/NewDetail.aspx?ArticleUid=97cb79fa-39e9-48c1-8cb0-45569e2e62af.
 A recording of the webinar can be viewed at ICDR Young & International, Artificial Intelligence and the Changing Face of International Arbitration, YouTube (July 16, 2020), https://youtu.be/peqWhovuHVg. The question is discussed from 1:10:00 onwards.
 Meaning a white man of a certain age. For instance, at the 2014 ICCA Miami Conference, the international arbitration community gathered to address the question, “Who are the arbitrators?” The answer, panel attendees were told, was “male, pale, and stale.” Joseph Mamounas, ICCA 2014. Does “Male, Pale, and Stale” Threaten the Legitimacy of International Arbitration? Perhaps, but There’s No Clear Path to Change, Kluwer Arb. Blog (Apr. 10, 2014), http://arbitrationblog.kluwerarbitration.com/2014/04/10/icca-2014-does-male-pale-and-stale-threaten-the-legitimacy-of-international-arbitration-perhaps-but-theres-no-clear-path-to-change/.
 Artificial Intelligence and the Changing Face of International Arbitration, supra note 3. See Ms. Yang’s answer from 1:13:00 onwards.
 AI arbitrator selection tools and diversity on arbitral panels, supra note 2.
 James Kwan, James Ng & Brigitte Kiu, The Use of Artificial Intelligence in International Arbitration: Where Are We Right Now?, 22 Int’l Arb. L. Rev. 19, 21 (2019).
 Bryan Cave Leighton Paisner, Diversity on Arbitral Tribunals: Are We Getting There? (Jan. 12, 2017), https://www.bclplaw.com/en-US/insights/diversity-on-arbitral-tribunals-are-we-getting-there.html
 See for example, ArbitralWomen, the Equal Representation in Arbitration Pledge (ERA Pledge), the Rising Arbitrators Initiative (RAI) supporting arbitration practitioners under 45 with their first arbitration appointments, among others.
 Janneke Gerards & Raphaële Xenidis, Algorithmic discrimination in Europe: Challenges and opportunities for gender equality and non-discrimination law: A special report, European Comm’n (2020), at 11.
 AI arbitrator selection tools and diversity on arbitral panels, supra note 2.
 Carol Mulcahy & Victoria Clark, BCLP International Arbitration Surveys: Party Appointed Arbitrators and the Drive for Diversity, Kluwer Arb. Blog (Oct. 24, 2018), http://arbitrationblog.kluwerarbitration.com/2018/10/24/permanent-contributor-12/.
 Gemma Anderson, Richard Jerman & Sampaguita Tarrant, Diversity in International Arbitration, Practical Law, available at https://www.mofo.com/resources/insights/190318-diversity-international-arbitration.html.