Thursday, June 6, 2019

France Bans Judge Analytics, 5 Years In Prison For Rule Breakers

France Bans Judge Analytics, 5 Years In Prison For Rule Breakers. Artificial Lawyer, June 4 2019. https://www.artificiallawyer.com/2019/06/04/france-bans-judge-analytics-5-years-in-prison-for-rule-breakers/


Excerpts without links:

In a startling intervention that seeks to limit the emerging litigation analytics and prediction sector, the French Government has banned the publication of statistical information about judges’ decisions – with a five year prison sentence set as the maximum punishment for anyone who breaks the new law.

Owners of legal tech companies focused on litigation analytics are the most likely to suffer from this new measure.

The new law, encoded in Article 33 of the Justice Reform Act, is aimed at preventing anyone – but especially legal tech companies focused on litigation prediction and analytics – from publicly revealing the pattern of judges’ behaviour in relation to court decisions.

A key passage of the new law states:

‘The identity data of magistrates and members of the judiciary cannot be reused with the purpose or effect of evaluating, analysing, comparing or predicting their actual or alleged professional practices.’ *

[...]

Insiders in France told Artificial Lawyer that the new law is a direct result of an earlier effort to make all case law easily accessible to the general public, which was seen at the time as improving access to justice and a big step forward for transparency in the justice sector.

However, judges in France had not reckoned on NLP and machine learning companies taking the public data and using it to model how certain judges behave in relation to particular types of legal matter or argument, or how they compare to other judges.

In short, they didn’t like how the pattern of their decisions – now relatively easy to model – were potentially open for all to see.

Unlike in the US and the UK, where judges appear to have accepted the fait accompli of legal AI companies analysing their decisions in extreme detail and then creating models as to how they may behave in the future, French judges have decided to stamp it out.

Various reasons for this move have been shared on the Paris legal tech grapevine, ranging from the general need for anonymity, to the fear among judges that their decisions may reveal too great a variance from expected Civil Law norms.

One legal tech expert in France, who wished to remain anonymous, told Artificial Lawyer: ‘In the past few years there has been a growing debate in France about whether the names of judges should be removed from the decisions when those decisions are published online. The proponents of this view obtained this [new law] as a compromise from the Government, i.e. that judges’ names shouldn’t be redacted (with some exceptions to be determined) but that they cannot be used for statistical purposes.’

Whatever the reason, the law is now in effect and legal tech experts in Paris have told Artificial Lawyer that, as far as they interpret the regulations, anyone breaking the new rule can face up to five years in prison – which has to be the harshest example of legal tech regulation on the planet right now.
Forbidden knowledge…….

That said, French case law publishers, and AI litigation prediction companies, such as Prédictice, appear to be ‘doing OK’ without this specific information being made available. This is perhaps because even if you take out the judges from the equation there is still enough information remaining from the rest of the case law material to be of use.

Moreover, it’s unclear if a law firm, if asked to by a client, could not manually, or using an NLP system, collect data on a judge’s behaviour over many previous cases and create a statistical model for use by that client, as long as they didn’t then publish this to any third party. That said, it’s not clear this would be OK either. And with five years in prison hanging over your head, would anyone want to take the risk?

But, the point remains: a government and its justice system have decided to make it a crime for information about how its judges think about certain legal issues to be revealed in terms of statistical and comparative analysis.

Some of the French legal experts Artificial Lawyer talked to this week asked what this site’s perspective was. Well, if you really want to know, it’s this:

[...]



Part of the French text covering the new law is below:

‘Les données d’identité des magistrats et des membres du greffe ne peuvent faire l’objet d’une réutilisation ayant pour objet ou pour effet d’évaluer, d’analyser, de comparer ou de prédire leurs pratiques professionnelles réelles ou supposées.

La violation de cette interdiction est punie des peines prévues aux articles 226-18, 226-24 et 226-31 du code pénal, sans préjudice des mesures et sanctions prévues par la loi n° 78-17 du 6 janvier 1978 relative à l’informatique, aux fichiers et aux libertés.’

(* Translated version above, via Google.)

No comments:

Post a Comment