• These thoughts by Paul Rawlinson, Global Chair of Baker McKenzie and sobering and realistic. Will lawyers become extinct in the age of automation? His observations include, “…(T)he market will kill those who don’t adapt. They are the ones who should be scared of the machines. For them, the robots are coming. The really wise lawyers, they know it’s not one versus the other. For those who can find ways to use AI to augment, not replace, judgement and empathy, I believe the future is very bright indeed.”

 

  • The Legal AI Forum has commissioned a survey (Artificial intelligence and the future of law) of “200 professionals within the legal sector” and presented the results in this report. Results like the chart below suggest their interviewees may be on the leading edge of things. More coverage of this very optimistic report here.

  • From Artificial Lawyer: “Global law firm Linklaters has partnered with the International Swaps and Derivatives Association, (ISDA) to build a platform that automates significant parts of derivatives documentation and also helps to negotiate initial margin (IM) issues.”

 

  • AI in healthcareNew data sources pose ethical conundrum for AI. “Technologists developing AI tools for healthcare must “completely re-engineer” their data flows around de-identified data to avoid regulatory hurdles, Stanley Crosley, an attorney who chairs the data privacy and health information governance team at Drinker Biddle, said.”

 

  • More from SOLI2018 here, including, “That includes embracing artificial intelligence rather than being fearful of it. Robots will not take your job,” said Shawnna Hoffman, global cognitive legal co-leader at IBM. “Robots will take away the things that annoy you, like processing invoices.”

 

  • If you’re at all interested in the legal (especially liability) implications of autonomous vehicles, read this post from Artificial lawyer.

 

  • Also from Artificial Lawyer: “Luminance, is branching out into the regulatory world in order to expand its offering by covering areas such as Brexit impact on contracts and GDPR compliance. … the company’s initial strategy of focusing only on M&A due diligence is well and truly over, with a mission now to capture a greater share of the NLP-driven doc review market across different practice areas.”

 

  • Here are 14 Ways Law Firms Are On-Point With Their Tech Game.  Good examples.

 

From Law firms:

Clifford ChanceClifford Chance establishes Best Delivery and Innovation Hub for Asia Pacific in Singapore

Finnegan (podcast): Susan Tull on Patenting the Future of Medicine. “Artificial intelligence, or AI, is rapidly transforming the world of medicine. AI computers are diagnosing medical conditions at a rate equal to or better than humans, all while developing their own code and algorithms to do so. With the rise of AI, there are new issues of patentability, inventorship, and ownership that must be addressed.”

Hogan Lovells (white paper): ADG Insights: Artificial Intelligence. “…(T)he top legal and political issues affecting the aerospace, defense, and government services (ADG) industry.”

Littler (survey of 1,111 in-house counsel, human resources professionals and C-suite executives): The Littler® Annual Employer Survey, 2018. “Recruiting and hiring is the most common use of advanced data analytics and artificial intelligence, adopted by 49 percent of survey respondents.”

 

  • Thomas B. Edsall contributed to the NYT this opinion piece about the impacts of AI and other major economic changes (e.g., getting Trump elected). Industrial Revolutions Are Political Wrecking Balls. Sobering.

 

  • And now, some unsettling news about increased use of AI by Facebook and the Russian military.

 

  • Finally for this week, here’s a thought piece for your weekend by Anthony Giddens, former director of the London School of Economics and member of the House of Lords Select Committee on Artificial Intelligence. A Magna Carta for the digital age. Among his recommendations:

The main elements of that charter are that AI should:

Be developed for the common good.

Operate on principles of intelligibility and fairness: users must be able to easily understand the terms under which their personal data will be used.

Respect rights to privacy.

Be grounded in far-reaching changes to education. Teaching needs reform to utilize digital resources, and students must learn not only digital skills but also how to develop a critical perspective online.

Never be given the autonomous power to hurt, destroy or deceive human beings.