AI explainability: Regulations and responsibilities

As AI applications have become increasingly pervasive, demands for accountability and explainability of AI systems have increased. Despite some regulatory developments, there is little guidance on what might constitute an acceptably rigorous “explanation”. In the absence of clarity on what might constitute a satisfactory explanation, and with the development of more complex AI, accomplishing compliance with undefined “explainability” requirements poses a difficult challenge for regulators and industry alike.

 

Request a copy