AI Attestation Services: How Do You Know Your AI Models Perform as Expected
This IDC Perspective discusses artificial intelligence (AI) attestation services and how business can determine if their AI models are performing as expected. Corporations are investing significantly in artificial intelligence since generative AI (GenAI) became real, usable, and value added. Organizations could see the major potential benefits in the areas of customer satisfaction, new and/or enhanced products, customer growth, and revenue growth. With this come the risks AI can present if AI models are inadequately developed, evaluated, assessed, implemented, maintained, and used for business decisions. There is the potential need for independent, third-party AI attestation or peer-review service providers that can provide that added level of trust and certainty that the risks of implementing certain AI models are manageable."AI is here to stay, and it is only going to become more sophisticated for business use. We now have an opportunity in these nascent stages to drive best practices in the development, management, maintenance, and enhancement of AI for critical business needs," says Philip Harris, research director, Governance, Risk, and Compliance Services and Software at IDC. "By taking these initial steps carefully and thoughtfully, the trustworthiness of AI will grow and organizations will significantly reduce risk to contend with going forward."
Please Note: Extended description available upon request.
Executive Snapshot
Situation Overview
What Is Needed?
Types and Number of Organizations Using the Models
Independent AI Peer Review or Attestation Services