AMA: Effective AI rollout requires insight into responsibilities

Wed 30 July 2025
AI
News

AI has a growing impact on healthcare worldwide in numerous areas—diagnosis, administration, advice, and prediction. This prompted the American Medical Association (AMA) to advocate for clarity about who is responsible for what in the rollout and use of AI: AI governance. Furthermore, AI must continue to support the work of healthcare professionals, who must remain in the lead.

According to the AMA, the number of American physicians using AI tools will almost double by 2024, from 38 to 68 percent. As their use increases, so will the need to use AI responsibly, safely, and effectively. The AMA also defines AI as augmented intelligence to emphasize that AI's role is to support healthcare professionals, not replace them.

"Clinical decision-making should still rest with clinicians," said Dr. Margaret Lozovatsky, Chief Medical Information Officer and Vice President of Digital Health Innovations at the AMA. "AI simply enhances their ability to make those decisions." That distinction, Lozovatsky argues, is crucial. AI directly impacts patient care and outcomes, and as it becomes more embedded in daily practices and workflows, it brings new challenges and responsibilities.

Real risks

"There are real risks associated with implementing these technologies," Lozovatsky stated during a recent AMA webinar. "That's why it's important to understand the critical need for governance." 

The fundamental pillars of responsible AI implementation, according to the AMA, are:

  • Establishing management accountability and structure.
  • Forming a working group to develop priorities, processes, and policies.
  • Reviewing current policies.
  • Developing AI policies.
  • Defining processes for project intake, vendor evaluation, and assessment.
  • Updating standard planning and implementation processes.
  • Establishing an oversight and monitoring process.
  • Supporting organizational AI readiness.

Start with Strategic Leadership

Establishing accountability is the first and most essential step in safe, scalable, and meaningful AI integration. Management sets the vision, provides oversight, and ensures that AI and its implementation align with system-wide priorities.

"Involving the leadership team is crucial," says Lozovatsky, a pediatrician and a nationally recognized leader in digital health and healthcare informatics. "All their disciplines will be affected, so buy-in from those leaders is essential."

This leadership often consists of:

  • Chief Medical Officer.
  • Chief Nursing Officer.
  • Chief Quality Officer.
  • Chief Operating Officer.
  • Chief Information or Chief Technology Officer.
  • Chief Digital Officer.
  • General Counsel.
  • Chief Medical Information Officer.

Such a multidisciplinary governance structure guides implementation based on feedback from healthcare providers. "It's crucial to have people representing all these disciplines because they best understand the relevant considerations," emphasizes Lozovatsky. From there, each leader should delegate responsibilities to trusted stakeholders within their domain to shape workflows, policies, and project evaluations.

Establishing Clinical Governance

The AMA recommends a three-pronged model for organizing AI governance:

  1. Clinical Leadership: Top-level support is essential to ensure that AI aligns with organizational priorities and patient care. Leaders should also delegate implementation responsibilities.
  2. Advisory Boards: Committees should evaluate technologies and address clinical concerns, while ensuring that new tools are interoperable with existing systems.
  3. Specialties: Genuine engagement with healthcare professionals on the ground in clinical specialties will contribute to the development and implementation of AI tools that meet the unique needs of those specialties.

This approach should ensure that AI supports broader institutional goals and that resources are allocated appropriately. "Organizations likely already have processes in place for evaluating technology," says Lozovatsky. "They will need to consider the unique aspects of AI and how these will be addressed with existing models and what additional governing bodies are needed," with the goal of ensuring consistent oversight and alignment with institutional priorities.

Integrate leadership in clinical informatics

Clinical informatics encompasses clinical, technical, and operational
Considerations are key, making this expertise an integral part of decision-making. "Governance of any clinical technology depends on a deep understanding of what both technology and people can and should do," says Lozovatsky. Involving these experts early in the governance discussion is essential.

Organizations must address several strategic questions before designing a structure:

  • How does AI support our strategic goals?
  • Should we take a more rigorous approach?
  • What internal capabilities and external partnerships do we need?
  • Who is responsible for oversight and compliance?
  • Does AI fit into our existing governance model, or do we need new committees or roles?

Establishing a governance framework begins with treating AI as a tool to advance institutional goals. Evaluate existing internal capabilities to determine their readiness to assess and implement AI. Healthcare institutions can then make fully informed decisions about integrating AI into existing structures or creating new ones.

Building Trust

Perhaps the most important function of AI governance is building trust with physicians and other healthcare professionals that these tools are safe, with patients that their data is secure, and with leaders that implementing AI will advance their healthcare mission. “Healthcare organizations must ensure that AI is implemented in a safe and thoughtful way,” Lozovatsky argues. “We must demonstrate that we are supporting the care of our patients and our clinicians in their ability to deliver that care.”

A clear objective and strong structures for implementation enable healthcare systems to move forward with confidence. Establishing administrative accountability within a viable governance framework supports collaboration that fosters meaningful AI integration.

AI governance is not just about oversight. It’s about creating a culture of innovation within a structured framework that positions AI as a tool for improving healthcare. “Doing this in a safe and thoughtful way,” Lozovatsky concludes, “supports the care of our patients and our clinicians in delivering it.”

A careful rollout of AI in healthcare requires an equally carefully substantiated strategy, with good AI governance and clarity about roles and responsibilities.