Explaining the Unexplainable: Duke’s New AI Course

10/1/24 Pratt School of Engineering

Duke Engineering's new Coursera specialization on emerging artificial intelligence trends is open to all

3D render AI artificial intelligence technology CPU central processor
Explaining the Unexplainable: Duke’s New AI Course

“A good decision is based on knowledge, and not on numbers.”—attributed to Plato.

Brinnae Bent, an artificial intelligence (AI)/machine learning (ML) research scientist and executive-in-residence in Duke’s Master of Engineering of Artificial Intelligence program, frequently cites this quote in the AI courses she teaches at Duke’s Pratt School of Engineering. 

Bent said that in most AI and ML courses, students are taught to focus on numbers rather than knowledge. However, this will not be the case in her role as the instructor of Duke’s newly launched online Coursera specialization on Explainable Artificial Intelligence (XAI).

“Explainable AI and interpretable ML enable us to use knowledge, rather than numbers, to better understand the predictions made by a model and subsequently the decisions humans make based on these predictions,” Bent said.

brinnae bent

Explainable AI and interpretable ML enable us to use knowledge, rather than numbers, to better understand the predictions made by a model and subsequently the decisions humans make based on these predictions.

Brinnae Bent Executive in Residence, Duke AI Master’s Program

The online specialization includes three courses:

  • Developing Explainable AI
  • Interpretable Machine Learning
  • Explainable Machine Learning

The courses are designed to empower future AI leaders with the skills needed to design solutions for real-world challenges that are both powerful and ethically responsible.

As AI has rapidly taken the world by storm, entering and transforming fields like health care and finance and shaping everyday lives through social media, developing AI systems that are accurate, transparent and trustworthy is critical.

“I strongly believe anyone who is using or building AI systems should learn more about developing responsible AI,” Bent said. “As these systems are increasingly being used for critical decision making, it becomes imperative that those building these systems are considering AI understandability.”

Jessilyn Dunn with PhD student Brinnae Bent who is preparing to download information from a wearable health monitoring device.
Brinnae Bent (foreground), during her time as a PhD student, preparing to download information from a wearable health monitoring device, with Duke BME faculty member Jessilyn Dunn.

By exploring topics like SOTA interpretable machine learning approaches, mechanistic interpretability, and explainable AI in Large Language Models (LLMs), learners will be able to achieve Bent’s goals.

Students will leave the course with the tools to “enhance the transparency and accountability of AI systems, build trust with users and stakeholders, and address ethical and privacy concerns in AI development,” according to the course website.

The 35-hour program created by Duke’s AI Master of Engineering program and Duke’s Learning Innovation and Lifetime Education (LILE) is open to anyone around the world, aligning with Duke’s institutional goals of being a leader in the AI field by making this knowledge accessible to everyone.

As AI is continuously changing our lives, Bent finds joy in being a part of this technological journey.

“I love teaching these topics because the topics are emerging,” Bent said. “This is a very fast-paced field. Much of the content of the courses is only a few years old.”

She said that course content on new fields like mechanistic interpretability and explainable AI in generative computer vision includes research from periods as early as the summer of 2024.

With transparency, interpretability and explainability at the core, this course will be Coursera’s first offering on XAI and will be the first certified online course to cover mechanistic interpretability and XAI in LLMs, Bent said.

Duke students and alumni will receive free access to the course. All students who complete the course will earn a Duke career certificate.

Stay Ahead of AI with Duke

Master of Engineering in AI

Become a leader in applying AI and machine learning

Duke CREATE

Developing AI-based tools to improve teaching and learning

Related News