Charting a Future With AI
The recent Triangle AI Summit aimed to deepen engagement with AI and develop leadership to contend with its potential and risks

The summit demonstrated Duke’s leadership in the evolving field of artificial intelligence. Faculty from every school are conducting AI research, supported by administrative efforts such as the provost’s initiative and steering committee to determine direction for Duke’s diverse work in AI. Also, a pilot project with OpenAI grants all undergraduate students a prepaid license to ChatGPT-4o.
Both Innovation and Challenges
“Enabling the fast-changing world of AI means we look forward to the things we will build over the next year, but which we can scarcely imagine today,” said Tracy Futhey, vice president and chief information officer of Duke’s Office of Information Technology.
New York Times technology reporter Cade Metz delivered the keynote address, and four panels of researchers, industry experts and educators from across the Triangle addressed the effects of AI on society and its role in scientific innovation.

The summit also included a student panel, a teaching showcase of more than 20 demos on how to use AI in the classroom, and two interactive workshops on using AI as task-oriented assistants.
The sessions gave examples of extraordinary advances coming from AI to benefit society, but also concerns about the dangers, as well as its impact on the workforce and higher education.
Potential in Healthcare
Nicoleta Economou-Zavlanos, director of Duke Health’s AI Evaluation and Governance program, gave examples of how AI is transforming health care.
“We truly do believe in the potential for AI, because we can see that AI can sometimes detect strokes for example in brain scans better than the human eye can,” she said.
Economou-Zavlanos also cited the benefit of ambient technologies, which transcribe conversations between providers and patients, in eliminating the need for tedious note-taking and reducing clinician burnout.
Brinnae Bent, executive in residence in the Engineering Graduate and Professional Programs in the Pratt School of Engineering, discussed her work on an AI-assisted medical device that helps individuals with neurological conditions, such as multiple sclerosis and cerebral palsy regain mobility.
“Thousands of people all over the United States use this now to walk their daughter down the aisle, or run with their grandkids again,” Bent said.
At the same time, Bent acknowledged the harmful impacts of AI, particularly the targeting of teens by deepfake pornography and the use of racially biased police technologies.
Yakut Gazi, Duke’s vice provost for learning innovation and digital education, added that AI is disproportionately impacting women in the workforce, who occupy many of the jobs being disrupted the most by AI.
Artificial and Human Intelligence Working Together
Another theme was the growing numbers of humans working with AI. However, speakers agreed that AI is not poised to replace humans just yet in most jobs.
Bent underscored the value of humans in the workforce, citing the fintech company Klarna’s recent replacement of customer service representatives with AI, which was followed by a roughly $40 billion devaluation of the company.
In the same vein, Metz urged the audience to consider AI’s flaws. “There are so many things that we do that these systems do not. We’re just good at dealing with the chaos, the unexpected that comes up in our daily lives,” he said. “Machines are not as good at that. They’re good at recognizing patterns.”
“Hallucinations,” which are incorrect or misleading results generated by AI models, are a prime example of AI’s imperfections. Metz said AI uses probability to generate results, which pull from vast amounts of data on the internet that can be unreliable. Chris Bail, professor of computer science and sociology and director of Duke’s society-centered AI initiative, warned that this can fuel the spread of misinformation.
Yang emphasized the importance of thinking critically when using generative AI. “As an educator, I'm really worried about our next generation that is growing up in this world where AI is omnipresent. They just have this unfounded trust in these AI models.”
“Pandora’s box is open. It’s not something we could ever take back. It’s about evolving and shifting the education system.”
Dara Ajiboy, Duke Sophomore
Joseph Salem, Duke’s vice provost for library affairs, noted the importance of preparing students for a rapidly changing research environment in the age of AI. Student panelists from Duke’s Code+ program echoed the need for higher education to adapt.
“Pandora’s box is open. It’s not something we could ever take back. It’s about evolving and shifting the education system,” Duke sophomore Dara Ajiboy said.
Duke’s AI steering and advisory committees will begin new work around the university’s AI Framework in the fall, creating new opportunities for the broader campus community to engage deeply with AI. To learn more, visit ai.duke.edu.