Call for Participation
On the importance of explanation content for explainable AI: Towards an explanation content research agenda
Organizers
Helmut Degen
Siemens Corporation, USA
helmut.degen@siemens.com
George Margetis
Foundation for Research and Technology – Hellas (FORTH), Greece
gmarget@ics.forth.gr
Stavroula Ntoa
Foundation for Research and Technology – Hellas (FORTH), Greece
stant@ics.forth.gr
Motivation of the Workshop
Explainable AI research has made strong progress in designing and comparing explanation types such as causal, counterfactual, contrastive, example based, case based, rule based, feature attribution based, and hybrid approaches. This body of work has advanced our understanding of what kinds of explanations can be generated and how they can be presented.
At the same time, explanation content remains comparatively under-researched: what an explanation should contain beyond its type, how that content should be structured, and how it should adapt to user roles, goals, tasks, and operational contexts. In many papers, content is treated implicitly or described only at a high level, which can leave unclear how the investigated content elements were derived and how they relate to user needs and task demands. As a result, it can also be unclear whether the studied explanation content is relevant to users at all. This limits comparability across studies, complicates replication, and slows the accumulation of a coherent body of knowledge on explanation content for trustworthy AI.
This workshop is motivated by the need to complement the focus on explanation types with a systematic focus on explanation content as a first-class research object grounded in human-centered perspectives. The workshop aims to advance shared definitions, encourage clearer reporting practices, and identify robust methods for deriving and validating explanation content.
Aim of the Workshop
- Establish a shared understanding and scope for explanation content in XAI, grounded in human-centered needs
- Explore negative consequences for XAI research when explanation content is not identified in a human-centered way
- Map methodological gaps and define a prioritized research agenda for human-centered explanation content research
Expected Workshop outcome
- A shared scope statement for explanation content in XAI
- A structured set of negative consequences of not identifying explanation content in a human-centered way, with implications for research and practice
- A set of open research questions, methodological gaps, and challenges, grouped by themes
- A prioritized research agenda and follow up plan, including candidate publication or dissemination targets
Workshop topics
The workshop focuses on the role of eliciting explanation content for research in explainable AI.
Workshop agenda
Workshop event: 13:30 pm – 17:30 pm, Sunday, 26 July 2026
The following is a framework for the program of the Workshop:
|
Time |
Program event |
|
30 min. |
Collect experiences from all workshop participants regarding the need for human-centered for eliciting explanation content |
|
30 min |
Establish a shared understanding of what explanation content is in XAI |
|
45 min. |
Explore negative consequences of not identifying explanation content in a human-centered way |
|
45 min. |
Identify research gaps and research questions |
|
60 min. |
Prioritize the research agenda and define follow up actions |
Guidelines to prospective authors
Submission for the Workshop
Interested participants should submit a position paper (around 1000 words without references) on the importance of explanation content for explainable AI.
Prospective authors should submit their proposals in PDF format through the HCII Conference Management System (CMS).
Submission for the Conference Proceedings
The contributions to be presented in the context of Workshops will not be automatically included in the Conference proceedings.
However, after consultation with the Workshop organizer(s), authors of accepted Workshop proposals who are registered for the Conference are welcome to submit, through the Conference Management System (CMS), an extended version of their Workshop contribution to be considered, following further peer review, for presentation at the Conference and inclusion in the “Late Breaking” volumes of the Conference proceedings, either in the LNCS as a long paper (typically 12 pages, but no less than 10 and no more than 20 pages), or in the CCIS as a short paper/extended poster abstract (typically 6 pages, but no less than 4 and no more than 11).
Workshop organizers are also encouraged to consider and explore the (additional) possibility of preparing a paper (short or long) which will present the collaborative efforts of their Workshop participants, and can be submitted in October 2026 to be considered for publication in the context of the HCII 2027 Conference Proceedings.
Workshop deadlines
|
Submission of Workshop contributions |
April 10, 2026 |
|
Authors notified of decisions on acceptance |
April 24, 2026 |
|
Finalization of Workshop organization and registration of participants |
May 29, 2026 |
Workshop organizers

Helmut Degen
Dr. Helmut Degen is Senior Key Expert for User Experience at Siemens Corporation, Princeton, NJ, USA. Helmut conducts explainable AI (XAI) research for industrial applications at Siemens with a focus on human-computer interaction. Helmut is also the co-chair of the yearly, international conference on Artificial Intelligence in Human-Computer Interaction (AI-HCI) affiliated with the HCI International Conference. He has received a Masters of Science (in German “Diplom-Informatiker”) from the Karlsruhe Institute of Technology and a PhD in Information Science from the Freie Universität Berlin (both in Germany).

George Margetis
Dr. George Margetis is a computer scientist specializing in Human-Computer Interaction (HCI), Human-Centered Artificial Intelligence (HCAI), Ambient Intelligence (AmI), Extended Reality (XR), and Digital Accessibility. Since 2021, he leads the Human-Centered AI research and development activities of the HCI Laboratory of FORTH-ICS, Greece. In this role, he has advocated for human-centric and inclusive design of AI systems, ensuring outcomes that are technologically robust, transparent, usable, and aligned with human values. He is also the co-chair of the yearly, international conference on Human-Centered Design, Operation and Evaluation of Mobile Communications (MOBILE), affiliated with the HCI International Conference.

Stavroula Ntoa
Dr. Stavroula Ntoa is a Computer Scientist specializing in Design for All, software accessibility, usability engineering, and User Experience (UX) research and design. She is a Principal Researcher at the HCI Laboratory of FORTH-ICS, Greece, leading the accessible UX research and design activities of the lab. Her research interests focus on Design for All and Universal Access of modern interactive technologies, adaptive and intelligent interfaces, as well as inclusiveness and user experience research in intelligent and Artificial Intelligence environments. She serves as co-chair of the annual International Conference on Artificial Intelligence in Human-Computer Interaction, affiliated with the HCI International Conference.
Registration regulation
Workshops will run as 'hybrid' events. Organizers are themselves expected to attend ‘on-site’, while participants will have the option to attend either 'on-site' or 'on-line'. The total number of participants per Workshop cannot be less than 8 or exceed 25.
Workshops are ‘closed’ events, i.e. only authors of accepted submissions for a Workshop will be able to register to attend the specific Workshop.
Workshop registration is complimentary for registered Conference participants or requires a fee of $95 per Workshop for non-registered Conference participants.