Topics of interest include, but are not limited to, the following topics that help reconnect modern algorithmic approaches with the psychological, design, and human-centered dimensions that characterized early research in this field, creating a more holistic and interdisciplinary approach to recommender systems (RS):
- Critical reflections on the dominance of machine learning in RS research
- Interdisciplinary approaches to RS (e.g., psychology, HCI, design, sociology, cognitive science, STS, computational social science)
- Human-centered recommendation methodologies
- Human-centered evaluation methodologies
- User experience, trust, and transparency in RS
- Case studies of RS failures or unintended consequences in real-world applications
- Design methodologies such as participatory, speculative, or value-sensitive design
- Qualitative and mixed-method research in RS development, evaluation, and understanding recommendation needs
- Application of psychological theories to RS (e.g., decision-making, motivation, affect, personality, autonomy)
- Cognitive science perspectives on information filtering and discovery
- Ethical tensions, value conflicts, and societal implications of RS
- RS in sensitive or high-stakes contexts (e.g., education, healthcare, mental health)
- Historical or critical analyses of the evolution of RS research
- Cultural and sociological dimensions of recommendation
- User agency, control, and feedback in the interaction with RS
- Long-term impacts of recommendations on user agency and preference development
- Reframing recommender goals: supporting well-being, reflection, or empowerment
- Alternative theoretical frameworks for conceptualizing the recommendation problem
We particularly encourage submissions that challenge established paradigms, highlight methodological diversity, or bring underrepresented perspectives into the conversation.
Paper Submission
We welcome two types of paper submissions:
- Position papers (maximum 6 pages, excluding references)
- Case study papers (maximum 10 pages, excluding references)
Both types of submissions should use the CEURART single-column template, available for download here (ZIP) or via Overleaf.
We particularly encourage case studies that report on challenges, problems, and negative experiences, as these offer valuable insights. Such contributions can serve as a basis for rich discussions at the workshop.
All submissions must be original and not under review at any other conference, workshop, or journal at the time of submission.
Papers must be submitted via Easychair (link).
Each submission will be reviewed by three members of the Program Committee through a single-anonymized peer review process. Papers will be evaluated based on quality, novelty, clarity, and relevance, with an emphasis on fostering engaging discussions at the workshop. Accepted papers will be invited for presentation during the workshop and will be published in the workshop proceedings (most likely via CEUR-WS) after the event.
Call for Provocative Perspectives
In addition to traditional paper submissions, we also invite provocative statements and flash talks that aim to spark discussion, challenge assumptions, or present bold ideas related to recommender systems research. These can include:
- Short, critical reflections or position statements
- Visions for the future of RS
- Lessons learned from RS research or deployment
- Thought experiments, conceptual provocations, or open questions
- Radical critiques of current methodologies or assumptions
For these submissions, we ask you to submit an abstract via Easychair (link).