Aims & scope

JCEII aims to:

  • Explore the fundamental principles of human cognition and intelligence to inspire innovative computational models and interaction paradigms;
  • Promote the development of intelligent information processing systems that are deeply informed by cognitive, emotional, and behavioral characteristics of humans;
  • Advance natural and multimodal human-computer interaction through cognitive engineering approaches;
  • Serve as a cross-disciplinary forum connecting cognitive science, artificial intelligence, human-computer interaction, neuroscience, and engineering.

Scope

JCEII publishes original research articles, reviews, case studies, and short communications in, but not limited to, the following areas:

Cognitive Engineering & Computing

  • Cognitive modeling and simulation
  • Emotion and affective computing
  • Cognitive architectures for intelligent systems
  • Mental workload, attention, and decision-making in interactive systems
  • Neuro-inspired and brain-like computing models

Intelligent Interaction Technologies

  • Brain-computer interfaces (BCI) and neural interaction systems
  • Natural language interaction and cognitive dialogue systems
  • Human-robot interaction with cognitive adaptation
  • Adaptive and personalized user interfaces

Human-Centered Artificial Intelligence

  • Explainable AI (XAI) from a cognitive perspective
  • Cognitive ergonomics and usability in complex systems
  • Trust, transparency, and user modeling in intelligent systems
  • Interaction in intelligent environments (smart homes, AR/VR, wearable systems)

Human-AI Collaboration

  • Human-in-the-loop systems and interactive machine learning
  • Cognitive task sharing and decision-support in AI-assisted systems
  • Trust, transparency, and explainability in collaborative AI
  • Human-AI teaming in dynamic and high-risk environments
  • Feedback loops between user behavior and system adaptation

Multimodal Interfaces

  • Multimodal interaction frameworks integrating speech, image, gesture, gaze, and emotion
  • Multisensory fusion and context-aware interface design
  • Cognitive modeling of multimodal communication and processing
  • Multimodal interaction in AR/VR and extended reality systems
  • Evaluation methods and benchmarks for multimodal systems

Applications and Systems

  • Assistive and rehabilitation technologies
  • Cognitive interaction in healthcare and education
  • Emotion-aware intelligent tutoring systems
  • Interactive systems for autonomous vehicles
  • Industrial and defense cognitive systems