KDD 2026 Workshop: Geometric Space, Architecture and Learning Objective for Large Pre-Trained Models
The Geometric Space, Architecture and Learning Objective for Large Pre-Trained Models (GALOP) workshop at KDD 2026 brings together researchers working on geometric representation spaces, geometry-aware architectures, and learning objectives for large pre-trained models, spanning natural language processing, computer vision, graph learning, knowledge discovery, and scientific AI.
News
- 2026-03-22: Call for paper
- 2026-03-11: The workshop proposal was accepted by KDD 2026 as a half day workshop.
Introduction
Large pre-trained models have transformed artificial intelligence, but most of them still operate in Euclidean space. This workshop explores how geometric principles can improve large pre-trained models through representation space, model architecture, and learning objective. The goal is to bring together researchers from machine learning, data mining, and adjacent fields to discuss recent progress in geometric deep learning for foundation models.
The proposal emphasizes applications across natural language processing, computer vision, graph learning, knowledge discovery, and scientific discovery. The workshop is designed as a meeting point for researchers working on hyperbolic, spherical, mixed curvature, equivariant, and manifold aware methods that can better respect the structure of real world data.
Important Dates
Time: 11:59 PM Anywhere on Earth unless otherwise specified.
- Workshop paper submission: April 30, 2026
- Workshop paper notification: June 4, 2026
- Camera ready: June 15, 2026
- Final workshop program, materials, and full website online: June 22, 2026
- Workshop date: TBA
- Conference dates: August 9 to 13, 2026
- Venue: Jeju, Korea
Topics of Interest
We welcome submissions on topics including, but not limited to, the following directions from the proposal:
- Hyperbolic, spherical, and mixed curvature embeddings for large pre-trained models
- Non Euclidean word, sentence, document, and multimodal representations
- Geometric transformers and manifold aware attention mechanisms
- Equivariant and invariant architectures for foundation models
- Metric learning and contrastive objectives with geometric constraints
- Curvature aware optimization on Riemannian manifolds
- Alignment and fusion across different geometric spaces
- Theory for geometric large pre-trained models, including expressiveness and generalization
- Applications in natural language processing, computer vision, graph learning, knowledge discovery, and scientific discovery
- Benchmarks, evaluation protocols, open source tools, visualization, and reproducibility resources
Why This Workshop at KDD 2026
The proposal argues that the timing is strong for KDD because large pre-trained models now dominate modern AI, while their geometric foundations are still underexplored. Recent progress in hyperbolic embeddings for language models, geometric transformers, and curvature aware contrastive learning shows that geometric principles can materially improve representation quality, reasoning ability, and efficiency.
The topic is especially relevant to the SIGKDD community because knowledge graphs, recommender systems, graph mining, search, and scientific data all contain strong structural signals that are often better modeled with geometric inductive bias. The workshop aims to connect the geometric machine learning community with researchers in data mining and large scale AI systems.
Submission
- We welcome short research papers of up to 4 pages and full research papers of up to 9 pages, excluding references and supplementary materials.
- All accepted papers are planned to be presented as posters.
- Approximately 4 papers will be selected for oral presentations and 2 papers for outstanding paper awards.
- The workshop follows the current KDD 2026 workshop policy and is planned as an in person event.
GALOP Workshop@KDD2026 Submission
Tentative Program
The accepted workshop is half day. Based on the proposal, the program will include the following components:
- Opening remarks and workshop overview
- Invited talks on geometric learning and large pre-trained models
- Contributed paper spotlight presentations
- Poster and discussion session
- Panel discussion on the future of geometric AI
- Best paper recognition and closing remarks
The final public schedule will be updated after KDD 2026 confirms the half day slot.
Invited Speakers
TBA
Organizers
Menglin Yang HKUST (GZ) |
Jiahong Liu CUHK |
Lucas Vinh Tran JPMorgan Chase |
Rex Ying Yale University |