The Sparse workshop aims to bring together researchers interested in compiler techniques, programming abstractions, and hardware for sparse computing including sparse tensor algebra, relational algebra, and graph processing applications. Due to the large number of applications, optimization techniques, types of data structures, and specialized hardware, there is a need for automation. In recent years, there has been a lot of interest in compiler techniques to automatically generate sparse computing code. This workshop aims to bring together leading researchers from academia and industry for talks on applications, code generation, source code transformation and optimization, automatic scheduling, data structure modeling, compilation to different types of hardware, specialized accelerators, extensions to new types of sparse array operations, and applying the techniques beyond sparsity to areas such as lossless compression. The workshop will last one day and will include invited talks, discussions, and a small number of submitted talks.
This program is tentative and subject to change.
Mon 16 JunDisplayed time zone: Seoul change
09:00 - 10:10 | |||
09:00 20mTalk | Insum: Sparse GPU Kernels Simplified and Optimized with Indirect Einsums Sparse Saman Amarasinghe Massachusetts Institute of Technology | ||
09:20 20mTalk | Intelligent Auto-Tuning for High-Performance Sparse Tensor Algebra Sparse Jiajia Li North Carolina State University | ||
09:40 20mTalk | Loop Fusion in Matrix Multiplications with Sparse Dependence Sparse Kazem Cheshmi McMaster University | ||
10:00 10mTalk | Panel 1 Sparse Saman Amarasinghe Massachusetts Institute of Technology, Kazem Cheshmi McMaster University, Jiajia Li North Carolina State University |
10:30 - 12:00 | |||
10:30 20mTalk | Optimizations and abstractions for sparse machine learning Sparse Charith Mendis University of Illinois at Urbana-Champaign | ||
10:50 20mTalk | Distributed Sparse Computing with Legate Sparse Sparse Rohan Yadav Stanford University | ||
11:10 20mTalk | Optimizing Recursive Sparse Computations Sparse Amir Shaikhha University of Edinburgh | ||
11:30 20mTalk | Panel 2 Sparse Charith Mendis University of Illinois at Urbana-Champaign, Rohan Yadav Stanford University, Amir Shaikhha University of Edinburgh |
14:00 - 15:20 | |||
14:00 20mTalk | PyData/Sparse & Finch: extending sparse computing in the Python ecosystem Sparse | ||
14:20 20mTalk | Compiling and Compressing Structured Tensors Sparse | ||
14:40 20mTalk | Sparsity-Aware Autoscheduling for Numpy with Finch and Galley Sparse Willow Ahrens Massachusetts Institute of Technology | ||
15:00 20mTalk | Panel 3 Sparse Hameer Abbasi Quansight, Emilien Bauer , Willow Ahrens Massachusetts Institute of Technology, Mateusz Sokol Quansight Labs |
15:40 - 17:00 | |||
15:40 20mTalk | Hyperreal Specifications for Continuous Sparse Data Computations Sparse | ||
16:00 20mTalk | Quantum Simulation with Sparse Tensors Sparse Meisam Tarabkhah University of Edinburgh | ||
16:20 20mTalk | Sparse Computing Drives Energy-Efficient Computation for Artificial Intelligence Sparse | ||
16:40 20mTalk | Panel 4 Sparse |
Accepted Papers
Call for Talks
We are soliciting 15 minute talks and posters for the second Sparse Workshop. Relevant topics include applications, libraries, programming language constructs, compilers, libraries/frameworks, and hardware for sparse computing including sparse tensor algebra, relational algebra, and graph processing applications. The talks/posters can be technical, on new ideas, on your thoughts about future needs, or other related topics that you are excited about. Already published ideas are welcome. There will not be a proceeding, so the talks will not require a submitted paper. If you are interested, please submit a short description (100-200 words).