π‘ Since the emergence of ChatGPT, Large Language Models (LLMs) have garnered significant attention, with new advancements continuously emerging. LLMs have found applications in various domains like vision, audio, and text tasks. However, tabular data remains a crucial data format in this world. Hence, this repo focuses on collecting research papers that explore the integration of LLM technology with tabular data, and aims to save you valuable time and boost research efficiency.
β¨ Awesome-LLM-Tabular is a curated list of Large Language Model applied to Tabular Data.
π₯ This project is currently under development. Feel free to β (STAR) and π (WATCH) it to stay updated on the latest developments.
- Table Representation Learning Workshop @ NeurIPS 2023
- Table Representation Learning Workshop @ NeurIPS 2024
@misc{wu2024awesomellmtabular,
author = {Hong-Wei, Wu},
title = {Awesome-LLM-Tabular},
year = {2024},
note = {Accessed: 2024-05-30},
url = {https://github.com/johnnyhwu/Awesome-LLM-Tabular},
orcid = {https://orcid.org/0009-0005-8073-5297}
}
We welcome contributions to keep this repository up-to-date with the latest research and applications of LLM in the tabular domain. Whether you want to correct any mistakes, add new content, or suggest improvements, your contributions are highly appreciated π€.