About me
Email: qwang16 [at] wm [dot] edu
Hello!
I am Qingyun Wang, an incoming Assistant Professor of the Data Science Department at William & Mary, starting in August 2025.
I am a Ph.D. student in the Siebel School of Computing and Data Science at the University of Illinois at Urbana-Champaign. I have been a member of the BLENDER Lab since 2017, supervised by Prof. Heng Ji. Previously, I graduated a summa cum laude from Rensselaer Polytechnic Institute with a dual B.S. degree in Computer Science and Mathematics.
I am among the first researchers to develop a virtual scientific research assistant (i.e., PaperRobot [ACL 2019]) for literature-based discovery by extracting and synthesizing insights from papers. My research interest lies in Automated Literature Understanding and Scientific Discovery. My long-term vision is to develop AI for Scientists (AI4Scientist) tools to effectively accelerate and democratize the entire research lifecycle for scientists, from knowledge acquisition [(NAACL ‘21 Best Demo🏆)I,II,III], hypothesis generation [IV], multimedia procedure planning for experiment design [V], experiment execution[VI], conduction to writing[VII,VIII], and evaluating the paper draft[IX].
Research Interests
- Scientific Multimodal Foundation Models with Critical Thinking: Build a new multimodal scientific LLM to understand formulas, tables, figures, and charts; Design model can dynamically extract and integrate new multimodal knowledge elements without additional training
- Few-Shot Scientific Knowledge Acquisition: Investigate methods for extracting knowledge from scientific corpora with limited annotation
- Planning and Reasoning in Scientific Domain: Utilize both structured and unstructured knowledge as well as logic rules among knowledge elements to produce trustworthy and explainable results
- Scientific Research Agents with Physical World Interactions: Train a new human-in-the-loop reinforcement learning framework with human, experimental, and literature feedback, which can leverage small datasets in closed-loop discovery platforms; Develop a human-in-the-loop self-driving laboratory that can complete the scientific research lifecycle through interactions with the physical world, such as a robotic laboratory
Prospective students
I am constantly looking for highly motivated PhD students (as fully-funded RAs) and interns to join my lab! If you are interested in working with me, please fill this form. After completing the form, you are also welcome to reach out via email (qwang16 [at] wm [dot] edu). I will read all submitted forms and emails but I do apologize for not being able to respond to each of them. Prospective Students English, Prospective Students Chinese
I’m happy to collaborate and answer questions about my research. I especially encourage students from underrepresented groups to reach out, as I commited to foster diversity, equity and inclusion in our community.
Recent News
Apr 11, 2025: We will organize the VISTA: Visionary Innovation in Standards and Technology of GenAI at ICDM 2025 in Washington DC!
Dec 11, 2024: Excited to give an invited talk, AI4Scientist: Accelerating and Democratizing Scientific Research Lifecycle at The Cosmic Horizons: AI-Powered Insights into the Universe conference, organized by NSF-Simons AI Institute for Cosmic Origins (CosmicAI), in May 2025!
Sep 20, 2024: We will organize the second AI4Research: Towards a Knowledge-grounded Scientific Research Lifecycle at AAAI 2025 in Philadelphia! Please submit your research on OpenReview by Nov 24!
Aug 20, 2024: Invited talk “SciMON: Scientific Inspiration Machines Optimized for Novelty” at Elsevier.
Aug 15, 2024: Gene-Metabolite Association Prediction with Interactive Knowledge Transfer Enhanced Graph for Metabolite Production was accpeted by BIBM 2024.
Apr 26, 2024: Invited talk “AIScientist: Toward Automated Literature Understanding and Scientific Discovery” at PSU.
May 15, 2024: SciMON: Scientific Inspiration Machines Optimized for Novelty was accpeted by ACL 2024. [AI News (5th June 2023)] [AI Breakdown] [MMLI Newsletter]
Mar 13, 2024: Named Entity Recognition Under Domain Shift via Metric Learning for Life Sciences was accpeted by NAACL-HLT 2024.
Mar 8, 2024: Invited talk “SciMON: Scientific Inspiration Machines Optimized for Novelty” at Oak Ridge National Laboratory.
Jan 18, 2024: Chem-FINESE: Validating Fine-Grained Few-shot Entity Extraction through Text Reconstruction was accepted by EACL 2024 Findings.
Nov 18, 2023: We will present a tutorial on Towards a Human-Computer Collaborative Scientific Paper Lifecycle at LREC-COLING 2024 in Italy!
Oct 2, 2023: We will present a tutorial at EACL2024 in Malta and organize the first “Language + Molecules” workshop at ACL2024 in Bangkok!
Load More
- May 2, 2023: Multimedia Generative Script Learning for Task Planning was accepted by ACL 2023 Findings.
- Nov 10, 2021: We presented a tutorial on Knowledge-enriched Natural Language Generation at EMNLP 2021. There were more than 95 online and 25 in-person participants.
- June 9, 2021 COVID-19 Literature Knowledge Graph Construction and Drug Repurposing Report Generation won the Best Demo Award🏆 at NAACL-HLT 2021 System Demonstrations! [News Report]
- May 14, 2021: Stage-wise Fine-tuning for Graph-to-Text Generation was accepted by ACL-IJCNLP 2021 SRW. [Twitter]
- March 17, 2021: COVID-19 Literature Knowledge Graph Construction and Drug Repurposing Report Generation was accepted by NAACL-HLT 2021 System Demonstrations. [Twitter]
- Oct 11, 2020: ReviewRobot: Explainable Paper Review Generation based on Knowledge Synthesis was accepted by INLG 2020.
- May 14, 2019: PaperRobot: Incremental Draft Generation of Scientific Ideas was accepted by ACL 2019. [News Report] [Twitter]
- Sept 7, 2018: Describing a Knowledge Base was accepted by INLG 2018 as Oral Presentation.
- Apr 20, 2018: Paper Abstract Writing through Editing Mechanism was accepted by ACL 2018.
- Nov, 2017: A Two-Layer Dialogue Framework For Authoring Social Bots was accepted by 1st Proceedings of Alexa Prize.
- Nov 30, 2016: Our Wise Macaw team is Finalist in $2.5 Million Amazon Alexa Prize. [News Report] </ul> </details>