Skip to content

Posters


poster stuff
submission number
presenting author name
time zone
Notes
poster
in person
presenting author's email
paper title
3
Mengjia Niu
000
1
Open
Mengjia_Niu_LLM_KG.pdf
Mitigating Hallucinations in Large Language Models via Self-Refinement-Enhanced Knowledge Retrieval
11
Lei, Ding
000
2
Open
GenIR_How-to_search_36x48_A0_portrait.pdf
Enhancing Mobile "How-to" Queries with Automated Search Results Verification and Reranking
9
Sean MacAvaney
000
0
Open
GAuGE-A0-v3.pdf

Genetic Approach to Mitigate Hallucination in Generative IR

8
Jan Malte Lichtenberg
000
-5
Open
placeholder.pdf
Large Language Models as Recommender Systems: A Study of Popularity Bias
13
Yi Fang
000
0
Open
xuyang_genir2024_poster_portrait.pdf
Passage-specific Prompt Tuning for Passage Reranking in Question Answering with Large Language Models
6
Chaeeun Kim
000
9
Open
GenIR2024-DynamicIR-ChaeeunKim.pdf
Exploring the Practicality of Generative Retrieval on Dynamic Corpora
6
Chaeeun Kim
000
9
Open
[GenIR2024]DynamicIR_ChaeeunKim.pdf
Exploring the Practicality of Generative Retrieval on Dynamic Corpora
7
Diji Yang
000
0
Open
Interpretable Answer Framework - Poster - GenIR@SIGIR2024 - V2.pdf
An Interpretable Answer Scoring Framework
10
Raksha Jalan
000
6
Open
LLM-BRec_poster_submission_10.pptx.pdf
LLM-BRec:Personalizing Session-based Social Recommendation with LLM-BERT Fusion Framework
12
Anirudh Ravichandran
000
-4
Open
sigir_poster [Autosaved].pdf
MERLIN: Multiple Enhanced Representations with LLM generated INdices
16
Kavin R V
000
5
Open
Poster.pdf
Document Aware Contrastive Learning Approach for Generative Retrieval
16
Kavin R V
000
5
Open
Kavin_poster2.pdf
Document Aware Contrastive Learning Approach for Generative Retrieval
6
Chaeeun Kim
000
9
Open
[240717]GenIR2024-DynamicIR-ChaeeunKim.pdf
Exploring the Practicality of Generative Retrieval on Dynamic Corpora
There are no rows in this table

Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.