Fact Verification in long-form story text

Project Information

Project Introduction

In the long-form text generation, generate a structure before the text becomes a new paradigm (cite DOC paper), which is easy for humans to interact and control the overall topic.  The problem is because the LLM can not see the full context and the inner Hallucinations. The mainly phenomena is that the fact is inconsistent during the generation process, so maintaining a system beyond the generation is important in this case.

Traditional Contradiction Detection method (FactScore paper, e.t.c.) is based on detecting current corpus that contradict with an outside, ground truth database like wikipedia,  while it is different to detect the contradiction inside a corpus. The corpus can not explicitly be taken as a fact statement, and also a fact statement is also transit and only valid in an interval. The complex property of the structure inner context make the contradict detection task becoming more challenging. (see figure below).

Our research proposes a new framework about contradiction inspired by narrative theory and applies it to deploy a better method to generate a better story outline. In narrative theory, a event can be defined as the change of world status. And to verify the contradiction, we can model world status as a set of fact statements which do not contradict each other. We first apply large language model to decompose events into three type of fact statements, we call them pre-fact, post-fact and strategic fact.