Publications Office of the EU
Question and Answering - Webtools
DisplayCustomHeader
ability of chatbots to provide deduced and summarized answers (Q&A)

 

Ability of chatbots to provide deduced and summarized answers (Q&A)


Questions and answer

CONTEXT Q and A

 

Benefits of deduced and summarized answers (Q&A)

Benefits of deduced and summarized answers (Q&A)


Nested Applications
Asset Publisher
Cognitive Efficiency btn

Learn more

 

Asset Publisher
Relevance & Focus btn

Discover insights

 

 

Asset Publisher
Seamless conversation btn

Explore further

 

 

Nested Applications
Asset Publisher
Productivity Gains btn

See benefits

 

 

Asset Publisher
Accuracy & Clarity btn

Get details

 

 

Asset Publisher
Accessibility & Inclusivity btn

Find out more

 

 

illustration 1

Architecture


Technical setup

  1. Knowledge Base is used to create a model, by processing the available documents
  2. A vectorization is applied to each document in the model and results are stored in the vector DB
  3. RAG utilizes vectorized information in vector DB to finetune LLM answers

User query

  1. User performs a query
  2. LLM uses RAG to find the best suitable answers (including sources) in the vector DB
  3. Answer is generated and provided to the user, considering vectors match between the question and possible answers. Potentially multiple answers styles can be provided. Ex. Simple answer, Advance answer, Expert answer, etc.
illustration 1 image

illustration 2

Technical setup

Documents from selected sources (EU Whoiwho, EU Publications, EU Law in Force) are ingested into the Data processing unit, which vectorize the available documents, by generate embeddings to be stored in the vector DB (for example Elasticsearch or Qdrant).
User query The user performs a query through the OP Portal search or OP Portal Publio (OP’s enterprise applications). A search for the best suited answer is made using the RAG, considering the vectors available in the vector DB. A prompt and relevant sources are provided to a LLM (for example OpenAI GPT 4 -3.5, Meta LLama, Anthropic CLAUDE), that considering the Query, the prompt and the relevant sources, create an answer to be provided to the user. Potentially multiple answers styles can be provided. Ex. Simple answer, Advance answer, Expert answer, etc.


 

illustration 2 image