tree: 5b7474c5bc6d473c0914364e044ade2831045106 [path history] [tgz]
  1. arxiv_articles.py
  2. functions.py
  3. get_articles.png
  4. read_article_and_summarize.png
  5. README.md
  6. requirements.txt
  7. state.py
  8. summarize_text.py
examples/LLM_Workflows/knowledge_retrieval/README.md

How to use “OpenAI functions” with a Knowledge Base

This example is based on this notebook in the OpenAI cookbook.

The point of the cookbook is to show how you can retrieve some knowledge, e.g. a list of articles, and then use that as input to OpenAI, using it in the context of conversation, where the OpenAI LLM can determine whether an external function should be called to get more context.

Hamilton is great for describing dataflows, i.e. DAGs. In this example, we will use Hamilton to define a DAG that encapsulates two functions that are called by OpenAI as housed in functions.py. Specifically the logic for each is housed in two modules:

get_articles()

This is the DAG that is defined when you call get_articles() in functions.py: get_articles()

summarize_text()

This is the DAG that is defined when you call summarize_text() in functions.py: summarize_text()

Then to orchestrate this, state.py contains code to hold a conversation and functions to call out to OpenAI and then exercise the two functions as appropriate.

Note: the original cookbook example is not production ready, and so to set expectations, in this example we get 80% of the way there translating it into Hamilton code. The remaining 20% is left as an exercise for the reader.

Running the example

You just need to install the dependencies:

pip install -r requirements.txt

and run state.py:

python state.py

The bottom of state.py has the conversation that is taking place - so modify that to change the conversation.