Test Drive: llama-index
Published on .
Today I’m test driving llama-index, “a data framework for your LLM application.” My task will be to summarize my recent Google location history. I’m just going to do the boring quickstart with barely any modification.
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What places have I spent time recently?")
print(response)
Code language: Python (python)
The result was the following, which is true but pretty meaningless
You have recently spent time at “Messina Site and Utility Corp.” in Marblehead, MA and at locations along a walking route with waypoints including ChIJtbU17bsU44kR6pLuGMFRJ4k, ChIJc1v_77sU44kRqSUfpZPkufc, ChIJY8KHV7kU44kRMFeDo84Hiio, and ChIJcQCpSLkU44kRU3N0beAOx6E.
Behind the scenes it created embeddings of chunks of my documents, retrieved the relevant docs and queried the LLM.