Zack Saadioui
8/26/2024
1
pip1
2
bash
pip install llama-index1
data1
gpt-3.5-turbo1
2
bash
export OPENAI_API_KEY=YOUR_API_KEY_HERE1
2
bash
set OPENAI_API_KEY=YOUR_API_KEY_HERE1
YOUR_API_KEY_HERE1
starter.py1
2
``
This simple snippet loads the document from the1
2
3
├── starter.py
└── data
└── paul_graham_essay.txt1
starter.py1
2
3
4
python
query_engine = index.as_query_engine()
response = query_engine.query("What did the author focus on growing up?")
print(response)1
starter.py1
2
``
This will give you a more verbose output! You can set the level to1
2
python
index.storage_context.persist()1
storage1
2
3
4
python
query_engine = index.as_query_engine()
response = query_engine.query("What did the author focus on growing up?")
print(response)Copyright © Arsturn 2025