-
Notifications
You must be signed in to change notification settings - Fork 116
Description
Traceback (most recent call last):
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\llms\utils.py", line 29, in resolve_llm
validate_openai_api_key(llm.api_key)
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\llms\openai_utils.py", line 383, in validate_openai_api_key
raise ValueError(MISSING_API_KEY_ERROR_MESSAGE)
ValueError: No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\MP\PycharmProjects\PythonAgentAI\main.py", line 10, in
from pdf import canada_engine
File "C:\Users\MP\PycharmProjects\PythonAgentAI\pdf.py", line 24, in
canada_index = get_index(canada_pdf, "canada")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\pdf.py", line 15, in get_index
index = load_index_from_storage(
^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\indices\loading.py", line 33, in load_index_from_storage
indices = load_indices_from_storage(storage_context, index_ids=index_ids, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\indices\loading.py", line 78, in load_indices_from_storage
index = index_cls(
^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\indices\vector_store\base.py", line 53, in init
super().init(
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\indices\base.py", line 61, in init
self._service_context = service_context or ServiceContext.from_defaults()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\service_context.py", line 178, in from_defaults
llm_predictor = llm_predictor or LLMPredictor(
^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\llm_predictor\base.py", line 109, in init
self._llm = resolve_llm(llm)
^^^^^^^^^^^^^^^^
File "C:\Users\MP\PycharmProjects\PythonAgentAI\venv\Lib\site-packages\llama_index\llms\utils.py", line 31, in resolve_llm
raise ValueError(
ValueError:
Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys
To disable the LLM entirely, set llm=None.
Process finished with exit code 1