Skip to main content

Framework integrations

LlamaCloud works seamlessly with our open source python framework and typescript framework.

You can use LlamaCloudIndex as a drop-in replace of the VectorStoreIndex. It offers better performance out-of-the-box, while simplifying the setup & maintenance.

You can either create an index via the framework, or connect to an existing index (e.g. created via the no-code UI).

Create new index

In comparison to creating new index via the no-code UI, you can create index from Document objects via the framework integration. This gives you more low level control over:

  1. how you want to pre-process your data, and
  2. using any data loaders from LlamaHub.

Note that in this case, the data loading will be run locally (i.e. along with the framework code). For larger scale ingestion, it's better to create the index via no-code UI or use the files or data sources API via REST API & Clients.

Load documents

from llama_index.core import SimpleDirectoryReader

documents = SimpleDirectoryReader("data").load_data()

Create LlamaCloudIndex

import os
from llama_index.indices.managed.llama_cloud import LlamaCloudIndex

os.environ[
"LLAMA_CLOUD_API_KEY"
] = "llx-..." # can provide API-key in env or in the constructor later on

index = LlamaCloudIndex.from_documents(
documents,
"my_first_index",
project_name="Default",
api_key="llx-...",
verbose=True,
)

You may also optionally supply an organization_id string parameter to the .from_documents method. This may be useful if you have multiple projects with the same name under different organizations that you are a part of (more info). In general, it is recommended to supply this parameter if your account belongs to more than one organization to ensure your code continues to work as more projects are created in the organizations you are a member of.

Connect to existing index

Connect to an existing index

import os

os.environ[
"LLAMA_CLOUD_API_KEY"
] = "llx-..." # can provide API-key in env or in the constructor later on

from llama_index.indices.managed.llama_cloud import LlamaCloudIndex

index = LlamaCloudIndex("my_first_index", project_name="Default")

Use index in RAG/agent application

retriever = index.as_retriever()
nodes = retriever.retrieve("Example query")
...

query_engine = index.as_query_engine()
answer = query_engine.query("Example query")
...

chat_engine = index.as_chat_engine()
message = chat_engine.chat("Example query")
...

See full framework documentation