admin管理员组文章数量:1334924
I am working on relationship extraction using the PropertyGraphStore
class from LangChain, following the approach outlined in the official guide. My goal is to restrict the nodes and relationships being extracted by using SchemaLLMPathExtractor
.
Here's the issue:
When I use local models like Llama 3.1 or Mistral through Ollama, no relationships get extracted when
SchemaLLMPathExtractor
is applied.If I remove the
SchemaLLMPathExtractor
, it extracts a large number of relationships, but they are not constrained as needed.Interestingly, when I switch to OpenAI's API instead of Ollama, everything works as expected even with
SchemaLLMPathExtractor
.
I’ve followed the instructions from the LangChain documentation and ensured the local models are set up correctly. Here are the steps I’ve tried:
Verified the compatibility of Ollama with the LangChain version I’m using.
Double-checked the schema and input configurations to align with the examples provided in the guide.
Tested with different prompts to ensure that the issue is not with the input format.
Ensured the local Ollama models are set up correctly, as they produce outputs without
SchemaLLMPathExtractor
.Checked the schema definitions, ensuring they match the example in the documentation.
Still, I’m facing the same issue with Ollama. There are no errors; the output simply doesn't include any relationships when the SchemaLLMPathExtractor
is used.
Key Details:
LangChain Version: 0.2.14
Ollama Version: 0.3.9
Local Models Tried: Llama 3.1, Mistral
OpenAI Model Used: gpt-4o-mini
Code for OpenAI:
# Creating an instance of Neo4jPropertyGraphStore with environment configuration
graph_store = Neo4jPropertyGraphStore(
username=NEO4J_USER,
password=NEO4J_PASS,
url=NEO4J_URL,
database=NEO4J_DB_NAME
)
vec_store = None
# Creating an instance of SchemaLLMPathExtractor with OpenAI model and schema configuration
kg_extractor = SchemaLLMPathExtractor(
llm=OpenAI(model=LLM_MODEL, temperature=TEMPERATURE),
possible_entities=entities,
possible_relations=relations,
kg_validation_schema=validation_schema,
strict=True,
)
# Creating an instance of PropertyGraphIndex with documents and environment configuration
index = PropertyGraphIndex.from_documents(
documents,
embed_model=OpenAIEmbedding(model_name=EMBEDDING_MODEL),
show_progress=True,
kg_extractors=[kg_extractor],
property_graph_store=graph_store,
vector_store=vec_store,
)
Code for Ollama:
# Creating an instance of Neo4jPropertyGraphStore with environment configuration
graph_store = Neo4jPropertyGraphStore(
username=NEO4J_USER,
password=NEO4J_PASS,
url=NEO4J_URL,
database=NEO4J_DB_NAME
)
vec_store = None
# Creating an instance of SchemaLLMPathExtractor with Ollama model and schema configuration
kg_extractor = SchemaLLMPathExtractor(
llm=Ollama(model="mistral:latest"),
possible_entities=entities,
possible_relations=relations,
kg_validation_schema=validation_schema,
strict=True,
)
# Creating an instance of PropertyGraphIndex with documents and environment configuration
index = PropertyGraphIndex.from_documents(
documents,
embed_model=OllamaEmbedding(model_name="mistral:latest", base_url="http://localhost:11434"),
show_progress=True,
kg_extractors=[kg_extractor],
property_graph_store=graph_store,
vector_store=vec_store,
)
Has anyone encountered a similar issue with Ollama and
SchemaLLMPathExtractor
?Is there any specific configuration or adjustment required to make this setup work?
Any insights or suggestions would be highly appreciated!
本文标签:
版权声明:本文标题:python - Issue with SchemaLLMPathExtractor and Ollama Models for Relationship Extraction in LangChain - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1742379120a2463752.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论