Example v<=1.0.3

NOTE: This will be depreciated soon, use latest version docs.

Prerequisites-

Python: Ensure Python is installed on your system.
GenAI Knowledge: Familiarity with Generative AI models.

Example Usage-

from chatformers.chatbot import chat

# llm_provider_settings = {
# "provider": 'ollama',
# "base_url": 'http://localhost:11434',
# "model": "openhermes",
# "options": {},
# "api_key": None
# }
# embedding_model_settings = {
# "provider": 'ollama',
# "base_url": 'http://localhost:11434',
# "model": "nomic-embed-text",
# "api_key": None
# }
# llm_provider_settings = {
# "provider": 'openai',
# "base_url": "https://api.openai.com/v1",
# "model": "gpt-4o-mini",
# "options": {},
# "api_key": ""
# }
# embedding_model_settings = {
# "provider": 'openai',
# "base_url": "https://api.openai.com/v1",
# "model": "text-embedding-ada-002",
# "api_key": ""
# }

llm_provider_settings = {
"provider": 'groq',
"base_url": 'https://api.groq.com/openai/v1',
"model": "gemma2-9b-it",
"api_key": "",
}

embedding_model_settings = {
"provider": 'jina',
"base_url": "https://api.jina.ai/v1/embeddings",
"model": "jina-embeddings-v2-base-en",
"api_key": ""
}

chroma_settings = {
"host": None,
"port": None,
"settings": None
}

memory_settings = {
"try_queries": True,
"results_per_query": 3,
}
collection_name = "conversation"
unique_session_id = "012"
unique_message_id = "A01"
system_message = "You are a helpful assistant."
buffer_window_chats = [
{'role': 'user', 'content': 'what is 7*5?'},
{'role': 'assistant', 'content': '35'},
{'role': 'user', 'content': 'now add 4 on that.'},
]
query = "Now add, 100 on that."
response = chat(query=query, system_message=system_message,
llm_provider_settings=llm_provider_settings,
chroma_settings=chroma_settings,
embedding_model_settings=embedding_model_settings,
memory_settings=memory_settings,
memory=True,
summarize_memory=False,
collection_name=collection_name,
unique_session_id=unique_session_id,
unique_message_id=unique_message_id,
buffer_window_chats=buffer_window_chats)
print("Assistant: ", response)
You can see, we are having initial conversation with assistant, where it is not aware of context (check next output also)-
Vector Database Queries sliced: ['What is the date of the conversation?', 'Has the user asked about politics or current events before?', 'who is PM of India?']

Processing queries to vector database: 100%|██████████| 3/3 [00:01<00:00, 2.21it/s]
0 past conversation fetched.'
system: You are a helpful assistant.
Here is the memory of old conversations-
{'memories': '[]'}

user: what is 7*5?
assistant: 35
user: who is PM of India?
Assistant: As of my last knowledge update in October 2023, the Prime Minister of India is Narendra Modi. He has been in office since May 2014. Please verify with a current source to confirm this information, as political positions can change.
Below you can see assistant remember the context in next run, fetched context as memories-
Vector Database Queries sliced: ["Who is 'him' referring to?", "What information does the user already have about 'him'?", 'Tell me more about him?']

Processing queries to vector database: 0%| | 0/3 [00:00<?, ?it/s]Number of requested results 3 is greater than number of elements in index 1, updating n_results = 1
Processing queries to vector database: 33%|███▎ | 1/3 [00:00<00:01, 1.84it/s]Number of requested results 3 is greater than number of elements in index 1, updating n_results = 1
Processing queries to vector database: 67%|██████▋ | 2/3 [00:00<00:00, 2.27it/s]Number of requested results 3 is greater than number of elements in index 1, updating n_results = 1
Processing queries to vector database: 100%|██████████| 3/3 [00:01<00:00, 2.14it/s]
3 past conversation fetched.'
system: You are a helpful assistant.
Here is the memory of old conversations-
{'memories': "['user: who is PM of India?\\nassistant: As of my last knowledge update in October 2023, the Prime Minister of India is Narendra Modi. He has been in office since May 2014. Please verify with a current source to confirm this information, as political positions can change.', 'user: who is PM of India?\\nassistant: As of my last knowledge update in October 2023, the Prime Minister of India is Narendra Modi. He has been in office since May 2014. Please verify with a current source to confirm this information, as political positions can change.', 'user: who is PM of India?\\nassistant: As of my last knowledge update in October 2023, the Prime Minister of India is Narendra Modi. He has been in office since May 2014. Please verify with a current source to confirm this information, as political positions can change.']"}

user: what is 7*5?
assistant: 35
user: Tell me more about him?
Insert of existing embedding ID: A01
Add of existing embedding ID: A01
Assistant: Narendra Modi is the Prime Minister of India, having been in office since May 2014. He is a member of the Bharatiya Janata Party (BJP) and the Rashtriya Swayamsevak Sangh (RSS), a Hindu nationalist volunteer organization. Modi was born on September 17, 1950, in Vadnagar, Gujarat.

Before becoming Prime Minister, he served as the Chief Minister of Gujarat from 2001 to 2014. His tenure as Prime Minister has been marked by significant economic reforms, such as the Goods and Services Tax (GST), the Make in India initiative, and efforts to improve infrastructure and digital connectivity.

Modi's government has also been known for its focus on national security, and he has taken a strong stance on issues concerning terrorism and cross-border relations, especially with Pakistan. He has been a polarizing figure in Indian politics, with both supporters praising his leadership and critics raising concerns about religious tensions and democratic backsliding.

Modi's foreign policy emphasizes strengthening India's global standing and relationships, particularly in the Indo-Pacific region. His leadership style is characterized by a strong central authority and a focus on development and progress.

For the latest developments or specific policies, it is recommended to consult current and authoritative sources.

Understand Settings Parameters-

llm_settings-
provider: can be openai or ollama only for now.
base_url: base url of provider
model: name of model
options: This is optional, by default it usages default settings of provider
api_key: API key from provider
# openai llm_provider_settings
llm_provider_settings = {
"provider": 'openai',
"base_url": "https://api.openai.com/v1",
"model": "gpt-4o-mini",
"options": {},
"api_key": ""
}

# groq llm_provider_settings
llm_provider_settings = {
"provider": 'groq',
"base_url": 'https://api.groq.com/openai/v1',
"model": "gemma2-9b-it",
"api_key": "",
}

# ollama llm_provider_settings
llm_provider_settings = {
"provider": 'ollama',
"base_url": 'http://localhost:11434',
"model": "llama3.1",
"options": {},
"api_key": None
Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.