Sambanova
Sambanovaโs Sambaverse and Sambastudio are platforms for running your own open source models
This example goes over how to use LangChain to interact with Sambanova models
Sambaverseโ
Sambaverse allows you to interact with multiple Open source models you can se the list of available models an interact with then in the playground
An API key is required to access to Sambaverse models get one creating an account in sambaverse.sambanova.ai
The sseclient-py package is required to run streaming predictions
%pip install --quiet sseclient-py==1.8.0
Register your API Key environment variable:
import os
sambaverse_api_key = "<Your sambaverse API key>"
# Set the environment variables
os.environ["SAMBAVERSE_API_KEY"] = sambaverse_api_key
Call Sambaverse models directly from langchain!
from langchain_community.llms.sambanova import Sambaverse
llm = Sambaverse(
sambaverse_model_name="Meta/llama-2-7b-chat-hf",
streaming=False,
model_kwargs={
"do_sample": True,
"max_tokens_to_generate": 1000,
"temperature": 0.01,
"process_prompt": True,
"select_expert": "llama-2-7b-chat-hf",
# "repetition_penalty": {"type": "float", "value": "1"},
# "top_k": {"type": "int", "value": "50"},
# "top_p": {"type": "float", "value": "1"}
},
)
print(llm.invoke("Why should I use open source models?"))
API Reference:
SambaStudioโ
SambaStudio allows you to Train, run batch inference jous, and deploy online inference endpoints to run your own fine tunned open source models
A SambaStudio environment is required to deploy a model. Get more information in sambanova.ai/products/enterprise-ai-platform-sambanova-suite
The sseclient-py package is required to run streaming predictions
%pip install --quiet sseclient-py==1.8.0
Register your environment variables:
import os
sambastudio_base_url = "<Your SambaStudio environment URL>"
sambastudio_project_id = "<Your SambaStudio project id>"
sambastudio_endpoint_id = "<Your SambaStudio endpoint id>"
sambastudio_api_key = "<Your SambaStudio endpoint API key>"
# Set the environment variables
os.environ["SAMBASTUDIO_BASE_URL"] = sambastudio_base_url
os.environ["SAMBASTUDIO_PROJECT_ID"] = sambastudio_project_id
os.environ["SAMBASTUDIO_ENDPOINT_ID"] = sambastudio_endpoint_id
os.environ["SAMBASTUDIO_API_KEY"] = sambastudio_api_key
Call SambaStudio models directly from langchain!
from langchain_community.llms.sambanova import SambaStudio
llm = SambaStudio(
streaming=False,
model_kwargs={
"do_sample": True,
"max_tokens_to_generate": 1000,
"temperature": 0.01,
# "repetition_penalty": {"type": "float", "value": "1"},
# "top_k": {"type": "int", "value": "50"},
# "top_logprobs": {"type": "int", "value": "0"},
# "top_p": {"type": "float", "value": "1"}
},
)
print(llm.invoke("Why should I use open source models?"))