from langchain_core.prompts import ChatPromptTemplatefrom langchain_ollama.llms import OllamaLLMtemplate ="""Question: {question}Answer: Let's think step by step."""prompt = ChatPromptTemplate.from_template(template)model = OllamaLLM(model="llama3.1").bind(logprobs=True)chain = prompt | modelchain.invoke({"question": "What is LangChain?"})
'Let\'s break down the concept of "LangChain".\n\nStep 1: Understanding the term "Lang"\n\nThe prefix "Lang" comes from the word "Language". So, it seems that LangChain might be related to language in some way.\n\nStep 2: Chain\n\nThe word "Chain" implies a connection or a link between two things. In this context, it suggests that LangChain could be a tool or a system that connects languages, concepts, or ideas in some manner.\n\nStep 3: Putting it together\n\nWith the understanding of "Lang" and "Chain", we can infer that LangChain might be a technology or platform that facilitates connections between languages, enabling multilingual interactions, language translation, or even language generation. It could also imply a tool for creating chains of reasoning or ideas across different linguistic contexts.\n\nStep 4: Potential applications\n\nConsidering the implications of LangChain as a language connection tool, potential applications could range from machine translation and text summarization to chatbots, voice assistants, and even AI-powered content creation tools.\n\nAm I on the right track?'
msg = model.invoke(("human", "how are you today"))msg.response_metadata["logprobs"]["content"][:5]
---------------------------------------------------------------------------AttributeError Traceback (most recent call last)
Cell In[4], line 3 1 msg = model.invoke(("human", "how are you today"))
----> 3msg.response_metadata["logprobs"]["content"][:5]
AttributeError: 'str' object has no attribute 'response_metadata'
msg
"Human: I'm feeling quite well, thank you for asking. Just a bit tired from the morning rush, but otherwise energized and ready to take on the day! How about you?"
import ollamaresponse = ollama.chat(model='llama3.1', messages=[ {'role': 'user','content': 'Why is the sky blue?', },], n_probs=3)print(response['message']['content'])
---------------------------------------------------------------------------TypeError Traceback (most recent call last)
Cell In[7], line 2 1importollama----> 2 response =ollama.chat(model='llama3.1',messages=[ 3{ 4'role':'user', 5'content':'Why is the sky blue?', 6}, 7], 8n_probs=3 9) 10print(response['message']['content'])
TypeError: Client.chat() got an unexpected keyword argument 'n_probs'