#1-Getting Started Building Generative AI Using HuggingFace Open Source Models And Langchain
AI技術の魅力を伝える記事を書いてみましょう。今回は、LangChainとHugging Faceが共同で維持するLangChain_huggingfaceについて焦点を当てます。
LangChain_huggingfaceは、最新のHugging Faceの開発力をLangChainに取り込み、常に最新の状態を保つことを目的とした新しいPythonパッケージです。このパートナーシップは、技術の共有だけでなく、この統合を維持し、継続的に改善する共同の取り組みでもあります。
langchain-huggingfaceはLangChainとシームレスに統合されており、LangChainエコシステム内でHugging Faceモデルを効率的かつ効果的に活用する方法を提供しています。このパートナーシップは単なる技術の共有ではなく、この統合を維持し、継続的に改善していくための共同の約束でもあります。
この記事では、LangChain_huggingfaceがどのようにAI技術の進化に貢献しているかや、その利点などに焦点を当てることで、読者にAI技術への理解と興味を促すことができます。また、実際の使用例や具体的な利用シーンも挙げることで、読者がイメージしやすくするとさらに効果的です。
さらに、「github Code」や「Support me by joining membership」など宣伝的な部分は省略し、「langchain-huggingface」自体の魅力や利点にフォーカスした文章構成が望ましいでしょう。AI技術への理解を深めるためにもわかりやすく具体的な説明を心掛けましょう。
Join my data science community discord group where we discuss many things. Happy Learning!!
https://discord.gg/u7q6ZNSH
we can call it after quantizing(mistral)
Amazing video. Thanks a lot Krish,you are a great teacher
How can we make projects through it 🤷😐
Thanks for the video! Brief, informative, and to the point.
Sir how do we make it like a web page where in we can chat just like GPT
Like I do not know how to implement python onto HTML pages
Awesome sessions, Always I follow your channel.😍
22:00 That moment when you realize its Causal and not Casual.
amazing this video really helped
Hi, I am trying this out and getting a rate limit error after calling only once. How can I solve this?
sir please continue this series
Hi Krish,
Can you make a playlist for evaluating the LLM Models getting a evaluation matrix? For RAG models and Fine tuned models.
device can't be equal to zero.
# Use huggingface pipeline with GPU
gpu_llm = HuggingFacePipeline.from_model_id(
model_id = "gpt2",
task = "text-generation",
device = 0,
pipeline_kwargs = {"max_new_tokens":100},
)
error at this line. help!
Is it possible to use the huggingfaceEndpoint module for other downstream task like text-clssification and all.
Which playlist is this present?
getting this : Bad request:
Authorization header is correct, but the token seems invalid after using llm.invoke("") It got resolved. I changed the perm from fine grained to read for token
Excellent 🎉🎉🎉
God bless Krish for doing excellent service.
Ran the following code (and equivalent) and keeps giving me the answer as "Borish Becker" instead of Stefan Edberg !
###
from langchain import PromptTemplate, LLMChain
# question = "what is the hypotenuse length of pythagorus traingle with sides 3 and 4"
question = "who won Webledon mens final 1990"
template = """ Question: {question}
Answer: Let us think step by step."""
prompt = PromptTemplate(template=template,input_variables=["question"])
print(prompt)
Llm_chain = LLMChain(llm=llm, prompt = prompt )
print(Llm_chain.invoke(question))
Part 2 please sir
sir bring up the next lecture asap
repo_id="mistralai/Mistral-7B-Instruct-v0.2"
llm=HuggingFaceEndpoint(repo_id=repo_id,max_length=128,temperature=0.7,token=src_key)
WARNING:langchain_huggingface.llms.huggingface_endpoint:WARNING! max_length is not default parameter.
max_length was transferred to model_kwargs.
Please make sure that max_length is what you intended.
WARNING:langchain_huggingface.llms.huggingface_endpoint:WARNING! token is not default parameter.
token was transferred to model_kwargs.
Please make sure that token is what you intended.
why am i getting this error
please help
a Much needed video and series .. 😀
"The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.3.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.
warn_deprecated("
this error i am getting while using agent with open source model through hugging face pipeline ,i even tried with these "create_react_agent, create_json_agent, create_structured_chat_agent" i am error. can you help me ?
"The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.3.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.
warn_deprecated("
this error i am getting while using agent with open source model through hugging face pipeline ,i even tried with these "create_react_agent, create_json_agent, create_structured_chat_agent" i am error. can you help me ?
"The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.3.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.
warn_deprecated("
this error i am getting while using agent with open source model through hugging face pipeline ,i even tried with these "create_react_agent, create_json_agent, create_structured_chat_agent" i am error. can you help me ?
"The function `initialize_agent` was deprecated in LangChain 0.1.0 and will be removed in 0.3.0. Use Use new agent constructor methods like create_react_agent, create_json_agent, create_structured_chat_agent, etc. instead.
warn_deprecated("
this error i am getting while using agent with open source model through hugging face pipeline ,i even tried with these "create_react_agent, create_json_agent, create_structured_chat_agent" i am error. can you help me ?
Hey Hi @krishnaik06, my first time seeing you and detailed video on HuggingFace & Langchain, been 4,5 days since i found HuggingFace & Langchain. I have few question, could you please clear these will be very helpful.
I'm creating an android application, where i'm using HuggingFace Interface API endpoints, where i'm using 2 models – Salesforce/blip-image-captioning-large, google/flan-t5-xxl, is there any limitation on HuggingFace access token? or if these modal is paid or free?
I've to deploy this android application on playstore for the large group of people, any challanges or risk i might face in future? Please answer.
Btw loved your video, getting interest into Generative AI. Keep creating such content, it really really helps us. Thank you so much ❤
Why didn't you use the same "mistralai/Mistral-7B-Instruct-v0.2" for the pipeline example ? Instead of that you have used "gpt2".
Kindly create a video in which you cover how to create a UI using Flask or other library in Google Colab for LLM application…