Agentic AI. At first, it all sounded a bit complex, but once I started exploring, I realized how exciting and powerful these technologies really are.
Today I learned many things about it.
Generative AI — is all about creating things — like text, images, or code — using AI. It’s what powers tools like ChatGPT.
Agentic AI goes one step further. It’s about building smart agents that can think, make decisions, and perform tasks on their own.
- I also came across something called RAG (Retrieval-Augmented Generation). This is a method where AI can pull in extra information from outside sources to give better answers.
Tools and Frameworks
Some amazing frameworks and tools I discovered today:
- LangChain and LangGraph: These help you build chatbots and agent-based systems easily.
- Crew AI (another interesting one) is also used for building smart applications.
Everything is built using Python.
Some Notes I Got
- AI is not just for researchers anymore. It’s being used in everyday jobs like developers etc.
- Instead of creating models from scratch, many people now use frameworks to build real applications quickly.
- Best part? Many of these tools and models are open-source and free, so anyone can start learning without spending a lot.
According to many people on YouTube, I found that even though it’s helpful to know some Python, machine learning, or deep learning basics, I realized you don’t need to be an expert to learn AI agents. You can still follow along, understand how things work, and gradually grow your skills.
As I continue my journey into the world of AI Agents, I recently learned about something super exciting — a framework called LangChain. Until now, I knew that large language models (LLMs) like GPT could do amazing things like writing, summarizing, and coding. But I always wondered: how do people actually use these models to build real-world applications?
That’s where LangChain comes in!
What Is LangChain?
LangChain is an open-source framework that helps developers (even beginners like me!) build powerful AI applications. Instead of connecting to an LLM directly and figuring everything out from scratch, LangChain gives you a proper structure to connect language models, databases, APIs, and more — all in one place.
It’s like having a toolbox specifically made for creating things like:
- Chatbots
- Smart assistants
- Retrieval-based AI systems (RAG)
- Multi-agent workflows
Cool Features I Learned About It Has
- Supports many LLMs like OpenAI, Anthropic, Google Gemini, and even Meta’s models. Makes it easy to switch between models and test what works best.
- Can connect to vector databases and third-party APIs to give the AI updated, real-world info.
Comes with other tools like:
- LangGraph: For building smart AI agents easily.
- LangCraft: For debugging and managing agents.
- LangSmith: For testing, monitoring, and deploying AI projects.
Most LLMs can’t access current information — they’re trained on data from the past. But LangChain helps you solve this by allowing your app to fetch live data using APIs. For example, you could connect it to a news API or search engine, and now your chatbot can answer questions about today, not just yesterday.
Also, LangChain supports reading files like PDFs and scraping websites. So, I can build an AI that understands documents, chats about them, or even summarizes them. How cool is that?
It’s Beginner-Friendly Too
I used to think you needed to be an AI expert to do any of this. But LangChain simplifies a lot of complex things. Even though you still need to know some Python, I found that the modular approach helps a lot — each part does one job, and you connect them step by step.
You don’t need to build a giant app all at once. You can start small — like making a simple chatbot — and then grow it as you learn.
Open Source and Widely Used
It’s also fully open-source, which means I can explore the code, try things out, and not worry about expensive tools or paid licenses. And the fact that big companies are using LangChain shows how powerful it is.
Since beginners can start experimenting and creating useful, intelligent tools. I’m now excited to dive deeper — maybe start with a chatbot, or a simple app that uses live data from the web.
Setting Up Project
Let’s set up a Langchain project using the UV Package Manager. I learned these from Krish Naik Hindi’s youtube channel, and it highlights the power of the UV Package Manager and the best practices for working with Langchain.
To set up such a project, we need a tool that is fast, reliable, and easy to use.
That’s where the UV Package Manager comes in. Written in Rust, UV is an extremely fast Python package and project manager. It can replace tools like pip, poetry, pyenv, and virtualenv — all in one — and is 10 to 100 times faster.
Complete Process for Project Setup with UV Package Manager
1. What is UV Package Manager?
UV is a universal Python package manager that:
- Manages Python versions
- Creates and manages virtual environments
- Runs scripts
- Provides a universal lockfile for dependency management
Thanks to Rust, it’s very fast and disk-space efficient.
It is supported on macOS, Linux, and Windows.
2. Setting Up Project Structure
First, create an empty folder (e.g., AgenticAI
) to serve as your workspace. Open this folder in VS Code, which is one of the best IDEs for Python development.
Open the terminal and run:
code .
This command will launch VS Code in your project folder.
3. Installing UV Package Manager
UV can be installed in several ways:
check out here:
https://github.com/astral-sh/uv
After installation, type uv
in the terminal to check if it’s properly installed.
4. Initialize a New Project with UV
To initialize a new project, run:
uv init
This creates essential project files like:
pyproject.toml
.gitignore
main.py
README.md
It also initializes a Git repository.
5. Set Up Python Version and Virtual Environment
You can install and manage specific Python versions using UV. Example:
uv python install 3.12.1
To create a virtual environment:
uv venv
This creates an isolated environment for your project. Use the activation command provided by UV to activate it.
6. Installing Packages
To install Langchain and its dependencies, simply run:
uv add langchain
You can also use a requirements file:
uv add -r requirements.txt
This is much faster and more efficient than pip.
7. Setting Up for Jupyter Notebook
If you want to use Langchain in a Jupyter Notebook, install ipykernel
with:
uv add ipykernel
Then you can select the Python kernel in VS Code and run your code in a notebook.
OpenAI, Google Gemini, and Groq
Langchain and other LLM-based apps require API keys. Examples:
- OpenAI API
- Google Gemini API
- Groq API (for open-source models)
How to Create a Google Gemini API Key:
Go to Google AI Studio (ai.google.com) and sign in.
To create an API key, you may need to create a project in the Google Cloud Console.
If the option to create a key isn’t visible, create a new project in the console and then generate the key.
Groq API Key Setup:
Visit groq.com, log in, and go to the Dev Console.
You’ll find various open-source models available for API access.
Copy the generated API key and use it in your project.
How to Create an OpenAI API Key:
Visit OpenAI’s website, log in, and create a new API key.
Copy the key and add it to your project.
A separate config file like .env
to securely store API keys.
Conclusion
Setting up a Langchain project with UV Package Manager is not just fast and easy — it’s a best practice for modern Python development.
UV simplifies Python version management, virtual environments, and dependency management into a single, blazing-fast tool.
Properly configuring API keys is crucial for Langchain-based AI projects.
Using OpenAI, Google Gemini, and Groq will help you build powerful AI applications.
After this, now I have a strong foundation for my Agentic AI projects — and I am ready to dive deeper into LangChain and LLM-powered development.
Thanks for reading!
— Ranjan