AI Agent
Environments
UV
How to install
- definition : module for managing project ! (better than pip, pyenv, poetry, virtualenv …)
- How to install
How to use
- Initialize a project
cd {directory}
// Create a folder named prj_name, and init uv under the folder
uv init {prj_name}
// init uv at current folder
uv init
- Fetch and setup other project dependencies
// Copy pyproject.toml file into your project folder
// And then,
uv sync # at the folder
- Add new package without pyproject.toml
uv add {package_name}
uv.lock File
- After setting up dependencies with
addorsync, There might be a a file named uv.lock. And the file is designed to let system know the sub-dependencies of the main dependencies which are specidifed under pyproject.toml
Jupyter
How to install
- Install jupyter extension in vscode
-
Install ipykernel python module using
uv// --dev argumnet is to install the module only for developer uv add ipykernel --dev
How to use
- Create
.ipynbfile with main or with other name. - Select
venvwhich is created by uv
OpenAI Billing
- OpenAI billing page
- $30-$50 might be enough to move forward
Set up project
- Create folder
-
Initialize uv
uv init - Copy
pyproject.tomlif it is provided. -
Synchronize current project’s dependencies with dipendencies which are specified pyproject.toml
uv sync // if not in Onedrive uv sync --python 3.12 // crewai and other package have dependencies on python 3.12 // If you're using OneDrive of Windows // 1. Not use hardlink uv sync --link-mode=copy # do not use hardlink // 2. Clean cache and do sync uv cache clean uv sync - Set vscode’s venv to uv’s venv
-
Create
.env fileand add OPENAI_API_KEY (you can use any other name as env variable)os.getenv("OPENAI_API_KEY") # it will return the key -
Run a python code
uv run {python_code}.py uv run --python 3.12 {python_code}.py
Build your first AI response
-
How to get response from
the selected modelby usingopenai module
# Create client to link to openai
import openai
client = openai.OpenAI()
# Arguments
# - model : go to prcing page, and then choose one of the models
# - messages : a list of dict need to be fed into this argument
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{
"role" : "user",
"content" : "How to build usd asset assembly in houdini ?",
}
]
)
# Following that, just retrieve message from choice object
choice = response.choices[0]
choice.message.content
AI Agent ! what is that ?
- Definition : A system that handle with question or problem on behalf of user
-
And, AI Agent is an AI, when user asks
""" I have the following functions in my system. `get_weather` `get_currency` `get_news` All of them receive the name of a country as an argument (i.e get_news('spain')) Please answer with the name of the function that you would like me to run. Please say nothing else, just the name of the function with the arguments. Answer the following question: What is the weather in Greece ? """ - not ask
""" How to get a weather in Greece by using functions below ? `get_weather` `get_currency` `get_news` """ - if so, it returns

- instead of
"To get the weather in Greece using a hypothetical `get_weather` function, you would typically follow these steps:\n\n1. **Check Parameters Required**: Understand what parameters the `get_weather` function accepts. Common parameters might include the location (in this case, Greece), the date for which you want the weather, and perhaps the type of data you need (current weather, forecast, etc.).\n\n2. **Call the Function**: Use the `get_weather` function with the appropriate arguments.\n\nHere’s a simple example in Python (assuming these functions are provided in a package or module):\n\n```python\n# Example of fetching weather in Greece\n\n# Assume these functions are defined in a module named 'weather_api'\nfrom weather_api import get_weather\n\n# Get current weather in Greece\ngreece_weather = get_weather(location='Greece')\n\n# Output the weather\nprint(greece_weather)\n```\n\n### Example Output\nThe output might be a dictionary or an object with weather details such as temperature, humidity, condition (sunny, rainy, etc.).\n\n### Additional Options\nIf you want to retrieve related information, you could also use the `get_currency` and `get_news` functions in a similar manner:\n\n```python\n# Example of fetching currency and news related to Greece\n\n# Get the currency information\nfrom finance_api import get_currency\n\ngreece_currency = get_currency(country='Greece')\nprint(greece_currency)\n\n# Get news related to Greece\nfrom news_api import get_news\n\ngreece_news = get_news(topic='Greece')\nprint(greece_news)\n```\n\n### Summary\n- Use `get_weather` to fetch weather data for Greece.\n- Use `get_currency` to fetch currency details (like the Euro).\n- Use `get_news` to get the latest news related to Greece.\n\nBe sure to check the documentation for the specific API or library you're using to understand the exact function signatures and available parameters!"
Promptis really important !
Adding memory
- Goal : How to make the Ai
rememberprevious answers - Problem ! : With the code above - Build your first AI response, it will not remember user’s previous question like
user : My name is Taiyeong AI : Hi Taiyeong, what can I help you? user : what is my name AI : Sorry... -
Solution : Append user input (str) and AI reponse (str or object) to
listor other type of variablemsg_stack = [] # Ask question to ai msg_stack.append({"role":"user", "content":user_msg}) response = client.chat.completions.create(model="gpt-4o-mini", messages=msg_stack) # After that, append the answer to the message stack answer = response.choices[0].message.content msg_stack.append({"role":"assistant", "content":answer}) # Get another question msg_stack.append({"role":"user", "content":user_msg})- Fianl code
from typing import List import openai client = openai.OpenAI() def call_ai(msg_stack :List[dict]) -> str: response = client.chat.completions.create( model="gpt-4o-mini", messages=msg_stack ) answer = "Sorry, I didn't get it. Can you explan more detail ? " if response: answer = response.choices[0].message.content msg_stack.append( {"role":"assistant", "content":answer} ) return answer msg_stack = [] while True: user_msg = input("Send a message to the LLM...") if user_msg == "quit" or user_msg == "q": print(f"Ai Answer : Ok, I will close this conversation now, but if you need any help, feel free to reach out to me :)") break else: msg_stack.append( {"role":"user", "content":user_msg} ) answer_from_ai = call_ai(msg_stack) print(f"User ask : {user_msg}") print(f"Ai Answer : {answer_from_ai}")
Adding Tools
- Goal : Make the Ai give purpose-oriented answer in terms of coding
- not text-based answer but
answer based on api documents
- not text-based answer but
- How to ? - tools=TOOLS / “role”: “tool” / “tool_call_id”: tool_call.id / “tool_calls” …
- Give tool schema to the ai
- Check if
response.choice[0].message.contentis None or not - it means the answer used tools and tools mapping - Check
response.choice[0].message.tool_calls. And then, give feedbacks to the ai- Append feedback two times
- functions and arguments that ai gave us
- returns that functions returned after running the functions
- Register functions - Check
FUNCTION_MAPvariable below
- Register functions - Check
- Append feedback two times
from openai.types.chat import ChatCompletionMessage
def get_weather(city):
return "33 degrees celcius."
FUNCTION_MAP = {
"get_weather": get_weather,
}
def process_ai_response(message: ChatCompletionMessage):
if message.tool_calls:
messages.append(
{
"role": "assistant",
"content": message.content or "",
"tool_calls": [
{
"id": tool_call.id,
"type": "function",
"function": {
"name": tool_call.function.name,
"arguments": tool_call.function.arguments,
},
}
for tool_call in message.tool_calls
],
}
)
for tool_call in message.tool_calls:
function_name = tool_call.function.name
arguments = tool_call.function.arguments
print(f"Calling function: {function_name} with {arguments}")
try:
arguments = json.loads(arguments)
except json.JSONDecodeError:
arguments = {}
function_to_run = FUNCTION_MAP.get(function_name)
result = function_to_run(**arguments)
print(f"Ran {function_name} with args {arguments} for a result of {result}")
messages.append(
{
"role": "tool",
"tool_call_id": tool_call.id,
"name": function_name,
"content": result,
}
)
call_ai()
else:
messages.append({"role": "assistant", "content": message.content})
print(f"AI: {message.content}")
def call_ai():
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages,
tools=TOOLS,
)
process_ai_response(response.choices[0].message)
CrewAI
Terminolgoy
- crew : a group of agents
- agent : it is a thing that help user get answer based on purpose of question
- task : task is task
Basic workflow of CrewAI
- Use decorators : CrewBase / agent / task / crew
- The CrewBase decorator collects agents and tasks automatically
import dotenv dotenv.load_dotenv() from crewai import Crew, Agent, Task from crewai.project import CrewBase, agent, task, crew @CrewBase class TranslatorCrew: @agent def translator_agent(self): return Agent( config=self.agents_config["translator_agent"], ) @task def translate_task(self): return Task( config=self.tasks_config["translate_task"], ) @task def retranslate_task(self): return Task( config=self.tasks_config["retranslate_task"], ) @crew def assemble_crew(self): return Crew( agents=self.agents, # The CrewBase deocrator collects agents which are under the class tasks=self.tasks, # The CrewBase deocrator collects tasks which are under the class verbose=True, ) TranslatorCrew().assemble_crew().kickoff( inputs={ "sentence": "I'm Nico and I like to ride my bicicle in Napoli", } ) - Use config yaml files
- This is fed into the python code
translate_task: description: > Translate {sentence} from English to Italian without making mistakes. expected_output: > A well formatted translation from English to Italian using proper capitalization of names and places. agent: translator_agent retranslate_task: description: > Translate {sentence} from Italian to Greek without making mistakes. expected_output: > A well formatted translation from Italian to Greek using proper capitalization of names and places. agent: translator_agent
- This is fed into the python code