MLLM
is a library for simplifying communication with various Large Language Models (LLMs) and multi-modal LLMs such as OpenAI GPT, Anthropic Claude and Google Gemini.
It lets you
RoleMessage
and RoleThread
from Threadmem. So, you can also create the thread using MLLM:
expect
parameter in the router.chat()
call: