A Private AI Assistant
Large Language Models (LLMs) use Neural Networks to identify and learn patterns from large amounts of text documents. Simply put, they read so much text, that whenever you give them the beginning of a sentence, they have a very high chance to predict how it continues.
This ability of predicting the next word for a given sentence has opened the door to interacting with machines using natural language, a whole new interface.
A.M.I.C.A.
What does it take to build a system where multiple agents can interact asynchronously, respond to user queries and take proactive action when things change around them?
Many articles already explain how to build an agent with existing libraries and SDKs. Here, I’ll cover what else you need to create a multi-agent system: a future-proof architecture that supports asynchronous communication, works with any tooling or models, and runs locally on varied hardware.
Prerequisites This is a dense article that introduces many concepts without fully defining them. Some familiarity with LLMs, the concept of agentic AI, and how to build agents is advised, in particular with LangChain4J and Jlama.
How Deep Learning Works
The field of Artificial Intelligence began to develop after World War II (the name was coined in 1956). But it wasn’t until the late 20th century that widespread applications began to emerge, with the first spam detectors and recommendation engines.
In the 2020s, the emergence of Generative Artificial Intelligence appears poised to revolutionize the way people interact with technology.
How Semantic Search Works
My last side project was a proof-of-concept multi-agent platform called A.M.I.C.A.. Among other features, it provides centralized tool management, allowing agents to search for relevant tools when handling user requests.
The initial implementation of the Tool Manager interface was the LuceneToolManager class, which used an in-memory Lucene index to search for the tools relevant to a user prompt.
After publishing the article, I thought that it would make sense to create another implementation that used semantic search, as it is a more powerful mechanism to identify relations between text fragments.