Blog>Cognee News

Exploring AI Memory on the AI Engineering Podcast

Late last year, I joined Tobias Macey on the AI Engineering Podcast. Tobias is a familiar face in the data engineering world. He’s also the creator and host of the widely followed Data Engineering Podcast and the AI Engineering Podcast, where he explores the cutting edge of AI and machine learning by interviewing industry experts.

In this episode, we discussed a topic that is a baseline to advancing AI application development: giving Large Language Models (LLMs) memory.

Why Memory Matters in AI

LLMs are impressive in their ability to generate text, but their memory is limited to the information you feed them in a single session, constrained by their context window. This limitation makes it hard for them to handle complex, multi-turn interactions or recall important details across conversations.

During the episode, I introduced how semantic memory systems, like cognee, can solve these challenges. By structuring memory to manage both short-term and long-term context, we can create AI that behaves more naturally, with the ability to "remember" relevant information just like humans do.

What is Semantic Memory?

Inspired by cognitive psychology, cognee organizes memory into two key components:

One of the approaches we discussed was hierarchical memory, which organizes data into layers:

This layered structure allows LLMs to retain and access information more effectively, improving their accuracy and contextual understanding. In our previous blog, you can find more detailed information for memory in AI systems.

Real-World Applications

During the conversation, I shared several real-world use cases where cognee is making an impact:

These applications highlight how semantic memory goes beyond traditional vector storage by incorporating graph-based structures to model relationships and retrieve data more intelligently.

Challenges and the Future of AI Memory

We also explored some of the challenges involved in building effective memory systems:

Listen to the Episode

If you’re interested in how AI memory works and the technical challenges behind building it, this episode is a must-listen. Tobias’s thoughtful questions and deep expertise made it a truly engaging discussion.

🎧 Listen to the AI Engineering Podcast here.


Whether you're exploring semantic memory, building personalized AI agents, or integrating AI into your workflows, the cognee community is here to support you.



Try cognee today, and check out our tutorials on GitHub

Written by:Vasilije Markovic
Vasilije MarkovicCo-founder / CEO