
AI is changing very fast and Large Language Models (LLMs) have made things a lot easier. Yet, they face a fundamental limitation: they forget every...
For further actions, you may consider blocking this person and/or reporting abuse
Great share Anmol! I will bookmark this and try it out π
Thanks Ndeye! This took a huge effort because there was no docs, so I had to understand every part of the codebase to write it lol.
That is very impressive. Do you do content writing full time?
yeah pretty much.. I'm not a big fan of full-time jobs for some reason so I prefer doing this on a freelance basis.
Got it! That makes sense!
Really nice breakdown @anmolbaranwal
Thanks Saurabh! there was no docs this time so I had to figure everything from codebase... it was pretty tough. π
First time?
π
nah but the codebase felt a bit large this time + deadline was really tight.. that image reminds me of jack sparrow in Pirates of the Caribbean π€£ (yeah, I can tell you have been through the same experience)
We learn this way mate!
Imagine going through Java 6 code with Freemarker template that was last updated in 2013 and there's no way that things builds on my local PC. π
This is too awesome!!!
Thank you π
I'll be trying it all out after exams!
thanks Divya! the concept is definitely exciting.
It really is.
Thank you for sharing.
Really appreciate the clear breakdown of the local setup β itβs well-structured and easy to follow. The behind-the-scenes architecture of OpenMemory MCP is particularly interesting, especially how it manages context and client communication through SSE. Looking forward to exploring more of its potential for local-first, privacy-focused AI workflows!
what do you mean Nevo? sorry I didnβt quite understand .. Mem0 launched OpenMemory MCP and this tutorial breaks it down.
Crazy how far this is going, I kinda love the idea of keeping memory local and all under my control.
Yeah.. you're totally in control of your data and it also gives a sense of trust as well. You can read more here: mem0.ai/openmemory-mcp
So much wild stuff is happening in tech right now, it's exciting :)