Out of curiosity, are you considering getting more RAM with your next computer to run local LLMs in the future?

✴️ Also on Micro.blog ✍️ Reply by email 🦣 Reply with Mastodon