Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Macbook Pro 2020 with 16gb of system ram. I think the gpu is Iris Plus? But I don't much keep up on those.

I'm now delving into getting this running in Terminal... there are a few things I want to try that I don't think the simple interface allows.

Also, I've noticed that when chats get a few kilobytes long, it just seizes up and can't go further. I complained to it, it spent a sentence apologizing, started up where it left off... and got about 12 words further.



hm yeah i think I need to update llama.cpp to get this fix https://github.com/ggerganov/llama.cpp/pull/3996

Thanks for trying it!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: