New generative A.I breakthroughs are happening almost every week, it seems. The big question is; Are we ready for it? Noted Science Zaddy Kyle Hill explains ...
Ollama is actually pretty decent at stuff now, and comparable in speed to chat gpt on a sort of busy day. I’m enjoying having a constant rubber duck to bounce ideas off.
If your concern is that we’re “not getting anything” in exchange for the training data AI trainers have gleaned from your postings, then those open-source AIs are what you should be taking a look at. IMO they’re well worth the trade.
I’ve been playing with a locally installed instance of big agi really like the UI but it’s missing the RAG part. I’m also cobbling my own together for fun and not profit to try to stay relevant in these hard times. Langchain is some wild stuff.
Ollama is actually pretty decent at stuff now, and comparable in speed to chat gpt on a sort of busy day. I’m enjoying having a constant rubber duck to bounce ideas off.
That’s cool. I haven’t looked at any local/foss llms or other generators, largely because I don’t have a use case for them.
If your concern is that we’re “not getting anything” in exchange for the training data AI trainers have gleaned from your postings, then those open-source AIs are what you should be taking a look at. IMO they’re well worth the trade.
It’s with playing around with. This is a good one which packages all the basics including RAG
https://github.com/imartinez/privateGPT
I’ve been playing with a locally installed instance of big agi really like the UI but it’s missing the RAG part. I’m also cobbling my own together for fun and not profit to try to stay relevant in these hard times. Langchain is some wild stuff.