It takes less power to run my local model than it does to play Baldur’s Gate 3, but I haven’t seen anybody shaming people for playing games. Not every LLM is a giant wasteful cloud provider, many are open source and self hosted.
It’s kind of like saying all vehicles are gas guzzling enormous pickup trucks and therefore nobody should travel anywhere. Self hosting on a PC you already own is more like riding a bike in this metaphor.
I’m working on software to help more people do it, but I fear that anti-ai sentiment has lost focus on the problem. Local models are super useful for assistance with code, writing, and all sorts of general tasks. I’ve been working on a tool that allows you to tell the computer what you want and it generates a command line prompt with an explanation of how it works.
Going back to my analogy, it doesn’t matter what you do with your special case of LLM tech. What matters is that most are using it in a destructive way. People do not care if you use your massive pick-up truck to cure cancer. People see a dumb pick-up truck, they assume you voted for Trump. So I’m pretty sure my analogy stands.
And the answer for the training question is a “No”, I presume.
Yes, I agree, but I think it does matter where we go from here. We could say all vehicles are bad, or we could focus on the source of the problem. Corporate AI is what’s using all the electricity and water, it’s what’s creating the worst issues.
Most open source models don’t need additional training, they’re already plenty good for most plain language tasks and the weights are all free to use. Why would I waste power doing my own training when the public options are perfectly adequate?
No, though I’m not really sure how it makes a difference. If I used a different model that was made using less resources what would be improved? Both already exist, using one over the other would not save any energy.
Edit: To go back to the BG3 comparison, I don’t know how many resources were used to make the game, or what sources the developers pulled from. I play it because it’s fun and another person enjoying it doesn’t cost the world anything (except a little electricity)
It takes less power to run my local model than it does to play Baldur’s Gate 3, but I haven’t seen anybody shaming people for playing games. Not every LLM is a giant wasteful cloud provider, many are open source and self hosted.
It’s kind of like saying all vehicles are gas guzzling enormous pickup trucks and therefore nobody should travel anywhere. Self hosting on a PC you already own is more like riding a bike in this metaphor.
Show me the percentage of AI prompts done on local models, and if it’s more than a rounding error, I’ll eat my hat.
Also, was your local model trained locally?
What percentage of trips are done on a bicycle?
I’m working on software to help more people do it, but I fear that anti-ai sentiment has lost focus on the problem. Local models are super useful for assistance with code, writing, and all sorts of general tasks. I’ve been working on a tool that allows you to tell the computer what you want and it generates a command line prompt with an explanation of how it works.
Going back to my analogy, it doesn’t matter what you do with your special case of LLM tech. What matters is that most are using it in a destructive way. People do not care if you use your massive pick-up truck to cure cancer. People see a dumb pick-up truck, they assume you voted for Trump. So I’m pretty sure my analogy stands.
And the answer for the training question is a “No”, I presume.
Yes, I agree, but I think it does matter where we go from here. We could say all vehicles are bad, or we could focus on the source of the problem. Corporate AI is what’s using all the electricity and water, it’s what’s creating the worst issues.
Most open source models don’t need additional training, they’re already plenty good for most plain language tasks and the weights are all free to use. Why would I waste power doing my own training when the public options are perfectly adequate?
Do you know how much energy and what sources where used to train your model?
No, though I’m not really sure how it makes a difference. If I used a different model that was made using less resources what would be improved? Both already exist, using one over the other would not save any energy.
Edit: To go back to the BG3 comparison, I don’t know how many resources were used to make the game, or what sources the developers pulled from. I play it because it’s fun and another person enjoying it doesn’t cost the world anything (except a little electricity)