

But everyone on Lemmy said LLMs had no usecases


But everyone on Lemmy said LLMs had no usecases


That was 13 years ago, I’m also curious how much that relates to world events vs age


Desert boy not having a great vocabulary tho


Depends on how much quantization, but still fairly beefy, couldn’t run it on my homelab with a 3080ti for example.
I generally use smaller 8-12b models and they’re alright depending on the task.


Eh, yes you’re right but seriously what are we gaining from this mission? As far as I’ve heard there’s nothing really new that were gaining from it, just… Because?
To the ops point, there are a multitude of more beneficial projects that money could have gone toward instead in the same realm.


But they also should visit and be able to interact with prisons lmao.


The facts are voting for the lesser of two evils is how we got to this point.
Edit: Before you assume, yes I voted for biden and Harris in 2020 and 2024.
Proxmox as another option


You don’t think they’d happily target Lemmy if it were larger? It’s still “social media” to them


Except I’m sure they’d charge out the ass and they don’t seem to put any effort into gaming 🤷


Why did you call them LAMP?


I had a friend who gamed on a Mac for a while during that period, most games did work for her.
I do think the M series chips set it back a bit because most games aren’t targeting ARM, so you have to use Rosetta to emulate reducing performance


Does proxmox count? Then I run lots of docker containers in lxcs


I think their motive was internal usage and classic embrace/extend/extinguish, along with github was generally well liked amongst its users, so maybe gives MS a bit of a boost on that front.
I can not imagine they knew what was coming with LLMs but I definitely could be wrong.
Oh and they offer it to businesses so it’s another feather in their MS365 ecosystem cap.


They bought it 8 years ago lol


As someone who uses the slop machine, completely agree, it might help improve them further and if you don’t want to use it, move to forgejo or similar (I did that too) and if you still want AI help, try learning how to host your own locally if your GPU can swing it.


Basically just reads like concern trolling to me 🤷


They mean because games mainly work on windows first, not so much that gamers specifically want windows.
You can do it locally now pretty easily depending on your use ass and hardware, huggingface has all the models you’d need and use something like llama-swap