There could be many use cases, and some of them may even be legitimate, but I’m yet to observe any which have broad applicability, and they should only ever be wielded by a responsible, expert adult.
‘AI’ fluffers sure do love the taste of grift flavoured tokens.
I’d ask what you were thinking, but it’s clear that played no material role in this extrusion. Extrapolating the assertion I constrained to a specific topic to the entirety of ‘tech’ is a bellowing straw man.
Further, the exclusively US centric examples of inappropriate stewards reveals the vantage to be squarely rooted inside that noxious bubble. The invocation of treason further betrays an affinity for national subservience.
To refine my original point, my observation of the application of LLMs is that the only entities who find them impressive are those who expressly lack proven expertise in the area it’s being applied. The correlation appears to be nearly linearly, inversely proportional.
LLMs could eventually prove innately useful, but there’s no indication they’re close to that, let alone traversing a relevant vector.
Personally, any world populated with entities who are impressed by LLMs is a world not worth living in.
There could be many use cases, and some of them may even be legitimate, but I’m yet to observe any which have broad applicability, and they should only ever be wielded by a responsible, expert adult.
Oh boy wonder what kind of gatekeeping would come out of locking tech away unless you’re a “responsible expert adult”!
Jesus Christ not a single bit of thought went into what you said, did it?
You want to lock everyone but experts away? Who decides what an expert is, potus? The gop? The treasonous institute known as Stanford?
Is it only people with degrees? People making a certain amount? People that can pay a large licensing fee?
What a stupid thing to say.
‘AI’ fluffers sure do love the taste of grift flavoured tokens.
I’d ask what you were thinking, but it’s clear that played no material role in this extrusion. Extrapolating the assertion I constrained to a specific topic to the entirety of ‘tech’ is a bellowing straw man.
Further, the exclusively US centric examples of inappropriate stewards reveals the vantage to be squarely rooted inside that noxious bubble. The invocation of treason further betrays an affinity for national subservience.
To refine my original point, my observation of the application of LLMs is that the only entities who find them impressive are those who expressly lack proven expertise in the area it’s being applied. The correlation appears to be nearly linearly, inversely proportional.
LLMs could eventually prove innately useful, but there’s no indication they’re close to that, let alone traversing a relevant vector.
Personally, any world populated with entities who are impressed by LLMs is a world not worth living in.
its a multitool, its applicability needent be broad imo. anyway, im glad we arent speaking in absolutes.