Removed by mod
Removed by mod
Removed by mod
Political intervention is what started Google, so I don’t see the problem.
How about taking responsibility and just not using services that require it.
Google has shaped the web into what it is over decades so that they could maintain their position of power. This is the very essence and purpose of a monopoly. Yet here you are trying to blame anything but the monopoly for the monopoly’s existence.
Nothing like convincing hundreds of millions of people to abandon a company rather than put any pressure on the small group of greedy people who own it.
my experience was that Wikipedia was specifically called out as being especially unreliable and that’s just nonsense.
Let me clarify then. It’s unreliable as a cited source in Academia. I’m drawing parallels and criticizing the way people use chatgpt. I.e. taking it at face value with zero caution and using it as if it’s a primary source of information.
Eesh. The value of a tertiary source is that it cites the secondary sources (which cite the primary). If you strip that out, how’s it different from “some guy told me…”? I think your professors did a bad job of teaching you about how to read sources. Maybe because they didn’t know themselves. :-(
Did you read beyond the sentence that you quoted?
Here:
I can get summarized information about new languages and frameworks really quickly, and then I can dive into the official documentation when I have a high level understanding of the topic at hand.
Example: you’re a junior developer trying to figure out what this JavaScript syntax is const {x} = response?.data
. It’s difficult to figure out what destructuring and optional chaining are without knowing what they’re called.
With Chatgpt, you can copy and paste that code and ask “tell me what every piece of syntax is in this line of Javascript.” Then you can check the official docs to learn more.
I think the academic advice about Wikipedia was sadly mistaken.
Yeah, a lot of people had your perspective about Wikipedia while I was in college, but they are wrong, according to Wikipedia.
From the link:
We advise special caution when using Wikipedia as a source for research projects. Normal academic usage of Wikipedia is for getting the general facts of a problem and to gather keywords, references and bibliographical pointers, but not as a source in itself. Remember that Wikipedia is a wiki. Anyone in the world can edit an article, deleting accurate information or adding false information, which the reader may not recognize. Thus, you probably shouldn’t be citing Wikipedia. This is good advice for all tertiary sources such as encyclopedias, which are designed to introduce readers to a topic, not to be the final point of reference. Wikipedia, like other encyclopedias, provides overviews of a topic and indicates sources of more extensive information.
I personally use ChatGPT like I would Wikipedia. It’s a great introduction to a subject, especially in my line of work, which is software development. I can get summarized information about new languages and frameworks really quickly, and then I can dive into the official documentation when I have a high level understanding of the topic at hand. Unfortunately, most people do not use LLMs this way.
You shouldn’t cite Wikipedia because it is not a source of information, it is a summary of other sources which are referenced.
Right, and if an LLM is citing Wikipedia 47.9% of the time, that means that it’s summarizing Wikipedia’s summary.
You shouldn’t cite Wikipedia for the same reason you shouldn’t cite a library’s book report, you should read and cite the book itself.
Exactly my point.
Throughout most of my years of higher education as well as k-12, I was told that sourcing Wikipedia was forbidden. In fact, many professors/teachers would automatically fail an assignment if they felt you were using wikipedia. The claim was that the information was often inaccurate, or changing too frequently to be reliable. This reasoning, while irritating at times, always made sense to me.
Fast forward to my professional life today. I’ve been told on a number of occasions that I should trust LLMs to give me an accurate answer. I’m told that I will “be left behind” if I don’t use ChatGPT to accomplish things faster. I’m told that my concerns of accuracy and ethics surrounding generative AI is simply “negativity.”
These tools are (abstractly) referencing random users on the internet as well as Wikipedia and treating them both as legitimate sources of information. That seems crazy to me. How can we trust a technology that just references flawed sources from our past? I know there’s ways to improve accuracy with things like RAG, but most people are hitting the LLM directly.
The culture around Generative AI should be scientific and cautious, but instead it feels like a cult with a good marketing team.
Don’t threaten me with a good time!
If anyone ever did this with my likeness after death, even with good intentions, i would haunt the fuck out of them.
marketing hype is pushing anything with AI in the name, but it will all settle out eventually
Agreed. “use it or be left behind” itself sounds like a phrase straight out of a marketing pitch from every single “AI-centric” company that pushes their “revolutionary” product. It’s a phrase that i hear daily from c-suite executives that know very little of what they’re talking about. AI (specifically generative) has its usecases, but it’s nowhere near where the marketing says it is. And when it finally does get there, i think people are going to be surprised when they don’t find themselves in the utopia that they’ve been promised.
If fake experts on the internet get their jobs taken by the ai, it would be tragic indeed.
These two groups are not mutually exclusive
And to what end? Just to have a misinformed populace over literally every subject!
This is a feature; not a bug. We’re entering a new dark age, and generative AI is the tool that will usher it in. The only “problem” generative AI is efficiently solving is a populace with too much access to direct and accurate information. We’re watching as perfectly functional tools and services are being rapidly replaced by a something with inherent issues with reliability, ethics and accountability.
Removed by mod