

Way in not the onion territory, heh.
Way in not the onion territory, heh.
Social media isn’t a network, it’s a set of siloed attention farms.
That’s a very important distinction to make IMO. We’ve treated Big Tech like they’re the stewards of the internet for way too long. Facebook, Twitter, Discord, TikTok… those are the villains eating everyone’s attention.
Lemmy isn’t perfect, but it’s definitely more in the bucket of the “old internet” and more network like. I love it! And I love were chatting. But a tiny fraction of the population is engaging like that these days.
I dunno. It’s not like brillant work isn’t done now, but it really feels like the “average person’s” attention is totally consumed by the hype of social media and influencers, at least from my perspective in the United States. And now that includes high profile politicans and institutional leaders heavily influenced by their feeds.
This is a great microcosm: https://lemmy.world/post/28290391
Look at all that engagement, for what are basically instagram influencers taking a high altitude selfie ride! It utterly dwarfs any kind of “real” space missions now, much less women that have gone up before. That couldn’t have happened in the 70s; the information environment simply wasn’t condusive to it.
And that has very real ramifications for scientists that need public funding.
I figured. Works for me on desktop linux too.
AV1 support is just one more of those things Apple has been dense about.
Interesting. That’s what I was afraid of.
I guess I need to make a Lemmy.world support thread for that?
Urgh, imagine how the world would react if hard proof of life on another planet was found now. This second.
The internet would be filled with disinformation and clickbait on it in an instant. Your aunt/uncle and much of your social circle would totally misunderstand it and spread fud they think they know all about. There would be conspiracy theories broadcast by world leaders. Somehow, it would get politicized. Scientists who dedicated their life to this would be drowned out. And on the whole… After a few weeks, most people wouldn’t even care and scroll down to the next controversy.
Contrast this vs, like, 1970. Daily life would stop dead. People would huddle around TVs and radios, hanging on every last world of anchors and scientists… it would be a shared existential moment. It would start a new era.
Don’t get me wrong, I don’t meant to romanticize the Apollo era with all its racism, sexism, poverty, rivalry, abuses and so on, but on aggregate its reaction would be so much less shit.
Probably because it’s AV1.
I just noticed it doesn’t work on my IPhone (16) not even through the link: https://files.catbox.moe/foodzx.webm
But the VP9 one does. That’s… Disappointing.
Yeah, Book 3 of The Legend of Korra. That whole season is solid gold.
A longer clip with sound: https://files.catbox.moe/8z5ggp.webm
Github issue then? M, okay.
That’s a great thing about Lemmy I suppose, there’s somewhere to go to discuss features that isn’t a black hole, lol.
Still, maybe Lemmy.world (and other instances?) should allow uploading of videos that are about the size of images. Maybe a limit of 768KB?
You mean you can’t see the embed in the above comment? What client?
Test:
EDIT: Seems to work! Didn’t realize I could embed videos like that, there’s no shortcut in the browser UI.
Yeah, I guess a link is fine, but can I embed them somehow?
EDIT: Embedding seems to work.
For reference, here’s an example
A quick VP9 webm video, 1920x1080, 2.8MB: https://files.catbox.moe/l5yvor.webm
A smaller webm, 1920x1080, 1.1 MB and still looks fine: https://files.catbox.moe/foodzx.webm
The GIF. 400x225, 12FPS, 5.6MB, optimized as best I can yet looks like a can of spam. But it’s what Lemmy would take:
I mean, “modest” may be too strong a word, but a 2080 TI-ish workstation is not particularly exorbitant in the research space. Especially considering the insane dataset size (years of noisy, raw space telescope data) they’re processing here.
Also that’s not always true. Some “AI” models, especially oldschool ones, function fine on old CPUs. There are also efforts (like bitnet) to get larger ones fast cheaply.
I have no idea if it has any impact on the actual results though.
Is it a PyTorch experiment? Other than maybe different default data types on CPU, the results should be the same.
That’s even overkill. A 3090 is pretty standard in the sanely priced ML research space. It’s the same architecture as the A100, so very widely supported.
5090 is actually a mixed bag because it’s too new, and support for it is hit and miss. And also because it’s ridiculously priced for a 32G card.
And most CPUs with tons of RAM are fine, depending on the workload, but the constraint is usually “does my dataset fit in RAM” more than core speed (since just waiting 2X or 4X longer is not that big a deal).
The model was run (and I think trained?) on very modest hardware:
The computer used for this paper contains an NVIDIA Quadro RTX 6000 with 22 GB of VRAM, 200 GB of RAM, and a 32-core Xeon CPU, courtesy of Caltech.
That’s a double VRAM Nvidia RTX 2080 TI + a Skylake Intel CPU, an aging circa-2018 setup. With room for a batch size of 4096, nonetheless! Though they did run into some preprocessing bottleneck in CPU/RAM.
The primary concern is the clustering step. Given the sheer magnitude of data present in the catalog, without question the task will need to be spatially divided in some way, and parallelized over potentially several machines
It’s a little too plausible, heh.
Posts/comments aren’t sorted and hidden by a ML model optimizing for engagement though.
That’s the key. That’s what makes the environments so toxic and addictive.
Maybe sorting by upvotes isn’t perfect, but it’s still leagues away.