• 1 Post
  • 79 Comments
Joined 2 years ago
cake
Cake day: June 17th, 2023

help-circle
  • There’s a lot more to teaching than just good explanations. I do enjoy trying to explain complex science in more understandable ways however.

    As for struggling, we all do at times, pushing through is how we get better. Also science is a little like a spider web. If you look closely, at just a few strands, they don’t make obvious sense. It’s only when you build up a broader picture that it becomes obvious and easy. Building that picture, unfortunately, requires pushing through the “what the hell, I can’t make sense of this!” stage.


  • It would be a mix of relative rates and the exact energy.

    If you pick an area of “empty” space where you expect very little dark matter, you will get a baseline reading. When you aim at an area expected to be dense in dark matter, you will expect to get a higher reading. E.g. 10 counts a day, Vs 100 per day. This is basically how radiation detection works on earth, so the maths is well studied.

    The other thing is energy levels. 2 electrons hitting have a distinct energy. It will vary upwards slightly, due to kinetic energy, but not that much. We also know the annihilation energy of other forms of matter, from earth experiments. A reading distinct from anything normal would be a good signature of an unknown type of matter annihilating.

    There are also extra complications from things like red shift, but those can be measured in other ways, and corrected for.

    The order of theory and discovery also helps. “Finding X that happens to support Y” is a lot weaker than “Predicting X from theory Y, then going and finding it”. If you run 1 million experiments, a 1 in a million result is quite likely by pure fluke. A 1 in a million result from a single, focused experiment is a lot more powerful.


  • In a short summary. Something is wrong with the spin of galaxies. There is more mass than we can account for, and it’s distributed wrong.

    Either the laws of gravity are slightly wrong, or there is something out there with mass, but no interaction with other matters (light particularly).

    More recent, more detailed studies have shown that the error is not consistent. Therefore either the laws of physics vary from galaxy to galaxy (very unlikely) or it’s something physical, rather than a law error.

    That leaves dark matter, sometimes called W.I.M.Ps (Weakly Interacting Massive Particles). They don’t seem to interact with electromagnetism at all, and even any strong or weak force interaction is minimal. It only interacts gravitationally.

    We know the interactions at minimal due to gravity mapping. It seems to form a cloud around galaxies, rather than collapsing in. To collapse in, they must interact to exchange momentum. If they only interact by gravity, that collapse will be extremely slow.

    That is most of what we can be fairly sure of. There’s a lot of speculation around this, and we might be barking up the wrong tree completely. However dark matter via WIMPs seems to be the most consistent with the evidence right now.

    Edit to add.

    This experiment seems quite ingenious. It assumes that WIMPs have a mix of both matter and antimatter. Ever so often a matter/antimatter pair get close enough to annihilate. This creates a pair of gamma photons. The existence of these would help back the existence of physical WIMPs. The energy would also tell us something of their mass (photon energy = mass energy + momentum energy). That will help narrow down where to look in our particle accelerator data.




  • It might also be a single dev who pushed for it. With only a 1-3% market share, the company is unlikely to push resources at it. That 1 dev getting any working version out is a win in many ways.

    Also, most Linux users are a lot better trained at reporting bugs. Most of the time, this is a good thing, letting them get fixed in FOSS development setups. Unfortunately, in gaming, it ends up making Linux look a buggy mess. When 60% of your big reports come from 0.5% of your users, companies can panic. Even if the same bugs exist in windows, just no one bothers to report them.






  • Terry Pratchett captured it well in death’s speech.

    YES. AS PRACTICE. YOU HAVE TO START OUT LEARNING TO BELIEVE THE LITTLE LIES.

    “So we can believe the big ones?”

    YES. JUSTICE. MERCY. DUTY. THAT SORT OF THING.

    “They’re not the same at all!”

    YOU THINK SO? THEN TAKE THE UNIVERSE AND GRIND IT DOWN TO THE FINEST POWDER AND SIEVE IT THROUGH THE FINEST SIEVE AND THEN SHOW ME ONE ATOM OF JUSTICE, ONE MOLECULE OF MERCY. AND YET—Death waved a hand. AND YET YOU ACT AS IF THERE IS SOME IDEAL ORDER IN THE WORLD, AS IF THERE IS SOME…SOME RIGHTNESS IN THE UNIVERSE BY WHICH IT MAY BE JUDGED.

    “Yes, but people have got to believe that, or what’s the point—”

    MY POINT EXACTLY.

    Justice is a communal lie. It doesn’t exist. Yet, but believing it it, we can make it real.


  • I’ll take compatible.

    Most people game on windows. It’s monolithic nature also means that they will mostly encounter the same bugs.

    Linux has a wider base of functionality. A bug might only show up on Debian, not Ubuntu.

    End result, they spend 60% of their effort solving bugs, for 2% of their base. That’s not cost viable.

    Compatibility means they just have to focus on 1 base of code. All we ask is that they don’t actively break the compatibility. This is far less effort, and a lot easier to sell to the bean counters.

    Once Linux has a decent share, we can work on better universal standards. We likely need at least 10% to even get a chance there.





  • Proviso of this is that, globally, politicians grow a spine, along with a sense of morality, and long term planning. It would also require them to deal with the money hoarding issues with the hyper rich.

    • The first step is a massive push for renewables. They should be representing 200-500% of grid demand regularly. If nuclear can get up to speed and be part of this, great, but we can’t wait on it.

    • That excess power should be soaked up by large scale, portable, energy storage. Green hydrogen is the current best option, but synthetic fossil fuels could also take up the slack. Depending on the area, desalination could also be combined into this.

    • We seriously decarbonise the transport networks. For vans and smaller, electric vehicles win. BYD have demonstrated that low cost electric cars are viable. For larger vehicles, where electric becomes inefficient, hydrogen is viable. This is where a lot of the excess hydrogen will be going.

    • Carbon credits with teeth. Rather than relying on a planned economy mindset, we can make capitalism work for us. We need a global fixed carbon emission limit. This limit should trend towards net zero on a preset timetable. Credits are bid on, akin to stock market trades. Companies must have credits by the end of the year/period. The fine for not having credits should be a multiple of the closing credits price (10x?). The fine for falsification should be multiples of that, erring towards corporate execution levels.

    This will force easy savings out of the market quickly. It will then force compulsory emitters to factor in Carbon costs.

    • Combined with the carbon credits will be negative credits. If a group takes a ton of CO² out of the air, long term, they gain a new credit. They can sell this to emitters. This will provide the CO² emissions industry requires, while meeting net zero.

    An example of this might be large scale bio capture on the open ocean. Grow seaweed etc on pontoons, and turn it into a solid. This can then be locked up (old coal mines?) taking carbon out permanently.

    • Geo engineering. There are multiple methods of reducing incident sunlight on the earth. Everything from powders in the upper atmosphere, to mylar solar shades at the Lagrange point. They will be short term fixes, but will buy us time.

    None of these require massive reductions in quality of life. They do require changes in how we do things. It’s also worth noting that I’ve not covered the numerous problems to be solved e.g. power grid upgrades to account for renewables. None of these should be insurmountable however, just engineering, or political/policing challenges.

    An no, I’ve no fucking idea how to get politicians to grow a spine and do what’s required for our long term comfort/survival. Fixing the planet? That’s just a (really big) engineering problem. Fixing human nature? …Fuck knows.





  • For nieve signal distances, that can sometimes be true. That’s not how starlink works however. It bounces the signal between satellites, each adding latency. Overall, fibre wins in almost every situation.

    The bigger problem is saturation. Most things you can apply to radio waves can be applied to light in a fibre. The difference is you can have multiple fibres on the same run. This massively increases bandwidth, and so prevents congestion.

    Just checked the numbers. Starlink is up at 550km. That means a minimum round trip of 1100km. In order to beat a fibre run, you are looking at over 2000km distance. Even halving that to (optimistically) account for angles, that’s still a LONG run to an initial data center.