Just a regular Joe.

  • 0 Posts
  • 12 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle


  • And contributions to codebases that have developed with the goal of meeting the team’s own needs, and who similarly don’t have the time or space to refactor or review as needed to enable effective contributions.

    Enabling Innersource will be a priority for management for only two weeks, anyway, before they focus on something else. And if it even makes it into measurable goals, it will probably be gamed so it doesn’t ruin bonuses.

    Do you also work for $GenericMultinationalCompany, per-chance? Do you also know $BillFromFinance?



  • Encryption will typically be CPU bound, while many servers will be I/O bound (eg. File hosting, rather than computing stuff). So it will probably be fine.

    Encryption can help with the case that someone gets physical access to the machine or hard disk. If they can login to the running system (or dump RAM, which is possible with VMs & containers), it won’t bring much value.

    You will of course need to login and mount the encrypted volume after a restart.

    At my work, we want to make sure that secrets are adequately protected at rest, and we follow good hygiene practices like regularly rotating credentials, time limited certificates, etc. We tend to trust AWS KMS to encrypt our data, except for a few special use cases.

    Do you have a particular risk that you are worried about?


  • Joe@discuss.tchncs.detoSelfhosted@lemmy.worldSecrets Management
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    6 days ago

    Normally you wouldn’t need a secrets store on the same server as you need the secrets, as they are often stored unencrypted by the service/app that needs it. An encrypted disk might be better in that case.

    That said, Vault has some useful features like issuing temporary credentials (eg. for access to AWS, DBs, servers) or certificate management. If you have these use-cases, it could be useful, even on the same server.

    At my work, we tend to store deployment-time secrets either in protected Gitlab variables or in Vault. Sometimes we use AWS KMS to encrypt values in config files, which we checkin to git repositories.


  • It typically takes a small core team to build the framework/architecture that enables many others to contribute meaningfully.

    Most OSS projects get bugger all contributions from outside the initial core team, having limited ability to onboard people. The biggest and most active (out of necessity or by design) have a contribution friendly software architecture and process, and often deliberately organized communities (eg. K8S & CNCF) or major corporate sponsors filling the role.

    Free Software and resulting ecosystems seem to have a better chance of contributing to the common good over the long term. This is simply because most companies are beholden to their shareholders, and at some point the urge to squeeze every last cent out of an opportunity comes to the forefront, and many initially well intentioned efforts get poisoned.

    Free Software licenses like the GPL help to protect our freedom and to set open standards, and are essential for the core technology stack.

    When someone can get annoyed with some shitty software or its license-terms and reimplement the core functionality in a few days/weeks/months … eventually someone will get annoyed and create some decent free software that will kill off the shitty alternatives, or even just a better commercial alternative. This only works because of the open platforms & protocols.

    One of the major challenges for consumers is finding good software today in the grey goo of projects and appstores. This harks back to OP’s point about curated collections of software. It’s also where the various foundations add value (CNCF, Linux Foundation, Apache) … along with “awesome X” gitlab repos, which are far better than random youtube videos or ad-riddled blogs or magazine articles.


  • The true strength is in the open interfaces and common protocols that enable competition and choice, followed by the free-to-use libraries that establish a foundation upon which we can build and iterate. This helps us to stay in control of our hardware, our data, and our destiny.

    Practically speaking, there is often more value in releasing something as free software than there is to commercialising it or otherwise tightly controlling the source code… and for these smaller tools and libraries it is especially the case.

    Many bigger projects (eg. linux kernel, firefox, kubernetes, apache*) help set the direction of entire industries, building new opportunities as they go, thanks to the standardization that comes from their popularity.

    It’s also a reason why many companies release software as open source too, especially in the early days, establishing themselves as THE leader…for a while at least (eg. Docker Inc, Hashicorp).