• zenharbinger@lemmy.world
    link
    fedilink
    arrow-up
    33
    ·
    4 months ago

    some partitions are useful. Keeping /var and /tmp separate can stop DoS attacks by now allowing logs to fill the entire drive /home means you can wipe the / partition and keep user data.

    • limelight79@lemm.ee
      link
      fedilink
      arrow-up
      12
      ·
      4 months ago

      I’ve had a full /var partition cause all sorts of problems using the system. But I still think it’s good to have four partitions /, /var, /tmp, and /home. At least split out /home so you can format / without losing your stuff in /home.

        • limelight79@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          4 months ago

          I can definitely see doing that on a server many people are using. For my personal server, I used to do that, but in the end I couldn’t find much benefit, and only headache (“ahhhh / is short on space because I forgot to clean up old kernels…”).

          • scratchandgame@lemmy.ml
            link
            fedilink
            Tiếng Việt
            arrow-up
            3
            ·
            edit-2
            4 months ago

            I think it would save you someday, when there is nothing writing in /usr so the writing in /home would not cause much damage. On a system with a huge root partition, an incomplete writing might damage the whole filesystem.

            Fsck would be faster. newfs (mkfs) would be faster. I found NetBSD spend so much time when it do newfs a 32G root partition (installing NetBSD in hyper-v).

            Also for the /tmp partition, we can use memory filesystem (tmpfs) if we have 4G of RAM or more, instead of physical disk to store things that are cleaned on reboot.

            • limelight79@lemm.ee
              link
              fedilink
              arrow-up
              2
              ·
              4 months ago

              I’m not saying it can’t happen, but I’ve been using Linux since the late 90s and have never had a problem with an incomplete write damaging the file system, or really anything else (except for a recent incident when a new motherboard decided to overwrite the partition tables on my RAID5 array, but that’s a different story). And I have UPSs on the server and desktop, and of course the laptop has a battery in it, so the risk of sudden power loss is extremely low.

              The /tmp thing in RAM is interesting. I was reconfiguring my server’s drive the other day, because I didn’t originally allocate enough space to /var - it worked fine for years until I started playing with plex, jellyfin, and Home Assistant (the latter due to the database size). I was shocked to find /tmp only had a few files in it, after running for years. I think I switched the server to Debian in 2018 or 2019, but that’s just a guess based on the file dates I’m seeing. Maybe Debian cleans the /tmp partition regularly.

    • emptyother@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      4 months ago

      Damn I’ve always wanted Windows to have that. Being able to put user folders on another partition, or even another drive, at install time. And being able to use “dynamic disk” (aka software raid) to expand partitions across disks as storage requirements grow. I know it is possible to setup, but with a lot of workarounds and annoying problems.

      • Magickmaster@feddit.de
        link
        fedilink
        arrow-up
        18
        ·
        4 months ago

        Windows user folders are nearly unusable in my opinion, too many programs throw in random folders and files everywhere. Especially the Documents folder, too many games putting incoherent stuff in there

        • emptyother@programming.dev
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          4 months ago

          Jup, useless folder. There’s one related thing I’ve complained a lot about lately, so I’m gonna complain some more about it:

          Microsoft got this “great” idea of trying to repeatedly trick me into uploading that Documents folder to the cloud. A folder filled with GBytes of Battlefield and Assassins Creed cache files, Starfield mods, MS database files, etc… A lot of files that are in constant change, or locked the entire session. Annoying as hell. I love Onedrive, but I dont know why its so damn important for them to have those files.

          Sometimes I really wish I could switch to some Linux distro instead.

          • rtxn@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            edit-2
            4 months ago

            It’s asinine that Onedrive doesn’t have an equivalent of the decades-old gitignore technology…

            There seems to be a workaround, though - archive link. It should work as long as the local and remote conflict remains unresolved, or Microsoft decides to just push the remote onto the local machine and delete your files instead.

      • rtxn@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        4 months ago

        I’m pretty sure you can just mount a volume to C:\Users.

        I definitely wouldn’t recommend changing the userdir paths in the system. Many of the office computers I work with are set up that way and it’s always a pain in the ass when an application expects the home path to be located on C:.

      • maxprime@lemmy.ml
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        4 months ago

        I remember doing this in macOS, when I got my first SSD. I installed it and kept the os on the SSD and mapped my user directory to my hdd. It made upgrades and re-installs much easier, which was a plus because it was actually a hackintosh.

  • Ilgaz@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    4 months ago

    A separate /home can save you hours or even days in several occasions however don’t try crazy things like trying to have KDE of Ubuntu share same theme/settings with KDE6. A /var on a fast drive can create wonders too.

    • psivchaz@reddthat.com
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      I’m trying out something mildly nutty by putting .steam in /home/steam, then making user-neon, and symlinking so that I can try kde without reinstalling steam games. If I succeed I might try it with other files.

      • Ilgaz@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        4 months ago

        First of all you can check distrobox.it which can basically run Neon inside your distribution however you better set a different virtual home for neon in that case.

        I would first tar the .steam to be on the safe side but steam is different, it is some kind of Ubuntu stable itself residing in that directory. Not a big time gamer but people laughed at Ubuntu for shipping its snap because of it.

        Long story short I don’t think steam would have issues. I meant not to expect KDE guys to revert upgraded preferences back to KDE5 etc. You know they do such things and blame Linux/KDE etc.

      • Mouette@jlai.lu
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        4 months ago

        I’ve created a specific partition for steam games so I can use games across distro without reinstalling them. You can tell Steam to go look in your partition for your games

        • webghost0101@sopuli.xyz
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          4 months ago

          I use my windows drive as a junk drawer for large programs in linux. Comes with the same benefit, fully accessible from either system.

  • Quazatron@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    4 months ago

    Partitioning does have benefits especially for enterprise scenarios. It allows you to specify different policies per mount point (i.e. no executables on /tmp, etc.). It prevents a runaway process from filling your hard disk with logs. It lets you keep your data separated from your OS, or have multiple OSs with the same home partition.

    For home use you’ll probably go with something simpler, like separated home, root and games partitions, for instance.

    Nowadays you should opt for LVM volumes or BTRFS subvolumes instead of partitions as these are way more flexible should you change your mind in the future about the sizes you allocated.

    • Pumpkin Escobar@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Yeah, I really like the archinstall default btrfs layout, 1 subvolume for each of these

        └─root    254:0    0  1.8T  0 crypt /var/log
                                            /var/cache/pacman/pkg
                                            /home
                                            /.snapshots
                                            /
      
    • scratchandgame@lemmy.ml
      link
      fedilink
      Tiếng Việt
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 months ago

      Partitioning have benefits. It is quite easy to set up “modern gnu/linux” since they all use a graphical installer. For sizes you can refer to openbsd’s disklabel(8) man page.

      It increase stability and security. Not only for enterprise.

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    4 months ago

    At least have a dedicated /home partition. This way if you want to upgrade the OS, change distribution, heck even migrate to a totally different OS your actual data is safe. Also if you need to do a backup, “just” backup /home which is probably going to be significantly faster and convenient than the entire OS. It also avoid using e.g dd and get a rather opaque file.

    TL;DR: yes /home keeps your data safe

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      4 months ago

      What’s the benefit of dd-ing a home partition over rsync-a-ing a home directory’s contents?

      • utopiah@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Well it’d result in a single file which if you have to copy on a microSD or USD stick might be easier. To also counter my own argument the result of dd can be mounted thus getting a rather useful directory quickly

        But anyway my point was rather the opposite, that indeed in most cases rsync, rdiff-backup, even scp (whatever one is most familiar with) to a local NAS, remote server, etc is usually better, at least more understandable for somebody who isn’t used to the process.

  • nottelling@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 months ago

    I’m surprised no one’s mentioned the security implications. Mounting with nosuid and nodev options can undermine rootkit or privileged escalation exploits.

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    4 months ago

    Partitioning (beyond what’s needed to boot)? No. Logical volumes or datasets? Perhaps, but probably not for most trivial setups. Even swap is fine on a file if you need it and it simplifies disk encryption. Most of my machines run an EFI and an LVM partition. If I need a separate volume for something, I can always create it in LVM.

  • TCB13@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    21
    ·
    4 months ago

    This is mostly a worthless discussion. A computer / device should be considerable disposable as well as all the data on it. Just sync everything real time to a local “server” with something like Syncthing and if something goes wrong with your machines resync it back. Done.

        • TCB13@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          4 months ago

          I mean, I’m sure you can find some old laptop, ARM SBC or anything second hand with a broken screen that people may even gift you or sell very cheap to run as your “home server”.

        • theshatterstone54@feddit.uk
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          Not everyone can afford to host such a service. For some people, bills would be an issue. For others, buying and storing the hardware would be an issue. For me, storing the hardware and hosting anything would be problematic, as I’m in student accommodation meaning space is limited as it is, let alone with extra stuff laying around. There’s also a point in my contract which states that upon using too much electricity, I would be asked to pay for it over it being included in the contract. On top of that, I don’t even need this as it is overkill for my use case, where I already have a backup of all cool and important stuff and a secondary backup of all the cool and important stuff that can’t be found on the internet or is very difficult to find. So yeah. That’s what I meant. Not everyone can afgord the luxury of doing so.

          • TCB13@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            4 months ago

            There’s also a point in my contract which states that upon using too much electricity

            Yes, because an extra 1.50$ / year would definitely kill your wallet.

            For me, storing the hardware and hosting anything would be problematic, as I’m in student accommodation meaning space is limited as it is

            This is a valid concern thought, however you may host it at your parent’s home for instance. Either way a RPi and a disk aren’t that big.

            You’re framing this as luxury when it fact it’s more like a small time effort to set it up than anything else.

    • grue@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Oh yeah, but did you know your server is a computer/device and therefore should be considered disposable, too? Checkmate, atheists! \s

      Honestly, though, you’re not wrong about how always having multiple copies of your data on separate devices is essential. (You do however also need backups, not just synchronized copies, because data-destroying fuck-ups can get sync’d too.)

      I’m not sure what your comment has to do with partitioning, though.

      • TCB13@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        4 months ago

        Ahahaha nice comment. I never said I didn’t have backups, the thing is that once you get your data across multiple machines with something like Synching your life becomes way better and things are easier to deal with. Even if my “server” dies I still have three more real time copies of the data (or at least one actually real time and two others a bit behind because those machines aren’t always turned on) and the “server” backups to another local drive a long term offsite backup that gets updated from time to time.

        I’m not sure what your comment has to do with partitioning, though.

        People usually go about and suggest partitioning their disks because they might require to reinstall the system and that way your home directory “will be safe” from whatever mess forced them into a reinstall. In reality this will just introduce unnecessary complexity and it is as likely to fail as single partition system. To be fair I would rather consider a BRTFS sub-volume for home with regular snapshots is way more interesting and manageable than just dumb partitions.

      • TCB13@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        4 months ago

        I guess it depends on how you’re running things… and you should have backups anyways.

        My previous point was that once you get your data across multiple machines your life becomes way better and things are easier to deal with. Even your “server” dies you’ll still have more real time copies of the data in your laptop, desktop etc. and eventually a long term offsite backup that gets updated from time to time. Having backups is important as real time sync won’t save you from you deleting files by mistake.

        A quick way to do things would be to have an SSD drive (so no noise) on the “server” for your real time sync and OS and a mechanical hard drive (usually spin down) that gets a copy of the data via rsync every day. Then you do a monthly or weekly backup of the data to a remote location ove the internet or some USB hard drive that you physically move to other site.

        If you’re using on an SBC you may run your OS on a SD card + 2.5" SSD drive for real time data + 3.5" for daily backup. And some other remote / offsite backup solution.