如何將單一父 ZFS 檔案系統快照及其所有後代的相同快照傳送到新池?

如何將單一父 ZFS 檔案系統快照及其所有後代的相同快照傳送到新池?

如果我們 zfs snapshot -r Tank@20220129,所有子檔案系統也會獲得該快照。

如何將帶有所有子檔案系統的坦克的單一快照發送到新池(沒有快照的歷史基礎)?

雖然 zfs send -R Tank@20220129 將發送所有子檔案系統,但它也會發送所有快照。

(我們稍後可以刪除所有這些快照,但這可能需要大量的額外發送,只是為了在完成後刪除。)

似乎沒有 zfs send -r 功能。

答案1

最後編輯:你是對的,你不能很好地處理遞歸(不是克隆!樹中的每個樹都需要單獨製作)我理解這樣的斷言,即正在復制函數的繼承而不是繼承的當前值,但我很掙扎在用例中這是最期望的值。我將提出一個功能請求。 當任何可修改屬性從預設值變更時,所有子屬性都會自動變更為繼承。我有一個加密根和產生金鑰 ds,其中許多非預設值都屬於活動子項。父母和孩子應該擁有一項財產,以便能夠選擇不繼承或繼承全部或一組財產。然後 zfs send -R -p 將以預期的方式起作用。您的案例需要上游功能,通常只有單個快照需要遞歸發送,並且從遞歸創建的快照遞歸創建克隆是一個預期的選項,我很驚訝地發現它不存在。

我很確定您可以執行 zfs 克隆或類似的操作,而不是我原來的“如果遇到快照,zfs 會將它們全部銷毀;然後執行您的 zfs send | zfs receive 克隆,無需任何快照。”這是不雅的、未經研究的和懶惰的,你不需要克隆,只是像這樣的 for 循環

for ds in $(zfs list -Ho name -r rpool); do \
zfs send ${ds}@20220129 | zfs recv -d newpool; done

但克隆也可以工作,新的 zfs snap -r 和 zfs send -R 僅作為新克隆中的一個快照。但你不能遞歸克隆,所以無論如何都需要一個類似的 for 迴圈。或者,如果您不介意丟失所有 zfs 歷史記錄,則將其 rsync 到一個乾淨的接收池並安裝具有所需屬性的資料集。

所以我正在擴展我的解決方案,因為它不那麼簡單或安全,而且我很快就會在實時但不是關鍵的系統上進行它。就我而言,我還將拆分鏡像 vdev,並在 zfs recv 上進行一些池和 zfs 屬性變更。

Zpool 和 zfs 遞歸操作可以改進一直是個錯誤。且不清楚多個 zsys bootfs、zfs-mount-generator、zfs-zed.service 的最佳實務(在 systemctl 掛起週期後不會重新啟動!),持久性資料集掛載不會反映啟動時 zfs-list.cache/pool 的狀態! Cananoical 似乎已經完成了對 zfs root 和 zsys 可用性的推動。這並不僅僅因為它是一個 ubuntu 安裝程式選項就結束了。

for zp in rpool bpool vault; do \
zpool trim -w $zp; zpool scrub -w $zp; \
zfs snap -r ${zp}@b4split; \
done
for zp in rpool bpool vault; do \
zpool attach -w $zp /dev/sda /dev/sde; \
zpool attach -w $zp /dev/sdc /dev/sdf; \
zpool split -w $zp ${zp}-offbakup /dev/sdg /dev/sdh; \
zpool initialize -w ${zp}-offbakup; \
zpool scrub -w ${zp}-offbakup; \
zpool export ${zp}-offbakup; \
done; rest

cat << EOF >> /etc/motd
> IMPORTANT NOTE TO SELF. Pool on zfc-receive with encryption, zstd, new dataset struction for boot enivironments \
Out with ubuntu, snapd!!!, grub, ext4, btrfs, lz4, snapd, systemd, docker, x|X* \
> IN like RSYNC void-linux-install, build zfsbootmenu AND s6 from source \
> wayland lxc libvirt pcie passthrough to stripped win11 for mah Civ-6 and Steam 
> EOF

for zp in rpool bpool vault; do \
zfs snap -r $zp@pre-b4move; \
zpool set localhost:PROPERTY-orig $(zpool list -Ho PROPERTY $zp); \
zpool checkpoint $zp;
zpool upgrade $zp (!);
done

for ds in $(zfs -Ho name -r rpool bpool vault); do \
echo "record some original properties for reuse - inherited props belong to parent dataset so revert on recv even with send -R or -p"; \
zfs set localhost:[PROPERTY]_orig=$(zfs -Ho [PROPERTY] $ds); \
done

弄清楚如何將 void linux 和 zfsbootmenu 安裝到這個 hackery 中,以及在新的 zfs 發送/接收遞歸之後自動掛載所有 zsys 和 systemd zfs。具有一致的繼承和 -o 期望對 zfs 接收非常重要。

─# zlsz
bpool/BOOT/garuda              com.ubuntu.zsys:last-used           1665060644
bpool/BOOT/kinetic             com.ubuntu.zsys:last-used           1664996078
bpool/BOOT/pve30-cli           com.ubuntu.zsys:last-used           1664973489
bpool/BOOT/pve30-gnm           com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda              com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda              com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda              com.ubuntu.zsys:bootfs              yes
rpool/ROOT/garuda/root         com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/root         com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/root         com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/srv          com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/srv          com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/srv          com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/var          com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/var          com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/var          com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/var/cache    com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/var/cache    com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/var/cache    com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/var/lib      com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/var/lib      com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/var/lib      com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/var/log      com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/var/log      com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/var/log      com.ubuntu.zsys:bootfs              no
rpool/ROOT/garuda/var/tmp      com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/garuda/var/tmp      com.ubuntu.zsys:last-booted-kernel  vmlinuz-linux-lts
rpool/ROOT/garuda/var/tmp      com.ubuntu.zsys:bootfs              no
rpool/ROOT/kinetic             com.ubuntu.zsys:last-used           1664996078
rpool/ROOT/kinetic             com.ubuntu.zsys:last-booted-kernel  vmlinuz-5.19.0-18-generic
rpool/ROOT/kinetic             com.ubuntu.zsys:bootfs              yes
rpool/ROOT/pve30-cli           com.ubuntu.zsys:last-used           1664973489
rpool/ROOT/pve30-cli           com.ubuntu.zsys:last-booted-kernel  vmlinuz-5.15.53-1-pve
rpool/ROOT/pve30-cli           com.ubuntu.zsys:bootfs              yes
rpool/ROOT/pve30-gnm           com.ubuntu.zsys:last-used           1665060644
rpool/ROOT/pve30-gnm           com.ubuntu.zsys:last-booted-kernel  vmlinuz-5.15.60-1-pve
rpool/ROOT/pve30-gnm           com.ubuntu.zsys:bootfs              yes
rpool/USERDATA/garuda          com.ubuntu.zsys:last-used           1665060644
rpool/USERDATA/garuda          com.ubuntu.zsys:bootfs-datasets     rpool/ROOT/garuda
rpool/USERDATA/kinetic         com.ubuntu.zsys:last-used           1664996078
rpool/USERDATA/kinetic         com.ubuntu.zsys:bootfs-datasets     rpool/ROOT/kinetic
rpool/USERDATA/pve30-cli       com.ubuntu.zsys:last-used           1664973489
rpool/USERDATA/pve30-cli       com.ubuntu.zsys:bootfs-datasets     rpool/ROOT/pve30-cli
rpool/USERDATA/pve30-gnm       com.ubuntu.zsys:last-used           1665060644
rpool/USERDATA/pve30-gnm       com.ubuntu.zsys:bootfs-datasets     rpool/ROOT/pve30-gnm
                           -
└─# zfs list -o name,used,dedup,secondarycache,sharesmb,acltype,overlay,compression,encryption,canmount,mountpoint,mounted
NAME                            USED  DEDUP          SECONDARYCACHE  SHARESMB  ACLTYPE   OVERLAY  COMPRESS        ENCRYPTION   CANMOUNT  MOUNTPOINT               MOUNTED
bpool                          1.94G  on             metadata        off       off       off      lz4             off          off       /bpool                   no
bpool/BOOT                     1.92G  on             metadata        off       off       on       lz4             off          off       none                     no
bpool/BOOT/garuda               250M  on             metadata        off       off       off      zstd-3          off          noauto    /boot                    no
bpool/BOOT/kinetic              782M  on             metadata        off       off       on       lz4             off          noauto    /boot                    no
bpool/BOOT/pve30-cli            273M  on             metadata        off       off       on       lz4             off          noauto    /boot                    no
bpool/BOOT/pve30-gnm            658M  on             metadata        off       off       on       lz4             off          noauto    /boot                    no
bpool/grub                     5.37M  on             metadata        off       off       on       lz4             off          noauto    /boot/grub               no
rpool                           176G  off            metadata        off       posix     off      lz4             off          off       /rpool                   no
rpool/LINUX                     772M  off            metadata        off       posix     off      lz4             off          off       /                        no
rpool/LINUX/opt                 765M  off            metadata        off       posix     off      lz4             off          noauto    /opt                     no
rpool/LINUX/usr-local          6.95M  off            metadata        off       posix     on       lz4             off          noauto    /usr/local               no
rpool/ROOT                     42.4G  off            metadata        off       posix     off      lz4             off          noauto    /rpool/ROOT              no
rpool/ROOT/garuda              19.7G  off            metadata        off       posix     off      zstd-3          off          noauto    /                        no
rpool/ROOT/garuda/root         3.56G  off            metadata        off       posix     off      zstd-3          off          noauto    /root                    no
rpool/ROOT/garuda/srv           208K  off            metadata        off       posix     off      zstd-3          off          noauto    /srv                     no
rpool/ROOT/garuda/var          5.49G  off            metadata        off       posix     off      zstd-3          off          off       /var                     no
rpool/ROOT/garuda/var/cache    5.46G  off            metadata        off       posix     off      zstd-3          off          noauto    /var/cache               no
rpool/ROOT/garuda/var/lib       192K  off            metadata        off       posix     off      zstd-3          off          off       /var/lib                 no
rpool/ROOT/garuda/var/log      10.1M  off            metadata        off       posix     off      zstd-3          off          noauto    /var/log                 no
rpool/ROOT/garuda/var/tmp      15.5M  off            metadata        off       posix     off      zstd-3          off          noauto    /var/tmp                 no
rpool/ROOT/kinetic             7.26G  off            metadata        off       posix     off      lz4             off          noauto    /                        no
rpool/ROOT/pve30-cli           6.18G  off            metadata        off       posix     off      lz4             off          noauto    /                        no
rpool/ROOT/pve30-gnm           9.28G  off            metadata        off       posix     off      lz4             off          noauto    /                        no
rpool/USERDATA                 13.8G  off            metadata        off       posix     on       lz4             off          off       none                     no
rpool/USERDATA/garuda          11.3G  off            metadata        off       posix     off      lz4             off          noauto    /home                    no
rpool/USERDATA/kinetic          791M  off            metadata        off       posix     on       lz4             off          noauto    /home                    no
rpool/USERDATA/pve30-cli       3.43M  off            metadata        off       posix     on       lz4             off          noauto    /home                    no
rpool/USERDATA/pve30-gnm       1.76G  off            metadata        off       posix     on       lz4             off          noauto    /home                    no
rpool/data                     98.9G  off            metadata        off       posix     off      lz4             off          on        /data                    yes
rpool/data/media               4.01G  off            metadata        off       posix     off      lz4             off          on        /data/media              yes
rpool/data/temp                 192K  off            metadata        off       posix     off      lz4             off          on        /data/temp               yes
rpool/data/vm-300-disk-0       29.9G  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/data/vm-300-disk-1        312K  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/data/vm-300-disk-2        128K  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/data/zvol                65.0G  off            metadata        off       posix     off      lz4             off          on        /data/zvol               yes
rpool/data/zvol/vm-101-disk-0  3.15M  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/data/zvol/vm-101-disk-1  65.0G  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/data/zvol/vm-101-disk-2  6.12M  off            metadata        -         -         -        lz4             off          -         -                        -
rpool/pve                      20.2G  off            metadata        off       posix     off      lz4             off          off       /                        no
rpool/pve/var-lib-pve-cluster   912K  off            metadata        off       posix     on       lz4             off          noauto    /var/lib/pve-cluster     no
rpool/pve/var-lib-vz           16.4G  off            metadata        off       posix     on       lz4             off          on        /var/lib/vz              yes
rpool/pve/zfsys                3.73G  off            metadata        off       posix     off      lz4             off          on        /zfsys                   yes
vault                           759G  off            all             off       off       off      lz4             off          off       /vault                   no
vault/devops                    306G  off            all             off       off       off      lz4             off          off       /                        no
vault/devops/PVE               84.1G  off            all             off       off       off      lz4             off          off       /var/lib                 no
vault/devops/PVE/vz            84.1G  off            all             off       off       off      lz4             off          on        /var/lib/vvz             yes
vault/devops/vm                 222G  off            all             off       off       off      lz4             off          off       /vm                      no
vault/devops/vm/vm-502-disk-0    88K  off            all             -         -         -        lz4             off          -         -                        -
vault/devops/vm/vm-502-disk-1  12.7G  off            all             -         -         -        lz4             off          -         -                        -
vault/devops/vm/vm-502-disk-2    64K  off            all             -         -         -        lz4             off          -         -                        -
vault/devops/vm/vm-510-disk-0  3.08M  off            all             -         -         -        lz4             off          -         -                        -
vault/devops/vm/vm-510-disk-1   209G  off            all             -         -         -        lz4             off          -         -                        -
vault/devops/vm/vm-510-disk-2  6.07M  off            all             -         -         -        lz4             off          -         -                        -
vault/media                     453G  off            all             off       off       off      lz4             off          off       /vault/media             no
vault/media/APP                 192G  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/APP/downloads      15.8G  off            all             off       off       off      lz4             off          on        /share/downloads         yes
vault/media/APP/library_pc      176G  off            all             off       off       off      lz4             off          on        /share/library_pc        yes
vault/media/DOCS               26.6G  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/DOCS/personal      26.6G  off            all             off       off       off      lz4             off          noauto    /share/personal          no
vault/media/DOCS/reference       96K  off            all             off       off       off      lz4             off          noauto    /share/reference         no
vault/media/LINUX              1.29G  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/LINUX/lxsteam      1.29G  off            all             off       off       on       lz4             off          on        /home/mike/.local/Steam  yes
vault/media/MUSIC               167G  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/MUSIC/dj_bylabel    167G  off            all             off       off       off      lz4             off          on        /share/dj_bylabel        yes
vault/media/PHOTO               288K  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/PHOTO/albums         96K  off            all             off       off       off      lz4             off          noauto    /share/albums            no
vault/media/PHOTO/public         96K  off            all             off       off       off      lz4             off          noauto    /share/public            no
vault/media/video              66.2G  off            all             off       off       off      lz4             off          off       /share                   no
vault/media/video/library      66.2G  off            all             off       off       off      lz4             off          on        /share/library           yes

相關內容