[illumos-Developer] zpool upgrade and zfs upgrade behavior on b145

Chris Mosetick cmosetick at gmail.com
Thu Sep 9 17:56:48 PDT 2010


Not sure what the best list to send this to is right now, so I have selected
a few, apologies in advance.

A couple questions.  First I have a physical host (call him bob) that was
just installed with b134 a few days ago.  I upgraded to b145 using the
instructions on the Illumos wiki yesterday.  The pool has been upgraded (27)
and the zfs file systems have been upgraded (5).

chris at bob:~# zpool upgrade rpool
This system is currently running ZFS pool version 27.
Pool 'rpool' is already formatted using the current version.

chris at bob:~# zfs upgrade rpool
7 file systems upgraded

The file systems have been upgraded according to "zfs get version rpool"

Looks ok to me.

However, I now get an error when I run zdb -D.  I can't remember exactly
when I turned dedup on, but I moved some data on rpool, and "zpool list"
shows 1.74x ratio.

chris at bob:~# zdb -D rpool
zdb: can't open 'rpool': No such file or directory

Also, running zdb by itself, returns expected output, but still says my
rpool is version 22.  Is that expected?

I never ran zdb before the upgrade, since it was a clean install from the
b134 iso to go straight to b145.  One thing I will mention is that the
hostname of the machine was changed too (using these
instructions<http://wiki.genunix.org/wiki/index.php/Change_hostname_HOWTO>).
bob used to be eric.  I don't know if that matters, but I can't open up the
"Users and Groups" from Gnome anymore, *"unable to su"* so something is
still not right there.

Moving on, I have another fresh install of b134 from iso inside a virtualbox
virtual machine, on a total different physical machine.  This machine is
named weston and was upgraded to b145 using the same Illumos wiki
instructions.  His name has never changed.  When I run the same zdb -D
command I get the expected output.

chris at weston:~# zdb -D rpool
DDT-sha256-zap-unique: 11 entries, size 558 on disk, 744 in core
dedup = 1.00, compress = 7.51, copies = 1.00, dedup * compress / copies =
7.51

However, after zpool and zfs upgrades *on both machines*, they still say the
rpool is version 22.  Is that expected/correct?  I added a new virtual disk
to the vm weston to see what would happen if I made a new pool on the new
disk.

chris at weston:~# zpool create test c5t1d0

Well, the new "test" pool shows version 27, but rpool is still listed at 22
by zdb.  Is this expected /correct behavior?  See the output below to see
the rpool and test pool version numbers according to zdb on the host weston.


Can anyone provide any insight into what I'm seeing?  Do I need to delete my
b134 boot environments for rpool to show as version 27 in zdb?  Why does zdb
-D rpool give me can't open on the host bob?

Thank you in advance,

-Chris

chris at weston:~# zdb
rpool:
    version: 22
    name: 'rpool'
    state: 0
    txg: 7254
    pool_guid: 17616386148370290153
    hostid: 8413798
    hostname: 'weston'
    vdev_children: 1
    vdev_tree:
        type: 'root'
        id: 0
        guid: 17616386148370290153
        create_txg: 4
        children[0]:
            type: 'disk'
            id: 0
            guid: 14826633751084073618
            path: '/dev/dsk/c5t0d0s0'
            devid: 'id1,sd at SATA_____VBOX_HARDDISK____VBf6ff53d9-49330fdb/a'
            phys_path: '/pci at 0,0/pci8086,2829 at d/disk at 0,0:a'
            whole_disk: 0
            metaslab_array: 23
            metaslab_shift: 28
            ashift: 9
            asize: 32172408832
            is_log: 0
            create_txg: 4
test:
    version: 27
    name: 'test'
    state: 0
    txg: 26
    pool_guid: 13455895622924169480
    hostid: 8413798
    hostname: 'weston'
    vdev_children: 1
    vdev_tree:
        type: 'root'
        id: 0
        guid: 13455895622924169480
        create_txg: 4
        children[0]:
            type: 'disk'
            id: 0
            guid: 7436238939623596891
            path: '/dev/dsk/c5t1d0s0'
            devid: 'id1,sd at SATA_____VBOX_HARDDISK____VBa371da65-169e72ea/a'
            phys_path: '/pci at 0,0/pci8086,2829 at d/disk at 1,0:a'
            whole_disk: 1
            metaslab_array: 30
            metaslab_shift: 24
            ashift: 9
            asize: 3207856128
            is_log: 0
            create_txg: 4
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.illumos.org/pipermail/developer/attachments/20100909/2a57debb/attachment.html>


More information about the Developer mailing list