I’m a bit confused why my TrueNAS Scale boot drive doesn’t have enough space for an update. This is my first time ever performing a TrueNAS update (as I’ve only first installed the OS a little over a month ago/built the server).
The boot device is an Intel Optane M10 16GB drive which only has TrueNAS Scale on it (running on bare metal)
The storage size shows up differently under System settings → Boot - 12.5GiB & Storage → Disks - 13.41GiB
And as far as I’m aware the 16GB should show up as 14.9GiB? As the 10TB drives for instance, show up as 9.1TiB…?
As far as I know, I could fix this issue by making a backup of the config file; reinstalling the most recent version of TrueNAS Scale; logging in & importing the config file & everything would basically “be the same” (datasets, permissions, networking settings, VMs, apps…)
But I’m just curious why I’m having this issue as the 16GB drive should be fulfilling the 16GB requirement for TrueNAS Scale? Also apologies if this is a very noob issue… as it probably is
Thanks to anybody that reads this & hope you have a nice day!
I don’t remember offhand if those totals include snapshots. Seems unlikely you’d have that many boot pool snapshots within a month, but might be something to check on before reinstalling.
I checked the “Boot Environments” & there’s only the 1 file there. I’d guess if anything I perhaps didn’t format it appropriately during install or something… the 13.41 GiB is weird to me as it should show it as 14.9GiB & I believe that if I had 14.9 then there wouldn’t be any issue.
I have multiple entries that stem from previous upgrades, which I thought may consume the missing space.
Now that we know that’s not the case I suspect that good old GB vs. GiB confusion robbed you of your missing storage.
Also, truenas needs to partition the boot drive to enable UEFI boot. This will take away a little over half a GB of space.
To validate:
access the shell (System Settings -> Shell)
become root (sudo -i; enter user credentials)
lsblk lists all partitions sorted by drive including size in GiB. You should have no problem identifying your boot drive. It should list SIZE for the complete drive as about 14.9G. Also, it should have 3 partitions, only one, the largest, is used for storage of the boot-pool. This partition should be listed as having SIZE of around 14.4G
lsblk -b lists the same data, but in total bytes. Some calculator math should clarify that nobody stole storage from your 16GB drive.
zfs list boot-pool will show you the total space USED and AVAIL in your boot pool partition in GiB. USED+AVAIL should total close to 14.4G
zfs list boot-pool -p will show the same in bytes allowing you to verify no bytes were harmed or lost in creating the boot pool.
Bonus: you can list all data sets of your boot pool including how much space is consumed by dependent data sets using the following command: zfs list boot-pool -ro space
Thank YOU for taking the time!
And thank you for the excellent guide/explanation, I really appreciate it!
It would seem that the “16GB” optane module is not actually 16GB.
I bought 2 of these drives on AliExpress a few weeks ago so I plugged the other one into the NAS via USB.
The drives say they’re 16GB, so they should show up as 16*0.93 → 14.88GiB. I also plugged in another random 16GB USB flash drive & while not quite hitting the 14.88 it did show a higher storage capacity.
So, I guess my conclusion is that these Optane drives aren’t really 16GB, or at least not the 2 I got from AliExpress, perhaps these are some “not quite perfect” models or something.
If anything, this does confirm that these are unfortunately not suitable for TrueNAS Scale. @jode Thank you very much for your time & help!