Linux zfs boot issue

My system is zfs. It has 3 pools bpool, rpool, and docker-pool. In an attempt to backup an nvme that home assistant is installed on I put the nvme in an enclosure and hooked it up to the zfs system. Unfortunately the system froze while attempting the backup. A hard reset was done, but the home assistant nvme remained connected when the system was coming back up and this caused an issue because the nvme also has a /boot partition on it. Now the zfs system isn’t importing the bpool properly on boot and is dropping into emergency mode. In emergency mode, I’m able to mount bpool to /boot, but booting does not continue when I try to leave emergency mode. My intuition is that I need the block ID of the bpool and update fstab with it, but I’m not sure. Please, I need help recovering this system.

And btw, I love my kvm!

Thank you,

I don’t have anything that boots from zfs, so I’m not sure what’s needed in your fstab. What does your current fstab look like? Can you post your zpool status and zfs list -t all? What distribution are you using? Are you booting via grub, zfsbootmenu or something else? The recovery you’re talking about, is the recovery from the initrd?

I am using zfs on root. To be honest, if you let me set it again, I would use ext4 for boot partition instead of zfs.
Recent distribution upgrade broken my zfs boot process. At some point, I have to use “zfsbootmenu” to boot my system.

I should provide an update on this. I overreacted a bit and assumed the problem was worse than it was. Turned out that in the process of trying to back up the home assistant drive there must have only been a little bit of space left on my root partition and that filled the drive to 100%. I’m not sure why that was only booting into emergency mode, but after clearing some files from the root partition. Thanks for the suggestions though!