Teaming NICs between Windows 10 and XenServer

So I currently have a windows PC that I am attempting to team or bond NICs together to get more throughput between it and my XenServer. They are directly attached. I have installed the Intel drivers and configured SLA on my Windows PC. I have tried the other options but none of them gave me a 2Gb connection speed. On XenServer I run 2 NICs in an Active-Active bond. I tried other configurations but this is the only one that gave me a 2Gb connection speed on both computers. I am really at a loss. I have tried testing speed with iperf and doing transfers over to the samba share from my SSD. Link lights for both of the connections are showing and both flash when I run something over the connection. I had even tried running multiple tests at once to make sure that it wasn’t 1Gb per session. I’ve spent a few days working on this and I haven’t been able to come up with any results.

Thanks for any help,
Maxamus456

Of course it was something stupid like needing to reboot for it to work mostly. I can have multiple 1Gb connections now but not a single 2Gb conenction

So the way where I managed to get above 1Gb performance was to have 2 VMs running iperf on the bonded interface. I can hit both at the same time from my PC and get 2Gb performance but if I only hit one I get 1Gb performance. So the bond is working. But why only 1Gb to each VM. Nothing shows up when I use ethtool or try to cat the speed file for the interface.