Comments: ** Comment from web user: brianlala **
Right... so this is sort-of by design. Before I implemented this, I was finding that:
* Launching the script on one server (let's call it the 'controller') and setting RemoteInstall Enable="true" would as expected launch an instance remotely on each other server in the farm.
* However, each server in a SP2013 farm (due to prereqs etc.) needed to reboot (a couple of times even)
* Upon returning from a reboot, each server would see the RemoteInstall Enable="true" and, upon completing its local install, then try to go out to each _other_ server and do a remote install (thinking it was the 'controller' - effectively all stepping on each other's toes
So basically I changed the behaviour so that (at least if you set ParallelInstall to true) you would initially launch a bunch of remote sessions, then as each server rebooted and autologon did its thing, it would independently continue its local installation and you would end up with basically the same result.
The immediate workaround could then be to either ensure that ParallelInstall is true and let the reboots & autologin do their thing, or to kick off the entire remote install from a server that's not a target for SP installation (e.g. I like to use the farm's SQL server). This non-SharePoint server would act as the 'controller' for the initial remote launch.
Hope that makes sense?
Brian