I am pretty sure think that the field of view from each eye tracker will overlap with each other and that will give you unpredictable results.
Theoretically, if it would be possible to make every eye trackers aware of each other and sync/calibrate in terms of which one is active for the particular area, that I believe could work. Sounds to me like it could be done on the software level, but I do not think it is possible at the moment. The same could apply for the ultra-wide curved monitors.
As a starting point for that, how would Windows handle 2 eye trackers connected to the same PC? Is it possible to address a particular eye tracker for lets say calibration? And on the API level, is it possible to intercept the coordinates from two eye trackers, process them and then forward as if it were one eye tracker or similar?