I have another question.
Say each player syncs to the server at 24kb/s.
That means the server needs 24kb ram per player per second.
If your server has 7.5gb of ram, and you calculate to use 6.8gb of ram to have remaining overhead, means
6800000kb/24= 233,000 players can be handled by 7gb of ram. If you only have 2vcpus, you might have to cut that number down to 10% of that total in case something goes wrong, so 23,000 players can be handled, is this true or false? If the server connection is 1gbps or 125MB/s (125000kb/s), then it should be able to handle 5200 total players?
If the 5200 players per server at 24kb/s is correct based on the lowest value able to be used (internet connection bandwidth), then do you provide instances that come with 1vcpu, 1gb ram, because that in theory should be enough to handle that many players? If that’s true, doesnt that mean that both improbable and the person paying for the service are overpaying?
Just that, 1000 players sounds like too low of a number all things considered. And according to you, would be $10.80 an hour or $93,000 a year, for 1/5th of the amount of players.
If only one instance is needed and can handle 5200 players, then multiples of those instances are not needed!
If you can accurately explain and describe why this is not the case, that would be appreciated.
If you could also provide the bandwidth of one player using the shooter template uses, that could also be helpful. Thanks.