First off, this differs from my other thread because this scheme is 100% client-centric, and has no need for a server at all except for maybe tracking.
This would be a scheme for a scalable, distributed P2P multiplayer setup. It wouldn't be able to detect or combat hacked clients, since oversight would be too taxing, but for the frequent case of friends just wanting to play a private game but lacking a server, this would be a godsend.
The system works by distributing every chunk of the world to every user with a bittorrent-like peer-to-peer protocol. This can be done fairly quickly and easily due to the relatively small size of maps and the fact that there will a high seeder to leecher ratio at any given time. Each chunk has a globally-synchronized timestamp of the last time it was updated. Newer versions of the chunk replace older versions in the swarm. The torrent system would be used strictly for archived chunks, which are chunks that all the players have left and are no longer editing, or are occupied and have been routinely compressed and saved to file every few seconds. It is slower and lower priority, but keeps everyone on roughly the same page.
A chunk is defined as a occupied if it a client has a player or entity located on that chunk or within manipulable distance from that chunk.
Each client is aware of every other player's position and their view distance. If a chunk is visible to a player, we will call that chunk relevant. If a chunk is occupied by multiple players, we will call that chunk contested. The torrent system would prioritize relevant chunks over unseen chunks. Relevant chunks that are occupied by another player are handled by the second system described below.
Using ping and bandwidth estimations, the clients build and maintain a routing table. For relevant and contested chunks, the clients modifying a chunk use the routing table to figure out the fastest route to inform all relevant players of their action, prioritizing contested players over merely relevant players.
Updates would consist of actions done to the chunk since the last archiving using the update protocol Notch mentioned in the dev blog. Since an archiving happens once every few seconds, the sequence of events is never very complicated and the data transferred remains very small. After a set amount of time has passed, and all parties agree on the state of the chunk, the chunk is archived again and distributed to all non-local parties via the torrent system.
Entities and other natural things would be hosted on whomever's client is closest, and should be synchronized and seamlessly able to switch between clients. Map generation can be done by anyone, but only after they've loaded the entire world on the torrent.
This system would also be an easy way to implement infinite maps, since it plays out on the client just like single player except when your player is near other players. You also wouldn't have to worry about server crashes, since all clients have identical copies of the world, give or take a few seconds of modifications plus propagation time.
Again, this wouldn't work without complete trust between the clients, would use a lot more total bandwidth than a centralized server, and would have some difficulty dealing with particularly slow clients, but it wouldn't need any server-grade hardware and fills a niche that a lot of players would probably appreciate.
Depending on just how much control Notch gives us over servers and multiplayer protocol, we might even be able to implement this independently as a 3rd party server app.
Did I miss anything?
**********An Argument for Why this Protocol Will Work*********************
Let me first explain this:
If there are sufficient available seeders and your upstream traffic is not overloading your network, bittorrent can and will max-out your bandwidth. It is the optimal method for distributing files over a large number of computers, paradoxically increasing in speed as the number of peers increase. Using it to distribute iterations of the archived world map is the optimal environment for using bittorrent. At any given time, all clients will have something like a minimum of 95% of all chunks up-to-date, each chunk is so small its compressed form can fit into a single UDP packet, and the no client is dividing its bandwidth between multiple torrents at any given time. Even with slow connections, all nodes should be able to update all chunks within a few seconds, and even if it doesn't, the player won't see those chunks that are out-of-date.
Bittorrent protocol is used by many games, most notably Blizzard titles, for static updates like patches, but always use centralized servers for actual gameplay since bittorrent is simply not responsive enough. But that's fine. We are doing the same thing here.
For fast response time, a monolithic server is an absolute necessity. The server handles player/player and player/environment interactions, resolves all disputes over who did what and in what order, and presumably that server has low latency for all clients. Well, we don't need a server large enough to encompass the entire world population, because there is nothing a player on the north end of the continent can do to affect the player on the south end that can't be handled via bittorrent.
So we make local servers for handling local interactions, and sub-servers for distributing and collecting messages in that area, all made up of local clients and organized by optimal speed and bandwidth. If there are too many people in one place and the local nodes are overloaded, non-local nodes will begin joining in helping distribute the load. This would diminish the torrent bandwidth, but that makes no immediate difference.
Having a multi-tiered server setup like this is sort of like a ghetto-rigged cluster farm like what large population MMOs actually use. This is harder to do the more involved the server is in how the game is run, but In our situation, servers have a very simple task. Notch's block update protocol has order of actions is based on timestamps and can be reconciled after the fact. Due to this and the fact that servers won't be doing legal move verification, the only job they really have is to pass messages as quickly and efficiently as possible.
Interestingly, if this local server/cluster method and the global torrenting method are beginning to sound a lot like the same thing, that is because they are. The only difference between the two is the kind of data is being passed between clients (based on the proximity of one player to the other) and which clients you are sending data to (based on speed and local priority, but optimized globally. I think this problem may be NP-complete, but for a few dozen to a few hundred peers, who cares?).
I still need to solidify the protocol for verifying that everyone in a contested area has received every update and agrees on the results before archiving, and also how the routing tables will work, but this all looks very doable.
This sounds like it would work well, and possibly be easier (for the player) to set up than a custom private server. For these reasons, I like this idea.
Great for anyone without a serverstyle internet and computer setup. Should be an option, maybe automatically determined to see which would work better for your current setup.
I only understood 50% of that but it sounds great!
I'm not totally convinced the OP actually does either. Bittorrent is for static files. Once you start changing chunks, I'm not sure if error-checking will still work anymore. At least all the buzzwords seem to be in the right locations.
As far as distributing files when servers are offline? Sure. Lock the size at something manageable (for broadband), like half a gig of data.
And distributed servers wouldn't be impossible either. Just not "Bittorrent" like it's a magic word that makes things work.. It would be... fairly easy (?) to simply keep a local copy on each client, changing stuff as needed. The problem is, each player needs to download all the changes since they left, each time they reconnect. This means players will be stuck not-playing for protracted periods of time, as they download static (yet changed) chunks, and update those with dynamically changed chunks. Instead of "block X, Y, Z on chunk 2134 is now type 37" they'll need to do a LOT more.
Instead of a server keeping track, each player will need to send a copy of all their block changes to each co-client- shouldn't be too bandwidth intensive though, I suppose.
For LAN games, this might be workable. Use the torrent to distribute a single saved world to each client. Have each client use the relatively unlimited LAN bandwidth.
Pros: Lots more monsters. Each PC might keep track of a set of monsters, and simply update their positions.
Cons:
[*:cizqgh70]New players "connecting" to a changing world could be very hard. Look at any FPS match-style game right now, each player loads a local copy of the map, and then the positions of all the players, names, dropped items, etc. Now add in a changing world map of large size.
[*:cizqgh70]Downloading a lot of changes for a large map every join would be... interesting. Chunks would need "last updated" values, so clients don't get old data, then have to toss it and get NEW data.
[*:cizqgh70]Cheater vulnerability- Imagine if someone figures out how to join, and re-update every chunk...[*:cizqgh70]Bandwidth issues can be run into, impacting gameplay.[*:cizqgh70]Who's going to be the tracker? DHT protocol? More bandwidth.[*:cizqgh70]Some people (well, their ISPs) have issues with Bittorrent stuff. A lot of Blizzards (for example) users cannot download their stuff via torrents. This has been an issue since beta, in fact. (I was there!)
Alright, I think I made a mistake in calling it bittorrent. I was using it to illustrate the nature of the protocol rather than to imply it was true bittorrent. The protocol, like bittorrent, connects peers to one another and arranges for each peer to contribute new sections of the world to leeching clients. Unlike bitorrent, the tracker (which will be centralized until I work out a more elegant way to store it) keeps track of the latest version id of each chunk, i.e. the synchronized timestamp from when it was last archived. When a new chunk is archived, the tracker broadcasts the new chunk id to all clients, which ask around for it bittorrent style until they get the transfer. The distribution is slow at start, but picks up speed as more clients get the piece. This is fine because, again, these are low priority pieces being transferred by this protocol. In any case, it's no worse off at the start than a centralized server would be with on the same connection.
Error checking will be in in the form of checksums, but will be unable to discriminate against exploits or hacked clients. If you can think of a cost-effective way to handle it, I'm all ears.
Without a persistent online presence, there would be no way to keep the world consistent. Simply can't be done. Unless you have fantastic logistical skills you will need a client that stays in the game and acts as a seed when needed. That client doesn't need to keep an active player, though. It's annoying to have to rely on something that's essentially a server in a serverless configuration, but I don't see any way around that. If it's any consolidation, you could probably stick a whole bunch of group's worlds on the same server since that's all it would be doing.
As far as bandwidth concerns go, local change messages will almost certainly use UDP protocol, which is very succeptable to overloading the client's bandwidth and will need to be carefully monitored and regulated. To make things easier the torrent-like chunk updating method will probably remain TCP for the time being just to have fewer things to manage.
Multiple players logging in shouldn't be too much of a problem during regular use because players will enter/exit at a staggered rate and the seeders will far outnumber the new leechers at any given time. I will admit, though, starting a brand new world and everyone jumping in all at once would suck something fierce.
tl;dr answer:
Not bittorent, but kind of like bittorrent.
Most of those cons shouldn't be a problem most of the time. The others are major concerns with no easy solution I can see.
I'm really hoping that the small filesize will let me hand-wave the really tricky load balancing problems.
It's a kind of network that has it's own separate pros and cons, and it's cons is that it differentiate it's connection by requiring a lot of people in order to have a top speed.
Plus everyone who plays the game using this will kill each other's bandwidth use for their ISP...
Some countries put limits in GB for internet use you know... Having to communicate to multiple clients in a "server" continously will use up more resources...
It's kinda like this, you can't gain anything good, without sacrificing something.
I only understood 50% of that but it sounds great!
[*]Cheater vulnerability- Imagine if someone figures out how to join, and re-update every chunk...[*]Bandwidth issues can be run into, impacting gameplay.[*]Who's going to be the tracker? DHT protocol? More bandwidth.[*]Some people (well, their ISPs) have issues with Bittorrent stuff. A lot of Blizzards (for example) users [i]cannot [/i]download their stuff via torrents. This has been an issue since beta, in fact. (I was there!)[/list]
Echo on blizzards problem, however it is important to note blizzard and most other games dont really use bittorent protocol, but their own proprietary ftp protocol, systems they have put millions into in some cases like SOE. notch doesnt have that kind of development money or teams to do it for him
List tags are malformed.
Rollback Post to RevisionRollBack
Equ1N0X-RedStone Engineer and Cybernetic architect.
This would be a scheme for a scalable, distributed P2P multiplayer setup. It wouldn't be able to detect or combat hacked clients, since oversight would be too taxing, but for the frequent case of friends just wanting to play a private game but lacking a server, this would be a godsend.
The system works by distributing every chunk of the world to every user with a bittorrent-like peer-to-peer protocol. This can be done fairly quickly and easily due to the relatively small size of maps and the fact that there will a high seeder to leecher ratio at any given time. Each chunk has a globally-synchronized timestamp of the last time it was updated. Newer versions of the chunk replace older versions in the swarm. The torrent system would be used strictly for archived chunks, which are chunks that all the players have left and are no longer editing, or are occupied and have been routinely compressed and saved to file every few seconds. It is slower and lower priority, but keeps everyone on roughly the same page.
A chunk is defined as a occupied if it a client has a player or entity located on that chunk or within manipulable distance from that chunk.
Each client is aware of every other player's position and their view distance. If a chunk is visible to a player, we will call that chunk relevant. If a chunk is occupied by multiple players, we will call that chunk contested. The torrent system would prioritize relevant chunks over unseen chunks. Relevant chunks that are occupied by another player are handled by the second system described below.
Using ping and bandwidth estimations, the clients build and maintain a routing table. For relevant and contested chunks, the clients modifying a chunk use the routing table to figure out the fastest route to inform all relevant players of their action, prioritizing contested players over merely relevant players.
Updates would consist of actions done to the chunk since the last archiving using the update protocol Notch mentioned in the dev blog. Since an archiving happens once every few seconds, the sequence of events is never very complicated and the data transferred remains very small. After a set amount of time has passed, and all parties agree on the state of the chunk, the chunk is archived again and distributed to all non-local parties via the torrent system.
Entities and other natural things would be hosted on whomever's client is closest, and should be synchronized and seamlessly able to switch between clients. Map generation can be done by anyone, but only after they've loaded the entire world on the torrent.
This system would also be an easy way to implement infinite maps, since it plays out on the client just like single player except when your player is near other players. You also wouldn't have to worry about server crashes, since all clients have identical copies of the world, give or take a few seconds of modifications plus propagation time.
Again, this wouldn't work without complete trust between the clients, would use a lot more total bandwidth than a centralized server, and would have some difficulty dealing with particularly slow clients, but it wouldn't need any server-grade hardware and fills a niche that a lot of players would probably appreciate.
Depending on just how much control Notch gives us over servers and multiplayer protocol, we might even be able to implement this independently as a 3rd party server app.
Did I miss anything?
**********An Argument for Why this Protocol Will Work*********************
Let me first explain this:
If there are sufficient available seeders and your upstream traffic is not overloading your network, bittorrent can and will max-out your bandwidth. It is the optimal method for distributing files over a large number of computers, paradoxically increasing in speed as the number of peers increase. Using it to distribute iterations of the archived world map is the optimal environment for using bittorrent. At any given time, all clients will have something like a minimum of 95% of all chunks up-to-date, each chunk is so small its compressed form can fit into a single UDP packet, and the no client is dividing its bandwidth between multiple torrents at any given time. Even with slow connections, all nodes should be able to update all chunks within a few seconds, and even if it doesn't, the player won't see those chunks that are out-of-date.
Bittorrent protocol is used by many games, most notably Blizzard titles, for static updates like patches, but always use centralized servers for actual gameplay since bittorrent is simply not responsive enough. But that's fine. We are doing the same thing here.
For fast response time, a monolithic server is an absolute necessity. The server handles player/player and player/environment interactions, resolves all disputes over who did what and in what order, and presumably that server has low latency for all clients. Well, we don't need a server large enough to encompass the entire world population, because there is nothing a player on the north end of the continent can do to affect the player on the south end that can't be handled via bittorrent.
So we make local servers for handling local interactions, and sub-servers for distributing and collecting messages in that area, all made up of local clients and organized by optimal speed and bandwidth. If there are too many people in one place and the local nodes are overloaded, non-local nodes will begin joining in helping distribute the load. This would diminish the torrent bandwidth, but that makes no immediate difference.
Having a multi-tiered server setup like this is sort of like a ghetto-rigged cluster farm like what large population MMOs actually use. This is harder to do the more involved the server is in how the game is run, but In our situation, servers have a very simple task. Notch's block update protocol has order of actions is based on timestamps and can be reconciled after the fact. Due to this and the fact that servers won't be doing legal move verification, the only job they really have is to pass messages as quickly and efficiently as possible.
Interestingly, if this local server/cluster method and the global torrenting method are beginning to sound a lot like the same thing, that is because they are. The only difference between the two is the kind of data is being passed between clients (based on the proximity of one player to the other) and which clients you are sending data to (based on speed and local priority, but optimized globally. I think this problem may be NP-complete, but for a few dozen to a few hundred peers, who cares?).
I still need to solidify the protocol for verifying that everyone in a contested area has received every update and agrees on the results before archiving, and also how the routing tables will work, but this all looks very doable.
I'm not totally convinced the OP actually does either. Bittorrent is for static files. Once you start changing chunks, I'm not sure if error-checking will still work anymore. At least all the buzzwords seem to be in the right locations.
As far as distributing files when servers are offline? Sure. Lock the size at something manageable (for broadband), like half a gig of data.
And distributed servers wouldn't be impossible either. Just not "Bittorrent" like it's a magic word that makes things work.. It would be... fairly easy (?) to simply keep a local copy on each client, changing stuff as needed. The problem is, each player needs to download all the changes since they left, each time they reconnect. This means players will be stuck not-playing for protracted periods of time, as they download static (yet changed) chunks, and update those with dynamically changed chunks. Instead of "block X, Y, Z on chunk 2134 is now type 37" they'll need to do a LOT more.
Instead of a server keeping track, each player will need to send a copy of all their block changes to each co-client- shouldn't be too bandwidth intensive though, I suppose.
For LAN games, this might be workable. Use the torrent to distribute a single saved world to each client. Have each client use the relatively unlimited LAN bandwidth.
Pros: Lots more monsters. Each PC might keep track of a set of monsters, and simply update their positions.
Cons:
[*:cizqgh70]New players "connecting" to a changing world could be very hard. Look at any FPS match-style game right now, each player loads a local copy of the map, and then the positions of all the players, names, dropped items, etc. Now add in a changing world map of large size.
[*:cizqgh70]Downloading a lot of changes for a large map every join would be... interesting. Chunks would need "last updated" values, so clients don't get old data, then have to toss it and get NEW data.
[*:cizqgh70]Cheater vulnerability- Imagine if someone figures out how to join, and re-update every chunk...[*:cizqgh70]Bandwidth issues can be run into, impacting gameplay.[*:cizqgh70]Who's going to be the tracker? DHT protocol? More bandwidth.[*:cizqgh70]Some people (well, their ISPs) have issues with Bittorrent stuff. A lot of Blizzards (for example) users cannot download their stuff via torrents. This has been an issue since beta, in fact. (I was there!)
Error checking will be in in the form of checksums, but will be unable to discriminate against exploits or hacked clients. If you can think of a cost-effective way to handle it, I'm all ears.
Without a persistent online presence, there would be no way to keep the world consistent. Simply can't be done. Unless you have fantastic logistical skills you will need a client that stays in the game and acts as a seed when needed. That client doesn't need to keep an active player, though. It's annoying to have to rely on something that's essentially a server in a serverless configuration, but I don't see any way around that. If it's any consolidation, you could probably stick a whole bunch of group's worlds on the same server since that's all it would be doing.
As far as bandwidth concerns go, local change messages will almost certainly use UDP protocol, which is very succeptable to overloading the client's bandwidth and will need to be carefully monitored and regulated. To make things easier the torrent-like chunk updating method will probably remain TCP for the time being just to have fewer things to manage.
Multiple players logging in shouldn't be too much of a problem during regular use because players will enter/exit at a staggered rate and the seeders will far outnumber the new leechers at any given time. I will admit, though, starting a brand new world and everyone jumping in all at once would suck something fierce.
tl;dr answer:
Not bittorent, but kind of like bittorrent.
Most of those cons shouldn't be a problem most of the time. The others are major concerns with no easy solution I can see.
I'm really hoping that the small filesize will let me hand-wave the really tricky load balancing problems.
Plus everyone who plays the game using this will kill each other's bandwidth use for their ISP...
Some countries put limits in GB for internet use you know... Having to communicate to multiple clients in a "server" continously will use up more resources...
It's kinda like this, you can't gain anything good, without sacrificing something.
Echo on blizzards problem, however it is important to note blizzard and most other games dont really use bittorent protocol, but their own proprietary ftp protocol, systems they have put millions into in some cases like SOE. notch doesnt have that kind of development money or teams to do it for him
List tags are malformed.
Equ1N0X-RedStone Engineer and Cybernetic architect.
“Patience up to a point. Know your time, but work your wyrd always.”