The console on which I spent the most time was the GBA. It was soo funny...
Back to the subject. I wouldn't be surprised either. However, it would not be GBA games, but rather Gameboy Color games. I remember playing Mario Kart on the GBA, and I would be surprised if the position was stored thanks to integers.
Rollback Post to RevisionRollBack
A Rocket Science tester and engineer. Well, at least when it will be released. =)
I'm French so English is not my native language. Please correct me when I make a mistake and don't worry for me, it will be even better. Thanks in advance! =) Current number of corrected posts/things: 7
I wonder if those old GBA games would qualify. It really wouldn't surprise me at all if those original Pokemon games used nothing but integers and fancy animation tricks for the world and everything in it.
I'm actually designing a game engine at the moment that avoids exactly this problem by using an integer grid. I'll let you know how it goes. =)
Yeah, the original sapphire/ruby/emerald was probably integer based. You move in person-sized sets.
As for the game block limits, I don't know them. lol
Basically though, 32-bit is the first limit. I'll bet that is what minecraft is based on right now, but I have nothing to do with code. Most 32-bit chips can emulate 64-bit, it just gets slower. Obviously native 64-bit will be faster. After 64-bit you hit 128. Intel is building a 128-bit CPU, and Microsoft is building a specialized version of "Windows" for it. Otherwise you're stuck with emulation. After that you hit the obvious 256-bit, 512-bit, 1024-bit. It was either Haswell or next-gen that has 512-bit AVX. With that basically you can do 512-bit stuff, but it is a lot slower, has to be coded around it, and really beats up the CPU. Also, it becomes incompatible with anything older than the chip that supports it.
tl;dr The only limits worth pushing are possibly to 64-bit.
Rollback Post to RevisionRollBack
If someone in an S class cuts me off, I don't get upset. He obviously has important things to do. Now, someone in a Toyota? We're fighting.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
64 bits? That should be enough space for anyone. =)
Should be. Some people want more.
I was just pointing out that there is always a limit. Even if someone did take the code to 512AVX you'd have people teleporting to the edge and wondering how to expand it. The 64-bit integer limit is what? +/- 9,223,372,036,854,775,807? 9 quintillion. Holy crap.
With AVX-512 later this year (turns out it would be next-gen knights landing) it would be something like +/- 464,227,514,732,017,603,087,171,584 or 464 septillion.
I don't know how that translates into blocks though.
Rollback Post to RevisionRollBack
If someone in an S class cuts me off, I don't get upset. He obviously has important things to do. Now, someone in a Toyota? We're fighting.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
The thing is, 64 bits is more than enough for applications ~now~. Even if the instruction size was increased to 128 bits, nothing would really make use of it fantastically well. As such, there is not really any significant demand for 128-bit at the moment, and hence it's not going to emerge very quickly into the market.
In future, of course the demand will increase, the technology will improve, and so on. However, I think it's somewhat undeniable that CPU computing doesn't have much left to bring to the table. It's certainly got some juice left, but... how to say.. it's already had its impact on the world. Certainly it will still improve, and will always have its place, but the next big thing will likely be an entirely new technology, Quantum Computing being the obvious pick (though it's perhaps the least understood technology ~of all time~).
Rollback Post to RevisionRollBack
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
What I am most looking forward to that could be possible in 20 years is optical computing. A perfect optical chip on a 45nm process (basically built off the origonal core i's) would be equivalent to something like 38 million ghz. Obviously it wouldn't be perfect, but even one one thousandth of that estimate would be 38,000Ghz.
Rollback Post to RevisionRollBack
If someone in an S class cuts me off, I don't get upset. He obviously has important things to do. Now, someone in a Toyota? We're fighting.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
What I am most looking forward to that could be possible in 20 years is optical computing. A perfect optical chip on a 45nm process (basically built off the origonal core i's) would be equivalent to something like 38 million ghz. Obviously it wouldn't be perfect, but even one one thousandth of that estimate would be 38,000Ghz.
A perfect semiconductor computer could operate at about 10-15 million Ghz. This number is determined simply by dividing the physical size of the CPU, by the uninterrupted speed of the electron (or in the case of your example, the photon). In practice however, it just isn't this simple. The biggest problem facing optical computers is: the nature of photons. Photons have some serious limitations, for example, they can only interact in the presence of an electron, and even then; they only experience weak interaction (electrons consistently experience strong interaction, which makes it much simpler to work with them).
Personally, I believe that optical computing will probably have matured by about 2020, and will likely offer no significant advantages over semi-conductor computers outside of some very niche applications. Actually, it's not really possible for them to be significantly faster, because they will still be restricted by binary logic (or else they would be Quantum Computers). That's not to say that they can't be faster, they might even be a lot faster, it's just... the extra speed isn't something that would revolutionize the world like the first computers, or the internet, etc.
The thing with Quantum computers is... If a classical computer can solve a problem in O(n^x), then a Quantum computer can theoretically solve it in O(n^x-1), an entire order of magnitude faster. At the very worst, it can still emulate binary logic and solve the problem in O(n^x). In some specific cases, an Quantum Computer can even solve problems with one single operation, that can only be solved by a classical computer in exponential time. These differences are just not things that can be overcome by increasing clock speed.
Rollback Post to RevisionRollBack
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
What I am most looking forward to that could be possible in 20 years is optical computing. A perfect optical chip on a 45nm process (basically built off the origonal core i's) would be equivalent to something like 38 million ghz. Obviously it wouldn't be perfect, but even one one thousandth of that estimate would be 38,000Ghz.
It's been around for ages and it's called photonics.
How would a 45nm photonic IC be faster than any electronic IC? For starters, both transfer energy at the near speed of light; however the logic gates for photonics are also extremely poor (near non-existent due to insufficient science). Photonic circuitry has no advantage over electronic, apart from it being resistant to EMP - making it unsurprising to find the US Military are the only real developers of the technology. Photonic has less noise and might be capable of being more dense but the cost and manufacturing of such a process is likely too high for mass adoption, unless something changes in the next couple of decades.
It's more likely that photonic computing will never catch on and we'll stick with electronic until quantum computing is figured out. Photonic is way too much of a drastic change in infrastructure with little-to-no benefit over electronic.
Note however that the prominent development in quantum computing is based on photonic technology, so maybe we will see it - just not based on a binary system (so no "Intel photonic series" CPU upgrades lol).
It's been around for ages and it's called photonics.
Gotta love them Fiber Internets... Oh wait... we live in Australia... nvm.
But yeah, photons should be good for Quantum Computing since they are relatively prone to entanglement.
Still, it's worth nothing that superconductor Quantum Computers are far more advanced, because Photon driven quantum computers are still running into a heap of problems with the way Photons behave, I think the record is something like 12 photons with only a couple of entangled pairs, compared to superconductor computers which (supposedly) have 500+ Qbits.
But regardless, after Quantum Computing, my next interest is DNA computing. One petabyte per gram? I mean, cmon, you could store the entire internet in a freaking teaspoon (with space left over), plus DNA is so robust that you could probably do this without even damaging the data within.
Rollback Post to RevisionRollBack
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
oh my god. wow. I've heard that scientist have stored data in DNA strands but I never thought they'd be able to compute information.
Biomolecular computing. In principle, I believe it is the idea of programming cells and molecules to perform certain behaviors. It has applications in computational science, of course, but its most valuable uses will likely be medical (programming cells to produce and release medication, for example, can fall into this area of study).
Rollback Post to RevisionRollBack
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
Biomolecular computing. In principle, I believe it is the idea of programming cells and molecules to perform certain behaviors. It has applications in computational science, of course, but its most valuable uses will likely be medical (programming cells to produce and release medication, for example, can fall into this area of study).
Man that sounds like we'll be long dead and our bones turned to dust by then, think about it that's literally "Godmode science" - the tech to literally create life. Exciting sure, but nanotechnology is right around the corner (yet another field that photonics will not work well in, funnily enough)
Man that sounds like we'll be long dead and our bones turned to dust by then, think about it that's literally "Godmode science" - the tech to literally create life. Exciting sure, but nanotechnology is right around the corner (yet another field that photonics will not work well in, funnily enough)
Actually they've already made some headway, iirc there was some experiment (some years ago) where they made an automata that could detect and attempt to treat early signs of prostate cancer (of course the experiment probably would not have worked in a living person, but still).
But it's very interdisciplinary, and will probably be relevant to all technology full stop. The sheer compactness and robustness of DNA data storage alone would be very significant for many many areas. In addition, given that DNA is effectively just software, and cells are effectively just organic nanomachines, I think it will have significant implications for nanotechnology as well
Rollback Post to RevisionRollBack
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
How different is the map storage with this mod? I'm writing a program that converts maps from Dwarf Fortress to Minecraft, and I'd like to use this mod if possible, since the source maps can get up to 900 blocks high when converted, without removing bits.
How different is the map storage with this mod? I'm writing a program that converts maps from Dwarf Fortress to Minecraft, and I'd like to use this mod if possible, since the source maps can get up to 900 blocks high when converted, without removing bits.
Very different. 0% of tools designed to read vanilla worlds will be able to read tall worlds without adding specific support for tall worlds.
Actually they've already made some headway, iirc there was some experiment (some years ago) where they made an automata that could detect and attempt to treat early signs of prostate cancer (of course the experiment probably would not have worked in a living person, but still).
But it's very interdisciplinary, and will probably be relevant to all technology full stop. The sheer compactness and robustness of DNA data storage alone would be very significant for many many areas. In addition, given that DNA is effectively just software, and cells are effectively just organic nanomachines, I think it will have significant implications for nanotechnology as well
You make it sound so simple though. It's literally Man = God, it's a kind of development that - to some - surpasses some very-distant (possibly impossible) technology like teleportation or time compression / space folding. Programming a cell is one thing, creating DNA so cells can make *themselves* and communicate with each other in a way that we dictate is a whole other story. IMO, even if we had (were given?) the technology to do it, we simply wouldn't be able to understand it - it's beyond human.
Really though, all that is just "IMO" - I'm not a scientist and while I do disagree, I sure hope I end up being wrong! Though I'm confident that advanced electronics will progress deep into cybernetics before genetic science and biological computing really gets anywhere.
By the way, I just thought, will you include new biomes generation in Tall Worlds (like in Climate Control, or will you only provide an API for mods to create their own vertical/horizontal biomes?
Rollback Post to RevisionRollBack
A Rocket Science tester and engineer. Well, at least when it will be released. =)
I'm French so English is not my native language. Please correct me when I make a mistake and don't worry for me, it will be even better. Thanks in advance! =) Current number of corrected posts/things: 7
Back to the subject. I wouldn't be surprised either. However, it would not be GBA games, but rather Gameboy Color games. I remember playing Mario Kart on the GBA, and I would be surprised if the position was stored thanks to integers.
A Rocket Science tester and engineer. Well, at least when it will be released. =)
I'm French so English is not my native language. Please correct me when I make a mistake and don't worry for me, it will be even better. Thanks in advance! =)
Current number of corrected posts/things: 7
I'm actually designing a game engine at the moment that avoids exactly this problem by using an integer grid. I'll let you know how it goes. =)
As for the game block limits, I don't know them. lol
Basically though, 32-bit is the first limit. I'll bet that is what minecraft is based on right now, but I have nothing to do with code. Most 32-bit chips can emulate 64-bit, it just gets slower. Obviously native 64-bit will be faster. After 64-bit you hit 128. Intel is building a 128-bit CPU, and Microsoft is building a specialized version of "Windows" for it. Otherwise you're stuck with emulation. After that you hit the obvious 256-bit, 512-bit, 1024-bit. It was either Haswell or next-gen that has 512-bit AVX. With that basically you can do 512-bit stuff, but it is a lot slower, has to be coded around it, and really beats up the CPU. Also, it becomes incompatible with anything older than the chip that supports it.
tl;dr The only limits worth pushing are possibly to 64-bit.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
Should be. Some people want more.
I was just pointing out that there is always a limit. Even if someone did take the code to 512AVX you'd have people teleporting to the edge and wondering how to expand it. The 64-bit integer limit is what? +/- 9,223,372,036,854,775,807? 9 quintillion. Holy crap.
With AVX-512 later this year (turns out it would be next-gen knights landing) it would be something like +/- 464,227,514,732,017,603,087,171,584 or 464 septillion.
I don't know how that translates into blocks though.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
In future, of course the demand will increase, the technology will improve, and so on. However, I think it's somewhat undeniable that CPU computing doesn't have much left to bring to the table. It's certainly got some juice left, but... how to say.. it's already had its impact on the world. Certainly it will still improve, and will always have its place, but the next big thing will likely be an entirely new technology, Quantum Computing being the obvious pick (though it's perhaps the least understood technology ~of all time~).
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
These forums are dumb and break my links. I'm bubbleawsome on steam too.
A perfect semiconductor computer could operate at about 10-15 million Ghz. This number is determined simply by dividing the physical size of the CPU, by the uninterrupted speed of the electron (or in the case of your example, the photon). In practice however, it just isn't this simple. The biggest problem facing optical computers is: the nature of photons. Photons have some serious limitations, for example, they can only interact in the presence of an electron, and even then; they only experience weak interaction (electrons consistently experience strong interaction, which makes it much simpler to work with them).
Personally, I believe that optical computing will probably have matured by about 2020, and will likely offer no significant advantages over semi-conductor computers outside of some very niche applications. Actually, it's not really possible for them to be significantly faster, because they will still be restricted by binary logic (or else they would be Quantum Computers). That's not to say that they can't be faster, they might even be a lot faster, it's just... the extra speed isn't something that would revolutionize the world like the first computers, or the internet, etc.
The thing with Quantum computers is... If a classical computer can solve a problem in O(n^x), then a Quantum computer can theoretically solve it in O(n^x-1), an entire order of magnitude faster. At the very worst, it can still emulate binary logic and solve the problem in O(n^x). In some specific cases, an Quantum Computer can even solve problems with one single operation, that can only be solved by a classical computer in exponential time. These differences are just not things that can be overcome by increasing clock speed.
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
It's been around for ages and it's called photonics.
How would a 45nm photonic IC be faster than any electronic IC? For starters, both transfer energy at the near speed of light; however the logic gates for photonics are also extremely poor (near non-existent due to insufficient science). Photonic circuitry has no advantage over electronic, apart from it being resistant to EMP - making it unsurprising to find the US Military are the only real developers of the technology. Photonic has less noise and might be capable of being more dense but the cost and manufacturing of such a process is likely too high for mass adoption, unless something changes in the next couple of decades.
It's more likely that photonic computing will never catch on and we'll stick with electronic until quantum computing is figured out. Photonic is way too much of a drastic change in infrastructure with little-to-no benefit over electronic.
Note however that the prominent development in quantum computing is based on photonic technology, so maybe we will see it - just not based on a binary system (so no "Intel photonic series" CPU upgrades lol).
Gotta love them Fiber Internets... Oh wait... we live in Australia... nvm.
But yeah, photons should be good for Quantum Computing since they are relatively prone to entanglement.
Still, it's worth nothing that superconductor Quantum Computers are far more advanced, because Photon driven quantum computers are still running into a heap of problems with the way Photons behave, I think the record is something like 12 photons with only a couple of entangled pairs, compared to superconductor computers which (supposedly) have 500+ Qbits.
But regardless, after Quantum Computing, my next interest is DNA computing. One petabyte per gram? I mean, cmon, you could store the entire internet in a freaking teaspoon (with space left over), plus DNA is so robust that you could probably do this without even damaging the data within.
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
Biomolecular computing. In principle, I believe it is the idea of programming cells and molecules to perform certain behaviors. It has applications in computational science, of course, but its most valuable uses will likely be medical (programming cells to produce and release medication, for example, can fall into this area of study).
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
Man that sounds like we'll be long dead and our bones turned to dust by then, think about it that's literally "Godmode science" - the tech to literally create life. Exciting sure, but nanotechnology is right around the corner
Actually they've already made some headway, iirc there was some experiment (some years ago) where they made an automata that could detect and attempt to treat early signs of prostate cancer (of course the experiment probably would not have worked in a living person, but still).
But it's very interdisciplinary, and will probably be relevant to all technology full stop. The sheer compactness and robustness of DNA data storage alone would be very significant for many many areas. In addition, given that DNA is effectively just software, and cells are effectively just organic nanomachines, I think it will have significant implications for nanotechnology as well
I believe in the Invisible Pink Unicorn, bless her Invisible Pinkness.
Very different. 0% of tools designed to read vanilla worlds will be able to read tall worlds without adding specific support for tall worlds.
You make it sound so simple though. It's literally Man = God, it's a kind of development that - to some - surpasses some very-distant (possibly impossible) technology like teleportation or time compression / space folding. Programming a cell is one thing, creating DNA so cells can make *themselves* and communicate with each other in a way that we dictate is a whole other story. IMO, even if we had (were given?) the technology to do it, we simply wouldn't be able to understand it - it's beyond human.
Really though, all that is just "IMO" - I'm not a scientist
A Rocket Science tester and engineer. Well, at least when it will be released. =)
I'm French so English is not my native language. Please correct me when I make a mistake and don't worry for me, it will be even better. Thanks in advance! =)
Current number of corrected posts/things: 7