The Meaning of Life, the Universe, and Everything.
Join Date:
5/10/2011
Posts:
125
Location:
二俣川
Minecraft:
SylentStryker
PSN:
BairSaysHi
Member Details
Ah. I think I got it. The thing was, I was assuming that there was an infinitesimally small number that is both non-zero and infinitely long. By my logic, that would indeed be 0.00000...1, but due to the definition of infinity, it is impossible to have the one after those zeroes. You know what? My head is hurting now.
~Bair
EDIT: Um, I think. Or am I just confusing you all as well as myself?
I'd like to think of myself as intelligent, but that, given the sheer variety and vastness of the cosmos that surrounds me, it is readily apparent that this is not the case.
Ah. I think I got it. The thing was, I was assuming that there was an infinitesimally small number that is both non-zero and infinitely long. By my logic, that would indeed be 0.00000...1, but due to the definition of infinity, it is impossible to have the one after those zeroes. You know what? My head is hurting now.
~Bair
EDIT: Um, I think. Or am I just confusing you all as well as myself?
Almost their. In short the one doesn't exist because the end of the nines doesn't exist
Limits are used to find out what number something approaches. So yes, 0.999... repeating tends to approach 1. Indeed, it gets infinitely close to 1, to the point where it's so much easier simply to call it 1. However, it's an asymptote. It does not, and never will reach 1.
The fact that Lim f(x) as X -> INF = 1 is true, by the definition of limits. Limits are not meant to find out the value of a function, though. They're meant to find out the value that the function approaches.
.
A asymptope exists because of divide by zero error, where literally x/0 doesn't equal anything. But in this case it hit one at infinity.
Rollback Post to RevisionRollBack
I always wanted to argue with a brick wall, I suppose the internet is the second best option.
If you watch my video you will find the explanation on why .333... is 1/3. It uses the proven geometric summation theorem. I used the exact example in the video, which is officially recognized by every algebra 2 book in production, college professors, and mathematicians.
I did watch the video... there were a few errors. First of all, 0.333... repeating does not equal 1/3. Indefinite numbers cannot be definite fractions. For it to equal 1/3, it would have to be finite. Second, in the end, all he did was prove that 1.9/1.9 equals 1... not 1.9999... repeating. Then again, it's hard for me to tell from the video. I would appreciate it if someone wrote the proof down.
Almost their. In short the one doesn't exist because the end of the nines doesn't exist.
A asymptope exists because of divide by zero error, where literally x/0 doesn't equal anything. But in this case it hit one at infinity.
What? Not all of them. Look at the exponential function- f(x) = e^x. It gets arbitrarily close to the x axis, but never actually reaches it. x/0 isn't the only asymptotic function. Also, infinity isn't a number. It can never reach infinity, only approach it.
Anyways, I'm going to bed now. I'll pick up again later, if there are any new responses. So far, though, I have yet to see a convincing proof.
The fact that Lim f(x) as X -> INF = 1 is true, by the definition of limits. Limits are not meant to find out the value of a function, though. They're meant to find out the value that the function approaches.
They're used to find the values of functions at certain points.
What? Not all of them. Look at the exponential function- f(x) = e^x. It gets arbitrarily close to the x axis, but never actually reaches it. x/0 isn't the only asymptotic function. Also, infinity isn't a number. It can never reach infinity, only approach it.
Anyways, I'm going to bed now. I'll pick up again later, if there are any new responses. So far, though, I have yet to see a convincing proof.
Well you are saying that nothing can be infinity, yes, and no. Yeah it isn't a number, but an assumption that something goes on forever. It is in the, AISB, geometric summation formula, which is explained in the video. This formula is accepted by professors and mathematicians.
Rollback Post to RevisionRollBack
I always wanted to argue with a brick wall, I suppose the internet is the second best option.
People seem to have a hard time understanding that there can be more than one way to represent the same value.
The difference between 0.9... and 1 would be an infinitesimal. In the set of real numbers, there are no infinitesimals that are not equivalent to 0. So, in real numbers, 1 - 0.9... = 0. And, if a - b = 0, then a = b.
I don't get why there's always such heavy argument about this topic. You're not going to find a matehemetician who will disagree with the proposition that 0.9... is equal to 1 when talking about real numbers. So, why are there always lay people insistig on arguing so vehemently that this is wrong?
People seem to have a hard time understanding that there can be more than one way to represent the same value.
The difference between 0.9... and 1 would be an infinitesimal. In the set of real numbers, there are no infinitesimals that are not equivalent to 0. So, in real numbers, 1 - 0.9... = 0. And, if a - b = 0, then a = b.
I don't get why there's always such heavy argument about this topic. You're not going to find a matehemetician who will disagree with the proposition that 0.9... is equal to 1 when talking about real numbers. So, why are there always lay people insistig on arguing so vehemently that this is wrong?
Because it's counter-intuitive.
Rollback Post to RevisionRollBack
'Tis not contrary to reason to prefer the destruction of the whole world to the scratching of my finger.
Real numbers are so weird. You'd think that 0.999... would be the number that just satisfies x < 1, but apparently x = 1, which would actually be x <= 1 instead of x < 1. Never a problem with integers.
Good thing we pretty much never have to deal with such things real life because you can't ever claim that much accuracy.
Because the concept of infinity is not well understood by the general population.
People think that infinity is something very big, but still measurable in some way.
saying that 0.999... <= 1 is correct, but only because 0.999... = 1
0.999... !< 1 (not smaller then)
If I take a cake and cut it in three exactly equal pieces I have 3 pieces of 1/3 of cake.
1/3 = 0.333...
3 x 1/3 = 0.999...
When I put the three pieces of cake back together I again have the entire cake.
People seem to have a hard time understanding that there can be more than one way to represent the same value.
The difference between 0.9... and 1 would be an infinitesimal. In the set of real numbers, there are no infinitesimals that are not equivalent to 0. So, in real numbers, 1 - 0.9... = 0. And, if a - b = 0, then a = b.
I don't get why there's always such heavy argument about this topic. You're not going to find a matehemetician who will disagree with the proposition that 0.9... is equal to 1 when talking about real numbers. So, why are there always lay people insistig on arguing so vehemently that this is wrong?
Too bad .999... isn't a real number.
Rollback Post to RevisionRollBack
I always wanted to argue with a brick wall, I suppose the internet is the second best option.
They're used to find the values of functions at certain points.
f(x) = x
Lim f(x) as x -> 5 = 5
It really is that simple.
Yes, I know what limits mean. That would be read as "The limit of f(x) as x approaches 5 is 5." That means that the output of the function approaches five as the function tends to five. The limit of f(x) as x -> INF is one. That is absolutely correct.
Is it counter-intuitive? It seems like someone could be convinced either way. Supposing that most people agree that it does equal one, I'd imagine that the opposite would be the counter-intuitive one.
As far as I'm still concerned, 1/9 =/= 0.99999~. I don't see how it can because when you graph it, it simply never reaches 1. The only way it does is when you round it, and we all know rounding 9 to 10 doesn't make those two the same value.
I understand the argument that the space between 1 and 0.9999~ is, technically, infinitely small, but I view that as a logical inconsistency.
For anyone who has a problem understanding this fundamental aspect of math in association to reality...
...I can't imagine what type of problems you might have trying to understand how imaginary numbers play into reality...
Time to go learn some things... or else just never use math in whatever career you go into...
Also, I don't understand why this deserves it's own topic... or how it made it to 4 pages... *sigh*
I understand the arugment that the space between 1 and 0.9999~ is, technically, infinitely small, but I view that as a logical inconsistency.
The problem you're having is that you actually believe there to be "space" between 1 and 0.999~.
There is not. This is mathematically proven and has been demonstrated to you numerous times.
The problem you're arguing about lies in the fact that writing out mathematics with decimals is not entirely consistent without understanding how this fundamental principle works.
The proper way to write 0.999~ would be 1/3, except in the context where it is understood that 0.999~ equals 1.
In this case (as is in the case within all formal mathematics) there is no problem to write it either way.
If it helps you to understand it... Think of it as an error in writing, rather than a logical error. It is not a logical error.
It is a conceptual error within writing. Nothing more.
Guys 0.9999999999999999999999999 is not 1, because there is no number inbetween. That is the point of infinity, so we can put the numbers next to each other without having to think that there is an infinite amount of numbers in between. 0.999999999999 is not 1 because there IS a decimal showing the difference. But it is infinitely long. That is the point of infinity. There is a difference, but it is extremely small. Because 1/3 is NOT 0.33333333333333333333... You CAN NOT DEFINE 1/3 OF 1! Then do not multiply a non-existant number by 3 to get 3/3 of 1, because 1/3 of 1 times 3 = 3/3 of 1 = 1. But 0.3333333333333... is not and will never be 1/3.
OK thanks. Please let us know when you've finished having every math book out there rewritten to accomodate your proclamation.
~Bair
EDIT: Um, I think. Or am I just confusing you all as well as myself?
Almost their. In short the one doesn't exist because the end of the nines doesn't exist .
A asymptope exists because of divide by zero error, where literally x/0 doesn't equal anything. But in this case it hit one at infinity.
I did watch the video... there were a few errors. First of all, 0.333... repeating does not equal 1/3. Indefinite numbers cannot be definite fractions. For it to equal 1/3, it would have to be finite. Second, in the end, all he did was prove that 1.9/1.9 equals 1... not 1.9999... repeating. Then again, it's hard for me to tell from the video. I would appreciate it if someone wrote the proof down.
My little pony is for little girls
What? Not all of them. Look at the exponential function- f(x) = e^x. It gets arbitrarily close to the x axis, but never actually reaches it. x/0 isn't the only asymptotic function. Also, infinity isn't a number. It can never reach infinity, only approach it.
Anyways, I'm going to bed now. I'll pick up again later, if there are any new responses. So far, though, I have yet to see a convincing proof.
They're used to find the values of functions at certain points.
f(x) = x
Lim f(x) as x -> 5 = 5
It really is that simple.
Well you are saying that nothing can be infinity, yes, and no. Yeah it isn't a number, but an assumption that something goes on forever. It is in the, AISB, geometric summation formula, which is explained in the video. This formula is accepted by professors and mathematicians.
Why?
Well, it you take away the 9s, where are you going to put the 1? After INFINITY 0s? How?
And what does Wolfram think?
People seem to have a hard time understanding that there can be more than one way to represent the same value.
The difference between 0.9... and 1 would be an infinitesimal. In the set of real numbers, there are no infinitesimals that are not equivalent to 0. So, in real numbers, 1 - 0.9... = 0. And, if a - b = 0, then a = b.
I don't get why there's always such heavy argument about this topic. You're not going to find a matehemetician who will disagree with the proposition that 0.9... is equal to 1 when talking about real numbers. So, why are there always lay people insistig on arguing so vehemently that this is wrong?
Because it's counter-intuitive.
Good thing we pretty much never have to deal with such things real life because you can't ever claim that much accuracy.
Mostly moved on. May check back a few times a year.
Yep the 1/3 .3333 x3 = 1 .999... is good.
I am hungry too now.
Too bad .999... isn't a real number.
Yes, I know what limits mean. That would be read as "The limit of f(x) as x approaches 5 is 5." That means that the output of the function approaches five as the function tends to five. The limit of f(x) as x -> INF is one. That is absolutely correct.
Is it counter-intuitive? It seems like someone could be convinced either way. Supposing that most people agree that it does equal one, I'd imagine that the opposite would be the counter-intuitive one.
I understand the argument that the space between 1 and 0.9999~ is, technically, infinitely small, but I view that as a logical inconsistency.
...I can't imagine what type of problems you might have trying to understand how imaginary numbers play into reality...
Time to go learn some things... or else just never use math in whatever career you go into...
Also, I don't understand why this deserves it's own topic... or how it made it to 4 pages... *sigh*
The problem you're having is that you actually believe there to be "space" between 1 and 0.999~.
There is not. This is mathematically proven and has been demonstrated to you numerous times.
The problem you're arguing about lies in the fact that writing out mathematics with decimals is not entirely consistent without understanding how this fundamental principle works.
The proper way to write 0.999~ would be 1/3, except in the context where it is understood that 0.999~ equals 1.
In this case (as is in the case within all formal mathematics) there is no problem to write it either way.
If it helps you to understand it... Think of it as an error in writing, rather than a logical error. It is not a logical error.
It is a conceptual error within writing. Nothing more.
You have .99999999 repeating, or 3/3 etc.
Well, there is no average inbetween .99999999 repeating and 1, so its basically the same number.
It isn't 'basically' the same number. It is EXACTLY the same number.
This is the only reason this topic is still going. We can all let it die now....
....I hate math.
OK thanks. Please let us know when you've finished having every math book out there rewritten to accomodate your proclamation.