See this MUST READ article: https://www.extremefinitism.com/blog/why_every_proof_that_0-999_equals_1_is_wrong/

This video investigates why 0.999... (zero point nine recurring) does not equal 1, despite the many arguments to the contrary.

Also see the article: https://www.extremefinitism.com/blog/the-sting-the-long-con-of-0-999-1/

This video is part 1 of two. This video explains the problem and the next video (part 2) proposes a solution.

For more information about why the notion of 'infinity' should be removed from mathematics go to ExtremeFinitism.com.

There are already far too many links on the Internet to articles that claim 0.999...=1, please do not try to post any here.

It is difficult for most people to question the accepted framework of mathematics because we have all been indoctrinated at an early age. But all the rules of mathematics are man-made, they are not universal truths. The reason these particular rules were devised (about what you can and can't do with divergent and convergent series) was result driven. We wanted to believe that we could work with 'infinite' objects but there were problems working with divergent series. So instead of admitting that we can't shift and add/subtract terms of an endless series they simply made up rules to avoid the problem cases.

I'm told I can't argue that 0.999... does not equal 1 because by the accepted rules of mathematics it does. To argue against this would be to argue against the whole basis of mathematics. But this is EXACTLY what I am doing.

Consider this simple argument:

First consider that each decimal place must have a corresponding point on the so-called 'number line'. So in the case of 0.999..., 0 is on the number line, 0.9 is on the number line, 0.99 is on the number line and so on. Note that this is not a process of creating points one after another, all of these points must already exist on the number line. Since all of these points are fixed and already exist, which of them is the last one? Which one of them is closest to the number 2?

Here our forefathers had a choice. They could have admitted that the idea that such endless non-zero terms can exist on the number line was problematic. Another option was to use slippery wordplay to ignore the problem. The solution was driven by commerce (not pure mathematics) because it was so much easier to work with quantities of goods if everyone used decimals instead of fractions. So they devised the concept of a 'least upper bound' and claimed that we don't need to think about the partial sum points on the number line; we don't worry about the problem of a 'last digit', we simply exclude that conversation from mathematics. Mathematicians concluded that they can simply choose to avoid any argument that they don't like.

This raised the more important issue about what exactly was mathematics? Was it just the use of symbols as shorthand for describing activities and properties that can apply to physical objects? If so then all the fundamental rules of mathematics would have a corresponding real-world physical description which would provide evidence of validity. And so the endless series 9/10 + 9/100 + 9/1000 + ... could represent any process where something gets closer to the amount '1' by repeatedly increasing by 9 tenths of the remaining amount at each stage. In the real world this process must end at some point in time (it must be finite), but if we are modelling what occurs during the time that it has not stopped, then 0.999... could be said to provide a shorthand description of the activity.

Or perhaps mathematics was some mysterious thing completely detached from the physical world and that was somehow mysteriously revealed to us? In this case any evidence-based process to determine validity goes out of the window because whatever rules we claim to have mysteriously discovered we can then claim to be valid. Indeed, we can devise spurious arguments like 'consistency is all that matters'. Obviously someone might argue that we could invent a religion that was consistent and so consistency does not equate to validity. But in mathematics we could have very well defined rules and we could claim that this makes it 'rigorous'. We could argue that consistency is validity as far as mathematics is concerned (effectively ignoring any counter arguments). Then we can maintain consistency by having rules that effectively avoid any problem areas. In this scenario we can assert that we can work with actual infinities, and we can simply invent rules to avoid the inevitable problems.

So again our forefathers had a choice and sadly they chose the latter. I believe these choices were mistakes. I reject the whole basis of our current mathematics. I argue that it is based on illogical arguments and that it is founded on supernatural beliefs.

For a better understanding please see these videos:

https://youtu.be/OghUe5C5cDU

https://youtu.be/0AargMjeW_4

This video investigates why 0.999... (zero point nine recurring) does not equal 1, despite the many arguments to the contrary.

Also see the article: https://www.extremefinitism.com/blog/the-sting-the-long-con-of-0-999-1/

This video is part 1 of two. This video explains the problem and the next video (part 2) proposes a solution.

For more information about why the notion of 'infinity' should be removed from mathematics go to ExtremeFinitism.com.

There are already far too many links on the Internet to articles that claim 0.999...=1, please do not try to post any here.

It is difficult for most people to question the accepted framework of mathematics because we have all been indoctrinated at an early age. But all the rules of mathematics are man-made, they are not universal truths. The reason these particular rules were devised (about what you can and can't do with divergent and convergent series) was result driven. We wanted to believe that we could work with 'infinite' objects but there were problems working with divergent series. So instead of admitting that we can't shift and add/subtract terms of an endless series they simply made up rules to avoid the problem cases.

I'm told I can't argue that 0.999... does not equal 1 because by the accepted rules of mathematics it does. To argue against this would be to argue against the whole basis of mathematics. But this is EXACTLY what I am doing.

Consider this simple argument:

First consider that each decimal place must have a corresponding point on the so-called 'number line'. So in the case of 0.999..., 0 is on the number line, 0.9 is on the number line, 0.99 is on the number line and so on. Note that this is not a process of creating points one after another, all of these points must already exist on the number line. Since all of these points are fixed and already exist, which of them is the last one? Which one of them is closest to the number 2?

Here our forefathers had a choice. They could have admitted that the idea that such endless non-zero terms can exist on the number line was problematic. Another option was to use slippery wordplay to ignore the problem. The solution was driven by commerce (not pure mathematics) because it was so much easier to work with quantities of goods if everyone used decimals instead of fractions. So they devised the concept of a 'least upper bound' and claimed that we don't need to think about the partial sum points on the number line; we don't worry about the problem of a 'last digit', we simply exclude that conversation from mathematics. Mathematicians concluded that they can simply choose to avoid any argument that they don't like.

This raised the more important issue about what exactly was mathematics? Was it just the use of symbols as shorthand for describing activities and properties that can apply to physical objects? If so then all the fundamental rules of mathematics would have a corresponding real-world physical description which would provide evidence of validity. And so the endless series 9/10 + 9/100 + 9/1000 + ... could represent any process where something gets closer to the amount '1' by repeatedly increasing by 9 tenths of the remaining amount at each stage. In the real world this process must end at some point in time (it must be finite), but if we are modelling what occurs during the time that it has not stopped, then 0.999... could be said to provide a shorthand description of the activity.

Or perhaps mathematics was some mysterious thing completely detached from the physical world and that was somehow mysteriously revealed to us? In this case any evidence-based process to determine validity goes out of the window because whatever rules we claim to have mysteriously discovered we can then claim to be valid. Indeed, we can devise spurious arguments like 'consistency is all that matters'. Obviously someone might argue that we could invent a religion that was consistent and so consistency does not equate to validity. But in mathematics we could have very well defined rules and we could claim that this makes it 'rigorous'. We could argue that consistency is validity as far as mathematics is concerned (effectively ignoring any counter arguments). Then we can maintain consistency by having rules that effectively avoid any problem areas. In this scenario we can assert that we can work with actual infinities, and we can simply invent rules to avoid the inevitable problems.

So again our forefathers had a choice and sadly they chose the latter. I believe these choices were mistakes. I reject the whole basis of our current mathematics. I argue that it is based on illogical arguments and that it is founded on supernatural beliefs.

For a better understanding please see these videos:

https://youtu.be/OghUe5C5cDU

https://youtu.be/0AargMjeW_4

By using our services, you agree to our Privacy Policy.

Alternative random YouTube videos generator: YouTuBeRandom

Powered by Wildsbet.

vTomb © 2023

Alternative random YouTube videos generator: YouTuBeRandom

Powered by Wildsbet.

vTomb © 2023

By using our services, you agree to our Privacy Policy.

OK