QUOTE(Raijinili @ Oct 3 2010, 12:07 AM)
QUOTE(Elnendil)
If 1 is the greatest common divisor and you're assuming there is a lower common divisor
What? No, we don't assume that.
All I was doing was defining the fact that you cannot choose a lower common divisor for this proof, as there's nothing lower than one that could divide 2k and 2k+1, leading to the choice that they can't divide each other evenly regardless of the possible positive integers you can select. And I added my definition by pointing out that 1|2 can't exist in this definition because you have to use k=0.5, which isn't an integer. Please note that I also noted all positive integers in the end, also omitting 0, which would result in 0|1. I also added that there is possibly a better definition for this proof out there which also applies 1|2, but I have yet to think on that or hear about it, this was a spurt of the moment concept that I put together in the middle of a Calculus II course.
I am saying this, once again:
Based on 2k and 2k+1, the GCD is 1. Here, let me do it:
2k+1=2k(1)+1
2k=1(2k)+0.
This applies to all nonnegative numbers. And this is for positive integers, as in 1,2,3...n. So when we find the lowest common multiple, its ab/(a,b), in which a= 2k and b=2k+1. So the LCM is 2k(2k+1).
So what is your issue? Post a counterproof, something else. I guess i'm sounding overly harsh here, but suddenly you're dismissing the concept and i'm wanting to know the details and some sort of counterproof that works so I can retract the statement.
This post has been edited by Elnendil: Oct 3 2010, 07:47 AM