Posted by Matt on .
Prove that a number 10^(3n+1) , where n is a positive integer, cannot be represented as the sum of two cubes of positive integers.
thanx

Maths 
MathMate,
We will examine the sum of cubes of two numbers, A and B. Without losing generality, we will further assume that
A=2^{n}X and
B=2^{n+k}Y
where
X is not divisible by 2
n is a positive integer and
k is a nonnegative integer.
A^{3}+B^{3}
=(A+B)(A^{2}AB+B^{2})
=2^{n}(X + 2^{k}Y) 2^{2n}(X^{2}  2^{k}XY + 2^{2k}Y²)
=2^{3n}(X + 2^{k}Y) (X²  2^{k}XY + 2^{2k}Y²)
Thus A^{3}+B^{3} has a factor 2^{3n}, but not 2^{3n+1} since X is not divisible by 2.
Since 10^{3n+1} requires a factor of 2^{3n+1}, we conclude that it is not possible that
10^{3n+1}=A^{3}+B^{3} 
Maths 
Jon Zhan,
Nice Answer, But Please Try To Use (Mod)
That Way Is Easier 
Maths 
Sean,
Hey, Your ANSWER is corrupt, cause it doesnt really explain anything! Try to make it more clear.
SY