A rectangular football field is 646464 meters wide and 100100100 meters long. A player runs from one corner of the field in a diagonal line to the opposite corner.

To solve this problem, we can use the Pythagorean theorem which states that in a right triangle, the square of the hypotenuse (the longest side) is equal to the sum of the squares of the other two sides. In this case, the diagonal line which the player runs forms the hypotenuse, and the length and width of the field form the two legs of the right triangle.

So, we can calculate the diagonal distance as:

sqrt((646464)^2 + (100100100)^2) ≈ 100100584 meters

Therefore, the player runs approximately 100100584 meters from one corner of the field to the opposite corner.