For #1-3, an angle of rotation and ( x prime, y prime) coordinates are given. find the (x,y) coordinates?

1. angle= 30 degress, ( root 3, 2)
2. angle= 60 degrees, (-1,1)
3. angle = 45 degrees, ( root 2, - root2)

To find the (x, y) coordinates given the angle of rotation and the (x prime, y prime) coordinates, you can use the rotation matrix. The rotation matrix for counterclockwise rotation is as follows:

| cosθ -sinθ |
| sinθ cosθ |

Here, θ represents the angle of rotation in radians. To convert degrees to radians, you can use the formula: radians = degrees * π / 180.

For each question, follow these steps to find the (x, y) coordinates:

1. For angle θ = 30 degrees and (x', y') = (√3, 2):
- Convert the angle to radians: θ = 30 * π / 180 = π / 6.
- Substitute the values into the rotation matrix and multiply:
| cos(π / 6) -sin(π / 6) | | √3 | | x |
| sin(π / 6) cos(π / 6) | * | 2 | = | y |
- Simplify the equation:
| √3/2 -1/2 | | x | | √3 |
| 1/2 √3/2 | * | y | = | 2 |
- Solve for (x, y) by multiplying the matrices:
x = √3/2 * √3 - 1/2 * 2 = 3/2 - 1 = 1/2
y = 1/2 * √3 + √3/2 * 2 = √3/2 + √3 = (2√3 + 3√3)/2 = 5√3/2.

Therefore, the (x, y) coordinates are (1/2, 5√3/2).

2. For angle θ = 60 degrees and (x', y') = (-1, 1):
- Convert the angle to radians: θ = 60 * π / 180 = π / 3.
- Substitute the values into the rotation matrix and multiply:
| cos(π / 3) -sin(π / 3) | | -1 | | x |
| sin(π / 3) cos(π / 3) | * | 1 | = | y |
- Simplify the equation:
| 1/2 -√3/2 | | x | | -1 |
| √3/2 1/2 | * | y | = | 1 |
- Solve for (x, y) by multiplying the matrices:
x = 1/2 * -1 - -√3/2 * 1 = -1/2 + √3/2 = (√3 - 1)/2
y = √3/2 * -1 + 1/2 * 1 = -√3/2 + 1/2 = (1 - √3)/2.

Therefore, the (x, y) coordinates are ((√3 - 1)/2, (1 - √3)/2).

3. For angle θ = 45 degrees and (x', y') = (√2, -√2):
- Convert the angle to radians: θ = 45 * π / 180 = π / 4.
- Substitute the values into the rotation matrix and multiply:
| cos(π / 4) -sin(π / 4) | | √2 | | x |
| sin(π / 4) cos(π / 4) | * | -√2 | = | y |
- Simplify the equation:
| 1/√2 -1/√2 | | x | | √2 |
| 1/√2 1/√2 | * | y | = | -√2 |
- Solve for (x, y) by multiplying the matrices:
x = 1/√2 * √2 - -1/√2 * -√2 = 1 - 1 = 0
y = 1/√2 * √2 + 1/√2 * -√2 = 1 - 1 = 0.

Therefore, the (x, y) coordinates are (0, 0).