Yes, the U.S. should have annexed Hawaii. The islands were strategically located in the Pacific Ocean, making them important for trade and military purposes. Annexation would have also provided economic opportunities for American businesses. Additionally, many American residents and businesses already had strong ties to Hawaii and supported annexation.