Informed consent is a fundamental ethical and legal requirement in healthcare. It ensures that patients are fully aware of the risks, benefits, and alternatives to treatments or procedures before agreeing to them. Nurses play a critical role in the consent process by providing clear and comprehensive information to patients, ensuring they understand their options.