Informed consent is a legal and ethical requirement in healthcare. Nurses must ensure that patients understand the nature of their treatment, the risks involved, and any alternative options available. Failure to obtain proper informed consent can lead to legal repercussions for both the nurse and the healthcare facility.