Do my warts need to be treated?

    Last Updated: October 25, 2023

    Whether a wart is treated or not depends on personal preference and symptoms. If the wart is not painful, has not spread, and is not present on an area of the skin that causes cosmetic or physical discomfort, simple observation may be an option, since up to two-thirds of warts go away by themselves within 2 years. For people seeking treatment, the least expensive and least painful treatments should be tried first.[1] One of the drawbacks to medical treatments is that they generally work by causing tissue damage, which can cause permanent scarring, whereas warts that resolve on their own generally do so without leaving a scar.[1]

    It is important to see your doctor for an exam in following circumstances:[2][3]

    • If the warts are located in the anal or genital areas: Warts in these areas may become cancerous, and need to be examined and treated by a physician.[1]
    • If the appearance of a wart changes in color or increases in size: This could indicate the formation of abnormal, possibly precancerous cells that need to be evaluated.
    • If the presence of a wart is painful, interferes with daily activities, or is cosmetically distressful.
    • If warts begin to spread to more skin areas: This could indicate an immune system problem and should be checked out by a doctor.