Posted on: 13 Sep 2024
Read Time: 6 minutes
The practice of researching, diagnosing and treating skin conditions has evolved remarkably throughout the ages. From the earliest known account of skin conditions on clay in ancient Mesopotamia 5,000 years ago to the current use of artificial intelligence, important individuals and discoveries have shaped the specialty’s perspectives. New knowledge and technology continue to broaden our capabilities, improve quality of life, and contribute to longer life expectancies.
Modern humans evolved in Africa around 315,000 years ago, and while visible changes to the skin have been recognised since the beginning of time, it was the Mesopotamian peoples – in present-day Iraq and north-eastern Syria – who first took an interest in understanding skin conditions from the beginning of recorded history around 3100 BCE. This took the form of religion and astrology, and was often considered as a sign of divine retribution. Mesopotamia is observed to be the first civilisation to define acne, moles, and warts.
Among the oldest preserved medical accounts relating to skin conditions date back to ancient Egypt from around 1600 BCE. These scrolls describe how to diagnose and treat rashes, burn wounds, ulcers, and even tumours through the use of herbs and natural ointments, as well as surgical procedures. Medical practice in Egypt was inextricably linked to religion, with priests providing care in temples and skin conditions being attributed to the will of the gods.
Beginning around 1200 BCE, ancient Greece rejected the use of incantations as a form of treatment, but similarly to the Egyptians, saw skin conditions as a sign of humoral imbalance: black bile, yellow bile, blood, and phlegm. Greek medicine placed a strong emphasis on a healthy diet and exercise, as well as the harmony between internal goodness and external beauty. They described the anatomy and physiological functions of the skin, including sweating and glandular secretion. Hippocrates, known as the “Father of Medicine,” proposed the first classification of skin conditions. He classified dermatoses into two categories: exanthematic diseases or rashes, and idiopathic diseases with unknown causes. The Greeks are credited with laying the foundation for dermatological practice.
The Roman Empire, which dates from 27 BCE, sought to preserve Greek medical knowledge. They valued water’s healing qualities, used plants for their antiseptic properties, and studied the causes of hair colour, as well as conditions like leprosy and ulcers. The Romans attributed each condition to a specific doctor; thus, some dedicated themselves to the treatment of eyes, ears, and skin, with the latter were responsible for the prescription of medicinal baths. At the height of the Roman Empire, science began to be applied to Roman medicine despite a strong medical-religious impetus.
The Middle Ages, which lasted from 500 to 1500 CE, were a time of relative intellectual stagnation in Europe, with superstitions dominating beliefs about medical care and illness. Standard medical knowledge was based chiefly on surviving Greek and Roman texts, preserved in churches and monasteries. These texts were the primary source of medical knowledge, and were relied on heavily when treating patients.
This lasted until the Renaissance, which took place between the 15th and 16th centuries, and saw a tremendous increase in knowledge as well as a reappraisal of Greek rationality. This reignited interest in medicine, including the study of skin. It marked the transition from the Middle Ages to modernity, with an emphasis on reviving and surpassing the ideas and achievements of classical antiquity.
In 1543, Belgian physician and anatomist Andreas Vesalius examined and described the distinction between the epidermis, or outer layer of skin, and the dermis, or inner layer of skin, as well as pores, nerves, and fat, marking a significant milestone in the formalisation of skin research. In 1572, Italian physician Girolamo Mercuriali completed the first scientific study devoted to skin, emphasising the importance of future research.
As the study of the skin expanded in the 18th and 19th centuries, three major medical and research centres in Europe emerged: the United Kingdom, France, and Austria, establishing the science of dermatology. In 1736, French physician Jean Astruc wrote the first comprehensive treatise on sexually transmitted infections, and is regarded as the “Founder of Modern Dermatology”. The first great dermatology school, Hôpital Saint-Louis opened in Paris in 1801, and the first textbooks and atlases were published during this time.
By the mid-19th century, dermatology had established itself as a widely recognised branch of medicine, and publications allowing physicians to recognise symptoms in their patients were circulating. Hospitals such as Charité in Berlin began to establish dedicated dermatological departments.
In 1845, British physician James Arnott pioneered the use of freezing techniques for therapeutic purposes, observing shrinkage and analgesia. Other types of freezing emerged in the decades that followed.
In 1865, British Dermatologist Alexander Balmanno Squire, was the first to document a skin condition using photography. Throughout the history of medicine, diagnoses, and treatments had been recorded in writing through notes and medical records, as well as through oral communications between professionals. Photographs soon became the cornerstone for care and education in dermatology.
The scientific revolution and technical advancements transformed dermatology during the 20th century. Scientific societies, journals, and academic congresses helped to consolidate the specialty and attract attention from the medical and business communities. Furthermore, the practice of dermatology grew to include a wide range of surgical, diagnostic, and cosmetic procedures.
In 1963, American dermatologist Leon Goldman pioneered the use of lasers in dermatological treatment, demonstrating the selective destruction of pigmented structures of the skin, namely melanomas, but with uses as wide as scars and tattoo removal. By the 1960’s, dermatology had definitively transformed into a clinical-surgical specialty.
Dermatologists began to adopt skin rejuvenation treatments, such as fillers. In the late 1970’s, Stanford University in the United States developed the first injectable dermal implant for filling soft tissues. In 1992, Canadian dermatologist Alastair Carruthers and ophthalmologist Jean Carruthers pioneered the use of botulinum toxin, or botox, to treat expression wrinkles.
Teledermatology was first used by the United States in Somalia in 1992. It provided dermatological assistance to soldiers through satellite radio, which allowed for video meetings between soldiers and dermatologists. Since the late 1990’s and early 2000’s, information and communication technologies have revolutionised social interactions and content sharing, and in doing so expanding access to dermatology services and fostering scientific research.
Skin conditions have been known to mankind since its origin, as the primarily visual component of these conditions allows for early recognition, but breakthroughs in accurate diagnosis, and effective methods of treatment showed signs of development in antiquity, but had to wait millennia for more meaningful advancement. It could be argued that we accomplished more in the 20th century than the preceding millennia, and the task at hand is to continue that level of advancement through the 21st century.