We don’t know anything at all about these people—where they came from, what language they spoke, what led them to settle on such a lonesome outpost on the treeless edge of Europe—but from all the evidence it appears that Skara Brae enjoyed six hundred years of uninterrupted comfort and tranquillity.

Through the whole of the medieval period, till well into the fifteenth century, the hall effectively was the house, so much so that it became the convention to give its name to the entire dwelling, as in Hardwick Hall or Toad Hall.

Every member of the household, including servants, retainers, dowager widows, and anyone else with a continuing attachment, was considered family—they were literally familiar, to use the word in its original sense.

The head of the household was the husband—a compound term meaning literally “householder” or “house owner.” His role as manager and provider was so central that the practice of land management became known as husbandry. Only much later did husband come to signify a marriage partner.

The principal effect of serfdom was to remove the holder’s freedom to move elsewhere or marry outside the estate. But serfs could still become prosperous. In the late medieval period, one in twenty owned fifty acres or more—substantial holdings for the time. By contrast, freemen, known as ceorls, had freedom in principle but often were too poor to exercise it.

However, slavery from the ninth to eleventh centuries in England was not quite the kind of dehumanizing bondage we think of from more modern times, as in the American South, for instance. Although slaves were property and could be sold—and for quite a lot: a healthy male slave was worth eight oxen—slaves were able to own property, marry, and move about freely within the community. The Old English word for a slave was thrall, which is why when we are enslaved by an emotion we are enthralled.

Bare earth floors remained the norm in much of rural Britain and Ireland until the twentieth century. “The ‘ground floor’ was justly named,” as the historian James Ayres has put it.

In humbler dwellings, matters were generally about as simple as they could be. The dining table was a plain board called by that name. It was hung on the wall when not in use, and was perched on the diners’ knees when food was served. Over time, the word board came to signify not just the dining surface but the meal itself, which is where the board comes from in room and board. It also explains why lodgers are called boarders and why an honest person—someone who keeps his hands visible at all times—is said to be aboveboard.

After an evening meal, the inhabitants of the medieval hall had no bedrooms in which to retire. We “make a bed” today because in the Middle Ages that is essentially what you did—you rolled out a cloth sleeping pallet or heaped a pile of straw, found a cloak or blanket and fashioned whatever comfort you could.

Whatever the losses in warmth and comfort, the gains in space proved irresistible. So the development of the fireplace became one of the great breakthroughs in domestic history: they allowed people to lay boards across the beams and create a whole new world upstairs.

(Bedroom was first used by Shakespeare in A Midsummer Night’s Dream in about 1590, though he meant it only in the sense of space within a bed. As a word to describe a dedicated sleeping chamber, bedroom didn’t become common until the following century.)

For a time, transportation to Australia was seriously considered for malfeasant bakers. This was a matter of real concern for bakers because every loaf of bread loses weight in baking through evaporation, so it is easy to blunder accidentally. For that reason, bakers sometimes provided a little extra—the famous baker’s dozen.

Where ice really came into its own was in the refrigeration of railway cars, which allowed the transport of meat and other perishables from coast to coast. Chicago became the epicenter of the railway industry in part because it could generate and keep huge quantities of ice.

Chicago got its first lobster in 1842, brought in from the East Coast in a refrigerated railway car. Chicagoans came to stare at it as if it had arrived from a distant planet. For the first time in history food didn’t have to be consumed close to where it was produced.

You don’t have to venture far into any New England forest to find the ghostly house foundations and old field walls that denote a farm abandoned in the nineteenth century. Farmers throughout the region left their farms in droves, either to work in factories or to try their hand at farming on better land farther west. In a single generation Vermont lost nearly half its population.

At Monticello in the early nineteenth century Thomas Jefferson grew 23 different types of peas and more than 250 kinds of fruits and vegetables. (Unusual for his day, Jefferson was practically a vegetarian and ate only small portions of meat as a kind of “condiment.”)

Part of the reason people could eat so well was that many foods that we now think of as delicacies were plenteous then. Lobsters bred in such abundance around Britain’s coastline that they were fed to prisoners and orphans or ground up for fertilizer; servants sought written agreements from their employers that they would not be served lobster more than twice a week. Americans enjoyed even greater abundance. New York Harbor alone held half the world’s oysters and yielded so much sturgeon that caviar was set out as a bar snack.

Many people considered the potato an unwholesome vegetable because its edible parts grew below ground rather than reaching nobly for the sun. Clergymen sometimes preached against the potato on the grounds that it nowhere appears in the Bible. Only the Irish couldn’t afford to be so particular. For them, the potato was a godsend because of its very high yields. A single acre of stony soil could support a family of six if they were prepared to eat a lot of potatoes, and the Irish, of necessity, were. By 1780, 90 percent of people in Ireland were dependent for their survival exclusively or almost exclusively on potatoes. Unfortunately, the potato is also one of the most vulnerable of vegetables, susceptible to more than 260 types of blight or infestation. From the moment of the potato’s introduction to Europe, failed harvests became regular. In the 120 years leading up to the great famine, the potato crop failed no fewer than twenty-four times. Three hundred thousand people died in a single failure in 1739. But that appalling total was made to seem insignificant by the scale of death and suffering in 1845–46.

Relief was infamously slow to come. Months after the starving had started, Sir Robert Peel, the British prime minister, was still urging caution. “There is such a tendency to exaggeration and inaccuracy in Irish reports that delay in acting on them is always desirable,” he wrote. In the worst year of the potato famine, London’s fish market, Billingsgate, sold 500 million oysters, 1 billion fresh herrings, almost 100 million soles, 498 million shrimps, 304 million periwinkles, 33 million plaice, 23 million mackerel, and other similarly massive amounts—and not one morsel of any of it made its way to Ireland to relieve the starving people there. The greatest part of the tragedy is that Ireland actually had plenty of food. The country produced great quantities of eggs, cereals, and meats of every type, and brought in large hauls of food from the sea, but almost all went for export. So 1.5 million people needlessly starved. It was the greatest loss of life anywhere in Europe since the Black Death.

A seventeenth-century black man in Virginia named Anthony Johnson acquired a 250-acre tobacco plantation and grew prosperous enough to be a slave owner himself. Nor was slavery a southern institution at first. Slavery was legal in New York until 1827. In Pennsylvania, William Penn owned slaves. When Benjamin Franklin moved to London in 1757, he brought with him two slaves, named King and Peter.

In factories, workers were expected to be at their places from 7:00 a.m. to 7:00 p.m. on weekdays and from 7:00 a.m. to 2:00 p.m. on Saturdays, but during the busiest periods of the year—what were known as “brisk times”—they could be kept at their machines from 3:00 a.m. to 10:00 p.m.—a nineteen-hour day. Until the Factory Act of 1833, children as young as seven were required to work as long as adults. In such circumstances, not surprisingly, people ate and slept when they could.

The best light of all came from whale oil, and the best type of whale oil was spermaceti from the head of the sperm whale. Sperm whales are mysterious and elusive animals that are even now little understood. They produce and store great reserves of spermaceti—up to three tons of it—in a cavernous chamber in their skulls.

In 1846, America had more than 650 whaling ships, roughly three times as many as all the rest of the world put together. Whale oil was taxed heavily throughout Europe, so people there tended to use colza (a type of oil made from cole seeds) or camphene (a derivative of turpentine), which made an excellent light, though it was highly unstable and tended, unnervingly, to explode.

By the summer of 1859, Bissell and his partners were out of funds. Reluctantly, they dispatched a letter to Drake instructing him to shut down operations. Before the letter got there, however, on August 27, 1859, at a depth of just under seventy feet, Drake and his men hit oil. It wasn’t the towering gusher that we traditionally associate with oil strikes—this oil had to be laboriously pumped to the surface—but it produced a steady volume of viscous blue-green liquid. Although no one remotely appreciated it at the time, they had just changed the world completely and forever.

Eventually, they started making purpose-built barrels with a capacity of forty-two gallons, and these remain today the standard measure for oil.

In the year of Drake’s discovery, America produced two thousand barrels of oil; within ten years, it produced well over four million barrels, and in forty years that figure was sixty million. Unfortunately, Bissell, Drake, and the other investors in his company (now renamed the Seneca Oil Company) didn’t prosper to quite the degree that they had hoped.

Fireplaces just aren’t efficient enough to keep any but the smallest spaces warm. This could be overlooked in a temperate place like England, but in the frigid winters of much of North America the fireplace’s inadequacies at projecting warmth into a room became numbingly apparent.

Using a flame made from a rich blend of oxygen and alcohol, Gurney could heat a ball of lime no bigger than a child’s marble so efficiently that its light could be seen sixty miles away. The device was successfully put to use in lighthouses, but it was also taken up by theaters. The light not only was perfect and steady but also could be focused into a beam and cast onto selected performers—which is where the phrase in the limelight comes from. The downside was that the intense heat of limelight caused a lot of fires. In one decade in America, more than four hundred theaters burned down. Over the nineteenth century as a whole, nearly ten thousand people were killed in theater fires in Britain, according to a report published in 1899 by William Paul Gerhard, the leading fire authority of the day.

Altogether, it consumed 13,200 houses and 140 churches. But the fire of 1666 was actually the second Great Fire of London. A fire in 1212 was far more devastating. Though smaller in extent than the one of 1666, it was swifter and more frenzied, and leaped from street to street with such dreadful rapidity that many fleeing citizens were overtaken or left without escape routes. It was also more deadly, claiming twelve thousand lives (versus five people killed in the 1666 fire, as far as is known). For 454 years, the fire of 1212 was known as the Great Fire of London. It really still ought to be.

Charles Darwin, driven to desperation by a mysterious lifelong malady that left him chronically lethargic, routinely draped himself with electrified zinc chains, doused his body with vinegar, and glumly underwent hours of pointless tingling in the hope that it would effect some improvement. It never did.

Then in the early 1870s, Hermann Sprengel, a German chemist working in London, invented a device that came to be called the Sprengel mercury pump. This was the crucial invention that actually made household illumination possible. Unfortunately, only one person in history thought Hermann Sprengel deserved to be better known: Hermann Sprengel. Sprengel’s pump could reduce the amount of air in a glass chamber to one-millionth of its normal volume, which would enable a filament to glow for hundreds of hours. All that was necessary now was to find a suitable material for the filament.

But all the problems were finally resolved, and on the afternoon of September 4, 1882, Edison, standing in the office of the financier John Pierpont (J. P.) Morgan, threw a switch that illuminated eight hundred electric bulbs in the eighty-five businesses that had signed up for his scheme.

The term is a shortening of the much older withdrawing room, meaning a space where the family could withdraw from the rest of the household for greater privacy, and it has never settled altogether comfortably into widespread English usage.

Then English farmers discovered something that Dutch farmers had known for a long time: if turnips, clover, or one or two other amenable crops were sown on the idle fields, they miraculously refreshed the soil and produced a bounty of winter fodder into the bargain. It was the infusion of nitrogen that did it, though no one would understand that for nearly two hundred years.

It fell to the great Captain James Cook to get matters onto the right course. On his circumnavigation of the globe in 1768–71, Captain Cook packed a range of antiscorbutics to experiment on, including thirty gallons of carrot marmalade and a hundred pounds of sauerkraut for every crew member. Not one person died from scurvy on his voyage—a miracle that made him as much a national hero as his discovery of Australia or any of his other many achievements on that epic undertaking. The Royal Society, Britain’s premier scientific institution, was so impressed that it awarded him the Copley Medal, its highest distinction.

The real breakthrough came in 1912, when Casimir Funk, a Polish biochemist working at the Lister Institute in London, isolated thiamine, or vitamin B1, as it is now more generally known. Realizing it was part of a family of molecules, he combined the terms vital and amines to make the new word vitamines. Although Funk was right about the vital part, it turned out that only some of the vitamines were amines (that is to say, nitrogen-bearing), and so the name was changed to vitamins to make it “less emphatically inaccurate,” in Anthony Smith’s nice phrase.

He was injected with 1,000 milligrams of vitamin C and was restored to life almost at once. Interestingly, he had never acquired the one set of symptoms that everyone associates with scurvy: the falling out of teeth and bleeding of gums.

The same considerations exactly apply with the vitamins’ fellow particles the minerals. The fundamental difference between vitamins and minerals is that vitamins come from the world of living things—from plants and bacteria and so on—and minerals do not. In a dietary context, minerals is simply another name for the chemical elements—calcium, iron, iodine, potassium, and the like—that sustain us.

Pepper accounted for some 70 percent of the spice trade by bulk, but other commodities from farther afield—nutmeg and mace, cinnamon, ginger, cloves, and turmeric, as well as several largely forgotten exotics such as calamus, asafoetida, ajowan, galangal, and zedoary—began to find their way to Europe, and these became even more valuable. For centuries spices were not just the world’s most valued foodstuffs, they were the most treasured commodities of any type.

It would be hard to name any figure in history who has achieved more lasting fame with less competence. He spent large parts of eight years bouncing around Caribbean islands and coastal South America convinced that he was in the heart of the Orient and that Japan and China were at the edge of every sunset. He never worked out that Cuba is an island and never once set foot on, or even suspected the existence of, the landmass to the north that everyone thinks he discovered: the United States.

Everyone but Columbus could see that this was not the solution to the spice problem, and in 1497 Vasco da Gama, sailing for Portugal, decided to go the other way to the Orient, around the bottom of Africa. This was a much trickier proposition than it sounds. Contrary prevailing winds and currents wouldn’t allow a southern-sailing vessel to simply follow the coastline, as logic would indicate. Instead it was necessary for Gama to sail far out into the Atlantic Ocean—almost to Brazil, in fact, though he didn’t know it—to catch easterly breezes that would shoot his fleet around the southern cape. This made it a truly epic voyage. Europeans had never sailed this far before. Gama’s ships were out of sight of land for as much as three months at a time. This was the voyage that effectively discovered scurvy. No earlier sea voyages had been long enough for the symptoms of scurvy to take hold.

Vasco da Gama was a breathtakingly vicious man. On one occasion he captured a Muslim ship carrying hundreds of men, women, and children, locked the passengers and crew in the hold, carried off everything of value, and then—gratuitously, appallingly—set the ship ablaze. Almost everywhere he went, Gama abused or slaughtered people he encountered, and so set a tone of distrust and brutish violence that would characterize and diminish the whole of the age of discovery.

In 1519, Ferdinand Magellan set off in five leaky ships, in a brave but seriously underfunded operation, to find a western route. What he discovered was that between the Americas and Asia was a greater emptiness than anyone had ever imagined Earth had room for: the Pacific Ocean. No one has ever suffered more in the quest to get rich than Ferdinand Magellan and his crew as they sailed in growing disbelief across the Pacific in 1521. Their provisions all but exhausted, they devised perhaps the least appetizing dish ever served: rat droppings mixed with wood shavings. “We ate biscuit which was no longer biscuit but powder of biscuits swarming with worms,” recorded one crew member. “It stank strongly of the urine of rats. We drank yellow water that had been putrid for many days. We also ate some ox hides that covered the top of the mainyard … and often we ate sawdust from boards.” They went three months and twenty days without fresh food or water before finding relief and a shoreline in Guam—and all in a quest to fill the ships’ holds with dried flowerbuds, bits of tree bark, and other aromatic scrapings to sprinkle on food and make into pomanders. In the end, only 18 of 260 men survived the voyage. Magellan himself was killed in a skirmish with natives in the Philippines. The survivors did very well out of the voyage, however. In the Spice Islands they loaded up with fifty-three thousand pounds of cloves, which they sold in Europe for a profit of 2,500 percent, and almost incidentally in the process became the first human beings to circle the globe. The real significance of Magellan’s voyage was not that it was the first to circumnavigate the planet, but that it was the first to realize just how big that planet was.

On November 5, 1492, on Cuba, two of his crewmen returned to the ship carrying something no one from their world had ever seen before: “a sort of grain [that the natives] call maiz which was well tasted, bak’d, dry’d and made into flour.” In the same week, they saw some Taino Indians sticking cylinders of smoldering weed in their mouths, drawing smoke into their chests, and pronouncing the exercise satisfying. Columbus took some of this odd product home with him, too.

Less happily, the Columbian Exchange also involved disease. With no immunity to many European diseases, the natives sickened easily and “died in heapes.” One epidemic, probably viral hepatitis, killed an estimated 90 percent of the natives in coastal Massachusetts. A once-mighty tribal group in the region of modern Texas and Arkansas, the Caddo, saw its population fall from an estimated 200,000 to just 1,400—a drop of nearly 96 percent. An equivalent outbreak in modern New York would reduce the population to 56,000—“not enough to fill Yankee Stadium,” in the chilling phrase of Charles C. Mann. Altogether, disease and slaughter reduced the native population of Mesoamerica by an estimated 90 percent in the first century of European contact. In return, the natives gave Columbus’s men syphilis.

People continued to fight over the more exotic spices for another century or so, and sometimes even over the more common ones. In 1599, eighty British merchants, exasperated by the rising cost of pepper, formed the British East India Company with a view to getting a piece of the market for themselves. This was the initiative that brought King James the treasured isles of Puloway and Puloroon, but in fact the British never had much success in the East Indies, and in 1667, in the Treaty of Breda, they ceded all claims to the region to the Dutch in return for a small piece of land of no great significance in North America. The piece of land was called Manhattan.

Although pepper and spices were what brought the East India Company into being, the company’s destiny was tea. In 1696, the government introduced the first in a series of cuts in the tea tax. The effect on consumption was immediate. Between 1699 and 1721, tea imports increased almost a hundredfold, from 13,000 pounds to 1.2 million pounds, then quadrupled again in the thirty years to 1750.

By 1800, tea was embedded in the British psyche as the national beverage, and imports were running at twenty-three million pounds a year. Virtually all that tea came from China. This caused a large and chronic trade imbalance. The British resolved this problem in part by selling opium produced in India to the Chinese. Opium was a very considerable business in the nineteenth century, and not just in China. British and American citizens—women in particular—took a lot of opium, too, mostly in the form of medicinal paregoric and laudanum. Imports of opium to the United States went from 24,000 pounds in 1840 to no less than 400,000 pounds in 1872, and it was women who mostly sucked it down, though quite a lot was given to children, too, as a treatment for croup. Franklin Delano Roosevelt’s grandfather Warren Delano made much of the family’s fortune by trading opium, a fact that the Roosevelt family has never exactly crowed about.

With the canal, the cost of shipping a ton of flour from Buffalo to New York City fell from $120 a ton to $6 a ton, and the carrying time was reduced from three weeks to just over one. The effect on New York’s fortunes was breathtaking. Its share of national exports leaped from less than 10 percent in 1800 to over 60 percent by the middle of the century; in the same period, even more dazzlingly, its population went from ten thousand to well over half a million.

Just 130 workers were needed on-site, and none died in its construction—a magnificent achievement for a project this large in that age. Until the erection of the Chrysler Building in New York in 1930, it would be the tallest structure in the world. Although by 1889 steel was displacing iron everywhere, Eiffel rejected it because he had always worked in iron and didn’t feel comfortable with steel. So there is a certain irony in the thought that the greatest edifice ever built of iron was also the last.

What particularly galled the Europeans was that nearly all the technological advances in steel production were made in Europe, but it was America that made the steel. In 1901, J. P. Morgan absorbed and amalgamated a host of smaller companies into the mighty U.S. Steel Corporation, the largest business enterprise the world had ever seen. With a value of $1.4 billion, it was worth more than all the land in the United States west of the Mississippi and twice the size of the federal government if measured by annual revenue.

John D. Rockefeller made $1 billion a year, measured in today’s money, and paid no income tax. No one did, for income tax did not yet exist in America.

By the early twentieth century, 10 percent of all British aristocratic marriages were to Americans—an extraordinary proportion.

Cornelius Vanderbilt—“Commodore” as he liked to be known, though he had no actual right to the title—didn’t offer much in the way of sophistication or intellectual enchantment, but he had a positively uncanny gift for making money. At one time he personally controlled some 10 percent of all the money in circulation in the United States.

To assist in this new line of development, he hired a young man named Thomas A. Watson. Together the two threw themselves at the problem in early 1875. Just over a year later, on March 10, 1876, a week to the day after Bell’s twenty-ninth birthday, the most famous moment in telecommunications history occurred in a small lab at 5 Exeter Place in Boston, when Bell spilled some acid on his lap and sputtered, “Mr. Watson, come here, I want to see you,” and an astonished Watson in a separate room heard the message clearly.

By the early twentieth century Bell’s telephone company, renamed American Telephone & Telegraph, was the largest corporation in America, with stock worth $1,000 a share. (When the company was finally broken up in the 1980s to satisfy antitrust regulators, it was worth more than the combined worth of General Electric, General Motors, Ford, IBM, Xerox, and Coca-Cola, and employed a million people.) Bell moved to Washington, D.C., became a U.S. citizen, and devoted himself to worthwhile pursuits. Among other things, he invented the iron lung and experimented with telepathy. When President James A. Garfield was shot by a disgruntled lunatic in 1881, Bell was called in to see if he could help locate the bullet. He invented a metal detector, which worked beautifully in the laboratory but gave confused results at Garfield’s bedside. Not until much later was it realized that the device had been reading the presidential bedsprings.

A ha-ha is a sunken fence, a kind of palisade designed to separate the private part of an estate from its working parts without the visual intrusion of a conventional fence or hedge. It was an idea adapted from French military fortifications (where Vanbrugh would have first encountered them during his years of imprisonment).

The Villa Chiericati, with its striking portico of triangular pediment and four severe columns, isn’t just rather like the White House, it is the White House, but weirdly transferred to what is still a working farm a little beyond the city’s eastern edge.

The system was enshrined in a series of laws known as the Navigation Acts, which stipulated that any product bound for the New World had either to originate in Britain or pass through it on the way there, even if it had been created in, say, the West Indies, and ended up making a pointless double crossing of the Atlantic. The arrangement was insanely inefficient, but gratifyingly lucrative to British merchants and manufacturers, who essentially had a fast-growing continent at their commercial mercy. By the eve of the revolution America effectively was Britain’s export market.

Fortunately, science was standing by to help. One remedy, described by Mary Roach in Bonk: The Curious Coupling of Sex and Science (2008), was the Penile Pricking Ring, developed in the 1850s, which was slipped over the penis at bedtime (or indeed anytime) and was lined with metal prongs that bit into any penis that impiously swelled beyond a very small range of permissible deviation. Other devices used electrical currents to jerk the subject into a startled but penitent wakefulness.

Baker Brown became a pioneering gynecological surgeon. Unfortunately, he was motivated almost entirely by seriously disturbed notions. In particular, he grew convinced that nearly every female malady was the result of “peripheral excitement of the pudic nerve centring on the clitoris.” Put more bluntly, he thought women were masturbating and that this was the cause of insanity, epilepsy, catalepsy, hysteria, insomnia, and countless other nervous disorders.

Syphilis was for a long time a particularly unnerving disease because of the way it came and went in three stages, each successively worse than the last. The first stage usually showed itself as a genital chancre, ugly but painless. This was followed some time later by a second stage that involved anything from aches and pains to hair loss. Like first-stage syphilis, this would also resolve itself after a month or so whether it was treated or not. For two-thirds of syphilis sufferers, that was it. The disease was over. For the unfortunate one-third, however, the real dread was yet to come. The infection would lie dormant for as long as twenty years before erupting in third-stage syphilis. This is the stage nobody wants to go through. It eats away the body, destroying bones and tissue without pause or mercy. Noses frequently collapsed and vanished. (London for a time had a “No-Nose’d Club.”) The mouth may lose its roof. The death of nerve cells can turn the victim into a stumbling wreck. Symptoms vary, but every one of them is horrible. Despite the dangers, people put up with the risks to an amazing degree. James Boswell contracted venereal diseases nineteen times in thirty years.

When President James A. Garfield was shot in 1881, it wasn’t the bullet that killed him, but doctors sticking their unwashed fingers in the wound. Because anesthetics encouraged the growth of surgical procedures, there was in fact probably a very considerable net increase in the amount of pain and suffering after the advent of anesthetics.

One well-known case was that of Eleanor Markham of upstate New York, who was about to be buried in July 1894 when anxious noises were heard coming from her coffin. The lid was lifted and Miss Markham cried out: “My god, you are burying me alive!” She told her saviors: “I was conscious all the time you were making preparations to bury me. The horror of my situation is altogether beyond description. I could hear everything that was going on, even a whisper outside the door.”

The ancient Greeks were devoted bathers. They loved to get naked—gymnasium means “the naked place”—and work up a healthful sweat, and it was their habit to conclude their daily workouts with a communal bath.

The worst disease of all, because it was so prevalent and so devastating, was smallpox. (Smallpox was so called to distinguish it from the great pox, or syphilis.)

Until the eighteenth century, when vaccination came in, smallpox killed four hundred thousand people a year in Europe west of Russia. No other disease came close to the totals smallpox achieved.

By the time Europeans began to visit the New World in large numbers, they had grown so habitually malodorous that the Indians nearly always remarked on how bad they smelled. Nothing, however, bemused the Indians more than the European habit of blowing their noses into a fine handkerchief, folding it carefully, and placing it back in their pockets as if it were a treasured memento.

The Romans were particularly attached to the combining of evacuation and conversation. Their public latrines generally had twenty seats or more in intimate proximity, and people used them as unselfconsciously as modern people ride a bus.

Charles II always took two attendants with him when he went into the lavatory. Mount Vernon, George Washington’s home, has a lovingly preserved privy with two seats side by side.

Most people continued to use chamber pots, which they kept in a cupboard in their bedrooms or closet, and which were known (for entirely obscure reasons) as jordans. Foreign visitors were frequently appalled by the English habit of keeping chamber pots in cupboards or sideboards in the dining room, which the men would pull out and use as soon as the women had withdrawn.

The most notable feature about anecdotes involving toilet practices is that they always—really, always—involve people from one country being appalled by the habits of those from another. There were as many complaints about the lavatorial customs of the French as the French made of others.

This problem was resolved by one of the great and surely most extraordinarily appropriate names in hygiene history, Thomas Crapper (1837–1910), who was born into a poor family in Yorkshire and reputedly walked to London at the age of eleven. There he became an apprentice plumber in Chelsea. Crapper invented the classic and, in Britain, still familiar toilet with an elevated cistern activated by the pull of a chain.

Cholera wasn’t terribly feared at first, for the decidedly unworthy reason that it was thought primarily to affect poor people. It was accepted wisdom almost everywhere in the nineteenth century that the poor were poor because they were born to be.

Cholera became known as “the poor man’s plague.” In New York City, more than 40 percent of the victims were poor Irish immigrants.

But then cholera began to strike down people in well-to-do neighborhoods, too, and very quickly the terror became general. People had not been so unnerved by a disease since the Black Death. The distinguishing feature of cholera was its quickness. The symptoms—violent diarrhea and vomiting, agonizing cramps, crushing headache—came on in an instant. The mortality rate was 50 percent, and sometimes higher, but it was the swiftness of it—the fearful, headlong transition from complete wellness to sudden agony, delirium, and death—that people found terrifying. To see a loved one well at breakfast and dead by suppertime was a horrifying experience.

It was Henry Chadwick who decided, oddly and endearingly, that the symbol for a strikeout should be a K because it is the last letter of the word struck. (He had already used S’s for so many actions on the field that he felt he needed to enlist another letter for striking out.)

He made the most careful maps showing the exact distributions of where cholera victims lived. These made intriguing patterns. For instance, Bethlehem Hospital, the famous lunatic asylum, had not a single victim, while people on facing streets in every direction were felled in alarming numbers. The difference was that the hospital had its own water supply, from a well on the grounds, while people outside took their water from public wells.

Snow announced his findings in a pamphlet of 1849, On the Mode of Communication of Cholera, which demonstrated a clear link between cholera and water contaminated with human feces. It is one of the most important documents in the history of statistics, public health, medicine, demographics, forensic science—one of the most important documents, in short, of the nineteenth century. No one listened, and the epidemics kept coming.

All wigs tended to be scratchy, uncomfortable, and hot, particularly in summer. To make them more bearable, many men shaved their heads, so we should be surprised to see many famous seventeenth- and eighteenth-century figures as their wives saw them first thing in the morning. It was an odd situation. For a century and a half, men got rid of their own hair, which was perfectly comfortable, and instead covered their heads with something foreign and uncomfortable. Very often it was actually their own hair made into a wig. People who couldn’t afford wigs tried to make their hair look like a wig.

Before cotton, slavery had been in decline in the United States, but now there was a great need for labor because picking cotton remained extremely labor-intensive. At the time of Whitney’s invention slavery existed in just six states; by the outbreak of the Civil War it was legal in fifteen. Worse, the northern slave states like Virginia and Maryland, where cotton couldn’t be successfully grown, turned to exporting slaves to their southern neighbors, thus breaking up families and intensifying the suffering for tens of thousands.

At the same time, the booming cotton mills of England needed huge numbers of workers—more than population increase alone could easily provide—so increasingly they turned to child labor. Children were malleable, worked cheap, and were generally quicker at darting about among machinery and dealing with snags, breakages, and the like. Even the most enlightened mill owners used children freely. They couldn’t afford not to. So Whitney’s gin not only helped make many people rich on both sides of the Atlantic but also reinvigorated slavery, turned child labor into a necessity, and paved the way for the American Civil War. Perhaps at no other time in history has someone with a simple, well-meaning invention generated more general prosperity, personal disappointment, and inadvertent suffering than Eli Whitney with his gin.

The figures usually cited are that one-third of children died in their first year of life and half failed to reach their fifth birthdays. Even in the best homes death was a regular visitor.

Puerperal fever was particularly dreaded because it came on suddenly, often several days after a successful hospital birth when the mother was completely well and nearly ready to go home. Within hours the victim would be severely fevered and delirious, and would remain in that state for about a week until she either recovered or expired. More often than not she expired. In the worst outbreaks, 90 percent of victims died.

In the most celebrated case, Princess Charlotte, heir presumptive to the British throne, died giving birth to her first child in 1817 because the presiding physician, Sir Richard Croft, would not allow his colleagues to use forceps to try to relieve her suffering. In consequence, after more than fifty hours of exhausting and unproductive contractions, both baby and mother died. Charlotte’s death changed the course of British history. Had she lived, there would have been no Queen Victoria and thus no Victorian period. The nation was shocked and unforgiving. Stunned and despondent at finding himself the most despised man in Britain, Croft retired to his chambers and put a bullet through his head.

Measles killed more children in the nineteenth century than any other illness.