5 Things That Still Exist But Should Disappear

8 min read

Everything is evolving: We stopped using candles because electric lights were cheaper and more versatile. We stopped using straight razors because disposables were more convenient, and you didn’t have to worry about slitting your own throat if you sneezed. But sometimes outdated ideas live on purely by inertia, like…



The word lecture comes from the Latin “Lectus,” to read. You may have noticed, however, there is very little reading involved in a lecture: A teacher speaks, and students take notes. The original lecture was actually the practice of reading aloud to students while they literally copied down every word you read. Until relatively recently, books were prohibitively rare and valuable. The lecture gradually evolved into people lecturing about the subject, but it’s still basically graduate-level story time.

So why are we still doing it when books are (mostly) cheap, and information is freely available on the internet? Even back in the 1500s, people thought the printing press spelled the end of lecturing, but it persists today, so it must be the best way to learn, right? “Yup, don’t change anything,” says the kid watching porn on his phone.

A good deal of research has gone into more active learning approaches, and it found that they may be more effective and require less background knowledge to learn the same material. (In fact, some studies have shown no method of teaching is less effective than lecturing.) That doesn’t mean “teaching” is an elaborate scam or anything. It’s just that maybe, on the structural level, kindergarten has it more together than your sophomore survey class.

Cars With Engines In Front

Cars With Engines

The overwhelming majority of cars have front-engine design. Mid-and rear-engine designs are reserved for fancy sports cars and time machines. Clearly these other designs have some advantages, but there must be some principle of physics that makes putting an engine somewhere else more expensive, right? Like cost equals distance over brand name or something?

Not really. All other things being equal, it makes the most sense for a big, heavy power source to be square in the middle. It’s worth pointing out that car design is a complex alchemy of art and science, and there are well-designed cars with the engine located in the front and the back. But playing purely by averages, mid-engine wins out.

Mid-engine cars have better balance and power delivery, as well as a lower “moment of inertia,” or, the 50:50 weight distribution allows them to turn and handle better in general. Many high-end manufacturers either don’t make front engine cars at all, or reserve them solely for low-end models.

In the early days of motorized cars, it was easier to convert horse-drawn carriages than to build an entirely new car from scratch. So the first adaptations of this technology put the engine in the same place as the horses it was replacing: in the front. You should be pulled along, as God wants, not pushed – that’s the devil’s locomotion. So just to avoid a little intellectual discomfort for their consumers, manufacturers stuck with an objectively worse design. And anyway, that’s why the Beetle is the world’s best car.

Tipping – A Remnant Of A Racist Practice


For most of human history, common people bartered. Coin money was reserved for the ancient world’s Rockefellers or Kardashians. Tipping didn’t come to America until after the Civil War, where it was considered “deeply un-American” and opposed by people as diverse as Trotsky, Twain, and Taft. Tipping created an automatic assumption of class division. That might fly in Europe, where classism is baked into the culture, but America was the land of equality. It was therefore considered rude to tip someone, since all men were considered equal. All white men anyway.

Americans had no problem insulting black people with currency. In 1902, a newspaper writer described tipping thusly: “Negroes take tips, of course, one expects that of them – it is a token of their inferiority. But to give money to a white man was embarrassing to me.”

Tipping eventually did catch on in America, because it filled an economic niche after the Civil War – a way to legally oppress black people. When the first minimum wage laws were passed, they couldn’t discriminate based on race. Most of the tipped jobs were occupied by black people. So the racists of the day, using good ol’ American ingenuity, created the “tipped minimum wage” we know today. This made it legal to pay anyone who made part of their salary from tips far less than the minimum wage other workers earned. This loophole in the minimum wage law was mostly used to underpay black workers.

Today, tipped minimum wage is much the same, only now people of all races can get the discriminatory treatment. Anyone in the service industry knows that certain types of people are tipped more, and others are tipped less. It would be illegal to pay your attractive employees more, or to pay your white workers more, but tipping ensures that’s exactly what happens.

Some restaurants have moved away from tipping, raising their servers’ salaries to a livable wage. That means they have to increase their prices on paper, even if the total cost (including tip) doesn’t actually go up. But because customers are shortsighted, that’s meant nothing but heartache for the restaurants. If all restaurants made the switch simultaneously, this wouldn’t be a problem. But if you could coordinate the actions of the entire restaurant industry, you could make the McRib permanently available, and that would obviously be the end of the civilized world.

Daylight Saving Time

Daylight Saving Time

DST was initially adopted as a measure to save energy during wartime. It first became law during World War I, and then kept popping back up during other wars and energy emergencies. But like many other things suspended during wartime (like a huge chunk of our civil rights), it eventually became a part of regular life.

But lighting no longer comprises a significant portion of household electricity usage. People’s electricity usage mainly stems from air conditioning and electronics. So, when people return home during a hotter time of day, they use their home air conditioning more, and the energy savings aren’t so clear. Until recently, most of Indiana was in the Eastern Time Zone and did not use Daylight Saving Time. Their switch to it in 2006 provided the perfect test bed. When pre-and post-DST power usage were compared, it was found that Indianans used more electricity when DST was implemented. In that state alone, Daylight Saving Time was estimated to cost the population an additional $9 million annually. And the pollution caused by the increased energy usage has a social cost (mostly adverse health effects) between $1.7 and $5.5 million dollars per year.

The 2005 Energy Policy Act specifically requires Congress to conduct research on the efficacy of DST, and it states they should repeal Daylight Saving Time extensions if proposed gains are not realized. However, in classic bureaucratic form, Congress has overwhelmingly supported it and continually re-approved and extended DST in spite of scientific research telling them to do the opposite. Sounds familiar.

Why would our representatives continue to enact a law that has absolutely failed to be substantiated by modern science? Well, money, of course. When people have more evening daylight, they go out shopping, barbecuing, and visit friends. This is worth millions to industries like retail stores and barbecue product manufacturers. The positive effect on these industries might be a valid argument for keeping it, if the negative outcomes were restricted to economic effects. However, DST is bad in so many ways. The day of the spring-forward time jump is one of the most hazardous of the year for strokes and heart attacks. It drastically increases traffic and workplace accidents. Additionally, by getting up an hour earlier, people are less rested for the entire summer. Then there’s the increase in cyber-loafing behavior (wasting more time at work slacking off and reading humorous online articles).

Grand Juries

Grand Juries

In medieval England, one of the main ways to determine someone’s guilt or innocence was “trial by ordeal.” In this case, “ordeal” is a way of saying “nearly kill them through boiling water, drowning, or hot irons, and see if God saves them.” So you can understand why an accusation even going to trial was a bit of a hassle for the accused. Rather than kill someone every time God was taking a nap, authorities decided to put a check on this system with grand juries, who would decide if a case even warranted going to trial.

As trials became slightly less barbaric, going to trial still remained quite a burden for common folk, so the idea of grand juries evolved with the times and were exported to England’s colonies. After the U.S. split from England, the grand jury remained a right guaranteed to every American standing in a federal court by the Fifth Amendment. Grand Juries were included in the Bill of Rights as a check on prosecutorial power. Remember that our courts operate on an adversarial system, but one of the two adversaries has the considerable resources of the government backing them up. While that’s an admirable goal, the modern incarnation of grand juries have all the drawbacks of being an outdated, stupid system while also retaining all the drawbacks of being a highly modernized bureaucratic waste of resources. They don’t protect anyone from anything.

In 2010, roughly 162,000 suspects were pursued in criminal proceedings by U.S. district attorneys, and out of all of these, only eleven failed to receive a bill of indictment by a grand jury. You are literally more likely to be struck by lightning than be acquitted by a grand jury. Unless you are a police officer, that is. One New York Chief Judge, Sol Wachtler, said, “any prosecutor who wanted to could get a grand jury to indict a ham sandwich.”

How are prosecutors able to boast such an impressive success rate? Turns out, there is practically zero supervision for the grand jury process, and defendants aren’t even allowed to be present (much less defend themselves). Grand Juries are cloak and dagger affairs conducted under a level of secrecy normally reserved for nuclear launch codes and the Colonel’s 11 herbs and spices.

And even if everything were out in the open, the usual protections for defendants in a trial do not apply to grand jury proceedings. The U.S. Supreme Courts have held that the standard for evidence admissible to a grand jury is frighteningly low: The Fourth Amendment (freedom from search and seizure) is inapplicable to grand juries, hearsay is a valid piece of evidence, and evidence presented before a grand jury can impact the inevitable trial to come. This means that in some states prosecutors can (and do) bring unreliable witnesses before grand juries and use their unreliable testimony against the defendant (who didn’t get to question the witness, because they weren’t at the grand jury proceedings).

Despite the fact that grand juries were created to be a bulwark against “ordeals” from false accusations, in the modern era, they’re at best a waste of resources, and at worst an unfair advantage given to government prosecutors. They’re a relic of medieval law that the framers of the Constitution hoped would serve as a shield against the unfounded prosecution of American citizens. Unfortunately, they often result in the unjust harassment by unrestricted prosecutors. The U.S. and Liberia are the only countries in the world that continue the practice. No offense to Liberia, but maybe that’s not such a good thing.