Marinating meats for the grill seems to bring out the inner herbalist in even the most hard-boiled of home cooks. A little olive oil, some lemon juice, a handful of herbs, some exotic spices -- whatever smells right. It’s almost like we’re designing a scented bath oil rather than a seasoning for meat.
The truth is, though, that marinades rarely do much good.
Most really don’t have much of an effect. In fact, in some cases -- those that call for a long soak -- they actually can do more damage than good.
Though composing complicated marinades may be satisfying on a certain intuitive level, with few exceptions, the mixture won’t do much more than coat the surface of the meat. It won’t tenderize it, and it will only impart the more forceful flavors.
No matter how long you soak it, most marinades won’t penetrate more than the outside eighth of an inch. That’s because meat is made up mostly of water (about 75% by weight) and water and oily marinades don’t mix. This is true whether you’re marinating for a half-day or for a week.
In most cases, that isn’t actually a bad thing. Most meats we marinate are thin cuts -- chicken pieces or beef or pork steaks. With these thinner cuts you’re almost always guaranteed to get a good bit of seasoned surface meat when you take a bite.
But in some cases, marinating can actually damage the meat. If you have very much acidity in the marinade -- vinegar or lemon juice, for example -- too long a bath can make the meat mealy.
This is based on the same science that leads some to believe that marinating “tenderizes.” Acid does denature protein -- it unwinds the tightly balled strands -- and that does make meat softer.
But remember that marinades rarely penetrate beyond the surface. So what is actually happening is that the outside of the meat is becoming overly tender -- mealy -- while the inside remains mostly untouched.
If you want to make a tough cut of meat more tender, it’s better to simply slice it thin, either before or after grilling.
And as for those complicated flavor combinations, subtlety tends to go out the window when it’s asked to compete with the primal flavors of meat and char.
Surprisingly, simply generously seasoning with salt and freshly ground black pepper will work wonders for flavor. This is particularly true if you do this 30 to 45 minutes in advance.
Try this sometime: Cook three steaks, one that has been salted and peppered in advance, one that has been seasoned just before grilling and a third that is seasoned only afterward. The difference is astonishing. Steak seasoned at the end tastes like meat with salt. Steak seasoned just before grilling is a bit better. But steak seasoned early has a deep, complex flavor and a much richer brown crust.
If you do want to get more complex with marinades, remember that it's going to take some big flavors to stand up to the taste of seared meat -- use garlic, shallots and other members of the onion family and dried peppers and other spices.
Acidity should be considered more a flavoring than a tenderizer -- and it’s an important one. Because most of the meats we grill are fairly fatty (that’s what makes them good for the grill), adding vinegar or citrus juice is a good balance.
The notable exception to the rule that marinades only work skin-deep is brining. That’s because salty water can more easily penetrate the meat than an oil-based marinade. And brining does more than just give flavor. It actually makes the meat seem juicier by increasing protein’s ability to retain moisture.
Dry-brining -- liberally salting the meat and letting it stand for several hours or even days -- does much the same thing. What happens in this case is that the salt pulls moisture from the surface, dissolves in that liquid and then is absorbed deep into the meat. This can take as little as a couple of hours for a thin fish fillet, a half-day for thin steaks or three or four days for a whole turkey.
Simply salting meat and letting it stand may not have the romance of more complicated marinades, but it works.
Star Trek Food Replicator: The Implications
In 176 episodes of Star Trek: The Next Generation, the food replicator was used to make food or drink 372 times. Of those, 311 times it was shown being used by Captain Picard to order “Tea. Earl Grey. Hot.”
You said it in his voice in your head, didn’t you? For all his diplomatic credentials, in all of those hundreds of cups of tea, he never bothered looking at the FAQ for the thing and figuring out how to put in a shortcut for the order. “Hit me,” or “tea me”, or a simple snap of the fingers. We’ve figured out the clap-on light, they’ve figured out interstellar travel, and no one can figure out keyword shortcuts in the future? This is why the Ferengi will rule over our bloated corpses.
The way that food replicators work is by being a refined offshoot of transporter technology. Basically, somewhere on the Enterprise is a (presumably sealed) compartment full of nothing but the complex proteins and fats that make up food. The replicator transports an appropriate quantity of various gunks (and that’s the real word, I think I saw it on the Food Network) and assembles them at a molecular level into the desired dish of food or drink. And when you’re done and throw the dishes back in there? It takes all the leftover gunk, sorts it, and puts it back into the pile.
Simple induction also tells us why no one in the Star Trek universe ever has to go to the bathroom. Because human wastes are composed of the same molecules that went in the oral inbox, that means that all that waste would logically just be thrown back in the pile of gunk by the replicators. In early generations, this would translate into the toilets flushing straight into the replicators, naturally. But in later generations, with advanced AI and the presence of micro-transporters, there’s no reason for such primitive plumbing. No, your colon and bladder are simply emptied automatically by the transporter system directly into the replicator gunk pile as they need to express themselves.
At any given moment, someone on screen in Star Trek: The Next Generation is probably invisibly shitting their pants.
But take things to even more logical conclusions. First, there is inevitably going to be some annoying hipster dicks who insist that they can tell the difference between replicator food and real food (and I seem to remember this actually happening on screen).
In addition, because it’s functioning off of recipes down to the molecular level, does that mean that every dish comes out looking identical to every other time it was ordered? Imagine if every single time you ate a hamburger, every ripple in the lettuce and angle of every sesame seed was exactly the same every single time. I know, McDonalds, right? But even if you are going to point out that no, there are probably algorithms that randomize that sort of thing, that leads to even more questions. In food with no true randomness, people will latch on to irrelevancies as mattering even more than they do now.
You think food snobs are pedantic now? Imagine when they can have holy wars over whether buns are better with 99 sesame seeds arranged in parallel configurations, or 117 sesame seeds in a crosshatch pattern. Or since all food is essentially ideal typed, there will be culinary rebels who insist on reprogramming the recipe to have a single burnt sesame seed on that bun. When everything is perfect, we will wage war over the proper imperfections.
Can you imagine the custom foods? Steaks that are indistinguishable from political figures you hate, cupcakes of the breasts of your commanding officer. You can sit down to a birthday cake shaped exactly like a celebrity’s erection, down to the molecule. With “you’re more gorgeous every day” tattooed in your best friend’s handwriting around the shaft. And you can have that every night for dinner for the rest of your life. And you won’t get fat because the replicators can zap the bad cholesterol out of your stomach while you sleep.
Also recall that if a computer is tracking every little pile of molecules, there’s no reason that with almost infinite storage space of data, it wouldn’t track which molecule went where. So with a little computer savvy, a deeply disturbed individual could set it up so that he was always eating the molecules from the waste products of whichever unfortunate person he is obsessed with.
And last, let’s just get to the charred corpse in the corner. If that thing can produce any food, that means that it can produce human meat. Which means that in the future, that’ll be a rite of passage, like discovering Internet porn. At some point with this technology, everyone will try a human burger at some point.
But also remember that transporter technology is good enough to copy human beings, i.e. down to their DNA. Which means, that you could not only eat generic human steaks, but you could eat specific human steaks.
Which means that someone on the Enterprise is a closed system, their every meal a steak composed of their own flesh, constructed of the molecules of their last bowel movement.
Oh, and you know that somebody is fucking Barclay.
Microwave Cooking
MICROWAVE MEALS GET a bad rap. It’s no mystery why: No matter how many new-age “organic” meals pop up in the freezer aisle, microwave-ready cuisine has long meant Hot Pockets and sad Lean Cuisine dinners, days-old leftovers, and nights spent wordlessly chewing in front of the TV.
Joseph Joseph, the British kitchenware brand known for exceptionally clever cooking gear, is changing that. Its M-Cuisine line includes six sets of devices designed to make something approaching fine dining (in this case, “fine” being more elevated than a frozen burrito) possible in a microwave.
There’s a surprisingly long list of foods you can cook in a microwave. Poached eggs, pasta, and steamed vegetables typically get prepared on a stovetop, but with the right settings, a microwave works too. The trouble for most people, says Antony Joseph, the company’s creative director and half of the fraternal founders, comes down to gear and space. Nearly everyone has a microwave, but “there is a lack of good quality cookware available that is designed specifically for this ubiquitous kitchen appliance,” he says. “As a result, many consumers are forced to improvise with unsuitable containers that often create a mess, break, or are simply unsafe.”
M-Cuisine aims to solve those problems. Everything’s made with injection-molded polypropylene, so it’s microwave-safe and easy to clean. Some of the bowls use a double-walled design, which prevents scalding exteriors. More importantly, though, the pieces are designed around a specific constraint: the microwave plate. You only have one to work with at a time (versus a stovetop with four burners), which could belabor the cooking process. To combat that, the Joseph brothers, along with British design firm Youmeus Design, found ways to pack a ton of utility onto the microwave plate.
Take the stackable cooking set ($42). It has a pot at the base, for preparing rice or pasta, a steamer basket for vegetables or fish, a pan for foods that need to sautée, and a lid like a typical microwave plate. This allows you to prep four dishes, more or less at once. Different foods require different cook times, so you can pause the microwave to add or remove dishes as needed without juggling lots of cookware. Likewise, because the Joseph brothers designed the omelet bowl ($12) with one flat-edged side, you can both prep with it, and then tilt it over to cook eggs, as if on a griddle.
The Joseph brothers are especially adept at identifying cultural macro-trends, and designing for them. They’ve made a consistent focal point of the constraints of urban living for many years now. Many of the brand’s hallmark products—the folding cutting board, the rainbow-hued nesting bowls and spoons, the flat-stacked cooking spoon set—cater specifically to the needs of young adults who live in cramped apartments but have good taste and an interest in cooking.
M-Cuisine speaks to another, adjacent, trend: People don’t really cook anymore, but like to feel as though they do. Joseph Joseph did some in-house research, and found that 60 percent of consumers in the US spend a maximum of 30 minutes cooking dinner, compared to one hour in the 1980s, and 100 minutes in the 1960s. Because more women work now than in the 1960s, and because of the surge in fast food and pre-made meals, those figures won’t surprise anyone. What’s interesting now, however, is the proliferation of delivery start ups—like Blue Apron or Plated—that let us keep cooking meals in 30 minutes, but revive the feeling of a home-cooked meal. M-Cuisine taps into that consumer desire. It’s convenient, but not too convenient.
Big Harvests Counter The Doom Sayers
This week’s autumn equinox is traditionally the time for the harvest festival. I have just taken a ride on the combine harvester cutting wheat on my farm. It is such a sophisticated threshing machine that long gone are the days when I could be trusted to take the controls during the lunch break. A screen showed how the GPS was steering it, inch-perfect and hands-free, along the edge of the unharvested crop; another screen gave an instant readout of the yield. It was averaging over five tonnes per acre (or 12 tonnes per hectare) — a record.
My farm is not alone in this. Everywhere in Britain this autumn, at least where the August downpours did not flatten and rot the crops, yields have broken records.
In the Lincolnshire Wolds, Tim Lamyman smashed the world record for wheat yield per acre, held for the past five years by a New Zealander. He also set a new world record for oilseed rape yield — he did this last year too, but lost it over the winter to a New Zealander. (Britain and New Zealand have the right combination of day length and soil moisture that breeds big wheat and rape crops.)
Unfortunately for farmers, this extraordinary harvest cannot make up for the steep fall in prices, and farm incomes will be down, not up. Bumper crops elsewhere are the main reason for those low prices. Globally, the cereal harvest this year will be very close to last year’s huge record. The Food and Agriculture Organisation’s food price index is now well below where it was throughout the 1960s and 1970s: that is to say, it’s proving cheaper and easier to feed seven billion today than it was to feed three billion in 1960.
This was not supposed to happen. Food prices rose in 2008 and again sharply in 2011, encouraging those who foresaw a Malthusian breaking point, where population would outstrip food supply. The eco-gloomsters who had talked for decades about a coming food crisis, even while famines faded, thought their day had come at last.
Yet the 2008-13 hump in food prices, which hurt poor people but helped farmers, was largely caused by Europe’s and America’s barmy decision, at the behest of the eco-gloomsters, to feed 5 per cent of the world’s grain crop to motor cars instead of people, in the mistaken belief that this was somehow good for the environment. We are still doing that, but at least we’ve stopped increasing the amount, so each year’s harvest increase can now go into food.
Last week, my fields were yielding 60 or 70 grains (seeds) of wheat for every grain that had been planted a year before. This would astonish our ancestors. A farmer in England in the 1300s was lucky to get four grains for every grain he planted. One of those four had to be saved for next year’s planting, leaving a precarious three to feed not only his own family but the various chiefs, priests and thieves who fed off him.
The truly surprising thing about this bounty is that not only are yields going up and up, in Britain as in the rest of the world, but that the amount of land required to produce that food is going down; and so is the amount of pesticide and fertiliser. Not just in relative terms, but in absolute terms. The acreage devoted to wheat and barley in Britain has fallen by 25 per cent since the 1980s. Pesticide usage in this country has halved since 1990. Nitrogen consumption in agriculture is now 40 per cent below the level of the 1980s, while mineral phosphorus and potassium use are down by 60 per cent — though that is partly because more sewage and chicken excrement are being treated and recycled for use on farms.
The world cereal harvest grew by 20 per cent in the past ten years (cereals provide 65 per cent of our calories). It needs to grow by another 70 per cent in the next 35 years to feed 2050’s nine billion people, probably with more affluent tastes.
It is on track to do so and to release a huge area from growing food at the same time. That means more nature reserves, more golf courses, more horsey-culture and hobby farming, more forests and wild land.
Jesse Ausubel, of Rockefeller University in New York, has run the numbers. To paraphrase his paper with British examples, if we keep lifting average yields towards the demonstrated levels of Tim Lamyman, stop feeding wheat to cars and rape to lorries, restrain our diets lightly and reduce waste, then an area the size of India could be released globally from agriculture over the next 50 years.
In the 19th century, the world increased its harvest by breaking new ground — in North America, Russia, Argentina and Australia. In the early 20th century it increased the harvest by replacing horses with tractors and releasing the land used to grow hay for horses. In the late 20th century the harvest went up because of synthetic fertiliser, which does not need land to produce it as manure does. Short-strawed wheats, better pesticides and safer storage and transport helped too.
Today, precision farming is driving the harvest up. Satellites tell farmers exactly which parts of each field should get more or fewer seeds, more or less fertiliser. Less wasteful fertilisers that do not escape into weeds or the local water course are coming. Better varieties are arriving all the time, though wheat harvests are now dwarfed, worldwide, by maize, which is benefiting from genetic modification. Pesticides, growth regulators, mineral supplements — all can now be fine-tuned to give the most benefit and do the least collateral harm.
Meanwhile the efficiency with which a chicken turns feed into meat has roughly trebled in the past 50 years, so even meat farming is constantly cutting the size of its ecological footprint.
From the point of view of farmers’ incomes, this is not a happy picture. With population growth slowing all the time, and Africa rapidly joining Asia in using new machinery, better varieties and more fertiliser, the world may be glutted with food for the rest of the century, keeping prices low.
Barring disasters, of course. Luddite greens are determined to prevent genetic and chemical innovations that are good for the planet as well as the harvest. A massive volcanic eruption could cause a global famine. But meanwhile, bountiful harvests mean more space for nature.
Online Chef Services
One evening earlier this summer, Joseph Yoon pulled a cooler big enough to store a corpse down the long hallway of a sleek Upper West Side building and stopped at a door emanating house music, which was opened by an expensively tousled young man holding a laptop. He showed him to the kitchen and resumed working on his spreadsheet as Yoon inspected his area. “You only have five plates?” he asked, trying not to sound crestfallen. Yoon is one of the highest-rated chefs on Kitchit, a San Francisco–based start-up that enables users to select a chef — from self-taught gourmands like Yoon to credentialed culinary-school graduates — from a “Chef Marketplace” and summon their catering services with a tap of their smartphone. In the three years since its New York launch, the site has created a roving community of hundreds of chefs, who have fanned out nightly across the city, setting up shop everywhere from Park Avenue penthouses to cramped Greenpoint apartments, cooking customized menus for people who might not otherwise book caterers for private dinner parties. And therefore don’t necessarily have napkins. “A lot of it is like guys, PR folks, tech people,” says Tessa Liebman, an International Culinary Center graduate with a profile in the Chef Marketplace. “You see a lot of really amazing apartments.”
And sometimes chefs see some not-so-nice behavior. “When there’s families involved, things tend to come out,” says Mark Tafoya, a career private chef who keeps a profile on Kitchit. “Like, I meant to say, 'Please pass the salt,' but instead I said, ‘You ruined my life, you bitch!’” Once, a bachelorette party asked if he'd be able to fill in for a missing stripper. “I said, ‘After four hours of cooking in a hot kitchen, you don’t want to see what’s under this.”
Mostly, though, the people were kind. “Some nerds tried to give me a pot brownie once!” says Liebman. And chefs were happy to have a market to exercise their creativity and sell their services. The business was lucrative: Yoon, who creates customizable menus for dinner parties ranging from $50 to $500 a head, says he made six figures last year. “When they first started, I was like, ‘Oh my God, this is awesome,’” says Dave Martin, who became famous on Top Chef for his line “I’m not your bitch, bitch,” and estimates he made $75,000 from Kitchit in his first year. “Kitchit was a great platform.”
He says “was” because last month, Kitchit suddenly announced it was changing direction. As it turns out, making 12.5 percent off of bespoke, high-end cooking has not been lucrative enough for the company, which received $7.5 million in funding from venture-capital firm Javelin Venture Partners last year. This week, they shut down Chef Marketplace to focus their efforts on what has been the most profitable part of the business, Kitchit Tonight. The service, which so far is only offered in San Francisco, provides a same-day catering service starting at $39 a head. Users just select a set menu. Ingredients are prepped and portioned at the company's kitchen. Then chefs, who are paid an hourly rate, pick them up and prepare them at people's homes. As co-founder Ian Ferguson describes it, it's “like Blue Apron, but with a full-service chef experience.” In San Francisco, where they’d tested out the concept, the service had also been vastly more popular, and the margins were simply higher. (And, because Kitchit now controls all of the costs, the operation is more predictable than a matching service between chef and client.) There will, he said, “unfortunately be a temporary suspension of New York service while we prepare to launch Kitchit Tonight here.”
Earlier this year, Kitchensurfing, which started with a similar model to Kitchit, also did away with its dinner-party format and focused only on offering simpler chef services for $25 a head — a chef arrives with preprepped ingredients and cooks the meal in under 30 minutes. At the time, “a lot of chefs quit,” says Yoon of the industry's attitude toward Kitchensurfing's change, thinking “it was really bullshit and underhanded.” (Kitchensurfing, for its part, says the move has resulted in rebooking rates that are "significantly higher" than the dinner-party service, and they've hired the chefs as employees instead of independent contractors.)
When Kitchit announced its own changes, many of the service's chefs were devastated. “I feel like I was treated like a commodity, no warning, no time to plan ahead, no warmth, no nothing,” read one comment on the private Kitchit Facebook page. “It was cold and I am disheartened by you. You built an amazing sand castle, filled it with good people then kicked it down.”
They were especially upset that the shutdown came just before the start of the holiday season, when the hors d'oeuvres business really booms. One message plainly asked, “Would you please consider extending the end date of the Marketplace to Jan. 1, 2016 to give both chefs and clients a chance to work together for the holidays and end the year on a good note?” Kitchit declined. “With the success of Kitchit Tonight we reached a crossroads that required us to focus all of our energy in one place,” Ferguson says.
Tafoya was angry, but not entirely surprised. “These are geeks who went to business school,” he says, comparing the company to Uber and other Silicon Valley companies that have fostered the “gig economy.” As a chef who has been working “since you had to put an ad in the yellow pages,” he will be fine, he reasons. It's the chefs whose businesses grew with Kitchit that are in a dicier position. “I’m not freaking out — yet,” says Liebman, who has regular clients and a pop-up dinner series, Methods & Madness, to fall back on. Some will migrate to online-booking services that focus on luxury "experiences" (private quarterback camp with a football player, yacht rentals, cooking classes with well-trained chefs), like If Only, Cloud 9 Living, and Go Dream. Yoon, who is friendly with Kitchit management, hopes he will be able to consult on menu planning for the new operation, when and if it launches in New York.
“I knew it would end, I knew the gift horse would close its mouth,” says Martin, who used to work in the tech industry. “It’s a start-up. They've got their money and they have to spend it.”
Unsurprisingly, Martin says he will not be Kitchit’s bitch. The only New York chefs who will agree to work with the new model are, he speculates, lesser-trained chefs, short-order cooks, and culinary students. “And, like, how great of an experience is that?” he says. “We all know we can get a takeout meal.”
In fact, it will now be easier than ever, given that Google, Uber, and now Amazon are muscling their way into the restaurant-delivery space, delivering fresh-cooked food to customers in shorter and shorter times, which may eliminate the need for cooks to show up at someone's home altogether. “All these non-food people starting food businesses because they think it’s cool,” Martin groans. "Millennials think they know everything because they can Google it. Someone needs to say, 'Guys, there’s no money in food.'”
SOME people relish putting on an apron and cooking dinner. Others, though, find cookery a black art best delegated to a domestic helpmeet, a microwave oven or, failing either of those, the local home-delivery service. But Mark Oleynik, a Russian-born scientist and engineer now based in London, hopes to change this state of affairs by introducing a further option: a robot cook that is as good as a Cordon Bleu chef but which can be installed in an average house. A prototype of his idea, unveiled this week at an industrial fair in Hanover, Germany, has been demonstrating its culinary prowess in public, by whipping up an excellent crab bisque.
Specialised cooking devices, such as Thermomix, made by Vorwerk, a German firm, do already exist. These, though, are essentially food-processors with bells and whistles. Dr Oleynik has taken a different approach. Instead of building a complex food-processor, he has set out to make his machine resemble a mini-kitchen, complete with conventional appliances and utensils. This can, in principle, be used to cook more or less anything. A pair of dexterous robotic hands, suspended from the ceiling, assemble the ingredients, mix them, and cook them in pots and pans as required, on a hob or in an oven. When the dish is ready, they then serve it with the flourish of a professional.
In this section
Robochef gets cooking
Thin harvest
The unfairer sex?
Infra digging
Reprints
The robochef’s hands are human-sized, and have jointed fingers and thumbs. They are made by Shadow Robot, another British firm, which has supplied similar hands to several research organisations, including America’s space agency, NASA. Teams from Stanford University, in California, and the Sant’Anna School of Advanced Studies, in Pisa, Italy, also worked on the project. Dr Oleynik’s company, Moley Robotics, hopes to have the first commercial model on sale in 2017, with a price tag of around £10,000 ($15,000).
The machine’s finesse comes because its hands are copying the actions of a particular human chef, who has cooked the recipe specially, in order to provide a template for the robot to copy. The chef in question wears special gloves, fitted with sensors, for this demonstration. Dr Oleynik’s team also shoot multiple videos of it, from different angles. These various bits of data are then synthesised into a three-dimensional representation of what the chef did while preparing the dish. That is turned into an algorithm which can drive the automated kitchen.
To make the crab bisque it is turning out in Hanover, the robot has copied Tim Anderson. Mr Anderson was the winner, in 2011, of “Master Chef”, a cookery programme popular on British television. The robot faithfully follows Mr Anderson’s every movement, carefully melting butter in a saucepan and using an electric whisk with precisely the same motions that he employs. Mr Anderson thought the robot would mess things up, but he has been impressed by its ability to capture the subtleties involved in preparing the dish. “Small things matter in cooking,” he says, “and the robot is very consistent.”
Sacre bleu!
Dr Oleynik’s plan is to support his automated kitchen with an online library of more than 2,000 recipes. And, because it is copying the idiosyncrasies of particular people, the service he plans will let a user select not only a dish but also its creator—in effect, bringing a virtual version of a celebrity chef into the user’s house to cook it for him. Dr Oleynik is also working on ways for home chefs to upload their own favourite recipes, to save them the trouble of cooking those recipes themselves.
In the current prototype, the ingredients need to be prepared in advance (the robot has not yet been trusted with knives) and placed at preset positions for it to pick up. That, though, should change with future versions. These will include fridges, in which a stock of ingredients can be stored and selected by the robot as required. With further development the automated kitchen will be made more compact and gain more equipment. And it can also, if desired, be switched to manual, because all of the implements and utensils involved are pieces of normal kitchenware. Indeed, Dr Oleynik thinks that with proper programming the kitchen could actually become a cookery teacher—helping neophyte chefs by giving them practical demonstrations of particular operations.
The robot kitchen is not, admittedly, perfect. One design flaw is that although the prototype is programmed to put used utensils into a washing-up bowl, it does not actually go on to do the washing up—a drawback often associated with human chefs, as well, it must be said. To turn the robo-chef into the mass-market product he hopes it will become, Dr Oleynik is therefore planning to add a dishwasher.
ChefCuisine Machine
Gastronomic meals used to necessitate an early-morning trip to the market followed by hours of painstaking labour in the kitchen.
No longer. France’s top female chef is promoting a machine that turns vacuum-packed capsules into gourmet dishes at the press of a button.
Anne-Sophie Pic, the only French woman to boast three Michelin stars, says ChefCuisine will enable the most ham-fisted of amateur cooks to produce dishes worthy of her illustrious restaurant in Valence, central France.
What is claimed to be the world’s first fully automatic haute cuisine machine will, she says, have the same impact on cooking as Nespresso did on coffee.
Detractors have denounced the €199 Chinese-made appliance as a nightmarish object likely to undermine French cooking skills and destroy the nation’s gastronomic heritage.
They say that it comes amid the alarming spread of projects designed to fool French families into believing that gastronomy can be quick and easy.
ChefCuisine, which was developed by Nutresia, a Swiss start-up, resembles a coffee capsule machine and functions in much the same way — except that it uses plastic packets containing vacuum-packed dishes created by Pic.
Each packet has a microchip that tells the machine how long it needs to be cooked, and at what temperature: suprême de volaille for 15 minutes at 63C, turnip fondant for 12 minutes at 83C, and chicken juice with date chutney for eight minutes at 68C, for instance.
Users have nothing more strenuous to do than to place an online order for the food capsules before putting them in the machine, filling it with water, pressing the on-off button, and waiting.
Those who want to impress their dinner guests can use the pipettes and the pastry bags supplied as accessories to make pretty patterns with reheated sauces and purées.
The dishes are sophisticated: foie gras with lemon confit for €12, for instance, or beef fillet with soya honey, mungo beans and ginger and crunchy vegetables for €16.
Pic claims that they are the nearest thing yet to home-made haute cuisine since the food is pre-cooked following her own recipes, delivered within 24 hours and then reheated with restaurant-style precision.
“When the cooking is controlled so that you reach an exact temperature, it offers perfect textures, unequalled taste and a rigorous quality,” the chef said.
La Tribune de Genève daily newspaper described the result as “quite convincing”. It said that the cauliflower cream was unctuous and the scallops “firm under the teeth”, while the association of foie gras with lemon and pear “works well”.
Yet there are no desserts, since Nutresia was unable to reproduce the necessary textures, and the project as a whole has proved to be controversial.
“This seems to me to be a very bad idea,” said François-Régis Gaudry, a French food critic who hosts a popular radio show on France Inter, the stateowned station.
He said that ChefCuisine, which is on sale in France, Belgium and Switzerland, was among a host of schemes designed to “externalise restaurants and to give gourmets the impression that they have nothing else to do than to open their mouths”.
Gaudry added: “People are being asked to live in a totally hermetic world where meat comes in a plastic packet. If this continues, we won’t know what a cow looks like in 15 years’ time.”
The promoters of similar projects argue that they meet a demand for high-quality meals in a society where families have little time to cook.
However, Gaudry countered: “How much time do we spend on our smartphones? Why not spend that time cooking? Wouldn’t it be better if all those people who take photos of themselves and put them on Instagram went to a cheese shop instead to find out what a good camembert is?”
Pic rejected his criticism, saying that her aim was to “encourage the French to cook and to democratise cuisine at home”. She added: “There are moments when you want to cook and you have time for it, and there are moments when you lack time but when you don’t necessarily want to sacrifice your desire to eat well.”
Apples That Don't Go Brown
On a cloudless September morning, the world’s most infamous apple farmer sat down at a table and carved into a $5 million Golden Delicious. Harvest had arrived early here in the verdant Okanagan Valley, 50 miles north of the British Columbia border, and fat, shiny apples were practically tumbling off their branches. But the apple Neal Carter was neatly slicing into here on his awning-covered, plant-lined patio wasn’t one of the ones his family orchard sells to distributors around the world — in fact, it wasn’t one any grocery shopper has encountered before.
This apple had been carefully grown somewhere in Washington state, the result of millions of dollars and two decades of labor. Break apart its unremarkable surface to reveal its flesh, wait long enough, and you’ll see what’s different: It remains pure white. It doesn’t start to brown right after you take a bite and leave it on the kitchen counter. In fact, it doesn’t start to brown until it molds or rots. It doesn’t bruise, either. Through a feat of genetic engineering, Carter’s apples hold on indefinitely to the pearly-white insides that inspired their name — the Arctic.
The Arctic was conceived by Carter’s company, Okanagan Specialty Fruits, which he runs with his wife, Louisa, and four other full-time employees, newly under the umbrella of a large biotech company that bought it this year. It’s an intended solution to what Carter sees as two interrelated problems: First, millions of pounds of perfectly good apples get dumped every year because they look a little too bruised or brown, the victims of an instinctive human aversion to fruits and vegetables that aren’t smooth, shiny, and symmetrical. And at the same time, North American consumers, accustomed to 100-calorie packs and grab-and-go everything, have developed an impatience for food that can’t be quickly eaten. “An apple’s not convenient enough,” Carter, 58, with reddish hair graying at the temples, told me. “That’s the truth. The whole apple is too much of a commitment in today’s world.”
Taken together, these two trends mean that while apple consumption has flatlined in the United States for decades, a staggering amount of apples go wasted. That’s an obvious problem for apple farmers, but it’s also a problem for an increasingly crowded world, and a nation in which only 13% of Americans eat their recommended daily servings of fruit. The way Carter sees it, the Arctic is a solution to all that: nutritious, attractive, always ready to eat, sliced, dried, juiced, whole. Natural.
It’s an innocuous-enough-sounding answer to a very real question, presented by an eminently likable guy running a small family business. But the race to create the world’s most convenient apple — a race that fundamentally blurs the distinction between natural and unnatural — won’t be won without a fight, and getting to the Arctic was far from easy. Browning is a natural and common mechanism in fruit, one that has evolved over millennia; counteracting it isn’t exactly like flipping a switch. And even if the science had been simple, Carter would still have had to contend with forces arguably stronger: a vocal movement against genetically modified organisms in general and the Arctic in particular, and a slew of competitors also hoping to make the apple more attractive to consumers. All of this was made harder by his total budget of roughly $5 million for the whole project, a tiny fraction of what biotech-food giants would spend on a single crop.
Even though there’s no evidence that the Arctic is unsafe for consumption — and leading scientific bodies and loads of studies have concluded that genetically modified foods are as safe as conventionally bred foods — will people want to eat an apple they know is engineered not to brown? Will people accept food visibly changed by technology? Up until now, genetic engineering’s benefits may have seemed abstract to the average consumer. Though GM corn, soybean, and canola make their way into animal feed and all kinds of processed foods, only small amounts of a few such crops (papaya, sweet corn, zucchini, squash) are actually eaten directly by humans. So while many people eat genetically modified foods all the time, they’re rarely forced to look at them, to really consider the engineering that went into giving their food the properties it has.
The Arctic will change that. If consumers do embrace Carter’s invention, it’ll be an indication that they may also be ready for other kinds of GM foods in the works, like heart-healthy purple tomatoes and cancer-fighting pink pineapples. If they don’t, it’ll be 19 years of work and millions of dollars down the drain for a product that consumers are afraid to buy.
The Arctic won approval in the U.S. and Canada this spring, but it won’t roll into supermarkets for a few years. So I drove to British Columbia to be among the first people in the world to try one. My host smashed an Arctic Golden Delicious against its regular counterpart, carved them into identical pieces, and waited.
In the mid-’70s — long before the Arctic and the outcry against it — Carter took a year off from the University of British Columbia to travel with his brother around rural parts of the developing world. In Egypt, Carter watched workers use crude machines to scoop water out of the Nile and pour it into an irrigation ditch. That’s a lot of work, he thought. Don’t these guys know there’s a pump?
The experience would spark in him a lifelong interest in solving world problems with agricultural ingenuity. He returned home, daunted at the challenges farmers face in producing food for a population expected to reach 9 billion by 2050. The crisis of insufficient and unequally distributed food and water is becoming acute since most of the world’s available farmable land is already being farmed, and rivers, lakes, and inland seas are disappearing. Soils are eroding. Climate change is wreaking havoc on temperatures and rainfall patterns.
Whether genetically modified crops have improved yields is debatable, but Carter and other experts believe they can be a — even if not the — solution. And they see genetic engineering as the latest iteration of a process that started thousands of years ago, when farmers began selectively breeding plants and animals for traits such as faster growth or bigger seeds. Apples in particular have been transformed dramatically by commercial cultivation and serendipitous acts of nature over the last two millennia. The apples grocery store shoppers pluck off shelves in 2015 are vastly different from the ones first discovered in Kazakhstan, or even the ones grown by Johnny Appleseed in the 19th century.
“Can we afford to not embrace a life-saving technology like agricultural biotechnology?” Carter asked in a 2012 TEDx Talk. Plant genomics research “is leading us to be able to develop new crops that meet real-world problems like drought, saline soils, poor water quality, and many, many more … This is a huge challenge and biotech crops are leading the way and allowing us to address it.”
In 1982, Carter graduated with a bio-resource engineering degree and married Louisa, a forestry major. He joined Agrodev, an international agricultural development company that helps farmers adopt new technologies and build infrastructure. The two eventually settled in Summerland, a tiny, lakeside British Columbia town filled with wineries, and started their own orchard. By 1995, Agrodev was thinking about agricultural technologies of its own, so Carter went looking for ideas at the government-run Pacific Agri-Food Research Centre in Summerland. There he met David Lane, a cherry and apple breeder newly in charge of crop biotechnology projects.
Lane had an idea on his mind. A team of Australian scientists had recently identified the biological process behind browning in potatoes, and Lane suspected the same force was at work in apples. In intact apple cells, enzymes called polyphenol oxidases, or PPO, stay separate from compounds called phenols. But as soon as a knife rips through the skin, as soon as air starts rushing in, the cell walls break down, the compounds mix, and the flesh deepens into shades of caramel. (This ancient process evolved so the flesh would release the seeds and allow them to propagate, Amit Dhingra, a Washington State University horticultural genomicist, told me.)
If there were a way to tone down PPO, Lane thought, it could plausibly slow or stop the browning process. No one knew how to do this, but Carter wanted to try. “If you’re a grower, you would understand immediately the amount of apples that are tossed out because of superficial scuff marks,” he said. “So there’s a huge cost to the grower, packer, shipper, retailer, processor, all the way down the value chain, and then, ultimately, I think most consumers understand the ‘yuck’ factor around apples going brown.”
They certainly do. In the U.S., Canada, Australia, and New Zealand, more fruits and vegetables are lost or wasted than consumed across the supply chain, according to the United Nations Food and Agriculture Organization. A study in the Journal of Consumer Affairs estimated that $15 billion in fresh and processed fruit was lost from the U.S. food supply in 2008 — about $9 billion at the consumer level and the rest at the retail level. Apples, the second-most consumed fresh fruit in the U.S. behind bananas, make up a good chunk of that waste: an estimated 1.3 billion pounds every year, or a $1.4 billion loss, with a sizable yet unknown portion due to off-coloring or soft spots.
Dave Henze of Holtzinger Fruit Company, which packs and ships apples from mostly family-owned and independent growers in Washington, estimates that bruising and browning force him to dump about 5% of his supply, or 2 million pounds, every year. “A lot of apples aren’t packed because maybe they don’t have the right shape or right color, but they’re a perfectly good eating apple,” he said. Some get juiced or sliced, but “there’s a huge amount of food that is thrown away or not used.”
Soon after Carter and Lane met, Agrodev lost interest in potato-browning. But Carter wouldn’t give up on apples. He licensed the Australian scientists’ technology, raised money from family and friends, got a grant from the Canadian government, and rented lab space in the Pacific Agri-Food Research Centre. Now, looking back on November 1996, Carter can only describe himself as “naïve as hell.” “David [Lane] made it all sound like it was going to be a lot easier than it was,” he recalled. “Classic scientist, right? ‘Oh yeah, two years and all this is done.’”
Long before the genetically engineered apple, there was a genetically engineered tomato. The Flavr Savr ripened more slowly, lasted longer, and in 1994, became the first commercially grown food with a genetic change that U.S. customers could see and feel. Since then, GMOs — mostly designed by agricultural behemoths like Monsanto and beloved by farmers for their ability to fight off pests, diseases, and drought in the field — have quickly and aggressively entered our food supply. Today, about 90% of all corn, soybeans, and cotton grown in the U.S. is genetically modified.
But as GMOs have grown pervasive, their opposition has become organized and vocal. Monsanto in particular became a high-profile symbol when it engineered some of its first crops to resist a weed killer it also made, which critics say forced farmers to buy its products, endangered the environment, and ultimately didn’t solve the problem it promised to solve: Weeds are now becoming immune to that weed killer. Activists stage worldwide rallies against Monsanto and protest in stores believed to carry its products. In 1999, scientists developed “Golden Rice” to counter vitamin A deficiency, which causes blindness in up to half a million children in developing countries every year. Despite studies finding that the nutrient-rich rice is safe and boosts health, activists have destroyed a field trial in the Philippines, filed to block all field tests and feeding studies, and helped keep it off the market 16 years after its invention. In 2005, two organic food retailers launched the Non-GMO Project, which has gone on to label nearly 35,000 products as “GMO-free.” And in 2012, Canadian researchers, in the face of protests, gave up on genetically modifying pigs to produce less environmentally harmful manure.
Buoyed partly by a rising appetite, at least in certain circles, for food perceived or marketed as “natural” — locally grown, minimally processed, organic — supermarkets and manufacturers including Ben & Jerry’s, Whole Foods, General Mills, and Chipotle have banned or restricted GMOs. State and national legislators have passed or tried to pass GMO-labeling laws; Connecticut, Maine, and Vermont all require some form of GMO labeling. Earlier this year, the White House announced it would re-evaluate its regulatory process for bioengineered crops.
As the GMO debate raged on, Carter and his handful of scientists plugged away. They dabbled in peaches, apricots, cherries, and pears, but ultimately, their budget forced them to focus on just two Arctic varieties, one sweet and the other tart: the Golden Delicious and Granny Smith. “Neal never gave up, ever,” Louisa said.
Until its sale, Okanagan Specialty Fruits was a tiny operation, unlike the mega-corporations that spend an estimated $136 million on developing and getting approval for just one GMO. And in many ways, it still is. Headquarters is essentially the Carters’ home, where family and work are indistinguishable. Over lunch, I sat with Louisa, 57, the co-founder and chief financial officer, and Joel, their 28-year-old son who helps part-time, in their kitchen as they chatted about the work left to do and an upcoming wedding. Apple-shaped magnets pinned family photos to the fridge, next to biotech-themed magnetic poetry (“agriculture and genetically modified biotechnology is exciting research”) and a political cartoon poking fun at the Arctic. A running tally of the harvest was scribbled on the chalkboard near the home office where Carter can be found when not in the fields. Carter estimates he’s worked 60 to 80 hours a week for the last 20 years. “Other people might say, ‘If you work out the net-present value of what we put in and what’s going on, we better quit,’” Lane said. “But not Neal.”
In the late 1980s, a biologist tried to darken purple petunias with an extra copy of the pigmentation gene — but the flowers bloomed white. Something had made the genes cancel, rather than enhance, each other.
The underlying biology, unlocked by Nobel Prize–winning scientists in 1998, involves how genes are regulated in plants and animals. Messenger RNA instructs the cell to create proteins, the building blocks of tissues and organs. It turns out there’s a natural mechanism — RNA interference, as it’s called — that can silence those instruction-carrying molecules. Carter’s team made copies of the browning-controlling genes, slightly modified such that they would trigger RNA interference, and stuck them into the apple genome. As counterintuitive as it sounds, the extra set of genes ultimately prevents the original genes from being expressed.
It’s an elegant solution. But the science wasn’t always clear, and the company ditched hundreds of test fruit before the Arctic Golden Delicious, No. 743, and the Arctic Granny Smith, No. 784, in 2004. Carter says Arctics can last up to four weeks with refrigeration, though they still mold and decay eventually. In September, he and I were munching slices of apples that’d been picked the previous fall, cut in January, dried, and never refrigerated. He showed off photos of Arctics ground into neon-bright juices and smoothies.
Carter talks about farming the way kindergarten teachers talk about graduation day. “You see something grow all season long — and boom, there it is, a bin full of apples heading to be packed and off to the marketplace,” he told me. “You get a sense you’re contributing.”
Here on his orchard, clad in a blue fleece and square glasses, Carter looks and acts more like an earnest dad than a mad scientist. But if anything can make a Bond villain out of an apple farmer, it might be the pitched, protracted, and at times deeply personal battle over GMOs. As the Carters planted trees and tinkered with seeds, they increasingly bumped up against a movement that was suspicious not just of GMOs, but of the Arctic in particular.
In 1999, protesters chopped down 652 of the Carters’ personal, non-Arctic trees. In 2006, the Carters changed the name from Okanagan Biotechnology Inc. to Okanagan Specialty Fruits in anticipation of criticism (“We realized the ‘biotech’ handle was a tough one,” Carter said). But the vitriol was unleashed in full view when the company submitted requests to four regulatory agencies in the U.S. and Canada to approve the apple for sale.
“This is ridiculous. Fruit is supposed to brown and go bad, it is part of life,” read one of more than 178,000 letters, most negative, to the U.S. Department of Agriculture from 2012 to 2014. “To change the apple from how it was intended to be could change the way the apple affects us down the road and the harmful effects might not be seen for a couple generations or more.” “Genetically modified food is poison and the biggest threat to our health on this planet! No to GMO apples!!!!!!!” More than 461,000 people also signed anti-Arctic petitions to the USDA in late 2013 and 2014.
Outside the biotech industry’s Chicago convention in 2013, a protester in a gas mask dropped apples into a cart as another tipped it over, yelling, “They put poison on those apples!” Anti-GMO sites disseminated images of apples with syringes and fangs. In a “fact sheet” for the public, Friends of the Earth warned, “From apple pie to baby’s first applesauce and the apple in your child’s lunchbox, apples are a core part of a natural, healthy diet. However, apples are about to become not-so-natural, and consumers, especially parents and other caregivers, may soon want to think twice about that apple a day.” Food and Water Watch warned of danger to the Rosh Hashanah custom of dipping apples in honey. “By next year, something could be sadly amiss with our annual tradition. Our apples could be genetically engineered and our honey could be somewhat endangered.” The U.S. Apple Association’s current stance on the Arctic is that “the choice, very simply, is up to consumers.”
Okanagan Specialty Fruits goes on the offensive by regularly talking to the media, blogging, and answering commenters’ questions. Even in defending its existence to a skeptical public, the company is relentlessly cheerful. “Yesterday we used cookie cutters to make some apple fish, stick those in some blue jello!” Carter joked on Reddit.
But these efforts do little to appease opponents’ criticism. “It’s clever marketing to use the word ‘Arctic’ for white and pure,” Martha Crouch, a biologist and consultant for the anti-GMO group Center for Food Safety, told me. “But in fact … it’s deceptive.”
Now that the apples have been approved, the biggest threat to the Arctic is a consumer boycott, whether formal or informal. “We know it’s not a health concern of any kind,” said Mark Powers, vice president of the Northwest Horticultural Council, which represents the Pacific Northwest fruit tree industry. “It really comes down to perception and marketability.” It’s also unclear whether farmers will want to grow Arctics; they may have problems growing or sending them to countries that restrict GMOs, like parts of the European Union and Japan. Okanagan Specialty Fruits says it’s heard from many interested growers, but won’t name them due to their fear of industry pushback.
Tim Dressel, a fourth-generation apple farmer in New Paltz, New York, told me, “GMO science, despite what a lot of people think, is a really amazing technology … and certainly not something to be afraid of per se. But with the general attitude towards GMOs right now, bringing apples into the mix is not necessarily something we need to do, especially for something that is only strictly a cosmetic issue.”
Indeed, how big a deal is browning, really? “As if this was a huge societal problem that needed to be solved,” Crouch said, laughing. She and her organization argue that all that time and money would be better spent on educating people to stop wasting food and keep produce fresh in old-fashioned ways.
But then again, maybe not. After all, browning and bruising aren’t problems just in apples. In 2008, American retailers and consumers were throwing out 3.7 billion pounds of fresh potatoes a year, a $1.8 billion loss. That spurred J.R. Simplot, one of the nation’s largest privately held companies, to create the Innate potato. Much like the Arctic, the Innate’s bruising-controlling enzyme is turned off. Also much like the Arctic, it’s faced protests as it’s won regulatory approvals over the last year. But Simplot sensed a need. “The number-one consumer complaint for fresh potatoes is bruising,” spokesman Doug Cole told me. Carter isn’t the only one working on inventing a more attractive apple: Over the last 15 years, while Okanagan Specialty Fruits was quietly working on the Arctic, sliced, preserved apples turned into a multimillion-dollar industry. And for Carter, that’s a problem.
Sterile, just shy of freezing, and alive with the roar of a hundred machines, the 60,000-square-foot Crunch Pak factory feels like a comically large operating room for apples. Red, green, and gold orbs bob through chutes of water, march into slicing machines, and plop onto conveyor belts in the form of crescent-shaped chunks. Inspectors in masks, gloves, and smocks then send them to their final destinations: laser-perforated plastic bags and lunch trays sold across America.
If the Arctic is gunning to be the most convenient, easiest-to-eat, longest-lasting apple around, Crunch Pak, the nation’s largest provider of sliced apples, seems like a good place to check out the competition. Located in the sunbaked Wenatchee Valley in Central Washington, it chops up 6 million slices a day.
My guide, the friendly and fast-talking Marketing Director Tony Freytag, founded Crunch Pak with two apple growers in 2000. Early pioneers in the apple-slice business, the trio initially thought, “It’s going to be hard enough to sell this idea because people are going to say, ‘It’s just an apple,’” Freytag recalled. “But we saw where convenience was going.”
It was a prescient observation. American apple consumption has dropped off significantly in the last three decades, from an average of 20 pounds per person per year between 1986 and 1991 to just 16 between 2006 and 2011. Meanwhile, other produce transformed into ultra-convenient forms skyrocketed in popularity. In 1986, a California farmer cut up ugly, broken carrots and single-handedly launched the baby carrot craze. Earthbound Farm in California pioneered bagged lettuce in the late 1980s and early ’90s. Apples missed that wave.
Crunch Pak’s apple slices aren’t genetically modified. But they’re not entirely natural, either. Their magic ingredient is NatureSeal, a proprietary powder of calcium salts and vitamin C invented in the late 1990s. Mixed with water and sprayed on produce, it extends the shelf life of sliced fruits for at least three weeks with refrigeration before they start browning. And it’s been a blockbuster: Crunch Pak’s slices have been sold in virtually every major supermarket — Wal-Mart, Kroger, Target, Sam’s Club, Costco, Publix, Safeway, Albertson’s — and fast-food joints like Carl’s Jr., Arby’s, Chick-fil-A, and Denny’s. The privately held company says it racks up in sales in the low nine digits.
The day of my visit was especially busy due to the harvest, so the 800,000 pounds of apples sliced that week were fresh off the fields. But otherwise, Crunch Pak relies on a lot of fruit that was picked up to a year prior, Freytag told me. Harvest happens but once a year, and the industry has devoted almost unbelievable time and effort to stretching out that supply for as long as possible to meet never-ending consumer demand. “The goal is,” Freytag said, “if you eat something in July, it’s going to be just as good as if you ate it in October right after you harvested it.”
A ripening apple takes in oxygen and gives off carbon dioxide. To slow that process, growers and pickers put them in controlled atmosphere storage rooms, the fruit equivalent of hibernation caves, until the time comes to be sliced or shipped to retailers. The temperature is almost freezing, the oxygen severely reduced, the humidity relatively high; a human couldn’t breathe. Even this setup alone can’t feed appetites year-round, which is why retailers import apples from countries like New Zealand and Chile, whose harvest happens during North America’s off-season.
The Crunch Pak factory is to Carter’s farm as Disneyland is to a jungle gym. It is an elaborate and precise operation manned by 900 employees, 24 hours a day, six days a week. Computers and cameras vigilantly monitor every condition, from temperature to humidity to contamination, to spot any risk of spoiling and bruising.
All this chilling, slicing, spraying, bagging, and shipping make the definition of “fresh” a little awkward. “You would never think to go to your refrigerator and slice an apple and cut it up in pieces and put it in a baggie and come back in 10 days and say ‘I’ll have that,’” Freytag admitted. “That’s not a visual that we want.” Refrigeration and preservatives aside, pre-manufactured slices can seem silly. How hard is eating an apple out of your hand — or slicing it yourself? Isn’t pre-slicing rewarding laziness? “Slicing up apples and putting them in plastic bags to turn them into a fast-food item seems to be going in the wrong direction to me,” Crouch, of the Center for Food Safety, said, “rather than helping people reconnect with whole foods.”
But in some cases, the alternative might be eating little or no fresh fruit at all. When Cornell University researchers visited upstate New York schools, they learned that braces and small mouths aren’t ideal for chomping into whole fruit; teenage girls said doing so was “unattractive,” according to their 2013 study. So they gave fruit slicers to eight elementary schools, and on average they sold 60% more apples. The experiment was repeated at three middle schools, where average daily apple sales went up 71% compared to non-slicing schools. Significantly more students also actually ate the apples, instead of throwing them away. The stark results prodded the Wayne Central School District, which participated in the study, to start offering sliced apples full-time to its 2,300 students.
In a way, a Crunch Pak slice and an Okanagan Arctic are mirror images. The first comes off the tree “natural,” then is subjected to a battery of chemicals and machines designed to make it more palatable. The second grows with its engineering already built in, and then can be eaten plain. But both companies are rivals in the race to make apples convenient and fresh for as long as possible, and both approaches are fundamentally similar: They’re complicated, expensive inventions used by humans to wrest control of nature, united by the underlying principle that rather than adapt American eating habits to the fruit we have, we should adapt the fruit to the eating habits we have.
Back on the Carters’ Summerland patio, I found myself reaching right over the normal, slightly browned Golden Delicious slices to snatch a white Arctic, succumbing to some deep preference for whatever looked freshest, prettiest — and easiest.
“Most people don’t really recognize that fact, but there’s a lot of people who will only eat an apple after it’s sliced,” Carter said, watching me nibble on one slender piece after another. Otherwise, “you’ve got to get that knife out, that cutting board out, slice it up, deal with the core. People say, ‘I’m going to buy grapes or something else I can just pop in my mouth.’ Those are the guys we’re going after.”
You might call the other threat to the Arctic the accidental Arctics. These are a handful of apple varieties, crossbred in recent years, that somehow ended up with lowered PPO. There’s the RubyFrost, the Eden, and the big golden Opal, which is not-so-subtly advertised as a “non-GMO apple” that “naturally doesn’t brown,” and “the first U.S. apple variety to be verified by the Non-GMO Project.” Not surprisingly, Carter isn’t a fan of these newcomers. He doubts they’re actually non-browning. He calls the Opal “not a terribly good eating apple” and points out that his technology can transform any (tasty) variety.
But to those concerned about genetic engineering, these apples may sound like ideal alternatives.
GMO opponents like the fact that non-browning apples such as the Opal were created through the marriage of two familiar fruits — not by a gene-silencing technique like RNA interference, which they worry could change the genes of someone who eats food altered by it.
Those making this argument often cite a 2011 study in Cell Research. A team led by Nanjing University in China reported finding bits of rice RNA in the blood of men, women, and mice, which was surprising: Never before had these types of molecules been found to survive digestion and cross into the bloodstream. Even more alarming, the scientists reported a sign of bodily change and perhaps harm: One molecule appeared to shut down a gene involved in removing unhealthy cholesterol.
The study never mentions GM crops. Nevertheless, activists interpreted it to mean that the same kind of genetic engineering behind the Arctic could allow small RNA molecules to manipulate human gene expression in potentially harmful ways. A dozen environmental and consumer organizations cited the paper in asking corporations like Burger King, Subway, and Baskin Robbins to boycott Arctics. Baby-food maker Gerber, McDonald’s, and Wendy’s responded to say they had no plans to use them.
Yet many biologists derided that reaction as overly cautious at best and alarmist at worst. Attempts at replication of the study showed no more than trace amounts of RNAs in the blood of monkeys, mice, honeybees, and athletes, even after they ate food chock-full of those molecules. Some other, controversial studies argue that small plant RNAs, somewhat similar in function to the ones that suppress browning in the Arctic, may affect various species’ physiology under specific conditions. “Of all the different genetic modification strategies you can use, RNA interference is probably the one that has the potential to be the safest and most specific,” said Kenneth Witwer, a Johns Hopkins University molecular biologist who was among those who failed to replicate the findings, adding, “The weight of the evidence in the field is that this is not a phenomenon we have to worry about.”
Perhaps the more significant point is that no technology is risk-free. Even crossbreeding, the classic agricultural practice, is unpredictable because genes are transferred at random. “If you cross two red apples, you can get some yellow apples just because there’s dominant genes and recessive genes,” said Susan Brown, who leads Cornell University’s apple-breeding program. She recently crossed two breeds, certain she’d get extra flavor. “I ended up with a lot of progeny that tasted like soap.”
In the late 1960s, a research team crossbred the Lenape potato, only to discover it was genetically predisposed to produce a lot of an alkaloid called solanine — a natural defense mechanism that, in large doses, can kill humans. Celery naturally contains psoralens, irritant chemicals that ward off pests and diseases. But grocery store workers have experienced skin rashes after handling celery bred to have increased psoralens.
GMO critics take issue with how the U.S. and Canada evaluate the safety of GMOs for consumption: by the product, not the process by which it’s made. Developers are asked to identify the new genetic traits; new toxins, allergens, or proteins; and nutritional changes. If regulators conclude that food from the new plant variety will be as safe as food from conventionally bred varieties, as in the Arctic’s case, the crop is approved.
It’s true the system is set up to catch established, not unknown, toxins and allergens — and, again, no technology is risk-free. But genetic engineering introduces relatively few proteins compared to other methods of producing new traits. And after two decades, there hasn’t been any credible evidence to suggest that GMOs harm human health.
Okanagan Specialty Fruits likely wouldn’t have existed if not for the Carters’ siblings, cousins, aunts, uncles, and friends, like Carter’s former Agrodev boss and local growers — the bulk of the company’s roughly 45 shareholders. But that support came with a unique pressure. “It put some responsibility on my shoulders, because they’re not rich people,” Carter said. When he felt pessimistic at times, he’d suggest that they hold off on investing; they’d respond, “No, no, no, Neal, we trust you; you’re going to get this done,” he recalled.
In the winter and spring, when regulatory approvals seemed all but sure, Carter began to realize that, as much money and effort as his little company had poured into the first 19 years, their resources would almost certainly not be enough to get the Arctic to growers around the world, advertise it, and sell it. He began talking to Intrexon, a $4.5 billion synthetic biology company with an eclectic set of businesses that engineer cow reproductive technologies, fast-growing fish, and disease-curbing mosquitoes. The team hadn’t necessarily been looking to sell, but they realized that doing so could finally reward their investors, some of whom had died over the years. In February, two weeks after the U.S. Department of Agriculture approved the Arctic, Intrexon bought Okanagan Specialty Fruits for $41 million.
For an operation that from the start prided itself on being tiny and beholden to no one, the sale to a big biotech corporation seemed like quite a change. “Okanagan Specialty Fruits is a small, grower-led company with just seven employees, which often makes us seem like a small fish in a very big pond,” it blogged in 2013, referring to bioengineered food giants like Monsanto, Syngenta, and Bayer. Two years later, Carter told me that despite the sale, “We still have the same team in place. We’re still doing all the same things.”
Intrexon is not Monsanto. Still, CEO Randal Kirk sees it as building a brighter, more efficient world in which food is either ultra-unique (the Arctic) or ultra-cheap (salmon that grow in half the time). “I don’t think our size or our capital should be counted against us by virtue of those facts alone,” he said in an interview. “I would simply ask people eventually to judge us according to what we do, what we offer. In the case of the Arctic apple, I think everybody who has tried it has liked it, and that encourages us greatly.”
Under Intrexon’s wing, Okanagan Specialty Fruits is beginning to distribute saplings to growers. Next, its scientists are considering using their technology to alter other apple varieties, or turn off browning in pears and cherries, or make peaches resist diseases. The possibilities are many. After all these years in the field, it’s almost time for Carter to think about projects besides the Arctic. But for now, he’s just happy to finally start sending Arctics on their way to grocers, restaurants, and homes. “Does it taste like a GM food?” he asked me outside his house that day, the last slivers of the Arctic still pale in the summer sun. “Really. Can you distinguish it?” And the truth was, no. It tasted sweet, tender, and crisp. It tasted like an apple.
Anti GMO Fear Mongering Slapped Down
In my estimation, genetically modified organisms (GMOs) incur fear mongering of the highest order, perhaps second only to vaccines, if at all. For reasons that escape me there has been a concerted effort to marginalize or outright stop the use and development of GMOs at all levels of the food supply. The expressed reasons are varied; objections range from conspiracy-laden anti-corporate narratives to Frankenfood fears about unknowns. I have noted a severe ideological bent to these objections, which defy scientific evidence of safety and efficacy. They’re narratives based almost exclusively on the nonscientific foundations of chemophobia, naturalistic fallacy, and fear of the unknown.
Over the last few years the fear mongering has grown. With that growth there has been a reasonable-sounding anti-GMO tactic demanded by some advocates, namely forcing labeling of GMO foods.
For a variety of reasons I outlined in my previous post—GMO Labeling: Consumer Protection or Fear Mongering?—requiring the labeling of GMO food is an attempt to foster the false idea that GMO is significantly dangerous product, akin to cigarettes, a product that requires a warning label because of its demonstrable harm. This goal misleads consumers by using a patently wrong false equivalency. Labeling has been strongly supported by the organic food industry, which uses its own self-regulated labels to charge a premium for an essentially undifferentiated product. For these obvious reasons the organic industry has funded and promoted a push to label GMO products, resulting in a recent petition to the FDA for a government-mandated GMO food label. This month the FDA responded with a definitive and comprehensive denial of the petition. The FDA released a PDF copy of its response for free.
The FDA pretty much sums up my position. Specifically, it pointed out:
The petition does not provide evidence sufficient to show that foods derived from genetically engineered [GE] plants, as a class, differ from foods derived from non-GE plant varieties in any meaningful or uniform way, or that as a class, such foods present any different or greater safety concerns than foods developed by traditional plant breeding.
All agriculture poses risks to the population and the environment. Despite decades of scrutiny there is no evidence that GMO has an greater risk than conventional agriculture. The FDA requires labeling if a product poses a direct health risk, if it is misleading, or if it is significantly nutritionally different from similar products. There is no evidence for this in the case of GMOs, though this lack of evidence has been unpersuasive for anti-GMO advocates. Nonetheless, belief doesn’t equate to facts.
The proposal sounds logical because some food is labeled organic. Even though the organic food lobby would like you to believe that their food product labeling equates to superior quality, this belief is also unfounded. This can be found in detail in Skeptoid episode 166, episode 19, as well as multiple blog posts on this site.
Organic food is labeled with marketing claims. These are claims that have a specific meaning that they charge a premium price for. Although the label’s purveyors want consumers to perceive the label as a promise of wholesome superiority, realistically it’s just a sales gimmick. Conventionally grown tomatoes are just as healthful as their organic siblings, just as organic candy bars and chips are the same junk food at a higher price. There are other examples of this consumer labeling, for example a “Made in the USA” label says nothing about the quality or the construction of the product, or any number of other factors consumers might want to take into account. It is label that makes the product desirable to a section of its market, something that attracts a certain population or something they can charge a premium price for. It has little to do with the product itself.
In its response to the petition the FDA has said that mandatory labeling for GMO makes a claim that the product is different in quality, content, or safety. The response goes on to unequivocally and demonstrably explain that there is no evidence to support that position, saying, in part:
The simple fact that a plant is produced by one method over another does not necessarily mean that there will be a difference in the safety or other characteristics of the resulting foods. The determining factor is the final food product and its objective characteristics in comparison to its traditional counterpart, not the process used to produce the plant from which the food was derived.
Further, the response notes:
Although foods from GE plants may not have been on the market for the length of time as plants produced through conventional plant breeding techniques that does not mean that all resulting foods are any less safe. To date, we have completed over 155 consultations for GE plant varieties. The numbers of consultations completed, coupled with the rigor of the evaluations demonstrate that foods from GE plants can be as safe as comparable foods produced using conventional plant breeding.
The FDA response also gives a complete, detailed refutation of the petition’s specifics, which were based on what the anti-GMO community points to as the best evidence against GMO products. It addressed claims that GMOs pose unknown risks, are responsible for harmful environmental effects, and that consumers are demanding labeling.
Rather than offering blanket dismissals, the FDA was clear and precise; they broke down the research and disassembled it scientifically. The response letter specifically relates how the evidence was flawed, limited, and was used to buttress unsupported conclusions. The letter then specifically reviewed the reasons why each part was wrong and how those ineffectual claims apply to the Administration’s statutes. It was cathartically enjoyable to read it. Frankly, it gives me a glimmer of hope that at least some sections of the US government assigned to protect us act upon science, not ideology.
I am not saying that genetic modification is without risk or that corporations are only benevolent; rather, I’m saying that these problems exist for all food production equally and it is senseless to single out promising new technologies because we are afraid and don’t understand.
For the fear mongers you can still buy organic food and lessen your anxiety. In my estimation the numerical majority of the world endures daily hunger. As long as children are starving it makes no sense to stifle the science that offers the best probability of blunting this growing problem, especially when there is no demonstrable harm.
You're Fat, Not Fit
It’s bad news for the big-boned: the popular idea that you can be “fat but fit” appears to be untrue.
Even the fittest obese people are 30 per cent more likely to die early than the most out-of-shape individuals of a normal weight, a major study has shown. The results suggest that obesity is more dangerous than unfitness, and losing weight is more important than hitting the gym. Celebrities such as Adele, Nigella Lawson and the actor Jack Black have claimed that it is possible to be fat and fit. However, scientists said that people who maintain that their weight is not a problem are probably kidding themselves and the harm will appear in time.
The study also found that athlete level conditioning offered no greater protection against early death than average fitness. Researchers in Sweden looked at data on 1.3 million men who were weighed and given cycling tests when they were conscripted at 18 between 1969 and 1996. They found that the fittest conscripts were 51 per cent less likely to have died since the tests than the least fit.
However, the fittest obese men were 30 per cent more likely to have died than the least fit men of normal weight, researchers report in the International Journal of Epidemiology. The least fit obese men were 21 per cent more likely to die early than the fittest obese men.
“If you’re obese and unfit, it’s better to lose weight than to go for your fitness,” said Peter Nordström, senior author of the study. Even the least fit people who were merely overweight at 18 were 13 per cent less likely to have died than the fittest obese men.
It was unclear whether fatness and unfitness caused early death or were markers of dangerous genetic factors, but the finding “challenges the . . . idea that obese individuals can fully compensate mortality risk by being physically fit”, Professor Nordström said. Studies which have suggested there is such a thing as “healthy obesity” had tended to look at older people, which could explain why results differed when assessing people at 18.
Among people less fit than average, the risk of early death fell steadily as they got fitter, but beyond average fitness it did not fall further.
“It’s dangerous to have fitness below the mean but if your fitness increases it doesn’t mean you have a lower risk of dying. People believe that if they train hard they have lowered their risk. But you don’t have to be Andy Murray. He will have no lower risk of dying [early],” Professor Nordstrom said.
Matt Capehorn, of the National Obesity Forum, said exercise was important to reduce cardiovascular disease as an obese person exercising was less likely to develop heart disease than a normal weight person who did not. However, diseases such as diabetes were linked to obesity. “There appears to be no such thing as fit and fat, and even those showing no metabolic disturbance due to their obesity . . . are still statistically much more likely to become metabolically unhealthy,” he added.
Sirtfood Diet
Finally – a way to lose weight that involves eating, not going hungry. You won’t have heard of it yet, but the diet of 2016 is all about everyday ingredients – many of which you’ll already have in your cupboard or fridge – that experts say turbocharge weight loss and help you to live longer. They’re called sirtfoods and, according to researchers, they mimic the effects of fasting and exercise.
Still not persuaded? What about the fact that these plant-based ingredients include red wine (especially pinot noir), chocolate (so long as it’s at least 85 per cent cocoa content) and coffee (ideally black)? For a certain wealthy, body-conscious set in west London, the sirtfood diet has become a way of life. Converts already include heavyweight boxer David Haye, the Olympic gold-medallist sailor Sir Ben Ainslie and Lorraine Pascale, the model and BBC television chef, all of whom say they have never looked, or felt, better.
So, as well as wine, chocolate and coffee, what are these sirtfoods? It’s no coincidence that among the top 20 (see below) are ingredients that form the basis of diets in parts of the world that boast the lowest incidence of disease and obesity, such as the Mediterranean and Japan (where people eat five times as many sirt ingredients as we do).
Experts even claim that including just a few of these ingredients in your diet can counterbalance some of the junk food we eat. It’s as though they nuke the bad stuff.
To understand how the diet works, a quick biology lesson. Sirtfoods are rich in specific natural plant chemicals that activate genes in our bodies. These genes are known as sirtuins and they first came to light when researchers discovered that resveratrol, a compound in red-grape skin and red wine (especially pinot noir), dramatically increased the lifespan of yeast. Why the excitement? Because from yeast to humans and everything in between, the fundamental principles are nearly identical. If you can manipulate something as tiny as yeast successfully, then the potential exists for the same benefits in humans.
Sirtfoods have since become a focus of attention in the world of nutrition science, and many more sirt-rich ingredients have been discovered. Sirtuins are special because they orchestrate processes deep within our cells that influence our susceptibility to disease and, ultimately, our lifespan. Experts even argue that, if sirtfoods are eaten in the right quantity, their discovery could be as profound as the moment vitamins were understood 100 years ago. Over the past decade, pharmaceutical companies have invested vast sums into researching a magic sirt pill. It turns out it’s much more effective to actually eat.
The irony is that Aidan Goggins and Glen Matten, the men behind the sirtfood diet, didn’t set out to launch a weight-loss programme. Since writing The Health Delusion, an award-winning book lambasting the supplement industry, the authors, who have master’s degrees in nutritional medicine, have been fascinated by sirtfoods and how to harvest their potential – not in pill form, but by eating real food. However, their schtick is all about wellbeing and fitness. They did not foresee the dramatic weight-loss potential: “Dieting has never been our thing.”
The latest food phenomenon began at a high-end health club, KX, in South Kensington, London, where Goggins and Matten work as consultants. Last year, they had what sounds like a simple idea. Most people have some sirtfoods in their diet already, but not nearly enough. What would happen if they packed someone’s diet with as many of the richest-known sources of sirtuin as possible?
Their timing was serendipitous. We’ve become a nation of nutrition nerds, obsessed by carefully calibrated eating plans. We’ve learnt how to pronounce quinoa. We know all about goji berries.
What they didn’t realise is that sirtfood takes nutrition science to the next level.
In a pilot trial, Goggins and Matten put 40 people on a new regime (only one person dropped out). It began with three days of juices made up from sirtuin-rich ingredients kale, celery, parsley and matcha tea (like normal green tea on steroids) and one sirt meal, followed by four days of two juices and two sirt meals. The juices were designed to be the mainstay of the diet, packed with nutrition (the only supplements clients required were selenium and vitamin D). The average daily calorie quota over 7 days was 25 per cent fewer than the recommended average for women, and twice the deal on the 5:2 diet, and most of the participants were already in good shape. If the diet helped these people, surely it would have even more benefits for the rest of us.
Within a week, clients were ringing up saying that not only did they feel amazing, but weight was melting away. Indeed, the results were so dramatic that Goggins and Matten – who describe themselves as “serious sceptics” – spoke to the kitchen staff to check they were serving the correct portions. “This seemed too good to be true,” says Matten. “We kept thinking these must be outlying results, but it kept happening.”
“The original trial was all about stimulating rejuvenation and cellular repair,” adds Goggins. “We had no concept that the average weight loss would be half a stone. Because there was a degree of calorie restriction, we knew that people would lose weight, but this was way beyond what we imagined.”
By making sirtfoods the centre of an eating plan, they realised they were effectively boosting the effects of cutting calories, which is why clients could eat more for part of the trial, but still lose weight. Thus, without really looking for it, they had an irresistible formula on their hands. Eat, lose weight, don’t get hungry (or, for that matter, hangry – that broiling rage that overcomes anyone on the 5:2 at about 4.30pm on a fast day).
It seemed that combining sirtfoods – rather than eating them in isolation – actually increased their potency. Clients were also increasing their muscle mass, which wouldn’t usually happen on a plant-based diet. Most weight-loss regimes falter because people lose muscle as well as fat, and if they do lose weight they end up looking gaunt. But with this regime they were toning up, even without exercise. “ ‘Hold on a minute,’ we thought,” says Matten. “ ‘This is bigger than we realised.’ ” It was as though they were putting people’s bodies through a reboot.
I’m on Day 2 when I meet Goggins and Matten at their club. I’ve got my second juice of the day in my handbag and I’m wondering how early is too early to eat my ration of dark chocolate. I’m in luck. Alessandro Verdenelli, the club’s head chef, serves a sirt lunch, including carrot spaghetti with a puttanesca sauce and chocolate tart for pudding. I eat everything. Not bad for a diet. The following day, I’ve lost 2lb. It’s slightly weird, as though my body is cannibalising itself. The only problems are the distinctive squeal of the juicer as it tries to eviscerate leafy kale, and the bits of parsley between my teeth.
But couldn’t one argue that the weight loss is merely down to cutting out calories? “The fact is, the calorie deficit is not massive,” counters Matten. “The overall weight loss is over and above what we would expect. That is what piqued our interest. What else is at work here?” He says many clients reported back that the portion sizes were so big, they couldn’t finish them. It turns out sirtfoods both fill you up and suppress your appetite.
And what about the flaw in every diet: the weight regain when you stop? “I have more emails from people saying they might be losing too much in the long term than saying it has come back on,” replies Goggins. He also suggests that some people can do the diet every three months for a fat-loss boost.
The first week is a hyper-success phase to get people motivated. There follow another seven days on a more relaxed regime of three meals a day. “That’s when the real wellness derives,” says Matten. He is keen to make clear that this is as much about wellbeing as weight loss. (There is a sense that the weight loss is almost annoying, which helps, paradoxically, to convince me of its efficacy. Especially after three days when I’ve lost another pound.)
Everyone at the gym started talking about the diet. The thrice-daily green juice became as popular as the soy lattes. Jodie Kidd had a nearly nude photoshoot coming up and asked Goggins and Matten to put her on the diet. Sir Ben Ainslie wondered if he’d have won even more Olympic gold medals (four at the last count) if he’d been eating sirtfoods. Another client, David Carr, a member of Ainslie’s America’s Cup team, lost almost 2 stone – and reversed his type 2 prediabetes. Sixty clients have now done the diet, with the average weight loss 7lb in 7 days.
Lorraine Pascale has been sirting (it’s a new verb already) for a year now. She tells me the diet has been “life-changing”. She sleeps better, her memory has improved, her skin is clear – and her body is leaner. “I’m 43 and I’ve never been happier with the way I look,” she says. She has lost weight, but for her the best benefits have been physical and mental.
The former heavyweight boxing champion David Haye is training for his comeback fight later this month. For three years he’s battled a serious shoulder injury. He’s eating sirtfoods. A year ago, he was carrying 22lb in excess body fat. Not any longer. “I always knew I wasn’t optimising my nutrition like I should be,” he explains, despite being vegan. “Boxers are a bit archaic on the nutritional side of things. I’m 35, and I knew I had to live smarter.”
His problem – apart from being sedentary because of injury – was eating enough healthy food and not losing muscle. “If you lose too much, you don’t have enough energy to train. I need enough nutrients to train hard – twice a day, for six days a week – and to recover.” He has his sirt diet delivered to him every morning: four meals, two juices and one snack. “This is about 50 years’ time,” he says. “The damage I do to my body and what I can do about it now. It is possible to eat like this for the rest of my life.”
How do I fare? Over 7 days, I lose 6lb, and fall off once. My partner, who does it too, goes down two jeans sizes. It takes a couple of days to get used to, but it’s a revelatory way to eat.
I predict it is only a matter of months before a certain well-known sandwich chain starts flogging sirt shots – yoghurt-size pots of strawberries, walnuts and chocolate flakes to sprinkle over porridge. We’ll have sirt Thursdays (just as there were office 5:2 Mondays two years ago) and hold sirt suppers.
Unlikely as it sounds, expect a run on kale.
Cooking Set Us Free
Forget about brains. If you want to know the real secret of humanity’s success it is this: at some point we stopped wasting half our waking hours chewing yams and instead tucked into tenderised sliced meat.
That is the thesis of Harvard scientists, who noticed that human heads and jaws changed markedly about 2½ million years ago. They have attributed the change to food preparation methods they think freed us up to stop masticating and start building civilisations.
To reach their conclusion, they went back to basis — by asking subjects to eat raw goat’s meat in a laboratory, with only prehistoric tools to prepare it.
“This is a topic of great importance to everyone on the planet, but it is something we often don’t think about: just how important chewing is in terms of how we live,” Daniel Lieberman, professor of human evolutionary biology, said.
“A chimp gets up in the morning, eats as much as it can, waits until its belly is empty, eats as much as it can and so on. At some point in human evolution there was a shift and we started to chew less. Typical humans in non-industrial societies spend 5 per cent of the day chewing. Most of us spend much less than that — just a few minutes of each day. This shift is extraordinarily important. It is made possible by the fact we eat food that is heavily processed.”
The most important form of food processing is cooking, but what interested Professor Lieberman was that the evidence suggested that cooking did not develop until 500,000 years ago — almost 2 million years after the first evidence of meat eating.
Meat eating is crucial because it enables us to get more calories fast but that only works if we do not have to chew it too much. Professor Lieberman gave people in his laboratory a diet of two thirds raw vegetables and one third raw goat’s meat. He said the goat’s meat was very difficult to eat until he chopped it and pounded it, and then participants were at least able to swallow it. And when they did, they received the same calorific intake as a vegetable diet, but with significantly fewer chews. For each extra calorie replaced by meat there were 156 fewer chews per day. Replacing a third of the calories with meat gave them 2 million fewer chews per year.
Professor Lieberman’s thesis, published in the journal Nature, is that this change allowed humans to devote that excess chewing time to other activities and also precipitated a change in skull size that, long before the arrival of cooking, gave us the teeth we have today. “If homo erectus walked into a dentist’s office, the dentist would have a hard time telling the difference,” he said. Except for the fact, perhaps, that there would be a lot more raw meat stuck between the teeth.
What Is 'Natural'?
I.
Americans have until May 10th to help the Food and Drug Administration with one of philosophy's greatest riddles: What is the meaning of "natural"?
Given our current attitudes, the riddle might be better described as religious. Data show that 51 percent of us shop for "all natural" food – shelling out some $40 billion a year on these products. We even choose natural over organic, market analysts have found. Natural has become the non-denominational version of kosher, and orthodoxy is on the rise.
The religiosity is apparent in the 4,863 public comments that have already been submitted to the FDA online. Natural and unnatural read like Manichean synonyms for good and evil. Some comments are explicitly theological: "Natural should be limited to those ingredients that have been created by God." Others refer to violations of Mother Nature's intentions. Behind virtually all of them pulses an intense desire for salvation from modernity's perceived sins: GMOs, pesticides, chemicals, artificiality, synthetics. We ate, greedily, from the tree of scientific knowledge. Now we are condemned to suffer outside of Eden, unless we find a natural way back in.
Fair warning, though: Crowdsourcing theology is no easy task. This latest effort is actually round three for the U.S. government. Back in 1974, the Federal Trade Commission proposed codifying a simple definition: "Natural" foods are "those with no artificial ingredients and only minimal processing." Public comments poured in. The FTC deliberated for nine years, then gave up.
"A fundamental problem exists," explained then-chairman James C. Miller. "The context in which 'natural' is used determines its meaning. It is unlikely that consumers expect the same thing from a natural apple as they do from natural ice cream."
The FDA's first attempt met with a similar fate. In 1991 the agency invited input on the definition of "natural," noting widespread belief that natural foods are "somehow more wholesome." But like the FTC, the FDA also gave up, this time blaming the failure on us: "None of the comments provided FDA with a specific direction to follow for developing a definition."
That was fine until 2009, when a wave of lawsuits started to hit food manufacturers. Plaintiffs argued that Snapple's "all natural" designation was deceptive because its drinks contained high fructose corn syrup. Ditto for many of Nature Valley's products — which, it was noted, were deceptively festooned with "images of forests, mountains, and seaside landscapes." Twin lawsuits against Ben and Jerry's and Häagen-Dazs helped to clarify what consumers expect from "natural" ice cream — not Dutch-processed cocoa, apparently, which is alkalized with potassium carbonate, a synthetic ingredient. Even Whole Foods — the Church itself! — is currently being sued for advertising its bread as "all-natural," despite containing sodium acid pyrophosphate, a synthetic leavening agent allowed in organic products (you might know it as baking powder).
Fearing endless and ambiguous legal woes, representatives of the food industry issued petitions requesting that the FDA standardize the term. At the same time, the Consumers Union, a non-profit associated with Consumer Reports, called on the FDA to prohibit any use of the word or related derivations. (One wonders how the group envisions this playing out for Nature Valley, Back to Nature, Amy's Naturals, Organic by Nature, and the countless other companies whose names incorporate derivations of natural.)
I spoke about the wisdom of defining natural food with Georgetown Law professor and false advertising expert Rebecca Tushnet. "My initial reaction is that it's a good idea," she tells me. "People think natural is better than organic, but natural doesn't have a specific meaning. That's confusing. Corporations also need a clear definition so they can use the term and stop getting sued."
Her position makes sense. After all, rabbinic courts have established rules about the meaning of kosher. Otherwise the kosher seal would be useless. The time has come for government authorities, with our help, to do the same for the meaning of natural food.
II.
Before attempting to answer this question, it's worth noting that until recently, no one really asked it.
Though the distinction between natural and artificial — that is, made by man's art — dates back at least to Aristotle, the popular romanticization of natural food stands in stark contrast to pre-modern culinary philosophies. In keeping with the idea that you are what you eat, refined people ate refined food. According to historian Rachel Laudan, "for most of history people wanted the most refined, the most processed, the most thoroughly cooked food possible. This was regarded as the most simple and natural food, because all the dross had been removed by the purifying effects of processing and cooking, particularly fire. Ideal foods were sugar, clarified butter or ghee, white bread, white rice, cooked fruit, wine and so on."
Similarly, classical Chinese texts routinely express pity for early humans who, without the benefit of agriculture and cooking technology, were forced to eat directly from nature. "In ancient times," reads the Huainanzi, "people ate vegetation and drank from streams; they picked fruit from trees and ate the flesh of shellfish and insects. In those times there was much illness and suffering, as well as injury from poisons." Only through the alchemy of cooking, these Chinese philosophers concluded, could "rank and putrid foods" be transformed into something good to eat.
Both in the East and the West there have always been a minority of ascetics who denied themselves cooked, flavorful food and the products of agriculture. But unlike today, such ascetic denial was intended to distance the practitioner from the physical world, nature included. The ideal wasn't unprocessed food, but rather no food at all. Early Daoist tales tell of "spirit men" who subsisted entirely on wind and water.
"Food was flesh and flesh was suffering and fertility," writes the scholar Caroline Walker Bynum, describing the attitude of pious medieval Christian women. "In renouncing ordinary food and directing their being toward the food that is Christ, women moved to God...by abandoning their flawed physicality."
The turn towards redemptive natural foods didn't begin until the 18th century, when Romantics, led by Rousseau, began looking to the culinary past for guidance. Haute cuisine was blamed for the vices of the rich; country food bred virtuous peasants, their nature unspoiled by human artifice. "Our appetite is only excessive," wrote Rousseau in 1762, "because we try to impose on it rules other than those of nature."
In the 18th century Jean Jacques Rousseau, the high priest of the Romantics, sought redemption in natural foods. "Our appetite is only excessive," wrote Rousseau in 1762, "because we try to impose on it rules other than those of nature."
But among those who favored the culinary dictates of nature, there was little agreement upon their content. For Rousseau, it was vegetarianism: "One of the proofs that the taste of flesh is not natural to man is the indifference which children exhibit for that sort of meat." This idea gained traction in the 19th century, most famously in poet Percy Bysshe Shelley's 1813 essay A Vindication of Natural Diet, which blamed flesh-eating — "unnatural diet" — for a litany of woes including disease, crime and depravity. Some physicians were convinced, but many others continued to emphasize the centrality of meat to our natural diets. A popular medical text of the late 19th century expresses the tension in a section that could easily apply today:
"On my table are two books on the diet question, written by two well-known physicians. One proves at great length that the natural diet of man is the vegetable diet. Meat, this author claims, is unnecessary and injurious. ... The other author differs from the forgoing very radically. In his view the natural diet of the normal man is largely flesh food. When doctors disagree who shall decide?"
Only with the dominance of mechanized food production did the argument over "natural" begin to focus on the deleterious effects of processing, and come to look something like what it does in the FDA comments. In the mid-19th century, health food pioneer Sylvester Graham (of graham cracker fame) advocated for vegetarianism, but also for the superiority of whole grains and natural, unprocessed foods.
"It is nearly certain that the primitive inhabitants of the earth ate their food with very little, if any artificial preparation," he wrote approvingly, in stark contrast to the ancient Chinese. "Food in its natural state would be the best."
During the same period, food chemistry exploded — accompanied by concerns over dangerous chemicals. In her history of sugar, Wendy Woloson reports that as early as the 1830s, the medical journal The Lancet carried articles warning about popular British candies, exported to America, that were adulterated with "red oxide of lead, chromate of lead, and red suphuret of mercury." These candy makers also used cheap, poisonous dyes to attract children. Nor was it just children: People suffered the ill effects of strychnine in beer, sulphate of copper in pickles, and countless other poisonous additives that proliferated in a largely unregulated food industry.
Notwithstanding increased oversight — most prominently the 1906 establishment of the FDA —20th century agricultural developments brought additional concerns. In her 1960s bestseller Silent Spring, Rachel Carson called attention not only to the environmental harms of pesticide use, but also to their presence in our foods. "Packaged foods in warehouses are subjected to repeated aerosol treatments with DDT, lindane, and other insecticides, which may penetrate the packaging materials," she wrote. To make matters worse, Carson warned that the government was powerless to protect us: "The activities of the Food and Drug Administration in the field of consumer protection against pesticides are severely limited."
Given the last hundred years of food history, it's hard not to sympathize with those who venerate natural food. Medical authorities have come to agree with Graham on the benefits of whole grains. Diets rich in highly refined carbohydrates – the kind found in cookies, chips and other processed snack foods – and sugary drinks are implicated in rising obesity rates and related health problems. Meanwhile, articles run on a near daily basis about the potential dangers of synthetic chemicals used to produce and package these foods. The powerful corporate giants that produce them spend heavily to influence science and public policy. Worst of all, there appears to be a revolving door between the companies and regulatory agencies.
It's no wonder that people are scared. Skepticism seems warranted — which means that faith in the most recent incarnation of "natural" food, far from being irrational religiosity or a relic of the romantic past, might be a good way to keep ourselves and our families safe.
III.
Despite these legitimate concerns, the long and checkered history of natural cautions against an uncritical embrace of the term, especially as some kind of panacea.
Philosophers warn of the "appeal to nature" fallacy, in which good is equated with natural. In addition, there seem to be nearly insurmountable difficulties with defining the term in the first place. Even the well-known food writer and activist Michael Pollan sees no real way forward. Confronted by "such edible oxymorons as 'natural' Cheetos Puffs," he throws up his hands: "Nature, if you believe in human exceptionalism, is over. We probably ought to search somewhere else for our values."
Nevertheless, in the very same essay, Pollan indicates that some common sense version of natural really should guide our choices. It's not hard, he says, to figure out which of two things is more natural: "Cane sugar or high-fructose corn syrup? Chicken or chicken nuggets? GMOs or heirloom seeds?" The opposite of natural, on his reading, is artificial or synthetic, and it's clear that the former should be preferred to the latter.
But is that really true? I interviewed philosophers and chemists to see if there was some kind of consensus on the matter. It turns out that those who think professionally about the issue are no less confused or divided than the rest of us.
Take the philosophers. Joseph LaPorte of Hope College specializes in the language we use to classify the natural world and has written extensively on the idea of "nature" and "naturalness."
"To be sure, natural doesn't mean safe," he told me. "Nature produces some of the most formidable toxins in the world. But when it comes to packages of chemicals, as they exist in foods or fragrances, nature is a good bet, or at least a clue, because coevolution often suggests its safety and efficacy."
Not so fast, says York University's Muhammad Ali Khalidi, also a philosopher of science who specializes in classificatory language. "Something very recent might be safe," he points out, "and something that's been around for hundreds of years could be very dangerous." Case in point: Aryuveda, or traditional Indian medicine, has long prescribed herbal remedies that contain dangerous heavy metals. Smoked meats, a mainstay of non-industrial food production, are now known to increase cancer risk.
Nor is the lack of consensus limited to the safety of natural food. Scientists also disagree on whether it makes sense to distinguish natural from synthetic products at all. Richard Sachleben, an organic chemist, told me flat-out that all chemicals are natural. Petroleum, he explained, was originally algae. Coal used to be forests.
"The natural enthusiasts, they like to distinguish things based on origin," he says. "But that doesn't make any sense. Think about this: I could raise a pig in my backyard, and feed it corn that I grow myself. I could slaughter the pig and render the fat. I could ferment my corn and distill out the ethanol. Then I could boil wood ash, put this all together, and make bio-diesel. It would look no different chemically than if I used products derived from petroleum."
But when I talked to Susie Bautista, a long-time flavor chemist turned blogger, she had no problem distinguishing between natural flavors —"which are made with natural starting material, like fruits, roots, leaves and bark"— and artificial flavors that are synthesized, bottom-up, out of chemical building blocks derived from sources like petrochemicals.
"I think it's entirely reasonable to want natural flavors," she says. "As a Mom and a consumer, I would lean towards natural flavors."
What, then, should we take from all this? If nothing else, the issues surrounding "natural" do not admit of easy answers. Those who shop for natural foods and fear "chemicals" are not necessarily irrational or anti-science. They shouldn't be mocked by (well-meaning) satirists who refer to water as dihydrogen monoxide or list the chemical contents of an "all-natural" banana. At the same time, there's no good evidence that parents who eschew natural food and embrace GMOs are poisoning their children. Industrial agriculture, whatever its defects, shouldn't be confused with the work of (Mon) Satan.
No one put the situation better than novelist John Steinbeck, who ruefully recognized these opposing perspectives within himself:
"Even while I protest the assembly-line production of our food, our songs, our language, and eventually our souls, I know that it was a rare home that baked good bread in the old days. Mother's cooking was with rare exceptions poor, that good unpasteurized milk touched only by flies and bits of manure crawled with bacteria, the healthy old-time life was riddled with aches, sudden death from unknown causes, and that sweet local speech I mourn was the child of illiteracy and ignorance."
Indeed, it's this conflicted understanding of natural, tempered by tolerance and compassion, that I heard from Nobel-Prize-winning chemist Roald Hoffmann. In addition to his accomplishments as a scientist, Hoffmann is a prolific poet and playwright who has written extensively on the intersection of science and religion, and the meaning of "natural." During our long conversation he expressed sympathy for both sides of the debate, and maintained that there were no easy answers.
"Agriculture itself is the greatest invention for manipulating the natural and changing it that the world has ever known," says Hoffman. "I would like people to be aware of that, and the chemical basis for it."
Nevertheless, he also maintained that everyone, laypeople and scientists alike, is attracted to what is natural — a claim that has empirical support. For Hoffmann, natural isn't just about healthfulness, or the environment. It isn't a matter of physical identity. Even if synthetic diamonds are completely indistinguishable from geologically produced diamonds, the origin story matters: They are the same and not the same (which is also the title of one of Hoffmann's books).
Did he prefer natural products himself, I wondered?
"I would like to believe there is something to the construction of natural as good for us and Earth," he replied after a long pause, and then laughed. "I know my wife believes so!"
Ultimately, Hoffmann thinks that fear, however irrational, can only be tempered with empowerment. "No amount of knowledge, no matter how skillfully and widely taught, will assuage fear of the synthetic," he argues, "unless people feel that they have something to say, politically, in the use of the materials that frighten them."
It is for this reason that we should applaud the FDA's current project, difficult though it may be. All of us would do well to browse the submissions, either to increase our understanding of faith that differs from our own, or to reflect on the faith that we already hold. After doing so, perhaps you'll be inspired to submit your own reflection, and together — the same and not the same — we will muddle onward in humanity's long journey towards unraveling the riddle of "natural."
Baby Carrots
Ten years ago, NPR opened a radio news segment with a few words about a man few knew. Mike Yurosek, a carrot farmer from California, had passed away earlier that year. The homage was short —it lasted no more than 30 seconds — but for many of those listening, it must have been eye-opening.
"He actually invented these things," Stephen Miller, then an obituary writer with the New York Sun said, holding a bag of baby carrots. "Not many people know that baby carrots don't grow this way."
There are small carrots, which uppity restaurants serve as appetizers or alongside entrees, that sprout from the ground. But those look like miniature versions of the much larger vegetable. The smooth, snack-sized tubes that have come to define carrot consumption in the United States are something different. They're milled, sculpted from the rough, soiled, mangled things we call carrots, and they serve as an example, though perhaps not a terribly grave one, of how disconnected we have all become from the production of our food.
"The majority of consumers have no clue what they’re eating or how it’s produced," said David Just, a professor of behavioral economics at Cornell who studies consumer food choices. "There are so many people who honestly believe there are baby carrot farmers out there who grow these baby carrots that pop out of the ground and are perfectly convenient and smooth."
It's hard to overstate the ingenuity of the baby carrot, one of the simplest and yet most influential innovations in vegetable history. The little carrot sculptures (or baby cut carrots, as they're sometimes called to clarify) not only revived a once-struggling carrot industry, but they also helped both curb waste on the farm and sell the Vitamin A-filled vegetables at the supermarkets.
The baby carrot, like so many inventions before it, was birthed by necessity.
In the early 1980s, the carrot business was stagnant and wasteful. Growing seasons were long, and more than half of what farmers grew was ugly and unfit for grocery shelves. But in 1986, Yurosek, itching for a way to make use of all the misshapen carrots, tried something new. Instead of tossing them out, he carved them into something more palatable.
At first, Yurosek used a potato peeler, which didn't quite work because the process was too laborious. But then he bought an industrial green-bean cutter. The machine cut the carrots into uniform 2-inch pieces, the standard baby carrot size that persists today.
When Mike Yurosek & Sons, Yurosek's now-defunct California company, delivered his next batch to Vons, a local grocery chain, he included a bag of the new creation. He suspected he was on to something but hardly anticipated such an enthusiastic response. "I said, 'I'm sending you some carrots to see what you think,' " Yurosek recounted in a 2004 interview with USA Today. "Next day they called and said, 'We only want those.' "
Vons wasn't the only one impressed. Grocers, distributors, carrot buyers, and, most importantly, some of Yurosek's most formidable competition took notice. In the years that followed, baby carrots ballooned into big business, nudging the biggest carrot producers in the country to join in and feed the frenzy.
"When we realized this wasn't a fad, this was real, everybody jumped on the bandwagon," Tim McCorkle, director of sales for Bolthouse Farms, one of the nation's leading carrot producers, recalled in a 1998 interview with the Chicago Sun-Times. "This idea inverted the whole carrot-growing business."
It also helped lift the industry out of a rut. In 1987, the year after Yurosek's discovery, carrot consumption jumped by almost 30 percent, according to data from the USDA. By 1997, the average American was eating roughly 14 pounds of carrots per year, 117 percent more than a decade earlier. The baby carrot doubled carrot consumption.
Today, baby carrots dominate the carrot industry. The packaged orange snacks are now responsible for almost 70 percent of all carrot sales.
The development and rapid consumer acceptance of packaged fresh-cut carrot products during the 1990s has helped the carrot industry evolve from a supplier of low-value bulk products to marketer of relatively upscale value added products ... fresh-cut carrot products have been the fastest growing segment of the carrot industry since the early 1990s. Within the $1.3 billion fresh-cut vegetable category, carrots account for the largest share (about half) of supermarket sales, followed distantly by potatoes, celery, and others.
Of all the reasons for the rise of America's favorite carrot, there is likely nothing that has propelled baby carrots quite like their convenience. The quality was important to Americans in the 1980s, and it's even more precious now.
As people have found themselves with less time to sit down at restaurants or even cook at home, convenience has guided all sorts of decisions about food, especially when there is an option that requires little more than opening a packet.
"Baby carrots have transformed the way people think about carrots," said Just, the behavioral food economist. "The fact that you don't have to peel them, that it involves so little prep, is key." "Baby carrots are also small enough to fit in your mouth," he added. "They're bite-sized and ready to be eaten. They're easy."
The fuzziness about the baby carrot's origins may have also helped their success.
Recent marketing efforts to further boost their popularity have positioned them as an alternative to junk food, rather than a different way to eat carrots. The packaging was changed to mirror that used for potato chips. "Eat ’Em Like Junk Food," the 2010 TV, print, and digital ads suggested, likening the vegetable vehicle to Doritos and other snack foods. The campaign was a hit, boosting sales by 13 percent, succeeding, at least in part, by further disassociating baby carrots from their parent.
The truth is that it probably doesn't matter all too much whether someone understands that the smooth little 2-inch carrot cut-outs they're devouring didn't grow in the ground. Just maintains that knowing this probably wouldn't change anyone's consumption patterns, save perhaps for a small group of hardcore naturalists, since the processing involved is comparatively minimal.
But that doesn't forgive the disconnect. Baby carrots, the ones that don't grow in the ground, have done more than simply boost the sales of carrot producers around the country—they have turned the carrot industry into a much more efficient and much less wasteful endeavor.
At a time when most ugly vegetables go to waste in the United States, ugly carrots are carved and sold at a premium. What's more, moving the peeling process to the factory has allowed the carrot industry to make use of the scraps that used to end up in people's trash bins.
"It's something pretty amazing about baby carrots that I'm sure people don't appreciate," Just lamented. "The same people probably think selecting only for regular carrots is more environmentally friendly."
Electric Taste
“A spoon embedded with electrodes can amplify the salty, sour or bitter flavour of the food eaten off it”
YOU’RE having dinner in a virtual reality game. You approach the food, stick out your tongue – and taste the flavours on display. You move your jaw to chew and feel the food between your teeth.
Experiments with “virtual food” use electronics to emulate the taste and feel of the real thing, even when your mouth is empty. This tech could add new sensory inputs to virtual reality or augment real-world dining experiences, especially for people on restricted diets.
In one new project, Nimesha Ranasinghe and Ellen Yi-Luen Do at the National University of Singapore made a device that uses changes in temperature to mimic a sweet taste. The user places the tip of their tongue on a square of thermoelectric elements that are rapidly heated or cooled, hijacking receptors that normally trigger the sensation of sweetness. They presented their work at the 2016 ACM User Interface Software and Technology Symposium (UIST) in Tokyo last month.
In an initial trial, it worked for about half of participants. Some also reported a spicy sensation when the device was around 35 °C and a minty taste when it was 18 °C. The researchers envisage such a system being embedded in a glass or mug to make low-sugar drinks taste sweeter and help people cut their sugar intake.
But food isn’t just about taste. Also at UIST, a team from the University of Tokyo presented a device that uses electricity to simulate chewing foods of different textures. Arinobu Niijima and Takefumi Ogawa’s Electric Food Texture System places electrodes on the masseter muscle – a muscle in the jaw used for chewing – to give a hard or chewy sensation as you bite down. “There is no food in the mouth, but users feel as if they are chewing some food due to haptic feedback by electrical muscle stimulation,” says Niijima.
To give the “food” a harder texture, they stimulated the muscle at a higher frequency, while a longer electric pulse simulated a more elastic texture. Niijima says their system was best at mimicking the texture of gummy sweets.
Both projects are still in the experimental stage but their goal is to help people with special dietary needs. “Many people cannot eat food satisfactorily because of weak jaws, allergies and diet,” says Niijima. “We wish to help them to satisfy their appetite and enjoy their daily life.”
Ranasinghe has already experimented with a digital lollipop that emulates different tastes, and a spoon embedded with electrodes that amplify the salty, sour, or bitter flavour of the food eaten off it. He says that a Singapore hospital is planning a long-term study with the spoons to try to reduce sodium intake in its elderly patients.
All this tech could one day be incorporated into a VR headset. “I think the main advantage is to increase the immersion inside the virtual environment,” says Ranasinghe. He gives an example: an astronaut could put on a headset, soak in a relaxing view from back home, and have a nice cup of virtual coffee.
Farming insects may be more sustainable than raising meat, but so far that hasn't been quite enough to convince most Westerners to eat them.
Marketing them as delicious, exquisite delicacies, though? That might do the trick.
The global demand for meat drives environmental decline, from forest depletion and soil erosion to increased water use and the release of greenhouse gases.
Insect farming is easier on the environment, says Joost Van Itterbeeck, visiting scientist at Rikkyo University in Tokyo and co-author of the book Edible Insects: Future Prospects for Food and Feed Security. And, he adds, "The nutritional benefits are very obvious in terms of proteins, minerals and vitamins."
But as nice as that all sounds, Westerners are just plain disgusted by bugs on the dinner plate. And save-the-planet discussions don't seem to be changing their minds.
Current marketing tactics for eating insects tend to point out environmental and health benefits. But a new study published in Frontiers in Nutrition suggests it might be better to focus on taste and experience, such as highlighting how much dragonflies taste like soft-shelled crabs.
Hiding crickets in cookies
This doesn't come as a surprise to Kathy Rolin, who knows something about getting people to try edible insects.
She and her husband, James, originally started their business, Cowboy Cricket Farms, to sell whole frozen crickets to food manufacturers. After finding that more first-time bug eaters opt for cookies baked with cricket flour instead of a whole cricket, they decided to expand their business to sell Chocolate Chirp Cookies directly to consumers.
By subscribing, you agree to NPR's terms of use and privacy policy.
They found the Chocolate Chirps had better profit margins. "We mainly market the cookies, because who doesn't like a chocolate cookie?" says Kathy Rolin.
There have been calls to appeal to consumers' tastes before, but now there is evidence that appealing to the senses might actually work.
The study shows that a willingness to try edible insects — in this case, a chocolate-covered mealworm — depends on what advertisement a person reads before deciding whether to eat it. When the ad focused on taste and experience, rather than environmental or health claims, more people would try the worms.
In the study, 180 volunteers reviewed informational flyers on an edible insect start-up company. The wording differed only in one sentence: "Eating meat has never been so _______," meat referring to the meaty part of the insect in this case. The sentence ended with either "good for the environment," "good for the body," "exotic" or "delicious." The latter two were considered by the researchers as hedonic marketing that appealed to the senses.
After reflecting on the ad, participants were then given the option to try a chocolate mealworm truffle, which contained whole and visible worms. Participants who read the hedonic marketing claims were more likely to try the truffle, which the researchers attributed to higher-quality expectations suggested by the advertisements.
Fighting disgust
Promoting taste may convince more people to try insects because it veers our reaction away from disgust. "It's not a rational response," says Val Curtis, a professor at the London School of Hygiene & Tropical Medicine and author of the book Don't Look, Don't Touch, the Science Behind Revulsion. "We have an innate response to things that might make us sick by feeling disgusted and, therefore, don't want to consume them."
Disgust can be easily generalized, and bugs on the dinner plate trigger the "ick" reaction because we associate them with the cockroach scurrying across the floor. The result? A ruined appetite.
Telling people to eat insects for the sake of the planet, the researchers argue, won't convince a stomach that has already said "no."
"Saving the planet is not something we've evolved to do," notes Curtis.
Instead, the researchers suggest that hedonic advertising is a better way to entice would-be diners to eat bugs, because it helps prevent the disgust response.
The cockroach rises
If we can clear that hurdle, insects could potentially become as common as lobster — which was once referred to as the "cockroach of the sea" and fed to prisoners and servants. But when railways began to spread across America and lobster was served to unsuspecting travelers — who didn't know that the crustaceans were considered "trash food" — the passengers took a liking to the taste, and lobster began to soar in popularity.
A related story surrounds sushi, which didn't start gaining widespread acceptance in the U.S. until the mid-'60s. When high-end restaurants started serving raw fish, it went from unpalatable to popular.
It's Not Just A Bug, It's A Fine-Dining Feature At This Thailand Restaurant
Now, both lobster and sushi are considered delicacies, a trend that was propelled by another effective form of advertising: status appeal.
Rolin thinks insects could follow the same trend. "We've noticed that there's been quite a few celebrities that have endorsed the idea of [eating] insects." Recently, actress Nicole Kidman revealed her "secret talent" of bug consumption in a Vanity Fair video by eating a four-course insect meal complete with fried grasshopper dessert, and singer Justin Timberlake served up bug dishes at a recent album release party.
Marketing campaigns that focus on a favorable bug-eating experience, perhaps by showing celebrities eating them, might be enough to distract people from the disgust response long enough to get them to try it.
Reframing the bug
"I would say if you're going to market insects, you take them as far away from anything slimy or crawling or creepy or too leggy," says Curtis. "Meat is sold as a tasty product, and all pictures of animals have been taken off the packaging. I would say just do exactly the same with insects."
One way to do this is by changing the name of the dish. We've done this with other foods: We eat pork, not pig; and beef, not cow. When serving ant larvae, it may be better to use their alternative food name: escamoles, a delicacy served in Mexico City.
While taste and experience may prove to be a good way to promote eating insects, that shouldn't discount environmental claims. Eco-friendly campaigns do get people to think more about food sustainability; they're just not quite enough to get most people to put their money where their mouth is, so to speak.
But by advertising escamoles in garlic sauce with cilantro and chipotle? It just might.