Archive for category food science

Colors to dye for

I grew up in the golden age for kids’ cereals, first with Trix from General Mills—introduced in 1954 in three colors: raspberry red, orangey orange and lemony yellow (now also wildberry blue, grapity purple and watermelon), followed in 1963 with Froot Loops from Kellog—also in red, orange and yellow—Toucan Sam style (now also green, blue and purple). Back then nobody worried much about how these manufacturers colored their cereals—artificially or otherwise. However, nowadays a consensus has built up about a “rainbow of risks” caused by synthetic food dyes. Political pressure across the spectrum from Gavin Newsome to Robert F Kennedy, Jr continues to build for banning these presumably harmful additives.

This sets the stage for some interesting history by American Heritage magazine on letting the food industry “poison” us as RFK, Jr puts it. Their Senior Editor Bruce Watson reported in the November/December issue how “many of our first food-safety laws arose after healthy young volunteers became sick when they tried commercial foods containing toxic additives.” These daredevils comprised “The Poison Squad” created in 1902 by Harvey Wiley Washington—who became known as the “Father of the Pure Food and Drugs Act” when it became law in 1906.

“NONE BUT THE BRAVE CAN EAT THE FARE.”

– Sign posted outside the Department of Agriculture building to enlist human ‘guinea pigs’

As historian Deborah Blum noted in her book The Poison Squad: One Chemist’s Single-Minded Crusade for Food Safety at the Turn of the Twentieth Century Washington deserves credit for “one of the most significant experiments in the 20th century.” For example, just prior to his crusading work, hundreds or perhaps thousands of children died from milk “embalmed” with formaldehyde.

Not to lessen the current concern over artificial dyes, we can be thankful for the relative safety of our food compared to the fare in the early 1900s. But I do not advocate going back to the days when potential poisons were tested on human subjects. Though I suppose there’s worse things than being tasked with eating large quantities of Trix and Froot Loops, provided, of course, that the milk is not embalmed. ; )

No Comments

Hoping to cell-abrate meat substitutes before I die

As a consultant on statistical design and analysis of experiments, I’ve been working with many leading-edge developers of cell-based meats (and fish). I am a carnivore—me loving a juicy burger, tender pulled pork, medium-rare steak or barbecued chicken. However, I’d happily switch to lab-grown protein once it passes a properly designed double-blind taste test. This will be a huge breakthrough by not killing animals and greatly reducing greenhouse gases—including “enteric fermentation” (nice way of referring to cow farts, ha ha).

Some experts do not foresee this happening in our lifetime according to this report last February by CBC. But after reading this cover story posted yesterday by Chemical & Engineering News on recent developments on lab-grown meats, I am more optimistic.

There is a fly in the food, so to speak, though: I cannot eat lab-grown meat while wintering in my Florida home—it’s been banned per this May 1 press release from Governor DeSantis. No fair!

“Today, Florida is fighting back against the global elite’s plan to force the world to eat meat grown in a petri dish or bugs to achieve their authoritarian goals.”

– Governor Ron DeSantis

By the way, I do agree with the Governor on one thing by not being a big fan of eating bugs. On the other hand, I applaud a Stat-Ease client from Bulgaria—Nasekomo (meaning ‘‘insect’’)—for developing a high-protein chicken feed made from soldier flies. I helped one of their researchers on her experimentation after first being assured that the EU approves the use of their product only for animals, not humans. She told me that chickens who eat the fly-based food tend to be less aggressive and healthier. Sounds good to me: Cock-a-doodle-do!

No Comments

Classic case of sensory testing snubbing off a beer snob




The feature story on sensory evaluation in the new issue of ASTM Standardization News brings back a fond memory of a rare victory over an overly smug colleague.

I developed a taste for sensory science as a young chemical engineer determined to prove that mass produced American lagers differed only imperceptibly—consumers being brain washed by deceptive advertisers. This hypothesis drew strong condemnation from one of my colleagues—a chemist named Harold who dissed lesser brews such as Old Milwaukee, which he deemed “Old Swillwaukee”.

To put this beer snob to the test, I organized a tasting at a Super Bowl party attended by a dozen or so fellow researchers. Beforehand, I engaged a sensory professional that our employer hired to guard against “off odors” from our manufacturing plants. She advised that we limit drinking of each beer to a small sip, then eat saltless crackers and wash them down with water before going to next brew. Also, both the presenter of the beer and the taster should be blind to the brand, thus avoiding bias.

However, given my mission to snub a beer snob, we first rated a selection of undisguised beers—including Miller, Budweiser, Old Milwaukee and others (in those days there were no ‘craft’ brews*). Harold rated “Old Swillwaukee” dead last. That was my plan! Then we repeated the tasting with the order re-randomized, but this time not revealing the names. Harold rated Old Milwaukee at the top of his list, thus providing a Super Bowl victory for me (badly needed being a Vikings fan).

My conclusion from this experience, and my work over the years helping food scientists improve the taste and other attributes of their products, is that it would be best to adhere by ASTM’s upcoming revision to Guidelines for the Selection and Training of Sensory Panel Members. For beer and the like, then bear down on the Standard Guide for Sensory Evaluation of Beverages Containing Alcohol (E1879).

We make panelists learn chemical names. For example, isoamyl acetate is a specific compound that smells like candy banana…I make panelists drink heavy cream for mouthfeel attributes. They’re unfazed by whatever we give them anymore because it’s always weird.

– Ali Schultz, sensory manager, New Belgium Brewing Company and leader of the current revision to E1879 (“Accounting for Taste”, ASTM Standardization News, January/February 2024)

However, if you are having a party, it’s more fun to be unprofessional and ignore the mandates to sip and spit, etc. ; )

*PS: The specialty beer brewers are getting a bit out of control nowadays, IMO. For example, I just got an alert from my Stillwater, Minnesota neighborhood microbrewer Lift Bridge to their release this weekend of Taking Care of Breakfast—a “barrel aged imperial breakfast stout aged in 10-year Willet and 6-year Wild Turkey bourbon barrels, infused with peanut butter and banana chips.” This new brew comes in at 12% ABV. Perhaps it may be best to go with orange juice first thing in the morning.

No Comments

To bean or not to bean, that is the question for coffee

In my most recent blog post on coffee I reported that a finer grind may not always be better. Now another piece of the puzzle for producing java that jives falls into place: Spritz your beans with water.

Evidently this is not a new discovery—those who really know their coffee-making craft routinely moisturize their grind to reduce clumping. A new study reported here by New Scientist reveals the problem: static electricity. Following up on the link to the original publication, I see that the research team, led by a volcanologist (sensible considering the lightning generated by particle-laden eruptions), deployed this $3000 German-made, handcrafted machine to produce extremely uniform grinds. I will definitely buy one soon (after winning the lottery).

Another approach to better coffee takes a completely different route—create it from cells grown in bioreactors. Environmentalists like this because the demand for sun-grown beans leads to destruction of rain forests. Per this Phys.Org heads-up, a Finnish team just released a recipe to accelerate the creation of a new “coffee ecosystem.” This seems promising. But there is a problem: Though the current lab-grown concoctions contain twice as much caffeine as ever before, it remains much lower than those in farmed beans.

Another approach to avoid the problems keeping up traditional methods for making coffee is to go to a beanless brew, such as the imitation now being rolled out by Seattle-based Atomo Coffee. Based on this January 24th report by CBS Saturday Morning show, I would be willing to give it a try, especially given they load up their brew with caffeine at the upper end of the normal range of real coffee. Full steam ahead!

One last idea (my caffeine levels now running low) for improving the taste of coffee is being selective about the shape and material of your cup. For example, see what the Perfect Daily Grind says about pouring your brew into a wine glass or other specialty containers.

“A drinking vessel has a significant impact on perception of flavour and aroma because it changes the way the coffee smells and tastes, as well as how you drink coffee. What’s more, our senses, feelings, and emotions also impact how we experience coffee.”

Marek Krupa, co-founder and CFO of Kruve

No Comments

Variation in eggs presents perplexing problems for preparation

Today is World Egg Day.

I’m a big fan of eggs—my favorite being ones perfectly poached in an Endurance Stainless Steel Pan. However, the eggs that come from my daughters’ hens vary in size far more per container than store-bought, graded ones. I work around this by adding or subtracting time based on my experience. I really should weigh the eggs and design an experiment to optimize the time.

Coincidentally, I just received the new issue of Chance, published by the American Statistical Association. An article titled “A Physicist and a Statistician Walk into a Bar” caught my eye because one of my Stat-Ease consulting colleagues is a physicist and another is a statistician. I was hoping for a good joke at both of their expense. However, the authors (John Durso and Howard Wainer) go in a completely different direction with an amusing, but educational, story about a hypothetical optimization of soft-boiled eggs.

The problem is that recipes suffer from the “flaw of averages” —smaller ones get undercooked and bigger ones end up overcooked unless the time gets adjusted (as I well know!).

While the physicist sits over a pint of beer and pad of paper scratching out possible solutions based on on partial differential equations related to spheroidal geometry, the statistician assesses data collected on weights versus cooking time. Things get a bit mathematical at this point* (this is an ASA publication, after all) but in the end the statistician determines that weight versus cooking time can be approximated by a quadratic model, which makes sense to the physicist based on the geometry and makeup of an egg.

I took some liberties with the data to simplify things by reducing the number of experimental runs from 41 to 8. Also, based on my experience cooking eggs of varying weights, I increased the variation to a more realistic level. See my hypothetical quadratic fit below in a confidence-banded graph produced by Stat-Ease software.

Perhaps someday I may build up enough steam to weigh every egg, time the poaching and measure the runniness of the resulting yolks. However, for now I just eat them as they are after being cooked by my assessment of the individual egg-size relative to others in the carton. With some pepper and salt and a piece of toast to soak up any leftover yolk, my poached eggs always hit the spot.

*For example, they apply Tukey’s ladder of variable transformations – a method that works well on single-factor fits and can be related to the shape of the curve being concave or convex, going up or down the powers, respectively. It relates closely to the more versatile Box-Cox plot provided by Stat-Ease software. Using the same data as Durso and Wainer presented, I found that the Box-Cox plot recommended the same transformation as Tukey’s ladder.

No Comments

Experimenting to make spirits more enticing

Spirits are distilled alcoholic drinks that typically contain 40% alcohol by volume (ABV) or 80 “proof”. Until the Pandemic, I avoided spirits—preferring to imbibe less intoxicating beers and wines. However, during the Quarantine, I made it my mission to drink up a stock of tequila that my Mexican exchange student’s father Pepe sent me when his daughter told him about the terrible cold in Minnesota.

Down the hatch went Don Julio and the like over some months…and yet the quarantine dragged on. Tiring of tequila I pivoted to bourbon, starting with top-shelf Woodford Reserve and settling after serial pairwise testing on bottom-shelf Evan Williams. Why pay more when you cannot discern a difference?

Last week my research on spirits expanded to rye whiskey purchased after a tour of the Chattanooga Whiskey Experimental Distillery. See my guide Sam pictured with a measurement guide for a key variable—the degree of charring in the storage barrels.

The mash bill for my bottle is malted rye, yellow corn, caramel malted rye (providing a smoother taste) and chocolate malted rye (not sure what that is, but it sounds tasty).

It seems to me that multifactor design of experiments would be an ideal tool for contending with the many process, mixture and categorical inputs to the optimization of whiskey. Once upon a time I toured Dewars Aberfeldy distillery in central Scotland. It concluded with my first taste of whiskey—shockingly strong. However, what interested me most was a simulator that allowed visitors to vary inputs and see how the output rated for taste. Unfortunately, I only had time to do one factor at a time (OFAT) testing and desperate stabs at changing multiple inputs.

If the spirit moves you (pun intended), please contact me for help designing your experiments and tasting the results.

No Comments

Masterful experiment delivers delectable chocolate chip cookies

There’s no better place to learn about design of experiments (DOE) than your own kitchen. Not being much of a cook or a baker, I do well by restricting my food science to microwave popcorn. Therefore, I happily agreed to help fellow DOE expert Greg Hutto advise his student Jessica Keel how to design an experiment on home-made chocolate chip cookies.

“Want to learn more in your own kitchen? Try making some cookies with different variations in ingredients. It’s a fantastic way to understand and help perfect your signature chocolate chip cookie.”

Danielle Bauer, The (Food) Science of Chocolate Chip Cookies

Optimizing cookies involves a tricky combination of mixture components and process factors. Furthermore, adhering to a gold standard for valid statistical studies—randomization—presents great difficulties. For each run in the combined design, the experimenter must mix one cookie according to the specified recipe and then bake it at the stated time and temperature. It’s much simpler to make a trayful of cookies with varying ingredients and bake them all at once. This can be accommodated by a specialized DOE called a split plot.*

Jessica took on a big challenge: Coming up with not one, but two chocolate chip recipes—soft-and-thick, versus thin-and-crispy. Starting from the specifications for Original Nestle Toll House Chocolate Chip Cookies, she used Design-Expert® software https://www.statease.com/software/design-expert/ to lay out an optimal, combined experiment-design based on a KCV model.** Jessica divided the runs into two blocks to spread it out over her Saturday-Sunday weekend. The experiment determined the effects of four recipe components—butter, granulated sugar, brown sugar, vanilla–baked while varying two hard-to-change process factors—temperature and time—in convenient groups (whole plots).

Jessica cleverly measured the density (by water displacement) and breaking strength (via the ‘penny test’ pictured) of each cookie before handing them over to her panel of tasters for sensory evaluation of taste, appearance and softness on a scale of 1 (worst) to 9 (best).

Focusing on taste alone, this combined mixture-process experiment led to a recipe—heavy on butter, vanilla free—that, when baked at the ideal conditions—325 deg F for 18 minutes—scores near perfect, as can be seen in the ternary contour plot produced by Design-Expert.

See Jessica’s full report for all the details . Then do your own optimal mixture-process experiment to ‘level up’ your homemade chocolate chip cookies. Yum!

*For details, see this tutorial from Stat-Ease that deploys a combined split-plot design to create a “rich and delicious” Lady Baltimore Cake .

**See my webinar on How to Unveil Breakthrough Synergisms Between Mixture and Process Variables.

No Comments

Never ending quest for the perfect grind of coffee

This graphic illustration from the National Coffee Association provides some amazing statistics in support of the claim that their beverage reigns supreme. I am doing more than my per ‘cupita’ (pun intended) of the nearly half a billion mugs of coffee that Americans drink every day.

Back before we all started working from home during the pandemic and kept on doing so afterwards, my son Hank (now VP of Software Development) and most of our Stat-Ease  colleagues jived on java (the real stuff, not the coding language). He and our lead statistician Martin Bezener (now President) conducted a very sophisticated experiment on coffee-grinding, as reported by him in our September 2016 Stat-Teaser. Check out Hank’s dramatic video-detailing of the split-plot coffee experiment.

With the aid of Design-Expert® software’s powerful statistical tools, Martin discovered the secret for making delicious coffee: Use a burr, not a blade, grinder, and go for the finest granulation. Based on these findings, I upgraded my grinder to the highly-rated Baratza Encore, which works really well (though very noisy!).

However, a new study published this May in a Special Issue on Food Physics reveals an uneven extraction in coffee brewing. Evidently, “a complicated interplay between an initial imbalance in the porosities and permeabilities” creates “a cutoff point” where “grinding coffee more finely results in lower extraction.” Along the same lines, but with open content and some nice pictures and graphs to lighten up a lot of dense math (e.g., Navier-Stokes equations for fluid dynamics), see this earlier publication on Systematically Improving Espresso. It “strongly suggests that inhomogeneous flow is operative at fine grind settings, resulting in poor reproducibility and wasted raw material.”

So now that experiments show that finer may not always be better, the quest for the perfect grind continues!

No Comments