Archive for March, 2021
Applying multifactor testing to a wine-making simulator
The Pudding, a digital publication devoted to data-driven visualization of current culture, currently features a very interesting essay on Wine and Math, A Model Pairing. The author, Lars Verspohl, provides many eye-catching graphics of the analytics behind producing quality wines.
What got my attention was a simulator for making red Portuguese Vinho Verde. Verspohl sifted through a dataset of 1600 wines to develop a model that predicts quality based on 11 factors. You can slide these up and down to try making a fine wine—rated at 7 or more on a scale of 10.
Not being content with haphazard searching on so many variables, I set up a multifactor test. Using version 13 of Design-Expert® software (free trial here), I laid out a minimum-run (plus 2) screening design on the 8 factors ranked most important by Verspohl’s Random Forest analysis, bypassing the bottom 3 (pH, residual sugar and free sulfur dioxide). I then worked through the 18 combinations and recorded the quality results in percent.
As shown on its Pareto plot of effects, Design-Expert revealed that only 5 of the effects tested produced significant effects.
The numeric optimization tools led to the optimal red Vinho Verde flagged in this 3D plot at the highest alcohol and lowest volatile acid levels. Settings for the other attributes are indicated by the position of the slide bars, e.g.; sulphates at the high level*. The factors defaulted to the middle are ones that did not get picked for the model.
Now that I’ve solved this simulator, my next mission is to locate a bottle of red Vinho Verde for some one-glass-at-a-time testing.
*This result surprised me—not being a big fan of sulfurous compounds in wines. This skepticism is borne out by another take on the Vinho Verde wine here. The only way to resolve the conflicting results would be to do an actual experiment on the composition of a red wine, ideally a mixture design for optimal formulation.
Archer’s Big Bounce Experiment
I am a big fan of University of Minnesota Athletics—even more so now after they sponsored a Science of Basketball project for grade schoolers. My 9-year-old grandson Archer jumped at the chance to put a variety of basketballs to the test with my help. For the results, see the video we submitted to the UMn judges.
Archer’s findings–wood being better than rubber for bounce–stand out in graphics generated with Design-Expert software.
Archer enjoyed doing this science project. I feel sure it helped him understand what it takes to design an experiment, do it properly and analyze the result. My only disappointment is that the high-tech cell-phone app for measuring height, which I used for my experiment on elastic spheres, failed due to too much echo in the gym, most likely.
However, I discovered another intriguing basketball-physics experiment at the Science Buddies STEM website. It determines where a bouncing ball’s energy goes . This requires deployment of an infrared-temperature gun with a laser beam. Awesome! Archer will like that (if he can wrestle the laser gun away from me).