data hangovers and analysis paralysis...
Data conferences have changed — or is it me? My focus has always been on social injustice and inequity making me what one might call a ‘laggard’ of the latest and greatest in tech. My horror is watching how algorithmic models are praised and lauded as they set to redefining the world and its social abstractions as scientific and commercial inputs to drive market share.
And we are giddy with enthusiasm. The “we” I refer to here is indeed the “royal we”.
I am not able to quantify how many statistics courses add up to a knowledge base up to the task of making sense of all of the work we assign to data interpretation but I am guessing — a lot.
Our modern quantified environments include algorithms that trade stocks passively — think exchange traded funds (ETFs) or index-tracked funds. They are basically trading among themselves without human accountability. Passively driven by data.
We don’t exactly have a better plan. The holy grail of objectivity has been plagued by research studies that lack reproducibility, big data infrastructures being recognized for their gigantic carbon footprints stomping on our environment and devastating everything in the wake, and we have replaced our frequentist ‘rational’ mindsets with bayesian ’subjectivity’.
Remember the coin toss experiment designed to predict the probability of heads or tails? As the first 4 or 5 tosses yield heads for example, we are tempted to stray from the persistent probability of 50%. Frequentist statistics is what you likely think of when you are haunted by your college textbooks. Think p-values and confidence intervals.
Bayesian theory allows you to view outcomes from your perspective — not a right answer or the frequentist “rejection of the null”. You are left with a “less” or “more likely” that drives the foundation of AI and machine learning. We have evolved from hoping to approximate objectivity with a full lean into the subjective nature of algorithmic thinking. I am a Bayesian when contemplating specific decisions such as risk models for health screening but tend to be a little less enthusiastic when the outcomes are interpreted as fact and applied to economic theory and financial models.
In the shift from frequentist to Bayesian statistics, where statistics move from the discernment of knowledge to the production of action, these mathematical methods function directly on the level of value. They cease to aid in the selection of what is true, pointing instead to which course of action will produce the most profit.— Justin Joque, Revolutionary Mathematics Artificial Intelligence, Statistics and the Logic of Capitalism
Keep reading with a 7-day free trial
Subscribe to mumble | delegate | ponder to keep reading this post and get 7 days of free access to the full post archives.