2 Feb 2025

Computing Is The New Corn Syrup

In 2006, Michael Pollan published "The Omnivore's Dilemma". In it, he attempted to convince America to pay more attention to just how much corn had come to dominate our diet. Inspection of the labels revealed that a startlingly large amount of our food had corn syrup in it, and corn was used as fodder for animals such as cows that we thought had been fed grass (not normally considered an expensive food).

Like many others who read Pollan's book, once I had read it, I started seeing corn everywhere, and it was ridiculous. It was as if there were a law that all things which could have corn syrup in them, must have corn syrup in them. Where I live in Texas, many people would buy "Mexican Coke", because in Mexico (as in nearly any other spot on the planet) sugar was cheaper than corn syrup, and sodas made with corn syrup did not even taste as good. But, for some reason, in America all sodas had to use corn syrup, so you had to find "Mexican Coke" to avoid it.

The reason, of course, turned out to be that (owing in part to government incentives) we had massively overproduced corn, and built an entire industry whose continued existence relied on us continuing to overproduce it, and thus also required us to overconsume it. It was almost as if corn syrup had a negative price; you had to pay more to get foods without it, not only if they used cane sugar or honey instead, but even if the more expensive foods simply didn't have sweeteners. Corn syrup was put into nearly everything, and it took special effort and attention to find foods without it.

I have realized that computing has become the new corn syrup.

In this case, it is not primarily due to government incentives (although they may have had an impact on the margins), but the result is the same: a massive superabundance, and an unwillingness to admit it or produce less. The driver seems to have been Wall Street, and tech executives efforts to maintain the facade of being high-growth startups instead of the mature, slow-growth companies that they now should be.

The reason is most easily explained by looking at the "P/E ratio", the ratio of price to earnings. For a mature company, it was traditionally expected that the correct price was something like 15 times earnings. So, if the company had a profit of about $1/year, the fair price would be $15 (depending on the interest rate expectations, maybe $20, but you get the idea).

Now for a startup this metric doesn't work. If the company is going to be 10x as big in a year, then the price of a share now, should be considerably more than 15x the earnings per share. If you had expected Apple, Facebook, Microsoft, or other such stocks to have a P/E of 15 in their early days, and refused to buy them at 20x or 30x, you would have made a collosal mistake and missed out on the opportunity to buy a stock that was headed upwards. Startup companies can double or triple their earnings in a relatively short time, so you expect them to have a higher P/E ratio (that is, be more expensive than their current earnings would suggest). In fact, you shouldn't really expect them to be making a profit at all, necessarily, since as they grow the cost of their R&D would be spread across many more customers.

Unfortunately, this created a perverse incentive for tech company CEOs to try to pretend that they were still high-growth startups, long past the point when they were not, because as soon as they were considered "mature" they would start to be evaluated with a much more skeptical eye. If your tech startup is not yet profitable, this is not necessarily a worry, but if you are a mature company and you are not yet profitable then you are in trouble, and Wall Street will punish your stock price.

There was also a set of industry-specific incentives that we can approximate as "Moore's Law", which pushed tech companies to sprint as fast as they could into a future of faster chips and higher density of data. Admitting that you had enough capacity for the number of customers you expected to have, was tantamount to admitting that it was time for people to sell your stock.

What do you get if you have a set of incentives which does not punish a CEO for being unprofitable, but does punish them for not increasing capacity? You get corn syrup.

Why does my new clothesdryer come with an app? Why does it take special effort to find a refrigerator without built-in WiFi? It's not because the consumer said, "oh, if only my refrigerator would send me an alert if I accidentally leave the door open"; it's because anything which can include computing, will, unless you go to special effort to find an option without it.

Why do modern movies use CGI, even in cases where there is no real need for it? It's not because moviegoers love CGI (usually), or were clamoring for more, or disliked movies that had insufficient CGI.

Why have we turned our classrooms into places where on-call tech support is often needed, even though neither the teachers nor the students had a problem with whiteboards?

I could detail the reasons why I believe automobiles now have a lot more computing than they should, but I think most people understand this point well enough already. We did not ask for more touchscreens and fewer tactile controls, and if asked we would have refused it; it was given to us anyway.

Now I do not claim to know the mechanism whereby the companies responsible for overproducing a superabundance of computing power, are able to convince appliance makers and movie studios and the rest to use it. Perhaps they are investors in their own customers? Perhaps every non-tech CEO wants to pretend their company is also a tech company, so that they can get the stock price that comes with that? I don't know for sure, but what I can tell you is the result, which is that we have stuffed computing into every corner or life, and in many of them it not only doesn't do any good, it is positively harmful.

None of this, I should make clear, means that all computing is bad or a mistake, any more than all corn is bad to eat. I like cornbread, and corn on the cob, and I like to have a laptop and WiFi at the coffeeshop. But we have long since gone past the point of putting computing into only those products and places where we benefit from it (or even want it); we have entered into the age where computing has a negative price, and it will take special effort to find products and places that do NOT have it.

I do believe that, like corn syrup, the over-computerization of our world will (or perhaps is already beginning to) trigger a backlash. However, when a large industry has built up this momentum, it can take years of backlash to get it to change its ways. In the meantime, we can expect ever-more-ridiculous injection of computing power into products and places that do not benefit from it, and may even suffer from it. If you want to avoid this "computing syrup", you will need to work hard to avoid it.