Social critics tend to use the word "capitalism" when discussing the West. In fact, of course, we live in a mixed economy, i.e. one that combines elements of capitalism and socialism. A society in which 40-50% of GDP is allocated to the state can hardly be described as free market capitalism.
Sometimes the things being complained about are plausibly more traceable to capitalism than socialism (e.g. pollution), but more typically it's simply assumed that it must be the capitalist part that is responsible, with little basis for the assumption. For example, the supposed dumbing down that tends to go with massification is often blamed on capitalism, although massification is likely to be correlated with income redistribution.
During and after the 2008/09 financial meltdown, numerous commentators were ready to blame the crisis on "capitalism". Simplistic labels assist in the process of making simplistic assessments: we were living under capitalism, hence it must be capitalism's fault; it was the banking system that went wrong and banking is part of capitalism, hence capitalism was to blame; etc. In fact, the circumstances that led to the meltdown were a complex mixture of markets and intervention.
If you interfere with the market, and the market goes wrong, you should consider the possibility that the interference contributed to the problem, or may even have been the primary source of the problem.
The fact that nothing as sophisticated as this tends to happen, and that we get knee-jerk blaming of "capitalism" — usually without anyone bothering to consider what the term means — should not perhaps surprise us. Many (most?) contemporary social commentators have studied for degrees in humanities subjects. The humanities are increasingly dominated by the Marxist worldview, though it tends nowadays to come under the more respectable-sounding label of "Critical Theory". Marxism involves commitment to the belief that capitalism is unjust, intrinsically flawed, bound to end badly etc; and Critical Theory tends to uncritically identify Western societies as "capitalist".
I have not yet come across any accounts of the COVID-19 pandemic that finger capitalism as a culprit, but I suspect it's just a matter of time.
At the moment, we seem to be at the gloom/despair stage rather than the anger/blame stage. The Atlantic, for example, asks whether the USA is a broken state. America, like other nations, is temporarily in a bad way, but it seems a little premature to start talking about failed states.