Caveat
As Ted Wrigley noted, there's a difference between a system that develops to have a clash in values between the part and the whole, and one that develops parts that might lead to the failure of the whole. Let's try both.
Short Answer
Is there a name for the phenomenon where a system that selects for one quality eventually leads to optimizing that quality at the cost of others?
Yes. It's called specialization. A person who spends all of their time designing databases stands to benefit from the supply-demand curve, but might find himself homeless if the tech-bubble quickly bursts and lacks the social skills to get help from others. But you seem interested in particular cases in specialization where said specialization leads to unintended/collateral negative consequences. Your examples are cast wide, so we'll examine a few cases to suggest some terms.
Long Answer
Specialists, Generalists, and Fragility
In systems thinking from an ecological perspective, the best terms might be fragility/antifragility.
A fragile system is one that develops in such a way that certain portions of a system specialize in a way to open up the system as a whole to failure, particularly through black swan events. Instrumental convergence is nothing but puffery to characterize AI systems that avoid specialization and hedge their bets using a variety of general-purpose strategies. From WP quoting Bostrom:
Several instrumental values can be identified which are convergent in the sense that their attainment would increase the chances of the agent's goal being realized for a wide range of final goals and a wide range of situations, implying that these instrumental values are likely to be pursued by a broad spectrum of situated intelligent agents.
Going back to the example of a database programmer, the adage cited is "The jack of all trades is a master of none, but oftentimes better off than a master of one."
In general systems theory championed early by Bertalanffy, and added to by Nassim Nicholas Taleb, the notion that a system might evolve in such a way to open the system as a whole to failure starts with the example of how it's easy to believe all swans are white, only to find out that such a claim is a failure with the advent of the discovery of a single black swan raising the specter of the scandal of induction. The black swan example is practically an idiom in epistemology.
Black swan theory is largely epistemological in so far as it seeks to characterize systems of knowledge. From WP:
Taleb's problem is about epistemic limitations in some parts of the areas covered in decision making. These limitations are twofold: philosophical (mathematical) and empirical (human known) epistemic biases. The philosophical problem is about the decrease in knowledge when it comes to rare events as these are not visible in past samples and therefore require a strong a priori, or an extrapolating theory; accordingly predictions of events depend more and more on theories when their probability is small. In the fourth quadrant, knowledge is uncertain and consequences are large, requiring more robustness.
So, in this context, an optimization in game-theoretic terms, might make a system outperform in one domain, and be susceptible to failure in another. Cognitive biases and informal fallacies are an example of this according to behavioral economist Daniel Kahneman, the main thesis of his Thinking, Fast and Slow where he defends a perspective of the human brain and mind based on evolutionary psychology. To wit:
The book's main thesis is that of a dichotomy between two modes of thought: "System 1" is fast, instinctive and emotional; "System 2" is slower, more deliberative, and more logical. The book delineates rational and non-rational motivations or triggers associated with each type of thinking process, and how they complement each other,
The usual explanation rests in the intuition of a human on the plains of Africa whose brain has to choose between paranoid and alive, and logical and eaten when the brushes are rustling and possibly populated with hungry lions. That same paranoia today might be counterproductive since human paranoia combined with in-group-out-group thinking might lead to war and death of humans who might be better off collaborating.
Competition, Cooperation, and Coopetition
In biological systems that develop to preserve some genes over others or the organism, is related to selfish gene, biological altruism, and reciprocal altruism. Sociobiology suggests that psychological altruism is a consequence of the biological sort.
In your business example, you might conceive of a system developing so that the system collapses to benefit a part of the system. When corporate profits are put above environmental sustainability (externalizing costs and Tragedy of the Commons), a business may destroy the source of it's profit, such as the eradication of the environment, as in Dr. Seuss's The Lorax, or it may destroy it's customer base by maximizing short-term profits for shareholders and destroying the company, a phenenomeon related to the golden parachute:
A golden parachute is an agreement between a company and an employee (usually an upper executive) specifying that the employee will receive certain significant benefits if employment is terminated. These may include severance pay, cash bonuses, stock options, or other benefits. Most definitions specify the employment termination is as a result of a merger or takeover,15 also known as "change-in-control benefits",6 but more recently the term has been used to describe perceived excessive CEO (and other executive) severance packages unrelated to change in ownership
In these cases, systems clearly specialize in direct relation and at the expense of parts, wholes, collaborators, or competitors. It might be possible to categorize this as some unified class of parasitism-predation. From WP's article on parasitism:
The entomologist E. O. Wilson has characterised parasites as "predators that eat prey in units of less than one".2
Summary
Specialization has three manifestations that can lead to negative outcomes in some regard. On the one hand, a system might specialize allowing it to perform optimally under one set of conditions, and fail under another. This is fragility. The second manifestation of specialization is when in the part-whole relationship, the specialization of one part benefits or externalizes expenses in relation to other parts. This is altruism and parasitism. And if it's between whole systems, the terms altruism and predation apply. Is there a single term to describe these various exemplars? I'd say corporate employment, but I don't think that's the term you're looking for.