You make the assumption that well-being and autonomy are two different and separable things, but are they?
Like this is not just an iteration of the matrix where you plug a real person into a simulation where they interact with event, right? I mean you argue that you want to simulate their decisions and make them for them. Which would either not work and create a terrible discomfort for the person because their own decisions would be overridden and they had no idea why which might drive them insane or encourages them to act irrational to retain their agency, making it borderline impossible for any machine to keep up with that. Or you'd essentially make them a zombie, who has no mind of their own at which point you didn't really so much save a child but a psychological corpse or a body in a vegetative state. And whether such an organism can even experience well being is quite debatable.
It also presupposes answers to the mind body problem. Because if mind and body are interlinked than putting one to sleep and playing with the other would also radically alter your psychology, for better or for worse. So idk if that doesn't 100% works and there's reason to believe that it doesn't given no such device exist or is realistically conceivable, it might end up with a state where the user is trying to hurt themselves just to "feel" something, because parts of their body tell them that something is wrong and they want to tickle that itch to see where it is, unable to find it in the simulation.
And depending on the model autonomy is a part of well-being, so at some point the well-being machine is bound to fail it's job when it cannot produce autonomy. Like even without unnecessary brutality people wouldn't think of prisons as positive despite basically being well-being machines offering you with lots of spare time, free food and close contact to other people. But even if it's not a penal institution for people with problems but just regular people under unfree conditions would probably be closer to torture. Now if you've never experienced anything else you might be able to better cope with it, but I'd still think it would leave you with a lack of well-being.
And on the other hand if you you have no well-being then you're thoughts are likely occupied with the questions of how to change that. Even your autonomy in terms of deciding for yourself how to tackle the challenges that you face relies on you being sufficiently well off to not act instinctually. So yeah if the alternative is death, anything else would probably be better, at least it would buy you time to find something better and I think that would be the main motivation for parents to take that option. I don't think anybody would pick it if they had a better alternative.
I mean there are people in the real world who make that decision, who live in regions of crisis and send their kinds to relatives far away or even to strangers in hopes of giving them a better chance at live. So I guess there would be a strong urge for that, but not if there were any better alternatives.
Also can other people determine what well-being means for you? I mean beyond some very basic things that doesn't really work and we see it fail quite frequently and the autonomy to decide for yourself what is well-being is often quite essential for your well-being.
Idk it's kind of your trolley problem where you have to chose between two evils and in a realistic solution problem just worry till you pick one.
Last but not least there is the practical component of your thought experiment, in that it's quite unlikely that a highly destabilized society with rampant violence and death would be capable of building such a machine. And if they do they'd probably not do it for the convenience of the children but to run simulations on them with real life participants and as simulations are usually run on edge cases that might be very unpleasant.
Like you don't have to overthink it, but suppose you're not dealing with a mindset of hard determinism in real life but with people who do in fact have autonomy, then this scenario would basically be a computer game and you'd each and every move of the player would either create another universe or they would rather sooner than later find spots where the developers didn't expect you to go. Like the amount of processing power that would be required to render a perfect world that wouldn't feel like torture to you and that implements autonomy without having it would definitely be too much for your wasteland scenario.