The ecology of conviction

Crossposted from world spirit sock puppet.

In balance and flux

Crossposted from world spirit sock puppet.

Someone more familiar with ecology recently noted to me that it used to be a popular view that nature was ‘in balance’ and had some equilibrium state, that it should be returned to. Whereas the new understanding is that there was never an equilibrium state. Natural systems are always changing. Another friend who works in natural management also recently told me that their role in the past might have been trying to restore things to their ‘natural state’, but now the goal was to prepare yourself for what your ecology was becoming. A brief Googling returns this National Geographic article by Tik Root, The ‘balance of nature’ is an enduring concept. But it’s wrong. along the same lines. In fairness, they seem to be arguing against both the idea that nature is in a balance so intense that you can easily disrupt it, and the idea that nature is in a balance so sturdy that it will correct anything you do to it, which sounds plausible. But they don’t say that ecosystems are probably in some kind of intermediately sturdy balance, in many dimensions at least. They say that nature is ‘in flux’ and that the notion of balance is a misconception.

It seems to me though that there is very often equilibrium in some dimensions, even in a system that is in motion in other dimensions, and that that balance can be very important to maintain.

Some examples:

  • bicycle
  • society with citizens with a variety of demeanors, undergoing broad social change
  • human growing older, moving to Germany, and getting pregnant, while maintaining a narrow range of temperatures and blood concentrations of different chemicals

So the observation that a system is in flux seems fairly irrelevant to whether it is in equilibrium.

Any system designed to go somewhere relies on some of its parameters remaining within narrow windows. Nature isn’t designed to go somewhere, so the issue of what ‘should’ happen with it is non-obvious. But the fact that ecosystems always gradually change along some dimensions (e.g. grassland becoming forest) doesn’t seem to imply that there are not still balance in other dimensions, where they don’t change so much, and where changing is more liable to lead to very different and arguably less good states.

For instance, as a grassland gradually reforests, it might continue to have a large number of plant eating bugs, and bug-eating birds, such that the plant eating bugs would destroy the plants entirely if there were ever too many of them, but as there become more of them, the birds also flourish, and then eat them. As the forest grows, the tree-eating bugs become more common relative to the grass-eating bugs, but the rough equilibrium of plants, bugs, and birds remains. If the modern world was disrupting the reproduction of the birds, so that they were diminishing even while the bugs to eat were plentiful, threatening a bug-explosion-collapse in which the trees and grass would be destroyed by the brief insect plague, I think it would be reasonable to say that the modern world was disrupting the equilibrium, or putting nature out of balance.

The fact that your bike has been moving forward for miles doesn’t mean that leaning a foot to the left suddenly is meaningless, in systems terms.

What is going on in the world?

Crossposted from world spirit sock puppet.

Here’s a list of alternative high level narratives about what is importantly going on in the world—the central plot, as it were—for the purpose of thinking about what role in a plot to take:

  • The US is falling apart rapidly (on the scale of years), as evident in US politics departing from sanity and honor, sharp polarization, violent civil unrest, hopeless pandemic responses, ensuing economic catastrophe, one in a thousand Americans dying by infectious disease in 2020, and the abiding popularity of Trump in spite of it all.
  • Western civilization is declining on the scale of half a century, as evidenced by its inability to build things it used to be able to build, and the ceasing of apparent economic acceleration toward a singularity.
  • AI agents will control the future, and which ones we create is the only thing about our time that will matter in the long run. Major subplots:
    • ‘Aligned’ AI is necessary for a non-doom outcome, and hard.
    • Arms races worsen things a lot.
    • The order of technologies matters a lot / who gets things first matters a lot, and many groups will develop or do things as a matter of local incentives, with no regard for the larger consequences.
    • Seeing more clearly what’s going on ahead of time helps all efforts, especially in the very unclear and speculative circumstances (e.g. this has a decent chance of replacing subplots here with truer ones, moving large sections of AI-risk effort to better endeavors).
    • The main task is finding levers that can be pulled at all.
    • Bringing in people with energy to pull levers is where it’s at.
  • Institutions could be way better across the board, and these are key to large numbers of people positively interacting, which is critical to the bounty of our times. Improvement could make a big difference to swathes of endeavors, and well-picked improvements would make a difference to endeavors that matter.
  • Most people are suffering or drastically undershooting their potential, for tractable reasons.
  • Most human effort is being wasted on endeavors with no abiding value.
  • If we take anthropic reasoning and our observations about space seriously, we appear very likely to be in a ‘Great Filter’, which appears likely to kill us (and unlikely to be AI).
  • Everyone is going to die, the way things stand.
  • Most of the resources ever available are in space, not subject to property rights, and in danger of being ultimately had by the most effective stuff-grabbers. This could begin fairly soon in historical terms.
  • Nothing we do matters for any of several reasons (moral non-realism, infinite ethics, living in a simulation, being a Boltzmann brain, ..?)
  • There are vast quantum worlds that we are not considering in any of our dealings.
  • There is a strong chance that we live in a simulation, making the relevance of each of our actions different from that which we assume.
  • There is reason to think that acausal trade should be a major factor in what we do, long term, and we are not focusing on it much and ill prepared.
  • Expected utility theory is the basis of our best understanding of how best to behave, and there is reason to think that it does not represent what we want. Namely, Pascal’s mugging, or the option of destroying the world with all but one in a trillion chance for a proportionately greater utopia, etc.
  • Consciousness is a substantial component of what we care about, and we not only don’t understand it, but are frequently convinced that it is impossible to understand satisfactorily. At the same time, we are on the verge of creating things that are very likely conscious, and so being able to affect the set of conscious experiences in the world tremendously. Very little attention is being given to doing this well.
  • We have weapons that could destroy civilization immediately, which are under the control of various not-perfectly-reliable people. We don’t have a strong guarantee of this not going badly.
  • Biotechnology is advancing rapidly, and threatens to put extremely dangerous tools in the hands of personal labs, possibly bringing about a ‘vulnerable world’ scenario.
  • Technology keeps advancing, and we may be in a vulnerable world scenario.
  • The world is utterly full of un-internalized externalities and they are wrecking everything.
  • There are lots of things to do in the world, we can only do a minuscule fraction, and we are hardly systematically evaluating them at all. Meanwhile massive well-intentioned efforts are going into doing things that are probably much less good than they could be.
  • AI is powerful force for good, and if it doesn’t pose an existential risk, the earlier we make progress on it, the faster we can move to a world of unprecedented awesomeness, health and prosperity.
  • There are risks to the future of humanity (‘existential risks’), and vastly more is at stake in these than in anything else going on (if we also include catastrophic trajectory changes). Meanwhile the world’s thinking and responsiveness to these risks is incredibly minor and they are taken unseriously.
  • The world is controlled by governments, and really awesome governance seems to be scarce and terrible governance common. Yet we probably have a lot of academic theorizing on governance institutions, and a single excellent government based on scalable principles might have influence beyond its own state.
  • The world is hiding, immobilized and wasted by a raging pandemic.

It’s a draft. What should I add? (If, in life, you’ve chosen among ways to improve the world, is there a simple story within which your choices make particular sense?)

Condition-directedness

Crossposted from world spirit sock puppet.

Opposite attractions

Crossposted from world spirit sock puppet.