"Trajectory changes" are "very persistent/long-lasting improvements to total value achieved at every time" conditional on not having an existential catastrophe - in other words, interventions that increase the chances of good futures as opposed to merely okay ones (Forethought Foundation).
From the vantage point of the present, it would be very hard to reliably make trajectory changes, because we can't foresee whether our actions will be good or not. However, we can take actions that empower future generations to create the futures they want. These include:
- Doing global priorities research so future generations have more knowledge they can use to decide what matters
- Improving future generations' ability to predict the effects of their actions
- Improving governance so future generations can more easily decide what matters and act on their collective values
- Improving humanity's general capacity to innovate and solve problems (e.g. science/tech policy and infrastructure, public goods markets)
This is similar to the taxonomy of EA focus areas proposed here.
This approach may also be valuable from a deontological perspective: if future generations have a right to choose their own future, then we ought to help them make that decision themselves; making irreversible trajectory changes on their behalf would violate that right.
Based on this focus area taxonomy, I think the most pressing area is global priorities research. Compared to the other focus areas, it seems highly neglected relative to its importance, whereas lots of people inside and outside EA are doing object-level work on x-risks, forecasting, governance, etc.