Defining global catastrophic risk
Beacon uses a straightforward definition: global catastrophic risk is anything that could kill or cause the massive suffering of a billion or more people in a single event or tightly coupled chain of events.
This includes both existential risks (threats to humanity's long-term survival or potential) and catastrophic risks that fall short of extinction but would cause suffering at a civilisational scale. It includes risks from suffering itself (s-risks), not only risks from death.
This definition is deliberately concrete. It guides every project evaluation: does this work plausibly reduce the probability or severity of a billion-scale catastrophe? If yes, it's in scope. If the connection is too indirect or speculative, we'll help you find a better home.
Risk categories Beacon prioritises
We allocate our attention and resources based on where we believe the marginal impact is greatest, weighing the importance of the risk, the tractability of interventions, and how neglected the area is relative to its stakes.
AI safety and AI governance
This is where we expect the large majority of our portfolio to sit. The intensity and near-term nature of AI risk, combined with the rapid scaling of capabilities, makes this the area where good projects face the most acute infrastructure needs. Technical safety research, alignment work, AI governance and policy, evaluations, and interpretability all fall squarely within scope.
Biosecurity and pandemic preparedness
Engineered pandemics and dual-use biotechnology represent the next tier of catastrophic risk by our assessment. Projects working on detection, attribution, countermeasure development, governance of dual-use research, and pandemic preparedness infrastructure are within scope.
Nuclear risk
We acknowledge nuclear risk as a genuine catastrophic threat, but it is substantially less neglected than AI safety or biosecurity. There are established institutions, treaties, and funding streams dedicated to nuclear risk reduction. Beacon is unlikely to be the best home for nuclear-focused work unless it involves clear intersections with AI (e.g., AI-enabled command and control risks, autonomous weapons systems) or other emerging technology risks.
Emerging technology risks and civilisational resilience
Catastrophic risk doesn't stand still. We're open to sponsoring work on emerging threats that don't fit neatly into existing categories: novel dual-use technologies, engineered risks at scale, or civilisational resilience work that directly addresses recovery from and robustness to catastrophic events. These are evaluated case by case.
Longtermism-adjacent and cross-cutting work
Some valuable work doesn't map cleanly to a single risk category but contributes to our understanding of or ability to respond to catastrophic risk. Field-building, forecasting, institutional design for existential risk governance, and similar cross-cutting efforts may be in scope depending on the specifics of the project. We evaluate these on a case-by-case basis with a higher bar for demonstrating GCR relevance.
What falls within scope
In general, Beacon sponsors work that:
- Has a plausible, articulable connection to reducing the probability or severity of a billion-scale catastrophe
- Requires a nonprofit structure (fiscal sponsorship, 501(c)(3) status, grant administration)
- Is led by people with the skills and track record to execute on the proposed work
- Would benefit from Beacon's GCR-specific domain expertise, network, and infrastructure
This includes research, policy, field-building, technical development, evaluations, community infrastructure, and rapid-response efforts, provided they meet the GCR relevance bar.
What falls outside scope
The following are important areas of work that Beacon is not well-placed to support. We respect these efforts and will help you find sponsors better suited to them.
- Climate change (in isolation). Climate work is critical, but it operates on different timescales and has a mature ecosystem of dedicated funders and organisations. Beacon's infrastructure doesn't add value here. The exception would be climate scenarios that interact with other catastrophic risks at the billion-scale threshold.
- Local disaster response. Emergency response and disaster relief require specialised operational infrastructure that Beacon doesn't have. We can offer high-level guidance on scaling coordination across response efforts, but direct disaster response is outside our wheelhouse.
- General poverty reduction and development. Valuable work, but too broad for Beacon's focused model. Organisations like GiveWell and GiveDirectly have deep expertise here.
- Work without a clear GCR connection. If the link between your project and billion-scale catastrophic risk requires more than a few sentences to explain, we're probably not the right fit.
Edge cases
Some of the most important work happens at boundaries. Beacon pays particular attention to:
- Situations that are scaling into catastrophic territory: emerging threats that aren't yet billion-scale but are on trajectories that could get there
- Work on what authorities, institutions, or treaties can bind global responses to catastrophic events
- Research into when and how previously out-of-bounds interventions become justified as threat levels escalate
- Cross-domain projects that don't have obvious homes elsewhere (e.g., AI-enabled biosecurity monitoring, governance work spanning multiple risk domains)
If you're not sure whether your project is in scope, get in touch. We'd rather have the conversation than have you self-select out of a good fit.
Alternative fiscal sponsors
If your work doesn't align with Beacon's GCR focus, it may be better served by a fiscal sponsor with broader scope. That work matters; it just needs different infrastructure.
Curated referral list in development. Beacon is consulting with partner organisations to compile recommendations for projects we can't serve. In the meantime, the National Network of Fiscal Sponsors maintains a directory of fiscal sponsors across a range of mission areas.