We have released the 2024 Report for our Emerging Challenges Fund, detailing its latest grants, past grants' outcomes, and strategic direction.
Donate to the ECF today.Your contribution would help diversify our funding base and enable rapid support for highly impactful organizations that benefit from broad-based backing. The Emerging Challenges Fund aims to identify funding gaps that other donors cannot or will not fill. Our flexibility allows us to move swiftly across diverse areas—from diplomatic dialogues with China to legislative support for the EU.
About the ECF
Our Grants
New Grants (2024 Grants)
For summaries of the project funded, its theory of change, and the track record of the host organisation, please read our 2024 Report.
Recipient | Amount | Summary |
---|---|---|
Six EU civil society organisations | $733k | EU AI Act: Emergency Support |
Horizon Institute for Public Service | $150k | Placing technical expertise in government |
Centre for Long-Term Resilience | $150k | Advising on biosecurity & AI policy |
Pacific Forum | $50k* | Convening US/China Track II dialogues |
Council on Strategic Risks | $210k | Promoting non-nuclear deterrence |
* Some projects were jointly funded by the ECF & Longview's partners at our recommendation.
Grant Updates (2023 Grants)
For summaries and updates on each of our 2023 grants, please read our 2024 Report.
Recipient | Amount | Summary |
---|---|---|
FAR.AI | $100k | Found vulnerabilities in superhuman AI |
Harvard University | $110k* | Investigated LLMs’ beliefs and guardrails |
Model Evaluation and Threat Research | $220k* | Automated AI evaluations |
Brown University & Oregon Research Institute | $20k* | Protected nuclear decisions from bias |
Carnegie Endowment for Peace | $52k* | Revealed escalation risks are highly uncertain |
Blueprint Biosecurity | $50k* | Catalysed Project AIR, a plan for Far-UVC |
Center for Communicable Disease Dynamics | $80k | Supplemented research funding for GCBRs |
Nuclear Threat Initiative | $100k | Shaped the Biological Weapons Convention |
* Some projects were given additional funding by our Nuclear Weapons Policy Fund or one of our partners at our recommendation.
Looking Ahead (2025 Grants)
In 2025, we aim to unearth more opportunities that the ECF is particularly well-suited to fund. With your support, we aim to fill critical funding gaps at organisations in need of rapid financial support and a diversity of donors.
To support the ECF in 2025, donate now!Our Strategy
Over the next decade, emerging technologies will pose significant challenges to global security. Rapid advances in AI, for example, could lower the barriers for malicious actors to carry out large-scale biological or cyber attacks, bring democratic processes under unprecedented strain, and accelerate scientific and economic progress like never before. At present, we are neither ready to face the next deadly pandemic nor equipped to navigate escalating geopolitical tensions as global superpowers build more nuclear weapons, of more types, on more platforms.
The ECF aims to prepare the world for these challenges. We prefer to support projects that meet Longview’s usual grantmaking criteria and pass two further tests:
- Does the project have a legible theory of impact? ECF grantees must have a compelling and transparent case in favour of their impact that a range of donors will appreciate.
- Will the project benefit from diverse funding? Often, support by a large number of donors, rather than a single organisation or donor, is of particular value.
In 2024, we allocated over half of the ECF to civil society organisations invited to help draft the EU AI Act's Code of Practice. These grants were especially well-suited for the Fund, as (i) providing expertise to those shaping the implementation of the EU AI Act is both vitally important and legible, and (ii) it is important that these civil society organisations are funded by a diversity of sources and can remain credibly independent from any single interest group, and (iii) the Fund allowed us to make the grants quickly to fill an urgent need. We aim for this to exemplify the ECF's grantmaking strategy—quickly supporting clear opportunities that other philanthropists overlook or are not well-suited to support.
For those seeking to invest in a safer future, this fund provides unique expertise across beneficial AI, biosecurity, and nuclear weapons policy, and fills critical funding gaps at organisations in need of rapid financial support and a diversity of donors.
Donate to the ECF.Note that the ECF used to be called the Longtermism Fund.
About Longview Philanthropy
Longview Philanthropy is an independent and expert-led philanthropic advisory for major donors who want to do the most good possible with their giving. Our operations are funded directly by donors who believe in our mission, giving us the ability to reliably provide free-of-charge, independent, and expert-backed grant recommendations. By growing the community of major funders in our focus areas, we create more opportunities for organisations to start, sustain, and scale their impactful projects—contributing to a safer and more secure future.
Our Services
Our grantmaking and advisory teams offer a variety of philanthropic services at no cost. Whether you're beginning your giving journey or managing an established portfolio, Longview can help you create lasting impact.
Funds (Public & Private)
We manage specialised funds with distinct focus areas. Our public funds—the Emerging Challenges Fund and the Nuclear Weapons Policy Fund—are open to all donors. For major donors, we offer private funds featuring enhanced reporting and confidential insights.
Grant Recommendations for Major Gifts
For donors wishing to make large gifts, we offer access to grant recommendations drawn from our top opportunities. These concise analyses help donors find and fill the most critical funding gaps.
End-to-End Effective Giving Service
For donors seeking to develop significant philanthropic portfolios, we provide a bespoke end-to-end service at no cost. This includes detailed analysis, expert-led learning series, residential summits, tailored strategic planning, grant recommendations, due diligence, and impact assessment.
Get in Touch
To explore using these services—or to recommend someone who may benefit from them—please get in touch with our CEO, Simran Dhaliwal, at simran@longview.org.
Our Focus Areas
Our core priority is safely navigating emerging technology, with most of our grantmaking in the following focus areas:
- Beneficial AI. Cultivating expertise, refining policy, and scaling up technical safety efforts.
- Biosecurity. Strengthening defence-focused biotechnologies and global norms.
- Nuclear Weapons Policy. Opposing destabilising systems, arms races, and unintended escalation.
In addition to these core focus areas, Longview has made substantial grants and recommendations in global priorities research, media, global health & development, and animal welfare.
Beneficial AI
Massive investments are creating AI models that can surpass many human abilities. This rapid progress poses significant risks, from unprecedented cyberattacks on critical infrastructure to the loss of control over powerful, autonomous systems capable of deception. Philanthropy can reduce these risks by improving governments’ capacity, shaping national and international policy, and supporting innovative technical solutions.
Biosecurity
The next pandemic could be engineered to be far deadlier than COVID-19 or even the Black Death. Advances in synthetic biology could allow malicious actors to cheaply create and deploy pandemics without expertise. We aim to promote defence-focused biotech and global norms—like creating systems to detect novel pathogens and building consensus for their widespread use—to defend the world against the worst pandemics.
Nuclear Weapons Policy
The US, Russia, and China have entered a new arms race, bringing destabilising systems online as key arms controls treaties expire. Meanwhile, philanthropic funding for nuclear policy has fallen by half to under $40 million per annum. We support pragmatic initiatives to reduce worst-case risks from nuclear weapons, such as opposing maximalist proposals for expanding nuclear arsenals and working to prevent conflicts inadvertently escalating to nuclear war.
Support organisations that Longview recommends
To avail of our services or make large gifts to organisations that we recommend, please get in touch with our CEO, Simran, at simran@longview.org.
Or support the ECF today!