Hide table of contents

METR (Model Evaluation and Threat Research)

From their website:

Formerly "ARC Evals", METR was incubated at the Alignment Research Center and is now a standalone non-profit.

METR is a research nonprofit that works on assessing whether cutting-edge AI systems could pose catastrophic risks to society.

We build the science of accurately assessing risks, so that humanity is informed before developing transformative AI systems.

Posts tagged METR (Model Evaluation and Threat Research)

Relevance