Goodhart’s law famously says: “When a measure becomes a goal, it ceases to be a great measure.” Although originally from economics, it’s something we’ve got to grapple with at OpenAI when determining the right way to optimize objectives which are difficult or costly to measure.