The new toolkit designed to assist technologists to envision the potential risks plus the resultant worst-case scenarios that could emerge should their technologies be used in the future for malicious purposes. This will allow developers to anticipate issues and, from this, to design ethical solutions from the outset. The tools are called the “Ethical OS Toolkit” (which the developers dub “Or: How Not to Regret the Things You Will Build”). The tool kit has been tested out by some twenty technology companies, together with schools and start-ups. Among the thee early adopters are including Mozilla and Techstars.
The toolkit gives technologists with information about the new risks they should be paying attention to as well as choices they can make to safeguard users, communities, society and their own companies. The kit was designed by Institute for the Future’s Jane McGonigal, an expert in collaborative foresight and human interaction plus Samuel Woolley, director of Institute for the Future’s Digital Intelligence Lab.
Included with each toolkit are:
14 “risky futures” scenarios where they can imagine various risky situations that could arise because of new technologies.
A checklist of eight risk zones where hard-to-anticipate and unwelcome consequences are most likely to emerge.
Seven future-proofing strategies that help technologists prioritize identified risks.
According to Jane McGoniga: “The future is like a game. We have to play it to see how it turns out. But if we wait until the future actually happens, it’s too late to shape it or change our strategies for a better outcome.”
She adds: “But we can play it in advance. Our goal is to engage tens of thousands of tech workers in anticipating the long-term impacts of what they’re building today, so they can use that foresight to make better, more ethical choices now.”
Going forwards, the Institute for the Future and Omidyar Network’s Tech and Society Solutions Lab are continuing to work with tech companies, academics, product managers and others to market the toolkit.