I first met Natalie through my friends at Future Everything and we quickly bonded over our shared determination to tackle inequality. Natalie works as Curator & Editor at FutureEverything, an innovation lab for digital culture and festival in Manchester, alongside additional responsibility as a researcher for futures research lab, Changeist.
As a curator, writer and researcher exploring the intersection of culture, futures, design and technology, she has a particular interest in the uses and abuses of personal data, ethics and innovation in wearable technologies and issues surrounding the Quantified Self, as well as an interest in the cultural and social impact of engineering.
Natalie is also co-curator of the Haunted Machines conference, an ongoing research and curatorial project that reflects on narratives of magic and hauntings in our relationship to technology. Here's what she has to say...
What’s the biggest lesson you’ve learned over the last year?
Never to assume, or make assumptions, particularly in the way you think people should or will behave, as people will always surprise you. I spend a lot of time thinking about the social, political and ethical futures of a particular area, alongside the technological and economical, and I'm constantly surprised by how assumptions change over cultures, communities and generation.
A good portion of my work is in exploring those assumptions and see where clear lines are drawn to where they make it into reality. It's something I, and futures research lab Changeist where I spend some of my time, have worked into our projects such as Thingclash, which has ultimately been about thinking about the assumptions we hold around the Internet of Things, and how different contexts and personas cause us to rethink what those might be. It's something we're bringing into our work at FutureEverything, as we tackle CityVerve, Manchester's IoT demonstrator, where we're firmly putting people, not technology, first.
What’s your burning question of the moment?
I've got a couple, but the main things for me is how to embed ethics and futures methodologies into design in a way that makes teams understand how important it is for actual people, that it isn't seen as a luxury or a huge cost, and is something that can be made as a tool in a team's arsenal in tackling difficult design problems. So often ethics is forgotten about or seen as an add-on, but for me, it's essential in making sure we create things that reduce abuse, understand individual complexity, and are responsible. I think using futures and foresight methodologies can enable us to see these ethical consequences in ways that are applicable, and can be learned and used by teams throughout the design process.
Opening yourself up to uncertainty, and learning how to explore change and potential futures, allows you to step back and consider what might go wrong, and for who, in the process making your work more resilient. I'm driven by the need to make sure we don't create technology or design that hurts, excludes, or limits people because the designers didn't quite understand that these consequences might, and can happen. So many of the stories we see today about technology causing harm were a result of the fact that those designers didn't quite understand the complexities waiting for it in the real world, from legal implications to social behaviours. I know it's expensive to have a person specifically trained in futures, social sciences or ethics, but it doesn't have to be. You can train a team to start thinking in this way and bring in people when you need to, in the early days of design rather than when something goes wrong.
My other burning questions and current areas of research revolve around how to work futures methodologies into curatorial practice, and exploring the absences present in data; who isn't being represented, what is being hidden, and what exists in the gaps.
What’s the most inspiring thing you’ve seen/ heard/ read in the last year?
I love the work that Kate Crawford and Meredith Whittaker are doing with AI Now , a project run by The White House and NYU, which looks at the social and ethical implications of artificial intelligence, a subject incredibly close to my heart. I think there's so much conversation about what the technological capacities and aspirations might be that we forget the fact that they are going to be made by people and live with people, which means taking into account the prejudices and biases of those that author these systems, and those that will use them in the real world. The talks and discussions are fantastic, and I hugely recommend anyone working in AI, from developers to human resources, go and watch them.
I'm also constantly inspired by the work of people around me that are tackling social implications of design and technology, to name a few, Anab Jain, Deb Chachra, Sara Hendren, Georgina Voss, Sara Watson, Willow Blugh, Lydia Nicholas. There are many more I could name.
What's your one piece of advice to students out there?
There's an excellent quote by an artist and designer I hugely admire, Sara Hendren, whose work on assistive and adaptive technology is revolutionary; 'court ambiguity, walk toward inequality, above all yield to heartbreak'.
She gives this advice to her students at Olin, one of the best engineering schools in the world (in my opinion), and ultimately it's about understanding how people might use your technology is ways you didn't anticipate, actively work at making the world a better place for those that need it, and understand that over the course of your work you're going to get heartbroken, and to learn from it and tell others about it. I've probably hugely simplified that, so apologies Sara if you're reading this! It's something I want to bring more into my own dealings with students, and others I do workshops with. It's such a wonderful way to approach a problem, and such a necessary divergence from the tired, overused, and frankly boring 'failure' trope seen over and over again in design thinking. What you don't see is the people who didn't get back up again because of inequality, both structural and institutional, and only those with the privilege and connections to help them up again to then advocate the virtues of failure. We need to talk more about heartbreak, and vulnerability, and where we're letting people down by not addressing inequality head-on.
You can read the rest of the profiles here: