# Hazard: Lacks Community Involvement ```{image} ../../images/hazards/lacks-community.png :alt: A red diamond shaped outline (like a warning sign) with figures in the middle who have empty speech bubbles above their heads. :width: 250px ``` ## Description This applies when technology is being produced without input from the community it is supposed to serve. ## Examples __Example 1:__ Research into cures for Autism generally ([which are not wanted by Austistic people](https://www.theguardian.com/commentisfree/2009/jan/14/autism-health)). __Example 2:__ Samaritan's Radar app highlighted people who may be struggling to cope on Twitter, and was [withdrawn following wide criticism](https://www.samaritans.org/about-samaritans/research-policy/internet-suicide/samaritans-radar/). __Example 3:__ [Algorithmic colonisation of Africa](https://script-ed.org/article/algorithmic-colonization-of-africa/) ## Safety Precautions - Be proactive in reaching out to communities, including diasporic communities. If you are not sure how to contact communities then consider whether it is possible to do your work ethically in their absence. - Consider how issues of power, consent and trust may be present in your interaction with the relevant communities. - Ask the people who the works is about if they want this kind of solution, and co-create or [co-produce](https://involve.org.uk/resources/methods/co-production) research with them as partners. - Consider where the benefit for data science work lies and ensure that communities receive benefit from this work that is desirable to them. - Test the effectiveness of the algorithm or technology for different marginalised groups. - Consider where you may inadvertently be [classifying or ranking](ranks-classifies) one community over another when considering whose views 'count' towards addressing the research problem.