Identifying your audience#

If you would like to run a workshop you will want to try and identify a range of people to attend who can bring different perspectives to the session. This is a brief overview of the types of stakeholders we have usually encountered, and activity to get you thinking about who you might invite.

Audience types#

Audience group

Knowledge Exchange

Encouraging Diversity


Project Team

The team gain a new tool to reflect on their work and affect decisions whilst also having space to clarify whatever ethical concerns they already had.

Principal investigators, research associates and assistants, project managers, software engineers…

Data Hazard Labels can inform teams working on different projects

Impacted Community/ies

People who will be affected by, or use the research output can feed into its design, affecting the researchers’ decisions

Product end-users, caring professions, marginalised communities…

A project developing a system for cancer cells detection can involve cancer patients and clinicians

Organisational Representatives

Professionals from the institution where the project is carried out can share their specialist knowledge

Partnerships, legal, governance, human resources, finance, research ethics committee…

Legal teams can help understand privacy concerns, whilst human resources professionals might have notes on recruitment


Funders can learn about the social impact and risks of scientific research

Funders of different projects

The technical feasibility of a project can be impacted by morally informed judgements, and funders can feel better informed through collaborative sessions like these

Outside perspectives

The team gain a novel perspetive on a topic they know well

A variety of subject specialisms

A philosopher will have a unique perspective on the ethics of a data science project.

There are many stakeholders of data science research projects, and even more reasons to build strong ties with them. In this section, we suggest four coarsely grained groups of stakeholders. The list is not intended to be exhaustive, but they are important parties to engage with. Data Hazard Training sessions are a great way to establish a shared vernacular when discussing data ethics with these parties.

The project team#

It is important to discuss matters of ethics with the entire team. Whilst certain formal responsibilities may be clearly established, a project’s moral implications can fall to the entire team.

Involving the entire team can lead to excellent discussions. From experience, we have found that facilitating discussions in a safe environment with various research team members can help individuals at different career stages voice their concerns about a project. The Data Hazard Labels then serve to establish conceptual clarity to discuss those concerns. Ultimately, these discussions can shape projects for the better.

Furthermore, project teams are not forever. Researchers will continue their careers after a project ends and the team dissolves. But Data Hazard Labels serve as a basis to continue reflecting on data ethics afterwards. Therefore, our fantastic facilitators can help promote a cultural change within data science towards a more reflexive approach.

Impacted communities#

Research outputs might impact diverse communities. Such outputs might include tools that researchers might use for technical analyses; that clinicians might use for diagnosing patients; that policy-makers might use for influencing their decisions; or that the public might use for various reasons. It is important that, knowing the project team’s intended output, its potentially impacted communities be identified.

Whilst Data Hazard Training sessions do not explore your project team’s research in-depth, inviting impacted communities is a way to establish a shared conceptual framework to discuss ethical questions as the more relevant research project progresses.

Indirectly, Data Hazard Training sessions allow project teams to gain insight from their relevant communities. Attendees should feel encouraged to point to the Data Hazard Labels they find most relevant. This can then be captured by the session’s facilitator and shared with the project team later on.

Organisational representatives#

Research institutions often have different business units, such as legal, partnerships or human resources. Inviting their representatives to Data Hazard Training sessions can help demonstrate the institution’s care for the ethics of its research outputs. It can also foster a more trusting work environment, where colleagues from non-research positions gain greater understanding of the technical and ethical aspects of the research produced.

Furthermore – and similarly to inviting entire project teams – this creates a space where business professionals can share their insights from their own specialist backgrounds. In this way, they can influence decisions within a project and provide perspectives and even solutions that might otherwise be missed.


Those who fund research projects will be interested in knowing the impact they will have, as well as the different factors that influence its methodologies and final outputs. Once again, Data Hazard Training sessions allow for a shared series of terms to discuss the ethics of data science. These sessions also convey the urgency of data ethics. With all this, bringing funders into Data Hazard Training sessions can help foster a culture of understanding. By this, we mean that they promote a change from viewing “moral constraints” affecting methodological decisions to the moral basis from which to conduct a project for the better.

Outside perspectives#

Project teams can sometimes lack exposure to very many disciplines. A project team might be highly technical, for example. Data Hazard Workshops are a great way to bring diverse disciplines into dialogue. By creating a safe space to discuss complex ethical issues, experts from different disciplines can bring their own perspectives to the debate. Through interdisciplinary discussions, better-informed ethical decisions can be made.