Helping others

Many areas may lack immediately meaningful Government regulation. And the current UK GDPR enforced by the ICO falls short in various ways. There are unknown societal impacts from autonomous solutions. All this leaves NHS staff and patients yet to be convinced of the safety and reliability of these solutions. This uncertainty is magnified by the exponential growth of technological advancements. Each technology adds its own complexities to the picture.

Solution design that is focused on data-flows, feature requests or system design is often not enough to identify and mitigate the risk of harm presented by autonomous systems – regardless of its Human-in-the-Loop (HITL) or Human-in-Control (HIC) design.

HD Labs help organisational leads achieve trust and alignment with anticipated compliance for their data-driven technologies. We do this by developing stakeholder-level transparency. It is a bottom-up approach. We equip organisational leads with the necessary tools and knowledge – what they will need to deploy and manage trustworthy and safe autonomous solutions. We provide the resource and skills required to uncover and understand stakeholder values and concerns. And we provide the methods to overcome the challenges and barriers to trust and successful adoption.

Our approach

We design the service to be affordable and accessible. We work with a unique combination of intellectual assets that are high quality and evidence-based and they are also open-source and public domain.

Our Ethical Digital service is broken down into a series of facilitated workshops, co-design and co-production sessions. Outputs of each stage in our process contribute to the next stage in the sequence. Key outputs of our approach include the Harms Modelling matrix and XAI Interpretable Toolkit. These form the foundation of a tailored Ethical Assurance System.

We offer continuous support after a successful handover. This includes Trustworthy Socio-Technical Reviews. In these reviews we assess autonomous decisions for data-drift. We identify novel adverse outcomes. For example we identify outcomes developed from refreshed intelligence on population, or from newly emerged patient values and concerns. We also identify outcomes developed from industry-specific horizon-scanning.

HD Labs Ethical Digital service approach
HD Labs: Ethics Advisor Vacancy

Join our Ethics Advisory Board

The Ethics Advisory Board monitors and maintains industry guidance on AI. It oversees current best practices for the use of automation, AI and data handling within all of HD Lab’s projects.

Our willingness to be this open with industry aligns us with the practice of being open-source, as well as fulfills our mission statement, vision and brand values. Interested?