Updated: 2023-09-13

Introduction

After many years of following the development of generative AI, we decided at the beginning of 2023 to incorporate AI into our operations and to actively follow its development, to learn more and to truly start using AI in our business.

Context

At Digitalist Open Tech, we use AI as a working tool for our everyday job-related tasks. We also build specific AI services for internal use as well as for our clients.

We know that both we as employees and our clients are affected by how we build AI services and how we use them and in the long run - society as a whole.

Purpose

The purpose of this AI policy is to ensure that Digitalist uses artificial intelligence (AI) tools responsibly, transparently, and in a manner that respects individual privacy, fosters trust, and promotes fairness. This policy apply to all employees and subcontractors who interact with AI tools on behalf of Digitalist.

Target group

All employees and subcontractors that we work with are responsible for adhering to this policy and ensuring that no prohibited data is shared with AI tools.

Scope

This policy covers any AI tool used within Digitalist and any data processed or shared by these tools, regardless of its format or storage location.

Ethical guiding principles

Digitalist is committed to adhering to the following ethical guiding principles when using and developing AI tools:

Transparency

We will communicate clearly and openly about our use of AI tools and their purposes, capabilities, and limitations. We do this internally, in conversations with our clients as well as on our digital channels.

Privacy and data protection

We will protect the privacy of individuals by adhering to EU data protection laws and regulations, and by implementing robust security measures to safeguard personal and sensitive information.

Bias - fairness and non-discrimination

As far as we possibly can, we will strive to prevent unfair bias and discrimination in AI systems and ensure that they treat all individuals equitably. We are aware that that is not always the case.