Our Ethics Code

July 15, 2021

Synthetic Media revolutionises content creation. It allows people from all walks of life to create content in a capacity that was previously only available to a few. It is a powerful tool, but just like any tool, it can be misused.

 

Voice Cloning, like other deep fake technology, can be used for nefarious purposes including disinformation, defamation, information warfare as well as phone-call frauds and hacking attacks. 

 

Any technological advancement carries its risks. Altering human perception rightfully raises ethical questions about its purpose and its effect on the fabric of any society.

 

Our answer is that we should not be afraid of new technologies when they are potent. Instead, we should tame them and put them to good use. 

 

The virtuous cycle of incorporating new Synthetic Media technologies into our societies has already started. As the dangers around deep fakes grow, so does public awareness. As the realism of Synthetic Media grows, so does the ability of detection techniques to spot them. As the incidents of Synthetic Media misuse grow, so does the corresponding ability of modern legal systems to tackle wrongdoing and to strike a good balance between benefits and risks.

 

The legal framework around digital rights and impersonation is already solid and sufficient for most foreseeable misuses. What is different now is the scale and the accuracy at which impersonation can be made. This is a problem of scale and can be solved gracefully by solutions operating at the same scale; namely, by enabling everyone to detect synthetic content and for law enforcement to trace it’s creator.

 

At Altered, we have a mission to empower and democratise voice content creation. Our technologies are powerful and could be misused if they were opened to third-party, e.g. via an API. Instead, we choose to guard them behind the walls of our own platform that we fully control. This allows us to enable creativity while we reduce the risks of misuse. Further, we have codified our commitment to an ethical use of our technologies into the following Principles and Pledges:

Principles

  • No misuse: We will be vigilant and take measures against use-cases that are deemed to be illegal or harmful.
  • Digital Rights Protection: Protecting the Digital Rights and Intellectual Property of Voice Owners is very important to us and our business model. Voice Owners and their licensors have rights and we will take appropriate measures  to support them in accordance with the law.
  • Accountability: We will do our best to provide accountability in our platform, operations and business.
  • Selectivity: We retain the right to refuse to licence or sell our products to clients when we are unsure of their conduct or intentions.

 

Pledges

  • Commoditise Detection: We pledge to commoditise synthetic content detection, to reduce the friction associated with the detection of synthetic speech and to make this available to everyone, especially users without technical capabilities.
  • Commoditise Accountability: We pledge to ensure accountability in the use of the tools of our platform. A combination of logs and digital signatures (audio watermarking) will allow us to backtrace illegal or harmful content creation.