Denmark’s Bold Deepfake Law

What it Means for Law & IT Students?

white and red flag on pole

The deepfake issue has been causing some understandable fear and anxiety on how our public image could be used negatively. However, this year sees Denmark come out swinging by announcing its first-of-its-kind law in Europe. A move that could reshape how artificial intelligence interacts with human rights.

Here's what it means!

The Legal Lens

woman in dress holding sword figurine

Why the Law Matters

Under Denmark’s proposed bill, individuals will be able to demand the removal of unauthorised deepfake content portraying them, and seek compensation in civil court. Platforms that fail to comply may face significant penalties under the EU’s Digital Services Act, further solidifying a clear language that there is a shared responsibility between users and tech companies.

Crucially, the law makes room for satire and artistic expression, creating a much-needed balance between freedom of speech and digital privacy.

Denmark is also set to take over the EU presidency in 2025, and the government hopes to set a precedent that can be adopted across Europe. In effect, the country is redrawing the boundaries of digital personhood, and sending a strong signal to both regulators and developers around the world.

What BAC Law Students Should Take Away

For you legal eagles pursuing law at BAC, this development shows how emerging technologies are challenging "legacy" legal principles.

Where does free speech end and identity theft begin? How do we regulate non-physical harms in a digital space? And who is held accountable?

Denmark’s model brings constitutional law, IP law, privacy rights, and human rights into direct conversation with artificial intelligence, making it an excellent real-world case study for BAC students exploring Legal Systems, Cyber Law, Constitutional Law, and Technology & Regulation.

As the legal profession continues to evolve, future lawyers will increasingly need to work at the intersection of law, ethics, and digital infrastructure. Whether it’s drafting legislation, defending clients harmed by digital impersonation, or shaping policy for responsible AI, this is the legal frontier, and you are well-positioned to lead in it.

The Comp Sci Lens

a laptop computer sitting on top of a desk

A Wake-Up Call for Tech Students at UNIMY

UNIMY students, listen up. You are being trained to become the next wave of software engineers, AI developers, and computer scientists. And Denmark’s legislation serves as a powerful reminder:

Tech doesn’t exist in a vacuum.

Building AI systems requires not just technical skill, but an understanding of the ethical and legal frameworks those systems operate within. The tools we build today shape the digital world people live in tomorrow. This is a huge responsibility.

That's why UNIMY’s programmes in Computer Science, Cybersecurity, Computer Engineering, and Software Engineering are designed to equip you with both the technological expertise and the ethical mindset to navigate this complicated space.

From machine learning models and biometric data processing to platform moderation and algorithmic responsibility, you are taught to think critically about what they build—and who it affects.