Loading...

Guides

Right to be forgotten online

Right to be forgotten and digital security: when you can erase personal data online, when you can’t, and how to protect yourself effectively.

online content

Table of contents

  • What the right to be forgotten really means
  • Right to be forgotten and GDPR: what the law says
  • The difference between erasure and de-indexing
  • When you can exercise the right to be forgotten
  • When the right to be forgotten does NOT apply
  • Right to be forgotten and public figures
  • The role of search engines in digital security
  • Technical limits of the right to be forgotten
  • Right to be forgotten and social networks
  • The responsibility of the aware user
  • The right to be forgotten as part of a security strategy

The right to be forgotten is one of the most frequently mentioned concepts when discussing digital security, online privacy, and personal data protection, yet it is also one of the most misunderstood. Many users believe it means they can delete any information about themselves from the Internet at any time, without limits. Reality is far more complex.

This article offers an in-depth explanation of what the right to be forgotten really is, when it can be exercised, when it does not apply, and why it represents a crucial pillar of modern digital security. The goal is to provide a clear and realistic guide for digitally aware users who want to understand not only their rights, but also the legal, technical, and cultural limits of data deletion online.

What the right to be forgotten really means

The right to be forgotten emerged from the need to balance two fundamental rights: the right to privacy and the right to information. In the digital era, this balance has become extremely delicate, because the Internet does not forget and endlessly replicates information.

Legally speaking, the right to be forgotten allows individuals to request the removal, de-indexing, or restriction of processing of personal data that are no longer relevant, up to date, or necessary for the purposes for which they were originally collected.

Example
An old news article about a minor crime committed decades ago. Even if accurate, it may still appear prominently in search results linked to a person’s name, creating a form of permanent digital punishment. The right to be forgotten exists to address exactly this kind of imbalance.

Right to be forgotten and GDPR: what the law says

In Europe, the main legal reference is the General Data Protection Regulation (GDPR). In Italy, the supervisory authority is the Garante per la protezione dei dati personali.

The GDPR explicitly recognises the right to erasure, often incorrectly referred to as the right to be forgotten, stating that individuals may request the deletion of their personal data when:

  • the data are no longer necessary for their original purpose
  • consent has been withdrawn
  • the data have been unlawfully processed
  • erasure is required to comply with a legal obligation

It is essential to understand that this right is not absolute. The regulation itself establishes several exceptions to prevent abuse and protect the collective memory and freedom of information.

The difference between erasure and de-indexing

One of the most common misunderstandings concerns the difference between data erasure and de-indexing.

Erasure means the actual deletion of data from the servers of the data controller, such as when you close an account or delete a profile from an online service.

De-indexing, on the other hand, applies to search engines. It means that the content remains online but no longer appears when someone searches for a person’s name. This is the most well-known application of the right to be forgotten in relation to search engines.

From a digital security perspective, de-indexing significantly reduces exposure while preserving the balance between privacy and the public’s right to information.

When you can exercise the right to be forgotten

The right to be forgotten can legitimately be exercised only under specific conditions, which must always be evaluated case by case.

Example
If a piece of information is accurate but no longer relevant, and continues to harm a person’s reputation without any real public interest, a removal request may be justified.

Another common case is the withdrawal of consent. If you previously allowed a service to process your data and later change your mind, you may request deletion, unless legal retention obligations apply.

From a digital security standpoint, this right is essential to limit the circulation of personal data that could otherwise be exploited for identity theft, phishing, or unauthorized profiling.

When the right to be forgotten does NOT apply

Understanding when the right to be forgotten does not apply is just as important, and far less frequently explained.

The right cannot be exercised when data processing is necessary to protect freedom of expression and information.

Example
News of public interest, even if uncomfortable, cannot be removed simply because they damage someone’s reputation.

It also does not apply when data must be retained to comply with legal obligations, such as tax, judicial, or administrative records.

From a digital security perspective, this means not everything related to us can be “cleaned” from the web. Prevention is often far more effective than attempting deletion later.

Right to be forgotten and public figures

The boundary between individual rights and public interest becomes even thinner when dealing with public figures. Politicians, business leaders, and professionals in prominent roles enjoy a lower level of privacy protection than private citizens.

This does not mean they lose all privacy rights, but that the right to be forgotten is assessed using much stricter criteria. Information relevant to public debate is unlikely to be removed, even if old.

In terms of digital security, this translates into higher reputational exposure and a stronger need for conscious online presence management.

The role of search engines in digital security

Search engines play a central role in the practical application of the right to be forgotten. They are not responsible for the content itself, but for its indexing.

When a de-indexing request is submitted, search engines assess several factors: relevance, timeliness, public interest, the individual’s role in society, and the accuracy of the information.

From a digital security point of view, this process determines how easily personal data can be found and potentially misused.

Technical limits of the right to be forgotten

Even when a request is approved, there are technical limitations to consider. The Internet consists of copies, caches, archives, screenshots, backups, and mirrors.

Removing content from a single source does not guarantee its complete disappearance. This is one of the reasons the right to be forgotten cannot be considered a magic solution.

It highlights the importance of preventive digital security: limiting the amount of personal data shared online in the first place.

Right to be forgotten and social networks

Social networks represent one of the most complex environments. Deleting a post or an account does not automatically remove all copies or shares made by other users.

Legally, platforms must remove data under their control, but they cannot intervene on content saved or reposted elsewhere.

Here, the right to be forgotten meets the limits of distributed digital security, where data control is fragmented across multiple actors.

The responsibility of the aware user

For a digitally aware user, the right to be forgotten should not be seen as a shortcut to fix careless online behaviour, but as a protective tool for specific situations.

True digital security is built on everyday choices: what we share, with whom, on which platforms, and with which privacy settings.

Understanding the limits of the right to be forgotten helps foster a culture of digital responsibility, which is essential in today’s aggressive information ecosystem.

The right to be forgotten as part of a security strategy

Integrating the right to be forgotten into a broader digital security strategy means considering it alongside other tools: data minimization, digital identity management, and online reputation monitoring.

It is not just about deleting, but about controlling and reducing exposure over time.

Conclusion

The right to be forgotten is a powerful yet limited tool, essential for protecting digital dignity but often misunderstood. It does not guarantee total erasure of the past, nor can it override public interest or legal obligations.

For aware users, understanding when it applies and when it does not is key to truly strengthening digital security, shifting from a reactive approach to a preventive one.

In a world where digital memory is potentially infinite, real protection starts with knowledge.


Questions and answers

  1. Does the right to be forgotten allow deletion of any online content?
    No, it applies only under specific conditions and includes important exceptions.
  2. Can I request removal of a true but negative article about me?
    Only if it no longer serves the public interest and is disproportionate.
  3. Is de-indexing the same as deletion?
    No, the content remains online but becomes harder to find.
  4. Are social networks required to delete everything I post?
    They must remove data under their control, not copies made by others.
  5. Does the right to be forgotten apply to public figures?
    Yes, but under much stricter limitations.
  6. How long does it take to get a response to a request?
    Usually within one month, depending on the case.
  7. Can I contact the data protection authority directly?
    Yes, if the data controller fails to respond or unjustifiably refuses.
  8. Does the right to be forgotten protect against identity theft?
    It helps reduce exposure but does not replace other security measures.
  9. Do deleted data disappear forever?
    Not always, due to backups, archives, and copies.
  10. What is the best defence for personal digital security?
    Prevention: sharing less data and doing so consciously.
To top