The Directive on Copyright in the Digital Single Market might fundamentally change the way works and other subject matter protected by copyright will be used on online platforms. Article 17 of the Directive obliges such platforms, based on information they receive from rightholders, to ensure that copyright infringing uploads made by their users are prevented and/or removed by an internalized procedure over which platform operators exercise control. Important enforcement mechanisms are thereby left to private entities solely deciding what content can (or cannot) be available on their platforms. More problematic, the immense masses of uploads seem to be only manageable by automated AI-based filtering technologies incapable of properly distinguishing between lawful and unlawful uses, which leads to a serious danger of overblocking of perfectly legitimate content and thus to potentially important limitations of fundamental rights such as the right to freedom of expression. At the same time, Article 17 also forbids general monitoring of the content available on platforms and clearly mandates to safeguard user’s rights protected by exceptions and limitations to copyright such as quotations, criticism and review or caricature, parody or pastiche. This obligation of result leads to an unsolvable conflict and in a great difficulty for Member States of implementing the provision in a fundamental rights compatible way. Moreover, the lack of clarity of the provision has led Member States to interpret it in significantly diverging ways, jeopardizing the objective of harmonisation behind the Directive. Therefore, the interpretation guidelines by the European Commission foreseen by Article 17(10) to help Member States in their implementation effort were impatiently awaited, in particular since the provision is facing an action for annulment before the Court of Justice of the European Union for potential violating the right to freedom of expression. The Guidance on Article 17 finally issued by the Commission in June 2021 after several postponements, although it provides for some useful clarifications and certain safeguards for users of protected works, but unfortunately does not depart from a system based on monitoring and automated filtering and thus is likely to fail protecting fundamental rights in an appropriate manner. This paper analyses the main additions and proposed interpretation tools that the Guidance brings to platform’s content moderation as mandated by Art. 17 CDSM. It argues that in order to establish a virtuous content moderation system and to help Member States to implementing Article 17 in a balanced way, it would have been essential to address the more fundamental concerns when it comes to Article 17, and in particular the fact that privately operated algorithmic tools and not independent assessors based on copyright law’s equilibrium are deciding what content should be available online and to acknowledge the inherent limits and flaws of technology. We conclude that in the absence of a truly independent arbiter between the interests of users, platforms and rightholders, Article 17 of the Directive is likely not to comply with European fundamental rights and the basic principles of EU law.