Amendments Concerning Video-Sharing Platforms: What Do Moldova’s Recent Audiovisual Legislation Updates Actually Entail?

In May 2025, video-sharing platforms (hereinafter “VSPs”)—seen even within the media community as a relatively new field—became a prominent subject of discussion following proposed amendments to Moldova’s Audiovisual Media Services Code[1] (AMSC). Among other objectives, these legislative updates aim to transpose European regulations concerning these platforms into national law.
An Overview of Video-Sharing Platforms and their Regulation
A VSP is an online service that allows users to upload, watch, share, and comment on videos primarily generated by other users. Popular examples include YouTube, TikTok, Vimeo, and Facebook Video. Moldovan law can impose regulatory obligations only upon VSPs that fall within the jurisdiction of the state—specifically those headquartered in Moldova or with significant economic or organizational connections to local entities. A prominent local example of a VSP fully operating under Moldovan jurisdiction is Play.md, which allows users to upload, view, and comment on videos, create their own video channels, stream live content, build playlists, and publish videos.
VSPs are required to create a safe and transparent environment through various measures, including content filters and warnings designed to protect minors. They must actively prevent users from accessing violent, hateful, or terrorist-related content, provide accessible mechanisms for reporting inappropriate videos, respond promptly to user complaints, and publish transparent information regarding ownership, content guidelines, and user-support procedures. Additionally, VSPs must adhere to advertising rules, including requirements related to user-generated advertising.
It is important to note that VSPs do not bear editorial responsibility—they are not legally obliged to manually verify or approve every video before it is published—as long as they have effective mechanisms enabling users to report problematic content and the ability to promptly remove or moderate such content afterward.
Moldova’s Video-Sharing Platform Legislation: Past and Present
The Audiovisual Media Services Code (AMSC), since its adoption in 2018, has included general provisions addressing VSPs. However, these provisions have never been practically enforced. Currently, the AMSC:
- Defines essential terms such as “video-sharing platform provider,” “user-generated video,” and “video-sharing platform service” (Art. 1);
- Identifies VSP providers under Moldovan jurisdiction as subjects of the Code (Art. 2(4)(e));
- Authorizes the Audiovisual Council (AC) to draft, approve, and supervise regulations and monitoring methodologies applicable to VSPs (Art. 75(3)(b) and (h));
- Grants authority to the AC to handle complaints regarding VSP activities, including those related to copyright and related rights (Art. 75(4)(c));
- Requires the AC to maintain and regularly update an official registry of media service providers, VSP providers, and distributors of media services (Art. 75(4)(j)).
A set of amendments to the AMSC, adopted in their final reading on July 10, 2025 (though not yet promulgated), updates and significantly expands the regulatory framework applicable to VSPs.
Introducing Detailed Obligations for VSPs (Chapter VIII¹)
The most notable amendment involves the addition of a new chapter (Chapter VIII¹), imposing specific and detailed obligations on VSPs under Moldovan jurisdiction. In essence, platform providers must take appropriate measures to ensure:
- Protection of minors from user-generated videos and audiovisual commercial communications likely to impair their physical, mental, or moral development, particularly materials containing pornography or unjustified violence.
- Protection of the general public from programs, user-generated videos, and audiovisual commercial communications that include hate speech or content classified as criminal offenses (e.g., public incitement to terrorism, child pornography, fascist, racist, or xenophobic content).
Furthermore, VSPs must adopt specific measures concerning advertisements and product placements. For example, if users upload videos containing hidden advertisements or sponsorship, platforms must clearly inform viewers if the platform is aware or has been notified of such commercial content. In practice, this means content uploaders must declare any paid promotions included in their videos, and the platform must clearly display a notice (e.g., “Includes paid promotion”).
Specific Compliance Measures Mandated by Article 61/1 of AMSC:
To meet these objectives, the future Article 61/1 of AMSC lists specific compliance measures VSPs must implement:
- Updated Terms and Conditions: Platform terms must explicitly prohibit illegal and harmful content. Community guidelines must reflect the legal obligation to prevent content negatively affecting minors, promoting hate speech, or inciting criminal activity.
- User-based Advertising Disclosure System: Platforms must implement mechanisms enabling users uploading videos to declare if their videos contain commercial communications (ads, sponsorships) when known, ensuring transparency of sponsored content.
- Illegal Content Reporting Mechanisms: Platforms must provide transparent and user-friendly reporting mechanisms, such as a clearly visible “Report” button next to each video, allowing users to flag content such as pornography, violence, or hate speech.
- User Feedback and Complaint Resolution: Platforms must establish processes to communicate clearly with users regarding actions taken following content reports. Additionally, effective and accessible procedures must exist for appealing content moderation decisions (e.g., removal of user content).
- Age Verification and Parental Controls: Platforms must introduce effective age-verification systems to restrict minors’ access to potentially harmful content, alongside comprehensive parental control tools. Such age verification may involve mandatory logins through “18+” accounts or providing proof of age before accessing restricted content. For instance, YouTube announced in 2020 that it would require European users to verify age through ID documents or credit cards if automatic verification fails.
- Content Rating Systems: Platforms must enable classification (rating) of content by creators or users based on suitability (e.g., “suitable for general audiences,” “restricted to users under 16,” etc.). A consistent rating system helps viewers avoid unsuitable content and assists platform algorithms in preventing recommendations of harmful content to minors.
- Protection of Minors’ Personal Data: Article 61/1 explicitly prohibits processing personal data collected from minors (through age-verification or parental control mechanisms) for commercial purposes such as direct marketing, profiling, or behavioral advertising.
- Trusted Flaggers: Inspired by the EU Digital Services Act (DSA), the amendments introduce the institution of “trusted flaggers.” Platforms must prioritize reports from these certified experts, addressing their notifications promptly. The trusted flagger status can be granted upon request by the Audiovisual Council to any individual or entity established in Moldova that demonstrates specialized knowledge and competence in identifying illegal content. The AC will develop a certification procedure and evaluate platforms’ cooperation with trusted flaggers.
- Internal Dispute Resolution Mechanisms: Platforms must implement internal mechanisms for amicably resolving user disputes (e.g., account suspensions or removals). Users can appeal moderation decisions through these internal mechanisms without limiting their rights to seek external legal remedies (courts, mediation). This resembles the Oversight Board created by Facebook, an independent body reviewing controversial moderation decisions. Platforms may eventually be encouraged to adopt similar independent appeal bodies.
The Enhanced Role of the Audiovisual Council (AC)
The legislative amendments significantly strengthen the AC’s powers to ensure compliance with the new regulations on online platforms. According to the future Article 61/1 (11), if a platform under Moldovan jurisdiction hosts content violating legal provisions (e.g., failure to remove child pornography or hate-inciting videos), the AC may require the platform—within 24 hours of notification—to remove or block the content, display user warnings, or even suspend the violating user’s account for 1 to 3 months. Platforms must comply within the specified timeframe and inform the AC of the actions taken.
Additionally, the new law introduces Article 61/2, mandating VSP providers to submit a preliminary declaration to the AC before offering services in Moldova. This declaration must include the platform’s name, address, ownership structure (for transparency purposes), a service description, adopted protection measures, and planned launch date. The AC will then include the platform in a public registry of providers under Moldovan jurisdiction.
This requirement primarily applies to local platforms. Major global platforms (e.g., Google/YouTube, Meta/Facebook, TikTok) currently have no local legal entities, raising jurisdictional challenges.
Challenges Ahead
While representing significant progress, the amendments may pose certain implementation challenges, notably regarding transparency and auditing mechanisms—which the law currently lacks explicitly. Thus, secondary regulations drafted by the AC (as foreseen in Article 61/1(13)) should include periodic compliance reporting requirements.
Moreover, implementing even minimal protective measures against harmful or illegal content demands significant resources. Current regulations do not differentiate platforms based on size, potentially disadvantaging smaller local platforms. A possible remedy would be for the AC to introduce a transition period or provide technical support to help smaller platforms achieve compliance.
This publication is the result of activities carried out within MAAM – Media Advocacy Action for Moldova: Empowering Moldova’s Public Watchdogs to Safeguard Media Freedom, a project co-funded by the Italian Ministry of Foreign Affairs and International Cooperation and the Central European Initiative. All opinions expressed represent the views of their author and not those of the co-funding institutions.
![]()


