Obligations for online platforms | Traficom
Transport and Communications Agency

Obligations for online platforms

In their operations, online platforms must take account of the obligations in accordance with the Digital Services Act (DSA) that enter into force on 17 February 2024. Traficom is the main supervisor of the regulations. For multinational platform giants, the regulations entered into force in the summer of 2023.

According to the regulations, online platforms are a part of intermediary services. Certain obligations apply to all intermediary services, but in addition to these, online platforms have their own special obligations. These obligations do not apply to micro and small operators, however.

Common obligations of online platforms and hosting services

In their terms and conditions, service providers must include information on any restrictions on the information provided by the recipients of the service. The terms and conditions must be drawn up so that they are clear and easy to understand, and they must also be published in a machine-readable format. Intermediary services must also inform the users of any significant change to the terms and conditions. In addition, special attention must be paid to the realisation of fundamental rights and freedoms in the application and monitoring of the restrictions. If the service is mainly used by minors, the terms and conditions must be such that minors can understand them. 

At least once a year, service providers must publish reports on any content moderation that they engaged in during the relevant period. At minimum, the information specified in more detail in Article 15 of the Digital Services Act (DSA) must be mentioned in the report. The reporting obligation does not apply to micro or small enterprises.

The online platform must provide a mechanism that allows anyone to submit a notice concerning illegal content. When the online platform receives a notice, it must:

  • send a confirmation of receipt, if the party that submitted the notice gave their contact information 
  • process the notice and make its decision concerning the suspected illegal content quickly and diligently
  • give a notice of its decision and the related possibilities for redress
  • state if it has used automated means for processing the notices or decision-making. 

If an online platform restricts the use of the service because the user has provided illegal content or does not comply with the terms and conditions, the online platform must state the reasons for the restriction. The statement of reasons must only be provided if the online platform has the user’s contact information and the matter does not involve spam. 

The reasons must be stated starting from the day when the restriction is set at the latest and regardless of why or how the restriction has been implemented. An exception to the obligation to provide reasons, however, involves orders given to the online platform by a national judicial or administrative authority to act against the illegal content.

The service provider’s restrictions may include:

  • Removal of content, disabling access, or demoting content
  • Suspension, termination or other restriction of monetary payments 
  • Suspension or termination of the provision of the service in whole or in part 
  • Suspension or termination of the account

The statement of reasons must identify: 

  • what the decision concerns and, if necessary, the territorial scope and duration 
  • the facts and circumstances on which the decision is based 
  • whether automated means have been used 
  • concerning illegal content, a reference to the legal ground relied on and the reason why the information is considered to be illegal content on that ground
  • concerning the incompatibility with the terms and conditions, a reference to the contractual ground and reasons 
  • information on the possibilities of redress, meaning an internal complaint-handling system, out-of-court dispute settlement and judicial redress.

If the online platform discovers any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person has taken place or will take place, the platform is obliged to notify the police about the matter. 

The service provider must react quickly if it receives an order from a national judicial or administrative authority to take action against illegal content or provide information about certain users of the service. The service provider must inform the authority that issued the order what kind of actions it has taken as a result of the order. It must also be specified to the authority whether and when the order was implemented. 

The service provider must also inform the service user of the order and the actions taken as a result of the order. The notification must be given at the latest when the actions are implemented or according to the time specified by the authority. Users must also be informed about the reasons for the actions, the appeal possibilities and the regional scope of the order.

All service providers must publish information about the contact point, which allows national or EU authorities to communicate directly and electronically with the service provider. In addition to the contact information, it must be stated in which languages the authorities can be contacted. In addition to the national languages, the contact point must work in English or another language commonly spoken in Europe.

Service providers must also publish contact information for the purpose of users of the service contacting them. Users must be offered the opportunity for user-friendly, electronic, direct and fast communication with the service provider. A method that is not fully automated must also be offered for contacting.

The published contact information for contacting the authorities and users must be kept up to date.

Obligations that only apply to online platforms 

The online platform must provide users with its own electronic complaint handling system that is free of charge for a period of 6 months at minimum from certain decisions by the online platform.

The possibility to lodge a complaint must be provided for a decision made by the online platform concerning a notice on suspected illegal content. There must also be a possibility to lodge a complaint on a decision to restrict the use of the service based on the illegal content of the information provided by the recipients or incompatibility with the terms and conditions of the online platform. 

Online platforms must handle the complaints in a timely, non-discriminatory, diligent and non-arbitrary manner. If the complaint has sufficient grounds, the online platform must also reverse its decision without delay.

The online platform must make the decisions under the supervision of appropriately qualified staff, and not solely on the basis of automated means.

The online platform must inform complainants of its decision and of the possibility of out-of-court dispute settlement and other available possibilities for redress.

Upon application, Traficom can grant the status of a trusted flagger to any entity that meets the specified requirements.

Online platforms must ensure that notices on suspected illegal content by trusted flaggers are given priority and that they are processed and decided upon without delay.

If, according to the information of the online platform, a trusted flagger has submitted a significant number of notices that are inaccurate or if the notices have been insufficiently precise and inadequately substantiated, the online platform must notify Traficom about the matter. 

Trusted flaggers

An online platform provider must suspend, for a reasonable period of time and after having issued a prior warning, the provision of its services to users that frequently provide manifestly illegal content. (link to the section on manifestly illegal content in the rights of the user) An online platform provider must also suspend, for a reasonable period of time and after having issued a prior warning, the processing of manifestly unfounded notices or complaints.

When deciding on suspension, the online platform must assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the private individual or the entity engages in the misuse. The online platform must take into account all relevant facts and circumstances that are available, such as:

  • the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted within a given time frame;
  • the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;
  • the gravity of the misuse, including the nature of illegal content, and of its consequences; 
  • where it is possible to identify it, the intention of the private individual or the entity.

The online platform provider must explain, in a clear and detailed manner, in its terms and conditions its policy on misuse and give examples of the facts and circumstances that it takes into account in its assessment.

In addition to the common reporting obligations of online platforms and hosting services, the obligations of online platforms include:

  • reporting on the disputes handled by out-of-court dispute settlement bodies and their decisions as well as suspensions of the provision of the service and the handling of notices or complaints 
  • publish information on the monthly number of users of the service in its online interface
  • notify the Transparency Database of the Commission of its decisions on restrictions.

Providers of online platforms are not allowed to design, organise or operate their online interfaces in a way that deceives or manipulates the users of their service or that otherwise materially distorts or impairs their ability to make free and informed decisions. This prohibition concerning the so-called dark patterns is monitored by the Consumer Ombudsman when the online platform provides the service to consumers.

The monitoring of advertising has been divided between the Consumer Ombudsman and the Data Protection Ombudsman.  The Consumer Ombudsman monitors the identifiability and transparency of advertising, if the matter involves commercial advertisements from a trader to a consumer. The Consumer Ombudsman also monitors the online platform’s obligation to provide a function for declaring commercial communications. 

The Data Protection Ombudsman monitors the identifiability and transparency of advertising, when social or political advertising is involved. The advertisements must state clearly how the recipients of the advertising have been defined and how the user can change these parameters. In addition, the Data Protection Ombudsman monitors that advertisements are not shown based on profiling, in which data belonging to various personal data groups have been used. They include, for instance, information on health, political opinions and ethnic origin.

Online platforms that use recommender systems must describe the main parameters used in their recommender systems, as well as any options to modify them, to the recipients of the service in their terms and conditions in plain and intelligible language. The obligation is monitored by the Data Protection Ombudsman.

Online platforms must put measures in place to ensure the privacy, safety and security of minors on their services. The prohibition against presenting advertisements based on the profiling of minors applies to online platforms. The obligation is monitored by the Data Protection Ombudsman.

Obligations of online marketplaces

Online platforms also include online marketplaces where consumers and traders do business and conclude other distance contracts. Private individuals also do business with each other on the platforms, but the regulations do not apply to these cases. Online marketplaces include sites for selling items as well as travel and hotel reservation sites, among other things. In addition to the obligations on hosting services and online platforms, online platforms have their own additional obligations.

In addition to information related to identifying the trader, online marketplaces must require that traders provide the company’s payment account details and a self-certification committing to only offer products or services that comply with the applicable regulations. The online marketplace must verify the reliability of the information provided by the trader. If there is a justified reason to doubt the accuracy of the information, the online marketplace must ask the trader to correct the information that is inaccurate, incomplete or not up-to-date. The online marketplace must suspend the provision of its service to a trader that fails to correct the inaccurate information without delay.

The traceability of traders is mainly monitored by Traficom. The Consumer Ombudsman monitors that the online platform has made enough information about the trader accessible to the recipients of the service. The information must be published at least on the online interface of the online platform where the information on the item or service is provided.       

The online marketplace must design its online interface in a way that enables traders to comply with the obligations concerning them. These refer to pre-contractual information, compliance and product safety information. The online marketplace must also assess in advance whether the traders have provided the information in accordance with the regulations and carry out random checks after the fact in order to identify illegal products or services.       

The online marketplace has an obligation to inform consumers if illegal products or services have been offered via the platform. The notice must contain the fact that the product or service is illegal. Information on the identity of the trader and potential means of redress must also be provided at that time. If the online marketplace does not have the contact information of consumers, it must provide the information in question in its online interface. The obligation of the online marketplace is monitored by the Consumer Ombudsman.

Obligations and monitoring of multinational platform giants

Online platforms also include multinational platform giants, i.e. very large online platforms and search engines. They constitute services with more than 45 million users per month within the EU. Such platforms include Facebook, Instagram, the Google search engine, TikTok and the Amazon online store. The EU Commission names the operators and services that meet the definition of a very large online platform. In addition to the obligations that apply to all online platforms, a set of additional obligations have been imposed on multinational platform giants.

The operation of the platform giants is monitored by the Commission together with the authorities of the country in which the main establishment of the platform giant in question is located. None of the platform giants have a main establishment in Finland.

Further information

Do you have questions regarding the implementation of the Digital Services Act? You can contact us by filling in the form.

Traficom's role in the new regulation about digital services and data

Regulation (EU) 2022/2065 of the European Parliament and of the Council on a Single Market For Digital Services and amending Directive 2000/31/EC

(EU) 2022/2065Valid from: 19/10/2022

Page was last updated