Platform privacy
Our platforms offer optimal security and are compliant with current European privacy regulations. How we have managed this on our platforms is explained below.
Version October 2024
Terms of use
The webmaster (administrator) of the platform can create terms of use, after which they must be accepted by members.
Member data
Active members For members, at least the email address is used. This email address is needed to access the platform. In addition, the member him/herself can enrich his/her profile with additional personal data. Often the organization where the member is employed has established guidelines for this.
Inactive members Members may be assigned “deactive” status. Inactive members no longer have access to the platform. Only through the webmaster (administrator) can access be reactivated.
The profile data of inactive members are visible only to the webmaster (administrator); normal users can only see the name, profile picture and contributions made on the platform.
An inactive participant can also be removed. In that case, the profile is deleted and the contributions are anonymized.
User statistics Statistics are kept of user traffic on the platform. These include user contributions, such as posting files and comments, or tracking learning outcomes. Data on the number of visitors to and interaction on pages is also tracked.
Inappropriate content The liability and responsibility of online platform providers is regulated at the European level by the Digital Services Act. For this purpose, the platform offers members the possibility to mark certain content as inappropriate.
Cookies
The application uses only functional cookies:
The email cookie makes it easier for you to sign in on your next visit.
The local cookie stores your language preference.
The session and wax cookies are needed to keep you logged in from page to page.
The timezone cookie allows us to format dates and times for your local time zone.
Infrastructure security
The platform was developed by Fellow Digitals. Fellow Digitals has been developing platforms aimed at collaboration by and development of employees since 1997. Fellow Digitals has security as a high priority. This means that we comply with the applicable guidelines, laws and regulations in the field of information and data security. Fellow Digitals is ISO 27001, ISO 27701 and NEN 7510 certified.
For the hosting of its platforms, Fellow Digitals uses the services of Exonet. Exonet is a Dutch hosting provider. It uses data centers in the Netherlands. Fellow Digitals also has a separate fallback environment in case of calamities. Exonet is ISO 27001, ISO 9001 and NEN 7510 certified.
Explanations of AI applications
The AI Act The European Artificial Intelligence Regulation, or the Artificial Intelligence Act (AI Act), was adopted by the European Parliament and entered into force in 2024. The AI Act aims to ensure that AI systems placed on the European market and used in the EU are safe and respect EU fundamental rights and values.
The AI Act aims to create more trust in AI systems in Europe by ensuring the safety of users. The idea behind the AI Act is to regulate artificial intelligence through a risk-based approach. The AI Act classifies AI systems into four risk categories:
Unacceptable risk: These AI systems are prohibited. Consider AI systems that can influence human behaviour.
High risk: Within this category are systems that are considered high-risk and used, for example, in education, critical infrastructure networks or in law enforcement. The use of these systems is subject to strict regulations.
Limited risk: An example of a limited-risk AI system is a chatbot. This must take into account the risks that may arise and these systems are subject to a transparency obligation. People must be informed if they interact with a low-risk AI system.
Minimal risk: These AI systems can be developed and used according to existing legislation. There are no additional requirements for them in the proposed regulation. Examples of AI systems with minimal risk are spam filters and search engines.
The AI systems that Fellow Digitals makes available in its applications fall into the ‘limited risk’ category.
Functionalities In the functionalities below, Fellow Digitals provides support through AI systems: For translations, DeepL is used, an AI-powered service for generating translations, based on texts entered by users. This service is used when the user activates the translation button. Data is only used for translation and not stored by DeepL. AI translations may contain errors and may not fully capture all the nuances or context of the original text. We recommend checking translations for accuracy, especially if the content is crucial or sensitive.
Transparency obligation: Explanation of use of AI in applications Fellow Digitals uses AI systems to optimise services to users. In the context of the transparency obligation from the AI Act, Fellow Digitals wants to inform its users about this as well as possible, by indicating in which functionalities AI systems are used and for what purpose. In case of questions about the use of AI in Fellow Digitals' functionalities, please contact privacy@fellowdigitals.com.
Fellow Digitals mentions in every functionality that uses an AI system, that this functionality is supported through an AI system. Thus, the use of the AI system is identifiable to users. Likewise, reference is made to this note, which discusses the AI Act and the parts of this legislation relevant to Fellow Digitals. Moreover, within applications, the result generated by the AI system is identifiable such that it is clear to the user that it was generated by artificial intelligence (‘machine-readable format’). This ensures the data is portable and the generated result is between systems.
Before using AI systems, users should inform themselves about the capabilities and possible limitations of use possessed by the AI system. To this end, the section ‘Functionalities’ lists the AI systems used in the applications, also explaining the purpose (reason) of these systems and their limitations.
Disclaimer Fellow Digitals only uses third-party AI systems in its applications.
The user of the platform is responsible for what data he has processed via the available AI systems and is also responsible for the correct interpretation and use of the result. Naturally, the user must at all times comply with all relevant and current laws and regulations, such as with regard to the use of AI systems and privacy (GDPR/AVG).
Fellow Digitals can never be held liable for the processing of data by AI systems in its applications and its result.
Questions?
For questions about information security, please send a message to privacy@fellowdigitals.com. To find out how the above mentioned functionalities work, please refer to the support sites of Viadesk and Coursepath.