A window of opportunity at risk: Germany’s implementation of the EU Platform Work Directive 2024/2831

Platform work and its precarious forms have become the focus of a contested debate, spanning remote and location-based on-demand labour (Hoose et al., 2025). Scholars, trade unions, employer associations, and policymakers at both EU and national levels advance divergent views on how platform labour should be classified, regulated, and governed. Yet, the adoption of the Directive 2024/2831 once again provides governments with an opportunity to improve the working conditions of platform workers beyond the Directive’s minimum standards. After a lengthy legislative process, Member States must transpose the provisions into national law by December 2026. According to EU institutions, the Directive aims to enhance protections for more than 28 million individuals working through digital labour platforms. Platform work, precarity, and regulatory challenges: a scholarly perspective From a scholarly perspective, the legal challenges of platform work, including the lack of minimum wages and access to social security, have been widely criticised. Pongratz and Bormann (2017) and Serfling (2019), for example, call for stronger regulation to ensure fair working conditions, a position also supported by recent empirical findings (Hertwig et al., 2025; Sieker, 2022). Digitally mediated labour markets introduce significant structural challenges. While platforms promise flexibility and accessible income opportunities (Hertwig & Papsdorf, 2022), they typically rely on opaque algorithmic management, limited procedural protections, and the classification of workers as independent contractors. As a result, many platform workers are excluded from employment-based rights such as social security and collective representation (Cohen, 2017; Gruber-Risak, 2018). Legal scholarship has emphasised that digital labour platforms function not merely as intermediaries but as market organisers that structure access to work, set binding rules, and exercise indirect control through algorithmic management (Kocher, 2022). This organisational power, combined with the absence of corresponding accountability mechanisms, contributes to structural precarity and asymmetric power relations between platforms and workers (Witzak & Hertwig, 2025; Salehi et al., 2015). Key governance problems include limited transparency in algorithmic decision-making (Waldkirch et al., 2021), the absence of effective appeal mechanisms (Kocher, 2022), and unpaid forms of labour, such as task searching, which significantly reduce workers’ effective wages (Toxtli et al., 2021). Recent findings tailored to Germany by Beckmann et al. (2024) highlight the need for institutionalised state protection, mandatory social security contributions, solidarity-based structures, and stronger regulation of platform operators to reduce risks for platform workers. In this context, Hertwig et al. (2025) stress that the effectiveness of the Directive will depend decisively on national transposition, particularly on how far existing labour law jurisprudence, such as the Federal Labour Court’s focus on platform control and steering of work, is incorporated. Transposing the Platform Work Directive: beyond status-based regulation The Directive establishes minimum standards to address employment misclassification (i.e. the classification of platform workers as self-employed despite substantial platform control), algorithmic management, and enforcement deficits in platform work. Its central legal innovation is a rebuttable presumption (i.e. a legal default that an employment relationship exists unless the platform proves otherwise) of an employment relationship where a digital labour platform exercises control over work performance. While earlier drafts specified concrete indicators of control, the final text leaves key implementation choices to the Member States in defining the conditions under which a platform is to be regarded as an employer. Beyond status, the Directive regulates how platforms use algorithms to manage work by requiring transparency and ensuring that decisions with major consequences for working conditions are taken or reviewed by humans. It also strengthens procedural rights and involves social partners, setting EU-wide minimum standards while leaving implementation largely to Member States. In my reading, the debate on the transposition of the Directive has increasingly narrowed to questions of legal classification and the precise definition of control criteria. While this discussion is undoubtedly important, it risks obscuring the broader institutional stakes of implementation. Comparative evidence shows that platform workers’ access to social protection differs markedly across Member States due to nationally specific insurance schemes, eligibility thresholds, and administrative arrangements, meaning that the Directive will interact with highly heterogeneous institutional contexts across Europe (De Becker et al., 2024). Against this backdrop, in a country like Germany, with comparatively high labour standards and a long tradition of social partnership, transposition is not merely a technical exercise. It constitutes a strategic decision about whether digital labour markets are to be integrated into the existing regulatory framework or treated as a lasting exception. Given the extensive empirical evidence on precarious platform work, Germany should use the Directive to extend established labour standards into the digital economy. In my view, a scientifically robust transposition would move beyond status alone and instead focus on five interrelated regulatory dimensions. First, functional status and social protection. The Directive rightly introduces a rebuttable presumption of employment where platforms exercise control. National implementation should ensure that social security obligations follow functional dependency, regardless of contractual labels. This includes extending minimum labour standards and social protection mechanisms to platform workers whose economic risks are structurally shifted onto them, even when formal self-employment persists. Second, algorithmic management must be treated as a regulatory object. Digital platforms govern work primarily through algorithms that allocate tasks, set prices, evaluate performance, and impose sanctions. These systems are not neutral technologies but instruments of control. A credible transposition must therefore combine transparency obligations with enforceable procedural rights: access to meaningful explanations, human review of automated decisions, and safeguards against opaque deactivation. Algorithmic governance must be recognised as a form of private regulation subject to public oversight. Third, economic fairness requires recognising hidden labour and differentiated platform models. Empirical research shows that a substantial share of platform work consists of unpaid labour: searching for tasks, waiting for assignments, coordinating work, and managing reputational systems. Ignoring these activities systematically underestimates working time and depresses effective hourly earnings. Regulation must therefore account for platform-induced waiting and search times, particularly in microtask environments. At the same time, platform work is highly heterogeneous. Microtask platforms and project-based platforms generate fundamentally different risks, calling for differentiated regulatory responses rather than one-size-fits-all solutions. Fourth, collective capacity and procedural governance are indispensable. Individual…

Read the full article →