Eventfold Logo
Published on
· 5 min read

The EU Digital Omnibus Is About to Change the AI Act's Timeline — What Event Organizers Should Actually Do

Authors
  • avatar
    Name
    Lucas Dow
    Twitter

The EU Digital Omnibus on AI is in trilogue negotiations right now, and the practical effect — if and when it is adopted — is that parts of the AI Act that event organizers have been preparing for are about to get delayed. That sounds like good news. It mostly is not.

Here is why.

What Is Actually Changing

The Digital Omnibus is a legislative package that does a few things at once, but the part most relevant to event platforms is that it adjusts the implementation timeline for the EU AI Act's high-risk system obligations. Article 6(1) and the corresponding obligations were originally scheduled to apply from August 2, 2027. The Digital Omnibus is likely to shift some of the associated high-risk provisions further out.

But — and this is the part the headlines miss — the August 2, 2026 deadline for several core provisions is still on the books. Specifically:

  • Transparency obligations. Including the rule that AI-generated content must be labelled as such.
  • General-purpose AI model obligations. Which apply to the vendors behind most of the AI features in event platforms today.
  • Governance and penalties provisions. Which define who enforces what and at what cost.

If the trilogue collapses or endorsement is delayed past August 2, the original high-risk obligations also apply as written. Either way, August 2 is the operational date event organizers need to plan around.

Why Transparency Is the Provision That Actually Matters

For most event organizers, the high-risk AI classification conversation has always been slightly academic. Unless you are running a recruitment event where AI is used to evaluate candidates, or using emotion recognition on attendees (which is prohibited regardless), your AI use probably does not fall into the high-risk category.

What does apply to you is the transparency rule.

Starting August 2, 2026, any AI-generated content distributed to attendees has to be identifiable as such. That includes:

  • AI-drafted speaker biographies
  • AI-generated session summaries
  • AI-personalized attendee emails
  • AI-generated images used in event marketing
  • AI-written FAQ responses from chatbots

This is not theoretical. The vast majority of event platforms — including ours — generate or assist in generating these artifacts in 2026. If the attendee cannot tell they are interacting with AI-generated content, and the content is material to their experience, you are going to be out of compliance.

The fine structure under the AI Act is steep. For an individual event organizer operating a single-country event, you are probably not the enforcement target. But the providers you rely on are, and the contractual cascade runs downhill fast.

Member State Enforcement Is Still Uneven

One of the more striking findings from the spring 2026 AI Act analyses is that only 8 of 27 EU member states have formally designated their national competent authorities. The enforcement apparatus is not evenly distributed. In some member states, you could be out of compliance today and nothing would happen. In others, the authorities are already active.

For event organizers operating across multiple EU countries — which is most of the professional conference market — this is the uncomfortable asymmetry. Your Stockholm event might face different enforcement than your Munich event, even under identical AI use.

The practical implication is that compliance cannot be per-jurisdiction. It has to be per-operation, tuned to the strictest enforcement you will face across any market you serve.

The Four-Item Action List

If the Digital Omnibus moves forward cleanly, some of the more onerous obligations slide out. If it does not, they do not. Either way, here is what an event organizer can do in the next few weeks that covers both scenarios.

1. Inventory your AI-generated content. Every email, landing page, speaker bio, session summary, attendee recommendation, and chatbot response that is AI-generated or AI-assisted. You probably have more of it than you think. The inventory is the prerequisite for everything else.

2. Update your attendee-facing disclosures. Registration terms, privacy notice, and event communications should explicitly acknowledge AI use. This does not have to be alarming language. It has to be clear.

3. Document your processing records. Under GDPR Article 30 you already maintain a record of processing activities. Add AI tools and providers to that record. Note the legal basis. Note whether the provider is established in the EU. This is 30 minutes of work and it is the first thing an authority asks for.

4. Confirm vendor AI Act posture. Ask your event platform vendor specifically: which of your features fall under general-purpose AI model obligations, which under transparency obligations, and which are outside the scope? The answer tells you a lot about whether the vendor has done their compliance homework. If they do not know the answer, that is the answer.

Why This Is Not a Delay Opportunity

The temptation with regulatory delays is to push the work back. For the Digital Omnibus, that would be a mistake for a specific reason: the provisions most likely to be delayed are the ones least likely to apply to most event organizers anyway. The provisions that are still live on August 2 are the ones that hit everyone.

Event organizers who treat the Digital Omnibus as an excuse to defer AI Act preparation are going to spend the second half of 2026 scrambling. The ones who treat it as noise around a deadline they are already preparing for are going to be fine.

I wrote a 118-day compliance checklist earlier this year. The timeline has shifted slightly. The work has not.