Skip to main content

Cookie settings

We use cookies to ensure the basic functionalities of the website and to enhance your online experience. You can choose for each category to opt-in/out whenever you want.



Analytics and statistics


This proposal has been implemented

Congratulations! Your proposal has been one of the most voted and you will be able to present it on Friday 20th October at Decidim Fest. You have received an email with more information and details.

Benchmarking Decidim: What Can We Learn From Similar Platforms

Avatar: Simonas Zilinskas
Simonas Zilinskas


Decidim isn't alone in the field. Other platforms have been there for years, each with its own take on features and user experiences. And whether you treat them as competition or not, they're a chance for us to learn and improve.

This is why at Open Source Politics we would like to suggest a workshop to take a clear-eyed look at what others are doing right and wrong. We'll assess, discuss, and find inspiration.

We won't tiptoe around tough questions. If a platform has a feature we lack, we'll ask why, and whether we need it. If their user interface does something better than ours, we'll see what we can change. This isn't about flattery or imitation. It's about understanding our space better.

The end goal? Come out of the workshop with real, concrete ideas for how to make Decidim better. No vague concepts or grand visions—just actionable strategies we can work on right away.


  1. Exposition by OSP (30 minutes): This past summer (and for the past few years), our team at OSP conducted a comprehensive benchmarking analysis of Decidim, taking a deep dive into a selection of competing platforms. This rigorous exercise allowed us to identify a number of intriguing practices that diverge significantly from Decidim's approach. In the initial segment of our workshop, we will guide you through these noteworthy contrasts one by one.
  2. Team Assembly (5 minutes): Following the exposition, we will segment participants into three distinct breakout groups. Each team will be tasked with delving into a subset of the major deviations identified during our benchmarking process.
  3. Collaborative Analysis (30 minutes): Each group will embark on an in-depth exploration of their assigned contrasts, teasing out potential learnings and implications for Decidim.
  4. Interactive Insights Sharing (15 minutes): Post-analysis, each group will take center stage to share their insights and recommendations. We request each group to cap their presentations at a maximum of 5 minutes, allowing time for a quick transition between groups.
  5. Conclusion and Path Forward (10 minutes): To conclude, we will all synthesize the key points discussed during the workshop, distill the crucial takeaways, and chart a course for future action based on the collective insights and recommendations. Our aim is to transform these discoveries into clear, actionable inspiration for Decidim's continued evolution.

Language of the session: English

Logistical needs: For this workshop, we would require a projector for displaying a presentation, a wall to attach sticky notes to, and enough chairs for everyone to sit in groups of 10 around 3 big tables.

Maximum number of participants: 30

Number of people facilitating the workshop: 3



Please log in

The password is too short.