Changes at "Improvement of the Decidim’s metrics consistency throughout the application"
Title
- +{"en"=>"Improvement of the Decidim’s metrics consistency throughout the application"}
Body
-
+["
Is your feature request related to a problem? Please describe
When I want to find out how many proposals exist on an existing platform (in this example I'll take https://ecrivons.angers.fr/). In search, it diplays that there is about 2631 proposals. However, when I look down on the front-office, it displays a total of 2331. Finally, if I go to the back-office, this number is 2575 (these numbers may change a bit in the future, but the gap is here).
The inference of this is that Decidim has multiple ways to mesure what's going on a platform.
- Decidim::Search : when searching an empty string in the main search box of a Decidim Instance, we use the Decidim::Search namespace which use a complex system of hooks triggering computations that may not be ended when the page is displayed.
- Decidim::Stats : statistics displayed on front-office, they are computed in real time.
- Decidim::Metrics: the main concern of this proposal, displayed in back-office, these are the most detailed ones. Currently, they are computed, like the Decidim::Search with a Rake task. This task run every 24h by default, which can be unclear in periods of high participation (a voting phase of a budget as instance, with dozens/hundreds of votes every day).
It creates many misunderstandings among administrators, which are concerned about what number is the right one.
I took time to look at the Decidim::Metrics, since it appeared to be the most complete Ruby namespace on the subject among the three. What I found is that even for metrics that seems quite similar (proposals and “proposals accepted”), the way they are counted can be very divergent.
Let me give one example : there are 2 indicators that count proposals : a “total” one and a second that counts only accepted proposals.
The way total proposals are currently counted is the following :
- we harvest all participatory spaces (published or not)
- we harvest all published components of these participatory spaces
- we harvest all proposals that belong to one of these components
- … and we remove the followings :
- proposals that are withdrawn (therefore, remaining proposals respect the condition
state != withdrawn
) - proposals that are moderated
- proposals that aren’t published (keep it mind, it’ll make sense just after)
Now, when counting accepted proposals :
- the harvesting process is similar, that’s consistent…
- … and we remove the followings :
- proposals that are accepted (therefore, remaining proposals respect the condition
state == accepted
- … and that’s all : proposals that are moderated are also counted, and proposals that aren’t published too, which could lead to a state where there could be a greater number of accepted proposals than the number of proposals.
Describe the solution you'd like
- A clear description in the documentation of how statistics are computed throughout the application and why.
- Rethink the current implementation of how these numbers are computed in order to have a consistent counting for each object (ie. proposals are counted the same way comments are counted ie. objects that are published that belong to a published component and not moderated, period). If not, detail it either in the admin panel or in the documentation and explain this behavior.
- In a more bold but time-consuming way, we could at the same time have options that allows admin to modify the behavior of metrics (like a tickbox that allow counting of moderated objects, another tickbox that allow counting of unpublished objects etc.).
Describe alternatives you've considered
We currently use an external software (Metabase) to make our visualizations, but organizations do not have all the resource of a full-time data analyst to understand these subjects.
Additional context
Does this issue could impact on users private data?
Since it’s a rewriting of current visualizations, it normaly does affect but the current way data are communicated on the platform.Funded by
- Documentation work can be done by Open Source Politics
- No funding (yet) for the correction of measures
Share