You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At Wikimedia Foundation we make extensive use of Turnilo for data visualization (thanks!).
One of the datasets I use most in my day-to-day work is our 'netflow' data: statistics about the network traffic our routers handle.
It's natural to think of this data in terms of gigabits/second of bandwidth, or packets/second transmitted and received. However, the y-scales on turnilo's line chart are always relative to whatever temporal granularity I've selected in the time split. I don't want to have to think about bytes/5minutes or packets/hour, I want to think about gigabits/second and packets/second.
If there was some way to indicate that a given measure -- or all measures in a given cube -- are best represented as rates over time, with that unit independent of the split-by-time granularity, that'd be really nice.
The text was updated successfully, but these errors were encountered:
I imagine this as another action on measure (any measure? or maybe some marker in config? but what about introspection ... ) on left panel. There you select "Rate" and series adds to top bar and popup opens where you can select "rate base". So for example "Click per second" or "Click per minute". Under the hood turnilo would divide measure value by $MillisecondsInInterval times rate (1000 for seconds, 60000 for minutes etc.).
At Wikimedia Foundation we make extensive use of Turnilo for data visualization (thanks!).
One of the datasets I use most in my day-to-day work is our 'netflow' data: statistics about the network traffic our routers handle.
It's natural to think of this data in terms of gigabits/second of bandwidth, or packets/second transmitted and received. However, the y-scales on turnilo's line chart are always relative to whatever temporal granularity I've selected in the time split. I don't want to have to think about bytes/5minutes or packets/hour, I want to think about gigabits/second and packets/second.
If there was some way to indicate that a given measure -- or all measures in a given cube -- are best represented as rates over time, with that unit independent of the split-by-time granularity, that'd be really nice.
The text was updated successfully, but these errors were encountered: