Summarize metrics with random deletion
You have a metric for which you have a result every second You can’t keep this granularity forever; it would be too big Standard solution: produce e.g. hourly logs with summaries e.g. min max mean p50 p99 My suggested alternative: just keep the original data points, but randomly delete some You can then run any aggregations over them when required
How does random deletion affect expected percentiles?
I just released Vidrio,
a free app for macOS and Windows to make your screen-sharing awesomely holographic.
Vidrio shows your webcam video on your screen, just like a mirror.
Then you just share or record your screen with Zoom, QuickTime, or any other app.
Vidrio makes your presentations effortlessly engaging, showing your gestures, gazes, and expressions.
#1 on Product Hunt.
Available for macOS and Windows.
With Vidrio
With generic competitor
More by Jim
- Your syntax highlighter is wrong
- Granddad died today
- The Three Ts of Time, Thought and Typing: measuring cost on the web
- I hate telephones
- The sorry state of OpenSSL usability
- The dots do matter: how to scam a Gmail user
- My parents are Flat-Earthers
- How Hacker News stays interesting
- Project C-43: the lost origins of asymmetric crypto
- The hacker hype cycle
- The inception bar: a new phishing method
- Time is running out to catch COVID-19
- A probabilistic pub quiz for nerds
- Smear phishing: a new Android vulnerability
Tagged . All content copyright James Fisher 2018. This post is not associated with my employer. Found an error? Edit this page.