-
Notifications
You must be signed in to change notification settings - Fork 33
[WIP] - Victoria Metrics #27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Hey, thanks for all the work ! Performance-wise, with a near 3 millions data points export the influx RAM usage is around the same (~500MiB) and time to push is a tad longer (~1m30s vs 1m10s on the same export). Too bad the switch to v2 doesn't improve RAM usage/ ingest time, as deploying on a Raspberry PI with 1-2Go of RAM is a common platform. As stated in #20, i'm really not fluent in FluxQL. With some proper data available in grafana I tried to replicate some dashboards, I found it quite more challenging to create the queries as there is no visual builder like for v1 and the syntax is more complex. Also for people wanting to create their own panels, the SQL-like syntax of InfluxQL is more familiar I guess. Looks like there is a way of making influxQL queries with a v2 db: https://docs.influxdata.com/influxdb/v2/query-data/influxql/, that could alleviate some of the migration burden. TBH I don't kown what to think of a move to influx v2, especially with alternatives like victoria metrics around (https://github.com/VictoriaMetrics/VictoriaMetrics?tab=readme-ov-file#prominent-features). On the other hand, I'm eager to switch away from influx 1.8 as it's now unsupported, the python lib is unmaintained etc, but right now influx v2 doesn't have enough arguments to take its place (and there seem to be a v3 on the way ?) . Maybe it's also me not knowing the v2 ecosystem that much. Feel free to weigh in on any of these points ! |
@k0rventen - WIP like before but with Victoria Metrics. |
Hey ! I played around with your code, and after fiddling a bit with victoria and the ingester code, I was able to import my data. It took less time (1m40 vs 2m15), while consuming at most a tenth of the RAM of influx (50MB vs 600). That's very impressive. On the grafana side it's still quite blank, but I can see some of the data when doing raw requests. I see you're using the prometheus datasource for VM. As we are now building our own grafana image to provision the dashboards etc, I guess it would be a bad idea to try out their own datasource: https://docs.victoriametrics.com/victoriametrics-datasource/ ? I'll give it a try whenever I have some time. |
I’m up for it, grafana updates are going to be delayed a bit for me.
I’m working on a single time series DB to import all my Apple, Fitbit, and
ResMed data too to get a comprehensive build. When I get the data side
loading on all three I’m going to come back and clean up the dashboards.
…On Mon, Sep 16, 2024 at 10:33 coco ***@***.***> wrote:
Hey !
Thanks a lot for the hard work, this looks promising !
Sorry I didn't replied earlier, do not have a lot of free time rn :/
I played around with your code, and after fiddling a bit with victoria and
the ingester code, I was able to import my data. It took less time (1m40 vs
2m15), while consuming at most a tenth of the RAM of influx (50MB vs 600).
That's very impressive.
On the grafana side it's still quite blank, but I can see some of the data
when doing raw requests.
I see you're using the prometheus datasource for VM. As we are now
building our own grafana image to provision the dashboards etc, I guess it
would be a bad idea to try out their own datasource:
https://docs.victoriametrics.com/victoriametrics-datasource/ ?
I'll give it a try whenever I have some time.
—
Reply to this email directly, view it on GitHub
<#27 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACDYM2JNMDKGZJIOV2N4EY3ZW4I5PAVCNFSM6AAAAABEOL3PF2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNJTGUYTGOBRHE>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I'm having some issues with the tests but please take a look.