Support other DB engines #423
Replies: 13 comments
-
|
Can you test with v1.1.14 ? Although I technically prepared that other DB engines can be used, I don't think it's needed after today's changes. |
Beta Was this translation helpful? Give feedback.
-
|
Hi Marcel But: waiting now happens every time I open MA. Ca 6 seconds in my case. Not such a problem if it was only needed to wait for navigation. When opening MA, I typically open the queue to see what is next. The queue now reports 'no items found' during these first 6 seconds, and then the queue loads. That is very confusing! If there can be waiting or loading indication, I agree this works good enough Once you released, I will prune my local collection to my favorites only, that will speed up as well 😉 |
Beta Was this translation helpful? Give feedback.
-
|
Ah, that's an easy one to fix by just loading the queue first |
Beta Was this translation helpful? Give feedback.
-
|
I saw the loading order is now changed in v1.1.18 👍 |
Beta Was this translation helpful? Give feedback.
-
|
That has nothing to do with the database but the browser. 1000 items within 1 seconds is great. |
Beta Was this translation helpful? Give feedback.
-
|
I was referring to opening MA in HA. |
Beta Was this translation helpful? Give feedback.
-
|
Ah yes, that is indeed related to db speed. At start of the MA panel, all items are loaded into the browser "store" |
Beta Was this translation helpful? Give feedback.
-
|
Other advantage of using other DB engine like MariaDB is that the extensive writing can happen in remote database. Especially when HA Core runs on Raspberry PI with SD card. |
Beta Was this translation helpful? Give feedback.
-
|
@groenmarsmannetje: Thanks for your remarks. You don't need to be worried for extensive writings. Once the database is created, it's mainly used for lookups (read). The write frequency of the recorder is totally different (so I use MariaDB for that as well, even on an SSD) My main reason for this request was to speed up the opening of the MA UI (took over 1 second per 1000 tracks). That is fully resolved now by a great improvement Marcel made to load the listings in chunk's: #347 There is another issue to make the DB relocatable to avoid it gets included in backups. That doesn't require another DB engine although. This feature requests is therefore low prio |
Beta Was this translation helpful? Give feedback.
-
|
@erkr not totally true. Earlier this week I installed this integration and it started to scan my library of +20k songs after some hours it reported errors in the log that the database has become corrupt and in the front-end I did not see the scanned songs anymore. So I removed the whole integration and will wait until a better database solution is possible. It looked great though. |
Beta Was this translation helpful? Give feedback.
-
|
Support for other db engines is already supported by the backend but there simply is not yet any frontend for it. The extensive writes only happen when the db is being populated, nothing too fancy. |
Beta Was this translation helpful? Give feedback.
-
Let's try to separate this FR from the errors you encountered parsing 20+k songs! Please file the error loggings you had in a separate issue. That should work |
Beta Was this translation helpful? Give feedback.
-
|
Outdated v1 info |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
With large collections opening the artists or tracks tab takes some time to load. Maybe it's a nice feature to support other databases if that would increase the performance. Like the recorder that I connected to MariaDB
Beta Was this translation helpful? Give feedback.
All reactions