-
Notifications
You must be signed in to change notification settings - Fork 3
export results support #55
Comments
This is implemented in my dev branch, but conflicts with the chunked loading. Not sure what the interaction should be (could load all chunks on export, but it would be memory expensive). |
Thanks for the implementation, i'll merge it manually somehow. Chunked load fills the model with 100 rows at a time. Each time new chunk is extracted from results, rows get appended to the model. For extract, instead of appending, these 100 rows could be replaced. Memory in this case would only need to be as big as those 100 rows. Two consideration for this case:
If we want to be more adventurous, extracts should not be limited to one extract per tab, but more like a job queue. This means that another tab next to results grid and messages should be added and so on. For this we'd need a bit more work since closing connection or tab should cancel subsequent operations... My implementation is not the best in this regard. |
Just looking at your implementation of Query in threading, since query on Query::run() adds database, executes query and then removes database, how would this work with forward only queries which only on QSqlQuery::next() returns a row, initiated by the GUI thread? Anyhow, export needs to be done in it's own thread so Csv can initiate its own Query and Database connection. |
In addition, i was thinking about bigger use case, for later, so just want to drop ideas here. Use case:
An idea how descibed use case would be achieved:
Reason why this management of established connections is mentioned so many time is because database connection can be removed only by the thread it created it. Thoughts? Would this be too complicated? |
It actually is forward-only right now, but it lacks a few things that I haven't implemented yet.
This setup lets you have forward only, ui-responsive, quick closing, single run, memory lite queries. The one draw back to this approach is there is no good way to pause the data-to-disk flow (other than killing the query), so if you don't have a disk large enough to hold the result data you are out of luck.
The only gotcha I can think of with the ExportJobs is when queries are not idempotent.
having export repeat the query would cause a change to the database which is not what the user expects to happen. So long as it's more "query to file" instead of "export grid to file" it should be fine. |
I think i missed one fact, please correct me if i'm wrong: query opens connection, it pulls whole rowset into the model, then closes connection.
If we do this, i'd very much like to have it configurable. Reason is - i'm testing this on a huge table, so whenever i run something, it'll pull 3+ GB from db to the client which i don't want. So, several options here:
This applies to how we get the results, but i'm also thinking how to decide/design QtConcurrent::run vs QThread implementation.
this is true, but i think whoever does this, must be aware that anything but select will have an impact on the db state. so different buttons or behavior defined through settings?
it makes sense to be defined like that. what i did was a quick win, didn't yet use delegates, so i'd have to spend a lot of time to get it working. |
Anyhow, i'll try to patch something up to make your Csv class work in simplest way, so at least some export functionality is working. |
FYI I've added export-to-clipboard support to my branch asw-dev@137f175 The change has 2 parts, capturing ctrl+c events on the grid and a "full" export on button click. In each case I'm assuming that headers should be included (but that could be an option of a tool button). As you are currently working on export code I figured it would be best to let you handle this merge. |
Sure, no prob. The only thing is - this might take a while. That rewrite from I guess all can be taken in from your commit except the change in .pro file since there is no Query class anymore in master. The idea was to have a class which would manage all stuff related to queries since we have 4 happening for every tab:
all of these should be running over their own connection/thread. Just having issues in designing this since this damn QSqlDatabase::database() won't work if database is created in another thread and it needs to stay open since 100-by-100 rows is taken chunk by chunk. |
Point is, if you want to merge it into master, i have no issues with that. I can rebase my branch since i did reset all the work i did, so no worries. |
I tried to make clasd Query and move it to its thread (to standardize export and ui queries), but it didn't work for some reason with triggering different query on the same object. Tried different things and nothing worked. I will have to probably reset again. |
export now works in bad:
good:
as a learning process, this is somehow ok, but result of what i did is PoS. needs to be simplified and streamlined a lot. with this we come back to refactoring headache starting |
last post was addressed in #47 , but a bit more work is needed before this can be closed. when refactoring of Query part is done, i'd like to do a dialog here and then push parameters into Csv. Parameters would be:
if possible, this dialog should display how would this exactly look like (it would have a text part below the settings). since c/p takes data from QStandardItemModel, not sure if we can distinguish which datatype is in. |
this was done in refactoring2 branch, but still pending:
|
in locale branch, now different locales are supported, also custom date/time formats. so, todos before this issue can be closed:
|
latest commit solves 2 mentioned bulltes. still remains:
|
this was more or less done. results somehow cover what i had in mind, so the question is now - do we need to do more before this can be closed? |
I say it's good as is. Can always create a new case for anything more. |
to improve it, i think the following should be done (together with that data formater):
all in all, this c/p thing needs to be updated/cleaned/refactored i think |
The button currently exists but is not hooked up to anything
The text was updated successfully, but these errors were encountered: