You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is one of my typical workflow: con (say odbc) --> tbl --> some_dplyr_ops --> save (as a table). Note that data does NOT come into the R's memory. But the code does look clumpsy:
DBI::dbWriteTable(con, DBI::Id("a", "b", "c"), value = <sql>)
# or: DBI::dbAppendTable(con, DBI::Id("a", "b", "c"), value = <sql>)
Right now, value is understood to be a data.frame in R session's memory. In some cases, some database backends allow value to be a filename too.
Q1. Would overloading DBI::dbWriteTable to cover this case add value without breaking the semantics?
Q2. Having a new generic say DBI::dbWriteQuery make sense?
PS:
dbplyr::compute does this job is most cases. In practice, it has hit roadblocks like "only temporary tables are supported" (based on DB backend) whereas raw DBI::dbExecute gets the job done.
IMHO, support for value = <sql> should be at DBI level, not dbplyr.