Skip to content

Should DBI::dbWriteTable support overloading value argument to support a SQL query #488

@talegari

Description

@talegari

This is one of my typical workflow: con (say odbc) --> tbl --> some_dplyr_ops --> save (as a table). Note that data does NOT come into the R's memory. But the code does look clumpsy:

tbl_dbplyr %>% 
  dbplyr::sql_render() %>% 
  paste("create table a.b.c as", .) %>%
  DBI::dbExecute(con, .)

This could be (intuitively) replaced by:

DBI::dbWriteTable(con, DBI::Id("a", "b", "c"), value = <sql>)
# or: DBI::dbAppendTable(con, DBI::Id("a", "b", "c"), value = <sql>)

Right now, value is understood to be a data.frame in R session's memory. In some cases, some database backends allow value to be a filename too.

Q1. Would overloading DBI::dbWriteTable to cover this case add value without breaking the semantics?
Q2. Having a new generic say DBI::dbWriteQuery make sense?

PS:

  1. dbplyr::compute does this job is most cases. In practice, it has hit roadblocks like "only temporary tables are supported" (based on DB backend) whereas raw DBI::dbExecute gets the job done.
  2. IMHO, support for value = <sql> should be at DBI level, not dbplyr.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions