You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This incoming_data is just for example.. In reality, one incoming_data set will have several 5k rows and about 50k columns. And the entire final file will be about 200-400 gigabytes
My question is how to add new data as columns to the database without loading the file into RAM
# your way
path <- "D:\\R_scripts\\new\\duckdb\\data\\DB.duckdb"
library(duckdb)
library(duckplyr)
con <- dbConnect(duckdb(), dbdir = path, read_only = FALSE)
# write one piece of data in DB
dbWriteTable(con, "my_dat", incoming_data())
#### how to make something like this ####
my_dat <- cbind("my_dat", incoming_data())
The text was updated successfully, but these errors were encountered:
Thanks. This is a very broad question, and not a good fit for this issue tracker. Either way, 50k columns sounds like way too many. Any chance you can the data "longer"?
I have incoming data that I want to store on disk in a database or something. The data looks something like this
This incoming_data is just for example.. In reality, one incoming_data set will have several 5k rows and about 50k columns. And the entire final file will be about 200-400 gigabytes
My question is how to add new data as columns to the database without loading the file into RAM
The text was updated successfully, but these errors were encountered: