You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Sorry if I missed something or this is a dumb question.
I setup docker according to readme and it works, both with local csv and json output and with json-database output.
however, I don't understand how to correctly switch the database to non-json/separate columns.
when I tried to switch to non-json, the scraping will just stop working and freeze without an error message after the third pagejob.
I tried that multiple times and it always freezes after 3 page jobs, I'm assuming you are waiting for 3 results before writing it to database?
I found and used the sql files, but those didn't help either. A few things I noticed:
json-up.sql drops the non-json fields, but json.down doesn't drop the json field, so I assumed that json not null blocks the script from populating the columns, but deleting the json column didn't help either.
I also noticed that the columns that are created by jsondown.sql are far less than the actual datapoints that are extracted, so I feel like this sql file is either faulty or I'm missing something important?
I would have assumed that the database should contain a column for all datapoints that are otherwise extracted as columns in CSV?
The text was updated successfully, but these errors were encountered:
Sorry if I missed something or this is a dumb question.
I setup docker according to readme and it works, both with local csv and json output and with json-database output.
however, I don't understand how to correctly switch the database to non-json/separate columns.
when I tried to switch to non-json, the scraping will just stop working and freeze without an error message after the third pagejob.
I tried that multiple times and it always freezes after 3 page jobs, I'm assuming you are waiting for 3 results before writing it to database?
I found and used the sql files, but those didn't help either. A few things I noticed:
json-up.sql drops the non-json fields, but json.down doesn't drop the json field, so I assumed that json not null blocks the script from populating the columns, but deleting the json column didn't help either.
I also noticed that the columns that are created by jsondown.sql are far less than the actual datapoints that are extracted, so I feel like this sql file is either faulty or I'm missing something important?
I would have assumed that the database should contain a column for all datapoints that are otherwise extracted as columns in CSV?
The text was updated successfully, but these errors were encountered: