I’ve been fighting for days now just to get god damn logging set up. I’ve had to write a ton of code manually because PG doesn’t provide any automated mechanism to do this, for some reason, nor even tells you anything, beyond this: https://www.postgresql.org/docs/12/runtime-config-logging.html#RUNTIME-CONFIG-LOGGING-CSVLOG
- Set up the
postgres_logtable exactly like it says on that page.
- Set up my
postgresql.conflike this (also as it says on the page, except it only describes it vaguely and lets me find out everything on my own):
log_destination = 'csvlog' logging_collector = on log_directory = 'C:\pglogs' # Yes, I requires double \ chars or else it removes them entirely... log_filename = 'PG_%Y-%m-%d_%H;%M;%S' log_rotation_age = 1min log_rotation_size = 0 log_truncate_on_rotation = on
- Coded my own mechanism to constantly go through
.csvfile, skipping any ones that PG reports are already in use with
pg_current_logfile, feed them into PG’s table and then delete the file. This took me a huge amount of time and effort and not a word about it was mentioned in that "manual".
- PostgreSQL creates both
PG_2020-09-20_00;56;19.csv(in CSV format) and
PG_2020-09-20_00;56;19(in plaintext format) files. I obviously don’t want the extensionless files. Why are they created?
- Every minute (as specified) PG creates new log files, even if there’s nothing new to log. This results in an endless stream of empty log files (which my custom script goes through, "imports" and then deletes). How do I tell PG to stop doing that? It seems like pointless wear & tear on my disk to make empty files which are just deleted seconds later by my ever-running script.
- Why isn’t all of this automated? Why do I have to spend so much time to manually cobble together a solution to import the CSV files back into PG? In fact, why are they dumped to CSV files in the first place? Why doesn’t PG have the ability to directly log into that database table? It seems like a pointless exercise to dump CSV files which are only going to be COPYied back into the database and then deleted.