I have auto increment on each of a table. One unique ID that is consistent of 10 numbers, and each table have ~ 6 big int columns (values are small from 1-60k), and from 0 to 4 var chars (~ up to 500 characters, on average from 5 to 50 characters).
I am fighting with this for months and can’t make it production stage :(, basically it drops from ~ 170 inserts (from app perspective) to ~ 40 just after ~ 200-500k inserts.
This is nothing as I’ve worked with DB that was holding trillions of columns and auto increment and huge varchars. (however paid solution :().
I already tweaked the config so many times but still getting to the point where server is using ~ 950% & .net core 25% (of all cores).
Machine has i9 9900k 8c/16t, 64GB RAM, 2x NVME 2TB
I can’t even run @ 5 minute API test as it won’t be able to process all data from a queue 🙁 (API can accept ~ 20k/s).
Buffers, read io, inno_db other tweaks for a commit etc. were applied, nothing seems to be working.
Looks like it cannot for some reason handle just so little data and I cannot figure out why (I never had any real experience with free databases, so I only assume that it should be able to insert 300k records within 60 seconds and sustain this for ~ 10TB).