Is a relational database with a dynamic number of tables a good design?


I have a use case where I wish to create a table for each entity to which the underlying application that owns this entity will publish records.

This table has a fixed structure, so if there are 5 such entities in my system, there will be 5 different tables with the same schema.

The schema is generic with one of the columns in the schema as JSON for flexibility. I do not expect queries based on the fields in the JSON. I expect the following queries on each entity:

  1. On the auto-increment id primary key column with LIMIT and OFFSET where I need to read X rows from the record with id Y.
  2. On the creation date column with LIMIT X.

I expect thousands of such entities to be created on the fly so in turn there will be thousands of tables in the database.

In future when one of these entities have fulfilled their purpose, the table would be simply deleted.

I expect most of these tables to have not more than 100 rows while there will be a few with at least 1M rows as time goes by. This design makes data easy to query as my application can determine the table name from the entity name.

Is this a bad design?

Is there a limit to the number of tables in a database in RDBMS (the above design is with Postgresql 11 in mind) keeping performance in mind?

Should I use any different datastore to achieve this other than RDBMS? Any suggestions?