How does a search engine discover/crawl a website loaded by a DB?

I am creating a new blog website in Django. All the posts and their details are stored inside an SQLite database (one that comes default with Django). Once a post’s URL is requested, the server returns the webpage which contains the post loaded from the database. My question is, since the webpages do not exist until someone types in the URL, how will a search engine such as Google crawl the website and make it show up in the search results if it is relevant to the search? Is it necessary to provide a sitemap for this to happen?