Is there a problem with this approach to handling invalid web probes?

Like any other website owner, I get frequent probes for vulnerabilities e.g. .php .sql or .gz pages.

These used to appear in my log files as 404 responses (we host on ASP.NET Core). These also take up server time and processing as it has to

  1. handle the request in the pipeline
  2. check the static files for a match
  3. check the routing table for a match
  4. redirecting to the error handler middleware
  5. log the error
  6. show the 404 error page

Using postman I see that a generic 404 returns about 5.44kb in HTML

So I’ve added a middleware handler for requests that checks for requests ending in suspect extensions such as these, which just clears the response with no further processing. This response is now 129 bytes but returns a 200.

My question therefore – Is this a safe approach? I can change the response to 404, although I doubt they’d think the blank response a valid one.