I work for a company in which the majority of our overall platform consists of a massive, monolithic web application. Nearly all of the application’s functionality resides on 2 servers. Our main, external application, and our secondary internal application.
The main application does a majority of the major lifting. Most of its functionality consists of various CRUDs, reports, and displays. The secondary, internal server handles many of the longer running tasks, and operations that don’t necessarily needs to be immediately returned back to the user. It functions as a fire-and-forget type of server.
Overall, the main platform consists of dozens and dozens of smaller comprehensive tools. Things like scheduling, forms, and data lookup. Most of these smaller tools are fairly well packaged and could one day be refactored into microservices, and they’d and and scale quite well. The place where you draw the line on those services makes sense from a business standpoint as they’re fairly well defined.
However, nearly all of our applications reside on the same web server, within the same web application. Everything from the core site functionality to fairly resource intensive reports are all the same core application.
We’ve started making the switch to microservices to help the ease of burden of having everything running on a single server. But I find myself asking if it’s necessarily appropriate or advisable to have everything simply be a web application.
Would certain applications, such as reporting, be better suited as standalone services that aren’t running from within IIS or some other host? Or is the performance loss to convenience gained a worthwhile trade-off?