Most python frameworks will have a development webserver of some kind that will have a warning that it isn’t for use as production servers. How much different do they tend to be from their production equivalents?
I haven’t quite decided which framework to go with, much less what production server to use, so it’s kinda difficult for me to pin this down to a ‘compare development server x to production server y.’ So with that said, let me make the question a little bit more precise: In your past experience with a python framework, how much time did you have to spend getting your application up and running with a production system once its been developed on a development server? Or did you skip the development server and develop your app on a server that’s more like what you will use in production?
The lower environments should try to match the production environment as closely as possible given the resources available. This applies to all development efforts regardless of whether they are python-based or even web-based. In practical terms, most organizations are not willing to spend that type of money. In this case try to make at least the environment that is directly below production as close to production as possible.
Some of the variable to keep in mind are:
many times there are multiple machines (app server, database server, web server, load balancers, fire walls, etc) in a production. Keep these all in mind.
Operating Systems
number of CPUs. Moving from a one CPU lower environment to a multi core production environment can expose multi-threading issues that were not tested
load balancing. Many times lower environments are not load balanced. If you are replicating sessions (for instance) across multiple production app servers, you should try to do the same in a lower environment
Software / library versions