How to handle issues when we run out of connections to database? -
i looking commonly used / best practices in industry.
assume following hypothetical scenario: if app server accepts 200 user requests, , each of them need db access. db max_connections 100.
if 200 users request @ same time, have 100 max_connections, happens other requests, not served max connections ?
in real world:
will remaining 100 requests stored in sort of queue on apps servers, , kept waiting db connections ?
do error out ?
basically, if database server can handle 100 connections, , of web connections "require database access," must ensure no more 100 requests allowed active @ 1 instant. “ruling constraint” of scenario. (it may 1 of many.)
you can accept "up 200 simultaneous connections" on server, must enqueue these requests limit of 100 active requests not exceeded.
there many, many ways that: load balancers, application servers, apache/nginix directives. sometimes, web-page front-end process broken-out among many different servers working in parallel. no matter how it, there must way regulate how many requests active, , queue remainder.
also note that, though might have “200 active connections” web server, highly unlikely 200 of these clients “clicked mouse @ precisely same time.” requests come in @ random rates , therefore might never encounter sort of delay @ all. system must nonetheless engineered handle worst case.
Comments
Post a Comment