How does Facebook's chat server work?
-
I recently learnt in Networking , that a TCP port identifies about 16-bit of data that is upto 65535 , a single server machine can run upto 65535 service , i have to develop a chat program which is used to chat with each other (Group chat) , i use a single server to manage sending and recieving of data from the client's , if 10 client connect to each other , then 10 ports are used in server (3000,3001.....3010) , what happens when 70,000 people connected in group chat , the server is insuffient with ports the server needs more ports to connect 70,000 clients , But how facebook chat works , there are more than 65535 people online on chatting , How Facebook chat server manage it's clients ?
-
Answer:
A server can support more than 65,535 concurrent connections, as for TCP it's not the number of ports it can listen on that's the deciding factor. Instead, it's the unique combination of address and port on BOTH the source and destination computers. For example: tcp 0 0 X.73.2.71:59537 X.17.196.92:80 ESTABLISHED 24791/google-chrome tcp 0 0 X.73.2.71:59520 X.17.196.92:80 ESTABLISHED 24791/google-chrome tcp 0 0 X.73.2.71:59519 X.17.196.92:80 ESTABLISHED 24791/google-chrome tcp 0 0 X.73.2.71:59482 X.17.196.92:80 ESTABLISHED 24791/google-chrome Here I have four connections open to the same server on the same port - what's different is MY port number the request originates from. All web requests go to port 80 (ignoring secure ones for the moment), so if we're limited to one connection per port, every web server would be limited to a single request at a time, which can't scale. Instead, all systems accept multiple requests to the same port and the processing of the data is done instead based on the combination of source address, source port, destination address and destination port. In practice, the limiting factor of how many connections a server can handle is it's processing power. Around 1k-10k concurrent requests through some servers is possible, and far greater for load balancers (which will distribute the requests across many back-end servers). At some point though a server will end up starting to spend more of it's processing time managing the connections than processing data for them and that's not a good state to be in. Additionally, if you can process the requests fast enough, you can open a connection, receive the headers, process and send a response, then close it all very quickly (sometimes in less than 100ms - 0.1s). Therefore, if you can process more data, faster, you then need less concurrent connections active at any time.
Jonathan Wright at Quora Visit the source
Related Q & A:
- What's Yahoo server address?Best solution by Yahoo! Answers
- How does Priceline's 'name your own price' work?Best solution by Yahoo! Answers
- How long do you have to work in jails when you start off as a deputy sheriff with the LASD(LA Sheriff's Dep?Best solution by Yahoo! Answers
- How does the feature 'go to chat user' work?Best solution by Yahoo! Answers
- How does Facebook decide what's top news?Best solution by Quora
Just Added Q & A:
- How many active mobile subscribers are there in China?Best solution by Quora
- How to find the right vacation?Best solution by bookit.com
- How To Make Your Own Primer?Best solution by thekrazycouponlady.com
- How do you get the domain & range?Best solution by ChaCha
- How do you open pop up blockers?Best solution by Yahoo! Answers
For every problem there is a solution! Proved by Solucija.
-
Got an issue and looking for advice?
-
Ask Solucija to search every corner of the Web for help.
-
Get workable solutions and helpful tips in a moment.
Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.