I work for a daily news office and they have a web server for everyone can read the news. Now, we have problem with it, more and more people come more and more ours slowly. We use the caching for it but it is still slow. I need a solution for this.
My understanding of nginx is that it is a single threaded, but very fast server. Hence, if you have a multiple processor server, each instance of nginx will only use one processor. Solution, run multiple instances of nginx on a multiple processor server, and use a frontend to distribute requests to the different instances.
Norman is right. Using multi-process method is right approach for scalability in case of Nginx.
Also, it helps to evaluate your worker pipeline to see the throughput at each intersection. For example,
1. What does each request do? Does it involve making database queries?
2. What is the backend processing being used for serving the pages? FastCGI, PhP?? If so, have the caching been tuned to optimize the page rendering?
3. Has the content been separated correctly into static and dynamic parts, offloading the static content into a 'cookie-free' domain with correct cache-expires?
4. What is the cache miss rate? In few cases, such as wordpress backends, creating new pages could invalidate the complete cache for all other pages, if not configured correctly. Perhaps that setting can be checked.
Also, if there was indication that caching improved the performance (compared against having no cache), then using CDNs can be very helpful.