solution to OCR / search through 4 million pieces of paper and 10,000 added daily
By : user2144512
Date : March 29 2020, 07:55 AM
I hope this helps you . Divide and Conquer! If you do decide to go down the route of doing this 'in-house'. Your design needs to have scalability borne from day 1.
|
Java Opensource E-Commerce solution
By : warneverchange
Date : March 29 2020, 07:55 AM
I wish did fix the issue. I'm on the hunt for more or less the same. Up until now, KonaKart has been my prime candidate. But their entire stack isn't opensource, this is limited to (snippet from http://www.konakart.com/product/customization):
|
Can CouchDB handle 15 million records daily?
By : user3258200
Date : March 29 2020, 07:55 AM
Hope that helps Frankly, at this time, unless you have very good hardware, Apache CouchDB may run into problems. Map/reduce will probably be fine. CouchDB's incremental map/reduce is ideal for your requirements. As a developer, you will love it! Unfortunately as a sysadmin, you may notice more disk usage and i/o than expected.
|
Which technology should I use to handle 1 million * 1 million calculation per 30 seconds
By : vsam
Date : March 29 2020, 07:55 AM
Hope this helps There's a data structure called a QuadTree. Keep the data points updated in the quad tree and you will have a much much smaller data set to compare the values against. As clients log in and move, and send you datapoints, you change their location in the quad tree. Now the QuadTree is going to have a 2d map of all your datapoints, split into buckets. Each bucket contains 4 other buckets that may or may not have points in them. When you're trying to find everyone within X of a given data point, you look at all the points in the bucket that point is in. Then you look at all the points in the buckets 'around' that bucket. (There's 8 of them. N S E W NW SW NE SE.) You keep going until the distance to the buckets (and therefore all the points in them) is greater than your minimum range.
|
Optimize Nginx + PHP-FPM for 5 million daily pageviews
By : MWPDX
Date : March 29 2020, 07:55 AM
may help you . We run a few high volume websites which together generate around 5 million pageviews per day. We have the most overkill servers as we anticipate growth but we are having reports of a few active users saying the site is sometimes slow on the first pageview. I've seen this myself every once in a while where the first pageview will take 3-5 seconds then it's instant after that for the rest of the day. This has happened to me maybe twice in the last 24 hours so not enough to figure out what's happening. Every page on our site uses PHP but one of the times it happened to me it was on a PHP page that doesn't have any database calls which makes me think the issue is limited to NGINX, PHP-FPM or network settings. , 2 minor tips:
|