What are some examples of startups processing big data in a cost effective way?
-
Especially by utilizing latest advancements in cloud computing and big data processing?
-
Answer:
At Playtomic I process a huge volume of data for casual games in real time - yesterday was just under 700 million events thanks to an immensely popular new game. The way I do it cost effectively (only 5 servers, and 3 are just beefy vps's) is: bundle events so 700 million events is done by maybe 300 - 400 million actual requests log files instead of directly updating the database, players just hit static files so minimal work for the server + no consequences for a database far away zipped log files to keep the bandwidth use down even though they have to be unzipped almost immediately on another machine separating everything into purpose-specific processes so the data goes through multiple streamlined pipelines instead of getting caught up waiting for something to finish - even stuff as mundane as unzipping a file can get in your way if your process has to finish other tasks before it unzips more files aggregating the data as much as possible before committing it to the database to reduce the number of database operations
Ben Lawry at Quora Visit the source
Other answers
Really depends upon whether the data crunching has to be done exactly when you want to do it, or whether you can be interrupted. If you can be interrupted, then there are deals available where you get a massively reduced cost in return for the provider being able to kick you off for a premium paying customer. AWS Spot Instances are an example of this, but we know of several others (some of which are cheaper in certain use cases). http://www.strategic-blue.com. There is also a startup called MastodonC who move Hadoop jobs around so that they are performed in the greenest datacenters (which often corresponds to cheapest, due to cheap green power).
James Mitchell
http://www.tableausoftware.com/ http://www.atigeo.com/ http://www.climate.com/ (recently acquired). Many, in fact. And the seem to cluster in Greater Seattle Area, which seems to be a rising Silicon Valley for Big Data. Also, if "cost-effective" is defined via profits per expenses, quite a few quant trading companies are processing big data really really well ;-)
Dima Korolev
Solomonix is a European based startup that uses medical epidemiology skills and algorithms to process big data in new, more cost effective ways.
Stan Stalnaker
Related Q & A:
- How to Display Big Data On A Google Map?Best solution by gis.stackexchange.com
- What is the most effective way to study for my AP exams?Best solution by Yahoo! Answers
- What is an effective way to clean silverware?Best solution by Yahoo! Answers
- What's the most effective way to clean a futon?Best solution by Yahoo! Answers
- What's like the most cost-effective set-up you need for recording music the right way?Best solution by Yahoo! Answers
Just Added Q & A:
- How many active mobile subscribers are there in China?Best solution by Quora
- How to find the right vacation?Best solution by bookit.com
- How To Make Your Own Primer?Best solution by thekrazycouponlady.com
- How do you get the domain & range?Best solution by ChaCha
- How do you open pop up blockers?Best solution by Yahoo! Answers
For every problem there is a solution! Proved by Solucija.
-
Got an issue and looking for advice?
-
Ask Solucija to search every corner of the Web for help.
-
Get workable solutions and helpful tips in a moment.
Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.