How we can get access for Ads API?

How can we get faster access to Google Analytics data export API ?

  • We've built a web analytics dashboard with data we "export" from Google Analytics for a customer in real time. We chose for safety reasons not to store the trafic data so each time the dashboards have to connect to Google Analytics to get the trafic data and it takes very long. The loading time is quite high with sometimes up to 5 or 6 requests to GA taking up to 60 seconds for our slowest dashboard (for example providing comparisons for each indicator withthe previous time period doubles loading time...) * any ideas ? * how do webanalytics professionnals usually deal with this issue? * are there tweaks that could help us or online posts dealing with this subject ? * is there a way to get faster access by contacting Google ?

  • Answer:

    I've been dealing with the same issue and could only find one solution for it: run a cron job and cache the data for a limited period of time. This way the dashboard would load almost instantly.

Claudiu Murariu at Quora Visit the source

Was this solution helpful to you?

Other answers

It sounds like you can't either cache or store the data you get. I agree with Claudiu Murariu that it is the best way. But if you can't do that, you can still do other stuff: Optimize the query If your loading time is taking 60 seconds you should first look to optimize your query. Here's a few tips: 1. Shorten the time span or aggregate dates If you use the date as a dimension then longer time spans (longer than 1 month) slows things down a lot.  If you're just comparing week-over-week or month-over-month data, consider using ga:week or ga:month as your dimension. 2. Make sure you're using the v3 API and JSON JSON is a smaller download. That might speed things up for you. 3. Filters are slow Plain and simple. They slow things down. It gets worse with long time periods 4. Reduce the number of dimensions Same here. Dimensions make things slow. Optimize your pulling methods There's only so much you can do to optimize the query.  The rest is up to you on your end. 1. Paging + date ranges + parallel requests = VERY fast If you split up the request by date range and by pages (using "start-index" and "max-results") then the individual requests will be very fast. I've been able to call all those requests in parallel and aggregate them in memory a little later. 2. Send Accept-Encoding: gzip in your header This will compress the returned data.  Speeding things up a bit for the download itself.

Christopher Le

Related Q & A:

Just Added Q & A:

Find solution

For every problem there is a solution! Proved by Solucija.

  • Got an issue and looking for advice?

  • Ask Solucija to search every corner of the Web for help.

  • Get workable solutions and helpful tips in a moment.

Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.