I run a public proxy server because it makes a lot of internet fog for me and i do my best to support freedom of speech on the internet and have written many proxy servers applications over the years, HTTP is my middle name.
Now my poor old proxy server is getting flooded with web-bot traffic that are fake clicking adverts (contracts sold on Tor) or they are referer spamming like mad so the program soon works out if the clients are human or not by looking at the number of calls for image files, Style-sheets and will test the client using a 302 HTTP redirect.
if they are web-bots then i feed them bunk cached data and they eat it up all day long without the program having to waste my bandwidth in servicing the request by relaying information from the server.
In other words your SEO programs are useless and are not written very well and if i did let all your google Analytic request hit google because you all try to run too fast then google would soon start dumping the request because they are not stupid even if the are on the make.
Slow down a bit, check the results of to HTTP 200 OK and try to make your bots look like they are human because you are fooling no one.