Categories
Google

Google Analytics Tracking Change on March 20th?

I noticed a sudden trend in my GA starting on March 20th. I thought it was a fluke, until I noticed that 3 days later the change in numbers still persists.

All signs point to Google Analytics blocking an IP range and/or bots from being tracked.

Starting on March 20th, the number of pageviews and unique visits had a sudden drop.  This also coincided with an increase of pages per visit (shown below).

Pages Per Visit
Pages Per Visit

I have compared my pageview data to Clicky and AdSense and their data shows no change.  This leads me to only one conclusion: GA is blocking some type of traffic and has not announced this yet.

Did you discover similar findings?  Let me know.

Categories
Google rant

Your idea of privacy is dead

I hear a lot about privacy these days.  More often than not, it’s about Facebook or Google and their disregard of privacy.  I think part of the problem is that privacy means different things to different people.

I know for a fact that how I view privacy online is different than other people.  I accept that the internet is changing rapidly and the idea of privacy that existed are long gone.  Online handles are a thing of the past (though there are still exceptions, hi Reddit).

I hear things like…

Ditch Google and use DuckDuckGo because they don’t track users.  Don’t use gmail because they read your email.  Ditch Facebook because Zuckerberg said privacy is dead.

In fact, the most recent knock against them is that Facebook Home can take over your phone and offers all kinds of potential privacy violations.  Reading that story led me to write this blog post.

Let’s face facts here.  The modern web and your personal data are intertwined.  There is no going back.

When Google announced their unified privacy, it was actually a great leap forward.  Those people who complained four years ago that Google was reading their email love Google Now.

Gmail will see that you booked plane tickets or had a packaged shipped and will personalize your Google Now experience.  This was not possible five years ago.  Google Glass was not even possible a year ago.

Will these companies use this data to show advertisements and make money?  Yes, of course.  As it turns out, they are companies that need to make a profit.

A privacy breach will happen time to time with the modern web.  I expect the modern web to be responsible and use SSL and OAuth to securely share my data.  When they are careless and screw things up then they deserve the bad press.  But a story about Facebook Home saying that it “destroys any notion of privacy” is complete and utter bullshit.

I’m not saying that I want to sign up for a service with my email address and phone number and they can turn around and sell this information to a third party.  I don’t want them to publish my phone number without my consent.  Those are violations of my privacy and not something I would agree to.

However, if I sign up for Facebook and they decide to use the fact that I liked ESPN in a personal advertisement, then so be it.  I understand the tradeoffs of the modern web.

I find it funny that we get bombarded with credit card offers and flyers from ten different companies in the mail and no one seems to mind.  We accept those privacy violations offline, even though they offer us nothing in return.  Yet companies online who are innovating and using private data in new ways get so much grief.

No one said you had to join Facebook, use Google, or Twitter.  If you want to pretend its 2002, then that’s fine.  The rest of us are moving forward.  We don’t need you to come along.

This is progress people.  Sit back, relax, and stop your whining.

 

 

 

 

Categories
Google

Why I’m Sticking with Google Chrome

I just read an article entitled “Why I’m Switching (Back) to Firefox” by Cameron Paul.  He explained that Firefox has made huge improvements in the past couple of years.  Meanwhile Chrome has experienced slowness and memory issues as of late.  Google is in it to make money while Firefox stands for “freedom and privacy on the web”

He goes on to say: “Chrome is starting to feel a lot like Firefox did in 2008.” Ouch.

It is true that Firefox has improved by a lot over the past 4 years.  They have neat things for developers like command lines in the dev tool.  At this point Firefox is more than capable to be used as your primary browser again.  I use it on occasion, but mostly for testing purposes.

So is it time to switch? No.

They had my complete loyalty back in 2004-2008.  Every one of us who used their browser said the same thing.  It was slow and crashed all the time.  Within those 4 years it felt like Firefox hardly improved at all.  If anything the experience got worse up until 2010.

Firefox, and the state of web browsers in general, lacked innovation and vision.  Chrome brought about the modern web as we know it.  Without it, it would be impossible to develop javascript heavy applications that exist today.

Why did it take Google Chrome to come out in order for Mozilla to start to work on Firefox’s core issues?  Memory, responsiveness, and page render speed were core problems that existed for years.  

For the past four years Firefox has spent their time playing catch-up to Chrome.  How can a browser that was out for a decade be so far behind Chrome technologically?  

Since day one, Chrome handled extensions and tabs as separate processes to help alleviate memory issues.  As far as I know, Firefox still does not operate in this manner.

Now, because Firefox has nearly caught up with Chrome, it’s time to switch?

Am I supposed to be impressed that Chrome finally gave them a roadmap to use?

Am I supposed to be impressed that they finally fixed their memory issues that lasted for 6 painful years?

If Firefox is about “freedom and privacy on the web”, then Google Chrome is about being as fast as possible and quick to adopt and develop cutting edge features.  

 

Categories
cloudflare Google optimization

Google PageSpeed Service Review

I’ve been on the lookout for a service that would reliably speed up various web projects such as FantasySP and Top IAmA.  The faster the pages load, the happier my users will be.  I first started out using Cloudflare, but their service has been getting worse and worse.  Their pay features aren’t worth the money and using Cloudflare will most likely slow down response times.  A winning combination!

Then I heard Google PageSpeed was in invite only beta. (Not to be confused with mod_pagespeed, which is an Apache module that offers very similar features).  Google PageSpeed Service is very similar to cloudflare and all traffic will be passed through their servers via a DNS change.  I was finally accepted into the program and have spent the past few days tweaking my sites for best performance.

Though before we get to the results, let’s first go over why FantasySP tends to load a little slow to begin with.

What’s really slowing down my site are advertisements and third party javascript. No surprise there, right?  I offer my members a banner free experience, but the average joe has to load a page that is slower than I’d like.  To measure front-end performance I use NewRelic.  Prior to using Google PageSpeed Service, the average web page loads anywhere from 6 to 8 seconds.

Enter Google PageSpeed Service

I started out using Recommended Settings, which are the things you are probably familiar with: remove whitespace, combine CSS, minfiy javascript/css, use Google’s CDN, optimize images etc.  I decided to go all in with Google even though I already used Amazon Cloudfront as my CDN.  Google adds some additional goodies such as converting images to inline DATA, further reducing HTTP requests. It also automatically converts all references to images in stylesheets to run off their CDN, even if it’s not local to my server.  (Goodbye Amazon Cloudfront?).

Google PageSpeed went live on Christmas day, the 25th.

NewRelic: Browser performance
NewRelic: Browser performance

Immediately all pages started to load in under 5 seconds.  Best of all, no functionality on the site was broken, and I did not detect any additional latency by using Google’s service.

On December 26th I decided to enable additional experimental features that are labeled “high risk”.  I also enabled a new low risk feature called prefetch DNS resolve.

The best way to solve slow javascript is to defer javascript, which should prevent javascript from hanging and slowing down page rendering.  Google claims this is “high risk” and may break things, so I made sure to do thorough testing during a day that would have less traffic than normal.

Once defer javascript was enabled, you’ll notice that DOM Processing decreased even further, whereas page rendering actually increased.  Unfortunately I cannot explain why that is. (Experts, feel free to chime in).   So the next question might be, is deferring javascript even worth it according to that graph?  The difference between the 25th and 26th do not seem to matter?

Let’s have a look at what deferring javascript does in a browser not known for speed: Internet Explorer 9.  By enabling Google PageSpeed on December 25th, the pageload times decreased to under 6 seconds for the first time in seven days.  On the 26th, deferring javascript decreased pageload times to around 3 seconds.  Keep in mind that number includes banner advertisements.

Load Times in Internet Explorer 9
Load Times in Internet Explorer 9

 

Clearly deferring javascript helps with IE9, but what about the other browsers?

Windows firefox
Windows firefox
Mac Chrome 23
Mac Chrome 23
Windows Chrome 23
Windows Chrome 23

In every browser, you’ll notice that DOM processing and network times decreased, whereas page rendering times increased.  My theory is that since the javascript is deferred, the dom is processed a lot faster, but that doesn’t mean the page is fully rendered.  Again, feel free to chime in here with your own explanation.  I *think* the end user will feel as though the pages load a lot faster, despite the additional page rendering.

Unfortunately I am unable to test Android and iPhone browser performance because these users are directed to a different sub domain. Plus, I don’t think Google supports these browsers for deferring Javascript.  Older browser performance from IE8/IE7 remain unchanged because many of Google’s optimizations are for modern browsers only.

According to my testing, the previous bottlenecks like advertisements and rendering Google Charts no longer slow down pageloads.

Next up is to see performance from Google Webmaster Tools.  When I used Cloudflare, Googlebot noticed huge performance issues.  Cloudflare caused my 200 ms response times to double and even triple.  Will the same results appear for Google’s Service?

Google Webmaster Tools
Google Webmaster Tools

As you can see, passing traffic through Google PageSpeed servers does cause a penalty of about 100 ms in response times.  Their servers are doing an awful lot of optimizations behind the scenes, so this is not at all surprising.  The trade off is that the end user gets entire seconds shaved off their load times.  I think I’ll take that any day.

More Advanced Optimization Techniques

Of course, I am not satisfied there and wanted to further push the boundaries of what Google PageSpeed Service can do.  Next up is to cache and prioritize visible content.  Cloudflare has something similar, but they call it railgun. Railgun also requires you to run an extra process to send HTTP data back and forth to Cloudflare to show them which HTML content is stale so it can be cached.  I have no idea how well Railgun performs since no one has actually reviewed the service.

Google’s cache and prioritize visible content works a bit different.  You do not need to run any additional service on your server.  Instead they parse the HTML based on a set of rules you specify and rely on using Javascript to speed up rendering times.  Their website says: “Non-cacheable portions are stripped from the HTML, and the rest of the page is cached on PageSpeed servers. Visible content of the page is prioritized on the network and the browser so it doesn’t have to compete with the rest of the page.”  Visible content means, content that the end user actually can see on their browser.  I do not know if it is user specific, or they just base this off the most common resolution.  So anything above the fold will load immediately, whereas content below the fold may load via a Javascript call.

If deferring javascript works on your site then this feature should work once configured correctly.    Here is their full video explaining the feature:

I went ahead and applied FULL caching to Top IAmA with no rules.  This site has no advertisements and very little javascript.  Response times here are a best case scenario   Every page that gets rendered can be cached for 30 minutes.  This means that Google only needs to fetch a page once every 30 minutes, otherwise it serves it straight from their servers.  Googlebot shows the results:

Googlebot crawl speed Top IAmA
Googlebot crawl speed Top IAmA

You’ll notice two things about this graph… 1) The response times to the far left are terrible and 2) The response times under Google PageSpeed with cached content are extremely fast.  Response times went from 200ms to around 50 ms.  The response times to the far left are a result of Cloudflare’s Full Caching features with Rocket Loader enabled. As I mentioned earlier, avoid Cloudflare at all costs.  The response times in the middle of the graph are from my server.

I did attempt to add Google’s cache and prioritize visible content to FantasySP.  Once I tallied up all of the IDs and classes needed for user specific data I previewed the feature site-wide.  After a some tweaking of class names, everything seemed to be working without any issues.  Although I soon ran into occasional Javascript errors under IE10 that would halt rendering the full page.

This shows you how fragile this feature is because a single javascript error will cause rendering to fail.

The error for IE10 shows as:

SCRIPT5007: Unable to get property ‘childNodes’ of undefined or null reference
f2ae86053f8f64f57a4ef28a17bd0669-blink.js, line 30 character 224

Every other browser seemed to have no issues whatsoever.  IE10 seemed to load the fastest, with Chrome, Safari, and Firefox not too far behind.  I applied this technique to a certain subset of pages on FantasySP that I feel confident will have no errors and can use the speed boost.  Have a look at this Russel Wilson or Matt Ryan page.  During pageload, watch the console to see how different sections of the page load via javascript calls from blink.js. If you don’t see it on the first pageload, then refresh to see the cached version appear.

Google Analytics

The final set of data will be looking at “Avg. Document Content Loaded Time (sec)” metric in Google Analytics under Site Speed -> DOM Timings.  Google Analytics explains this metric as: “Average time (in seconds) that the browser takes to parse the document and execute deferred and parser-inserted scripts (DOMContentLoaded), including the network time from the user’s location to your server.”

I broke up the data by browser and they include: Chrome, Internet Explorer, Safari, and Firefox.  Internet Explorer comes in just a hair faster than Chrome.  Although Safari seems to be worse.

DOMContentLoaded
DOMContentLoaded

Google Analytics confirms what the NewRelic graphs show.  Users will perceive pageloads as much faster with Javascript being deferred.

Conclusion

According to NewRelic browser metrics and testing, Google PageSpeed Service offers significant improvements to load times.  It is relatively easy to set up and is better than any other competing service.  Sites both big and small should use at least some of the services provided.

Google PageSpeed Service with advanced optimization techniques is a perfect solution when it comes to simple WordPress blogs that use Disqus commenting.  All you have to do is add an ignore rule for /wp-admin/* and enjoy a much faster blog that should be able to handle an almost unlimited amount of visits and traffic.  I’d love to see results from anyone else out there willing to try.

Overall I’d say using Google PageSpeed is a no brainer.  Test it out and let me know your thoughts.

Update #1: Over at the hackernews thread, Jeff Kaufman, who works with the PageSpeed team gave some very insightful comments that are a must read.

Update #2: This blog you are currently reading now uses Google PageSpeed Service with advanced caching techniques, such as Google’s cache and prioritize visible content.

Categories
Google

Your Search Engine Sucks

I hear a lot about alternative search engines.  If you do some Googling searching on the subject, then you’ll quickly realize that there are many “alternatives” out there.  I’m not talking about companies such as Yahoo and Bing.  Of course we know that these guys are alright when it comes to search, but even they come up short.  So let’s quickly ponder about the type of challenges a search start-up faces in this space.

DuckDuckGo is the first one that comes to mind.  I see them pop up quite a bit since they constantly appear on hackernews.  The truth is that they would be fortunate if they could capture Bing’s share of search, let alone actually competing with Google.

The real question is why?  Why not at least give it a try, right?  After all, Google Chrome came out not too long ago and has quickly captured a large chunk of the market.  Keep in mind three things about that: 1) The browser market was no longer innovating at a fast pace.  2) Google has an immense amount of resources at their disposal 3) Google saw inefficiencies and knew they could improve upon them. 4) Developing a browser would fit well into their existing business model and future business opportunities.

Recently Google released a video on an internal discussion about search.  Watch that video and let this quote soak in for a moment:

The specific change discussed in this video improves spelling suggestions for searches with more than 10 words and it impacts only .1% of our traffic.

Google’s platform is so mature that these are the current problems that they are solving.  Do you really think that is what is going on at DuckDuckGo and similar search engines?  Hardly.  They are trying to solve problems that were solved years ago and will be playing catch-up for years to come, assuming they are still around.  Keep in mind .1% of Google’s traffic is at least 100 times more than DuckDuckGo’s TOTAL traffic right now.

The only hope a company has is to appeal to a specific niche or hope Google implodes.  A company that prides itself as a general all-purpose  search engine is wasting everyone’s time because you will not come close to Google in terms of accuracy.  Your best bet is that Google beats itself and does stupid things like implementing changes that appear to hurt accuracy, or favor their properties, or break referral data that is critical for SEO.

So which alternative search engines are worthwhile?  Obviously Twitter’s search performs better with real-time search queries than Google.  For example, if I want to know if Firehost is having issues, then I would turn to a Twitter search.

So what solution does DuckDuckGo solve?  Privacy concerns?  I’m not so sure.  People who complain about privacy are the same ones who use Twitter and Facebook on a daily basis.  Five years ago searches based on your previous search history were “revolutionary” and “the future”, but now they are “invasive”.  Interesting how perceptions change, isn’t it?

Agree or disagree? I’d love to hear it.

Categories
Google

Two-Step Google Verification SMS Spam

Not too long ago I started to receive an endless stream of Google Two-Step Verification codes to my cell phone.  I had not changed anything, but I had no way of stopping it and hoped the problem would resolve itself.  Well, it turns out that it does not.  What you need to do is use Google Authenticator to receive these codes instead of opting to use SMS text messages.

To do this, go to your Account Settings for two step verification and you will see the option to make the switch. Not sure where this is?  Log into gmail and click your name in the top right.

Google Authenticator works for both Android and Apple phones.

Enjoy! 🙂

Categories
Google

Switch from Google Chrome DEV to Beta Channel

If you are like me, then you’re sick of the Google DEV channel getting screwed up by new bugs. (I thought that was the whole point of the Canary build?)  Recent hair-pulling bugs include: fonts no longer rendering properly, sluggish performance on Twitter, or random crashes that weren’t there a build ago.

Well, fear not.  I just decided to downgrade from Chrome DEV to BETA and I kept ALL my user data.  Here is what you do:

  1. Backup your “User Data” folder first.  For windows 7 users: C:\\Users\YOUR-USERNAME\AppData\Local\Google\Chrome\User Data\ . You’ll notice that this folder is fairly big, in fact, mine weighed in at around 1 gig. If you are on an earlier version of windows or a different OS, then head here for more paths.
  2. Next, close Chrome and go to Add/Remove programs to uninstall.  When asked, Do not remove your user data.
  3. Now open up another browser and head over to the beta channel download page.
  4. After you install, Chrome will open and you’ll notice that it did not save any of your settings. Don’t panic.
  5. Close Chrome and head over to your folder again C:\\Users\YOUR-USERNAME\AppData\Local\Google\Chrome
  6. Rename “User Data” to “User Data OLD” (just incase) and then copy over your backed-up folder.
  7. Open Chrome and rejoice, all of your data is exactly how you left it.
No more weird DEV bugs anymore!  Enjoy. 🙂
Categories
Google SEO

Is it Time for a New Google Syntax for Comments?

The average blog post is fairly easy to index and crawl for Google.  However, what happens when a single page has several hundred comments?  How does Google decide what’s important to include in a search query and what to ignore?  Heck, in some cases the comments on a Reddit post are more important than the actual summary of the article submission itself.

A lot of websites have been adding a voting aspect to comments for some time now.  However, no search engine (as far as I know), looked into taking these votes into account when crawling.  I propose a new type of “rich snippet” syntax for Google, Comment Syntax.

What if websites included the syntax so Google could not only clearly identify comments, but quickly pick the most popular/useful comments to showcase in certain search queries.  This could be applied to sites such as Digg, Reddit, Yahoo Answers, Stackoverflow, and just about any WordPress blog.

What do you guys think?  Would this be helpful?

Categories
Google SEO

Free Google Search Keyword Tracking

Finally an advanced Google keyword ranking checker that is free.  The site is aptly named KeywordRankings and is geared towards SEO people looking to save a history of where their keywords rank in Google.

Feature List:

  1. Entire Service is free
  2. Add unlimited domains
  3. Add unlimited keywords
  4. Each keyword ranking is checked once per day
  5. Each domain is checked for index’d pages in Google once per day.
  6. Graphs are created for each keyword’s history and domain’s index’d history.
  7. A public secure and sharable link is provided for each keyword to show clients.
  8. Available for google.com, google.co.uk, google.ca, google.dk, google.es, google.it.
  9. Keyword Queue Ranking decides in what order your keywords get updated.
  10. Demo account available: login as [email protected] / demo .

The site is coded in PHP with a  jQuery front-end and is extremely fast.  Anyone involved in Search Engine Optimization should be happy with it.  At the moment it is in invite only mode so I can squash all of the bugs.

There have been questions as to how the site determines each users “Keyword Queue Ranking”.  There are a multitude of factors, such as how long you’ve been a member.  I cannot give out all the details, but I can say that it is an ongoing process and will be continuously tweaked.

Of course if you’d prefer to not deal with the Keyword Queue Ranking, then the option to donate to the site is available to get priority keyword checking.  The site has not recevied any donations yet, so for just $1, you will be at the top of the list.  Over time, if everyone donates $1, then you would need to donate another dollar to be first again.

Essentially the price for 1st in keyword priority is determined by you guys, not me.  The site will show you where you rank in the keyword queue.

I am allowing 5 new signups. (1 SIGNUP LEFT)  Enter code: fantasysp.com

If you have any questions or comments, respond below.

Categories
Google

Gmail Notifier Not Working? It’s because of HTTPS

I’ve encountered a simple, yet annoying problem when it comes to Google’s Gmail Notifier.  It turns out that if you select the option to always use HTTPS, then the notifier will not work.  The fix can be seen on the Notifier page, which is a registry edit.

Surely Google can do a better job of notifying us of the problem without having to search for it.  How about a message saying if you use HTTPS, remember to download the fix?