I hear a lot about privacy these days. More often than not, it’s about Facebook or Google and their disregard of privacy. I think part of the problem is that privacy means different things to different people.
I know for a fact that how I view privacy online is different than other people. I accept that the internet is changing rapidly and the idea of privacy that existed are long gone. Online handles are a thing of the past (though there are still exceptions, hi Reddit).
I hear things like…
Ditch Google and use DuckDuckGo because they don’t track users. Don’t use gmail because they read your email. Ditch Facebook because Zuckerberg said privacy is dead.
In fact, the most recent knock against them is that Facebook Home can take over your phone and offers all kinds of potential privacy violations. Reading that story led me to write this blog post.
Let’s face facts here. The modern web and your personal data are intertwined. There is no going back.
When Google announced their unified privacy, it was actually a great leap forward. Those people who complained four years ago that Google was reading their email love Google Now.
Gmail will see that you booked plane tickets or had a packaged shipped and will personalize your Google Now experience. This was not possible five years ago. Google Glass was not even possible a year ago.
Will these companies use this data to show advertisements and make money? Yes, of course. As it turns out, they are companies that need to make a profit.
A privacy breach will happen time to time with the modern web. I expect the modern web to be responsible and use SSL and OAuth to securely share my data. When they are careless and screw things up then they deserve the bad press. But a story about Facebook Home saying that it “destroys any notion of privacy” is complete and utter bullshit.
I’m not saying that I want to sign up for a service with my email address and phone number and they can turn around and sell this information to a third party. I don’t want them to publish my phone number without my consent. Those are violations of my privacy and not something I would agree to.
However, if I sign up for Facebook and they decide to use the fact that I liked ESPN in a personal advertisement, then so be it. I understand the tradeoffs of the modern web.
I find it funny that we get bombarded with credit card offers and flyers from ten different companies in the mail and no one seems to mind. We accept those privacy violations offline, even though they offer us nothing in return. Yet companies online who are innovating and using private data in new ways get so much grief.
No one said you had to join Facebook, use Google, or Twitter. If you want to pretend its 2002, then that’s fine. The rest of us are moving forward. We don’t need you to come along.
This is progress people. Sit back, relax, and stop your whining.
I just read an article entitled “Why I’m Switching (Back) to Firefox” by Cameron Paul. He explained that Firefox has made huge improvements in the past couple of years. Meanwhile Chrome has experienced slowness and memory issues as of late. Google is in it to make money while Firefox stands for “freedom and privacy on the web”
He goes on to say: “Chrome is starting to feel a lot like Firefox did in 2008.” Ouch.
It is true that Firefox has improved by a lot over the past 4 years. They have neat things for developers like command lines in the dev tool. At this point Firefox is more than capable to be used as your primary browser again. I use it on occasion, but mostly for testing purposes.
So is it time to switch? No.
They had my complete loyalty back in 2004-2008. Every one of us who used their browser said the same thing. It was slow and crashed all the time. Within those 4 years it felt like Firefox hardly improved at all. If anything the experience got worse up until 2010.
Why did it take Google Chrome to come out in order for Mozilla to start to work on Firefox’s core issues? Memory, responsiveness, and page render speed were core problems that existed for years.
For the past four years Firefox has spent their time playing catch-up to Chrome. How can a browser that was out for a decade be so far behind Chrome technologically?
Since day one, Chrome handled extensions and tabs as separate processes to help alleviate memory issues. As far as I know, Firefox still does not operate in this manner.
Now, because Firefox has nearly caught up with Chrome, it’s time to switch?
Am I supposed to be impressed that Chrome finally gave them a roadmap to use?
Am I supposed to be impressed that they finally fixed their memory issues that lasted for 6 painful years?
If Firefox is about “freedom and privacy on the web”, then Google Chrome is about being as fast as possible and quick to adopt and develop cutting edge features.
I’ve been on the lookout for a service that would reliably speed up various web projects such as FantasySP and Top IAmA. The faster the pages load, the happier my users will be. I first started out using Cloudflare, but their service has been getting worse and worse. Their pay features aren’t worth the money and using Cloudflare will most likely slow down response times. A winning combination!
Then I heard Google PageSpeed was in invite only beta. (Not to be confused with mod_pagespeed, which is an Apache module that offers very similar features). Google PageSpeed Service is very similar to cloudflare and all traffic will be passed through their servers via a DNS change. I was finally accepted into the program and have spent the past few days tweaking my sites for best performance.
Though before we get to the results, let’s first go over why FantasySP tends to load a little slow to begin with.
Enter Google PageSpeed Service
Google PageSpeed went live on Christmas day, the 25th.
Immediately all pages started to load in under 5 seconds. Best of all, no functionality on the site was broken, and I did not detect any additional latency by using Google’s service.
On December 26th I decided to enable additional experimental features that are labeled “high risk”. I also enabled a new low risk feature called prefetch DNS resolve.
According to my testing, the previous bottlenecks like advertisements and rendering Google Charts no longer slow down pageloads.
Next up is to see performance from Google Webmaster Tools. When I used Cloudflare, Googlebot noticed huge performance issues. Cloudflare caused my 200 ms response times to double and even triple. Will the same results appear for Google’s Service?
As you can see, passing traffic through Google PageSpeed servers does cause a penalty of about 100 ms in response times. Their servers are doing an awful lot of optimizations behind the scenes, so this is not at all surprising. The trade off is that the end user gets entire seconds shaved off their load times. I think I’ll take that any day.
More Advanced Optimization Techniques
Of course, I am not satisfied there and wanted to further push the boundaries of what Google PageSpeed Service can do. Next up is to cache and prioritize visible content. Cloudflare has something similar, but they call it railgun. Railgun also requires you to run an extra process to send HTTP data back and forth to Cloudflare to show them which HTML content is stale so it can be cached. I have no idea how well Railgun performs since no one has actually reviewed the service.
You’ll notice two things about this graph… 1) The response times to the far left are terrible and 2) The response times under Google PageSpeed with cached content are extremely fast. Response times went from 200ms to around 50 ms. The response times to the far left are a result of Cloudflare’s Full Caching features with Rocket Loader enabled. As I mentioned earlier, avoid Cloudflare at all costs. The response times in the middle of the graph are from my server.
The error for IE10 shows as:
SCRIPT5007: Unable to get property ‘childNodes’ of undefined or null reference
f2ae86053f8f64f57a4ef28a17bd0669-blink.js, line 30 character 224
The final set of data will be looking at “Avg. Document Content Loaded Time (sec)” metric in Google Analytics under Site Speed -> DOM Timings. Google Analytics explains this metric as: “Average time (in seconds) that the browser takes to parse the document and execute deferred and parser-inserted scripts (DOMContentLoaded), including the network time from the user’s location to your server.”
I broke up the data by browser and they include: Chrome, Internet Explorer, Safari, and Firefox. Internet Explorer comes in just a hair faster than Chrome. Although Safari seems to be worse.
According to NewRelic browser metrics and testing, Google PageSpeed Service offers significant improvements to load times. It is relatively easy to set up and is better than any other competing service. Sites both big and small should use at least some of the services provided.
Google PageSpeed Service with advanced optimization techniques is a perfect solution when it comes to simple WordPress blogs that use Disqus commenting. All you have to do is add an ignore rule for /wp-admin/* and enjoy a much faster blog that should be able to handle an almost unlimited amount of visits and traffic. I’d love to see results from anyone else out there willing to try.
Overall I’d say using Google PageSpeed is a no brainer. Test it out and let me know your thoughts.
I hear a lot about alternative search engines. If you do some Googling searching on the subject, then you’ll quickly realize that there are many “alternatives” out there. I’m not talking about companies such as Yahoo and Bing. Of course we know that these guys are alright when it comes to search, but even they come up short. So let’s quickly ponder about the type of challenges a search start-up faces in this space.
DuckDuckGo is the first one that comes to mind. I see them pop up quite a bit since they constantly appear on hackernews. The truth is that they would be fortunate if they could capture Bing’s share of search, let alone actually competing with Google.
The real question is why? Why not at least give it a try, right? After all, Google Chrome came out not too long ago and has quickly captured a large chunk of the market. Keep in mind three things about that: 1) The browser market was no longer innovating at a fast pace. 2) Google has an immense amount of resources at their disposal 3) Google saw inefficiencies and knew they could improve upon them. 4) Developing a browser would fit well into their existing business model and future business opportunities.
Recently Google released a video on an internal discussion about search. Watch that video and let this quote soak in for a moment:
The specific change discussed in this video improves spelling suggestions for searches with more than 10 words and it impacts only .1% of our traffic.
Google’s platform is so mature that these are the current problems that they are solving. Do you really think that is what is going on at DuckDuckGo and similar search engines? Hardly. They are trying to solve problems that were solved years ago and will be playing catch-up for years to come, assuming they are still around. Keep in mind .1% of Google’s traffic is at least 100 times more than DuckDuckGo’s TOTAL traffic right now.
The only hope a company has is to appeal to a specific niche or hope Google implodes. A company that prides itself as a general all-purpose search engine is wasting everyone’s time because you will not come close to Google in terms of accuracy. Your best bet is that Google beats itself and does stupid things like implementing changes that appear to hurt accuracy, or favor their properties, or break referral data that is critical for SEO.
So which alternative search engines are worthwhile? Obviously Twitter’s search performs better with real-time search queries than Google. For example, if I want to know if Firehost is having issues, then I would turn to a Twitter search.
So what solution does DuckDuckGo solve? Privacy concerns? I’m not so sure. People who complain about privacy are the same ones who use Twitter and Facebook on a daily basis. Five years ago searches based on your previous search history were “revolutionary” and “the future”, but now they are “invasive”. Interesting how perceptions change, isn’t it?
Not too long ago I started to receive an endless stream of Google Two-Step Verification codes to my cell phone. I had not changed anything, but I had no way of stopping it and hoped the problem would resolve itself. Well, it turns out that it does not. What you need to do is use Google Authenticator to receive these codes instead of opting to use SMS text messages.
To do this, go to your Account Settings for two step verification and you will see the option to make the switch. Not sure where this is? Log into gmail and click your name in the top right.
Google Authenticator works for both Android and Apple phones.
If you are like me, then you’re sick of the Google DEV channel getting screwed up by new bugs. (I thought that was the whole point of the Canary build?) Recent hair-pulling bugs include: fonts no longer rendering properly, sluggish performance on Twitter, or random crashes that weren’t there a build ago.
Well, fear not. I just decided to downgrade from Chrome DEV to BETA and I kept ALL my user data. Here is what you do:
Backup your “User Data” folder first. For windows 7 users: C:\\Users\YOUR-USERNAME\AppData\Local\Google\Chrome\User Data\ . You’ll notice that this folder is fairly big, in fact, mine weighed in at around 1 gig. If you are on an earlier version of windows or a different OS, then head here for more paths.
Next, close Chrome and go to Add/Remove programs to uninstall. When asked, Do not remove your user data.
The average blog post is fairly easy to index and crawl for Google. However, what happens when a single page has several hundred comments? How does Google decide what’s important to include in a search query and what to ignore? Heck, in some cases the comments on a Reddit post are more important than the actual summary of the article submission itself.
A lot of websites have been adding a voting aspect to comments for some time now. However, no search engine (as far as I know), looked into taking these votes into account when crawling. I propose a new type of “rich snippet” syntax for Google, Comment Syntax.
What if websites included the syntax so Google could not only clearly identify comments, but quickly pick the most popular/useful comments to showcase in certain search queries. This could be applied to sites such as Digg, Reddit, Yahoo Answers, Stackoverflow, and just about any WordPress blog.
Finally an advanced Google keyword ranking checker that is free. The site is aptly named KeywordRankings and is geared towards SEO people looking to save a history of where their keywords rank in Google.
Entire Service is free
Add unlimited domains
Add unlimited keywords
Each keyword ranking is checked once per day
Each domain is checked for index’d pages in Google once per day.
Graphs are created for each keyword’s history and domain’s index’d history.
A public secure and sharable link is provided for each keyword to show clients.
Available for google.com, google.co.uk, google.ca, google.dk, google.es, google.it.
Keyword Queue Ranking decides in what order your keywords get updated.
The site is coded in PHP with a jQuery front-end and is extremely fast. Anyone involved in Search Engine Optimization should be happy with it. At the moment it is in invite only mode so I can squash all of the bugs.
There have been questions as to how the site determines each users “Keyword Queue Ranking”. There are a multitude of factors, such as how long you’ve been a member. I cannot give out all the details, but I can say that it is an ongoing process and will be continuously tweaked.
Of course if you’d prefer to not deal with the Keyword Queue Ranking, then the option to donate to the site is available to get priority keyword checking. The site has not recevied any donations yet, so for just $1, you will be at the top of the list. Over time, if everyone donates $1, then you would need to donate another dollar to be first again.
Essentially the price for 1st in keyword priority is determined by you guys, not me. The site will show you where you rank in the keyword queue.
I am allowing 5 new signups. (1 SIGNUP LEFT) Enter code: fantasysp.com
If you have any questions or comments, respond below.
I’ve encountered a simple, yet annoying problem when it comes to Google’s Gmail Notifier. It turns out that if you select the option to always use HTTPS, then the notifier will not work. The fix can be seen on the Notifier page, which is a registry edit.
Surely Google can do a better job of notifying us of the problem without having to search for it. How about a message saying if you use HTTPS, remember to download the fix?