I just read an article entitled “Why I’m Switching (Back) to Firefox” by Cameron Paul. He explained that Firefox has made huge improvements in the past couple of years. Meanwhile Chrome has experienced slowness and memory issues as of late. Google is in it to make money while Firefox stands for “freedom and privacy on the web”
He goes on to say: “Chrome is starting to feel a lot like Firefox did in 2008.” Ouch.
It is true that Firefox has improved by a lot over the past 4 years. They have neat things for developers like command lines in the dev tool. At this point Firefox is more than capable to be used as your primary browser again. I use it on occasion, but mostly for testing purposes.
So is it time to switch? No.
They had my complete loyalty back in 2004-2008. Every one of us who used their browser said the same thing. It was slow and crashed all the time. Within those 4 years it felt like Firefox hardly improved at all. If anything the experience got worse up until 2010.
Firefox, and the state of web browsers in general, lacked innovation and vision. Chrome brought about the modern web as we know it. Without it, it would be impossible to develop javascript heavy applications that exist today.
Why did it take Google Chrome to come out in order for Mozilla to start to work on Firefox’s core issues? Memory, responsiveness, and page render speed were core problems that existed for years.
For the past four years Firefox has spent their time playing catch-up to Chrome. How can a browser that was out for a decade be so far behind Chrome technologically?
Since day one, Chrome handled extensions and tabs as separate processes to help alleviate memory issues. As far as I know, Firefox still does not operate in this manner.
Now, because Firefox has nearly caught up with Chrome, it’s time to switch?
Am I supposed to be impressed that Chrome finally gave them a roadmap to use?
Am I supposed to be impressed that they finally fixed their memory issues that lasted for 6 painful years?
If Firefox is about “freedom and privacy on the web”, then Google Chrome is about being as fast as possible and quick to adopt and develop cutting edge features.
I’ve been on the lookout for a service that would reliably speed up various web projects such as FantasySP and Top IAmA. The faster the pages load, the happier my users will be. I first started out using Cloudflare, but their service has been getting worse and worse. Their pay features aren’t worth the money and using Cloudflare will most likely slow down response times. A winning combination!
Then I heard Google PageSpeed was in invite only beta. (Not to be confused with mod_pagespeed, which is an Apache module that offers very similar features). Google PageSpeed Service is very similar to cloudflare and all traffic will be passed through their servers via a DNS change. I was finally accepted into the program and have spent the past few days tweaking my sites for best performance.
Though before we get to the results, let’s first go over why FantasySP tends to load a little slow to begin with.
What’s really slowing down my site are advertisements and third party javascript. No surprise there, right? I offer my members a banner free experience, but the average joe has to load a page that is slower than I’d like. To measure front-end performance I use NewRelic. Prior to using Google PageSpeed Service, the average web page loads anywhere from 6 to 8 seconds.
Enter Google PageSpeed Service
I started out using Recommended Settings, which are the things you are probably familiar with: remove whitespace, combine CSS, minfiy javascript/css, use Google’s CDN, optimize images etc. I decided to go all in with Google even though I already used Amazon Cloudfront as my CDN. Google adds some additional goodies such as converting images to inline DATA, further reducing HTTP requests. It also automatically converts all references to images in stylesheets to run off their CDN, even if it’s not local to my server. (Goodbye Amazon Cloudfront?).
Google PageSpeed went live on Christmas day, the 25th.
NewRelic: Browser performance
Immediately all pages started to load in under 5 seconds. Best of all, no functionality on the site was broken, and I did not detect any additional latency by using Google’s service.
On December 26th I decided to enable additional experimental features that are labeled “high risk”. I also enabled a new low risk feature called prefetch DNS resolve.
The best way to solve slow javascript is to defer javascript, which should prevent javascript from hanging and slowing down page rendering. Google claims this is “high risk” and may break things, so I made sure to do thorough testing during a day that would have less traffic than normal.
Once defer javascript was enabled, you’ll notice that DOM Processing decreased even further, whereas page rendering actually increased. Unfortunately I cannot explain why that is. (Experts, feel free to chime in). So the next question might be, is deferring javascript even worth it according to that graph? The difference between the 25th and 26th do not seem to matter?
Let’s have a look at what deferring javascript does in a browser not known for speed: Internet Explorer 9. By enabling Google PageSpeed on December 25th, the pageload times decreased to under 6 seconds for the first time in seven days. On the 26th, deferring javascript decreased pageload times to around 3 seconds. Keep in mind that number includes banner advertisements.
Load Times in Internet Explorer 9
Clearly deferring javascript helps with IE9, but what about the other browsers?
Windows firefoxMac Chrome 23Windows Chrome 23
In every browser, you’ll notice that DOM processing and network times decreased, whereas page rendering times increased. My theory is that since the javascript is deferred, the dom is processed a lot faster, but that doesn’t mean the page is fully rendered. Again, feel free to chime in here with your own explanation. I *think* the end user will feel as though the pages load a lot faster, despite the additional page rendering.
Unfortunately I am unable to test Android and iPhone browser performance because these users are directed to a different sub domain. Plus, I don’t think Google supports these browsers for deferring Javascript. Older browser performance from IE8/IE7 remain unchanged because many of Google’s optimizations are for modern browsers only.
According to my testing, the previous bottlenecks like advertisements and rendering Google Charts no longer slow down pageloads.
Next up is to see performance from Google Webmaster Tools. When I used Cloudflare, Googlebot noticed huge performance issues. Cloudflare caused my 200 ms response times to double and even triple. Will the same results appear for Google’s Service?
Google Webmaster Tools
As you can see, passing traffic through Google PageSpeed servers does cause a penalty of about 100 ms in response times. Their servers are doing an awful lot of optimizations behind the scenes, so this is not at all surprising. The trade off is that the end user gets entire seconds shaved off their load times. I think I’ll take that any day.
More Advanced Optimization Techniques
Of course, I am not satisfied there and wanted to further push the boundaries of what Google PageSpeed Service can do. Next up is to cache and prioritize visible content. Cloudflare has something similar, but they call it railgun. Railgun also requires you to run an extra process to send HTTP data back and forth to Cloudflare to show them which HTML content is stale so it can be cached. I have no idea how well Railgun performs since no one has actually reviewed the service.
Google’s cache and prioritize visible content works a bit different. You do not need to run any additional service on your server. Instead they parse the HTML based on a set of rules you specify and rely on using Javascript to speed up rendering times. Their website says: “Non-cacheable portions are stripped from the HTML, and the rest of the page is cached on PageSpeed servers. Visible content of the page is prioritized on the network and the browser so it doesn’t have to compete with the rest of the page.” Visible content means, content that the end user actually can see on their browser. I do not know if it is user specific, or they just base this off the most common resolution. So anything above the fold will load immediately, whereas content below the fold may load via a Javascript call.
If deferring javascript works on your site then this feature should work once configured correctly. Here is their full video explaining the feature:
I went ahead and applied FULL caching to Top IAmA with no rules. This site has no advertisements and very little javascript. Response times here are a best case scenario Every page that gets rendered can be cached for 30 minutes. This means that Google only needs to fetch a page once every 30 minutes, otherwise it serves it straight from their servers. Googlebot shows the results:
Googlebot crawl speed Top IAmA
You’ll notice two things about this graph… 1) The response times to the far left are terrible and 2) The response times under Google PageSpeed with cached content are extremely fast. Response times went from 200ms to around 50 ms. The response times to the far left are a result of Cloudflare’s Full Caching features with Rocket Loader enabled. As I mentioned earlier, avoid Cloudflare at all costs. The response times in the middle of the graph are from my server.
I did attempt to add Google’s cache and prioritize visible content to FantasySP. Once I tallied up all of the IDs and classes needed for user specific data I previewed the feature site-wide. After a some tweaking of class names, everything seemed to be working without any issues. Although I soon ran into occasional Javascript errors under IE10 that would halt rendering the full page.
This shows you how fragile this feature is because a single javascript error will cause rendering to fail.
The error for IE10 shows as:
SCRIPT5007: Unable to get property ‘childNodes’ of undefined or null reference
f2ae86053f8f64f57a4ef28a17bd0669-blink.js, line 30 character 224
Every other browser seemed to have no issues whatsoever. IE10 seemed to load the fastest, with Chrome, Safari, and Firefox not too far behind. I applied this technique to a certain subset of pages on FantasySP that I feel confident will have no errors and can use the speed boost. Have a look at this Russel Wilson or Matt Ryan page. During pageload, watch the console to see how different sections of the page load via javascript calls from blink.js. If you don’t see it on the first pageload, then refresh to see the cached version appear.
Google Analytics
The final set of data will be looking at “Avg. Document Content Loaded Time (sec)” metric in Google Analytics under Site Speed -> DOM Timings. Google Analytics explains this metric as: “Average time (in seconds) that the browser takes to parse the document and execute deferred and parser-inserted scripts (DOMContentLoaded), including the network time from the user’s location to your server.”
I broke up the data by browser and they include: Chrome, Internet Explorer, Safari, and Firefox. Internet Explorer comes in just a hair faster than Chrome. Although Safari seems to be worse.
DOMContentLoaded
Google Analytics confirms what the NewRelic graphs show. Users will perceive pageloads as much faster with Javascript being deferred.
Conclusion
According to NewRelic browser metrics and testing, Google PageSpeed Service offers significant improvements to load times. It is relatively easy to set up and is better than any other competing service. Sites both big and small should use at least some of the services provided.
Google PageSpeed Service with advanced optimization techniques is a perfect solution when it comes to simple WordPress blogs that use Disqus commenting. All you have to do is add an ignore rule for /wp-admin/* and enjoy a much faster blog that should be able to handle an almost unlimited amount of visits and traffic. I’d love to see results from anyone else out there willing to try.
Overall I’d say using Google PageSpeed is a no brainer. Test it out and let me know your thoughts.
Update #2: This blog you are currently reading now uses Google PageSpeed Service with advanced caching techniques, such as Google’s cache and prioritize visible content.
I love data, especially when it comes to data that can tell a story. I was looking at my github account and wondering whether I could see a pattern in the times that I make my commits. Naturally, I was curious and pulled up my punchcard.
By looking at my punchcard, what type of job do you think I have? Can you guess?
Github punchcard
I noticed a few trends:
The most obvious one is that I am not a morning coder and my commits generally don’t start appearing until around 10AM.
I clearly work seven days a week.
My work day does not end at 5PM.
I generally try to wrap up whatever I am working on by 5PM. This is especially true from Wednesday – Friday.
I work the least on a Saturday.
On Thursday and Friday I tend to work even later than earlier in the week. My guess is that I’m trying my best to finish what I am working on before the weekend comes. Generally on the weekend I tend to add some polish on whatever I released during the week. So that tends to be more fun coding with less heavy lifting.
Monday and Tuesday I tend to work way past 5PM. My guess is that I’m eager to get going at what I’m coding and after the weekend I’m a bit more refreshed.
When it comes to a Saturday night I tend to code the latest. So much for partying. Boy, I’m such a geek.
Some things you may not know. I work for myself and run my own company called FantasySP. Hopefully by my commits, you’ll notice that I am dedicated to the project. As the sole employee, I feel the pinch to get things done.
Is my commit pattern different than yours? Does your day job force you to have a much different schedule? Do salary developers work seven days a week as well?
I encourage everyone to write about their own punchcards.
People tend to form an opinion on an operating system immediately and way before any of them had a chance to use it. For example, it is widely unknown that Windows Vista is crap. Yet, if you used a computer with Windows Vista, you would likely just assume it’s running Windows 7. Perception is everything. Sometimes it’s wrong.
Windows 8 has a vastly different tiled screen in-place of the start menu. We all know this. At first glance your immediate reaction to drastic change is negative. Completely normal. Then I saw my dad use Windows 8.
Section of Windows 8 Start Screen
For an older user, Windows 8 is a blessing. From their point of view, you are presented with a very inviting and straight-forward screen with colorful tiles. Hidden away is the complicated desktop with dozens of icons. Gone is the start-menu with endless rows of applications. The new interface has all of the every day tasks right there in from of you: email, internet, weather, calendar, and more.
My dad never has to leave the easy-to-use start screen. So how does this new interface simply things?
In terms of simplifying email, my dad has email accounts from AOL and MSN. This caused a few problems:
Previously he would sign in and out of those each day.
He would not know if he had new mail or not until he signed in.
He had to learn different interfaces to send and receive email.
I could have fixed this by creating a gmail account and have those emails funnel in to there. This would give him another email account and possibly confuse him even more. Not to mention it might be too complicated with labels and priority inbox and such. Perhaps that’s not the best email inbox for him. I wasn’t about to use Outlook because that is even more complicated.
But with the new windows mail. It could not be easier. I added each email account he had to the mail app and he could easily switch between them. He would also know immediately when he got a new email. The best part is that it all exists within the same easy to use interface.
What Microsoft did is actually VERY smart. They made an extremely easy to use interface that the MAJORITY of their users would be able to master. Both young and old can master this interface very quickly. They released a windows phone and tablet with the same exact interface. If I bought my dad a surface tablet or windows phone he would immediately know how to use it.
That’s rather brilliant, don’t you think? All of those untapped users could buy a windows phone tomorrow and feel right at home.
At the same time, this new interface hurts the advanced user. We don’t want it and have no use for it. I have Windows 8 running and am extremely satisfied in every aspect except this new start menu. For the advanced user, Windows 8 is an incremental update from Windows 7. The new interface is not a deal breaker, but I’m not exactly thrilled with it either. If i don’t want to use it, then I don’t have to. Advanced users can get around the problem.
It’s also blatantly obvious that by creating this interface Microsoft is yet again putting their apps front and center and conditioning their users to stay with Microsoft products.
I will keep this short and to the point. A lot of news I see on tech blogs all revolve around the same type of start-up. A start-up gets X amount of seed money, or X amount of additional round of funding, or takes on SUPER CEO Joey Wallnuts, or has some outrageous claim about growth, and how using Node.js makes their work desks levitate.
In many cases, these places are not profitable, and in some cases, never intended to be. Their main goal is likely to flip the start-up and sell it off at the peak of it’s valuation before the shit hits the fan. Before people realize that what they are building is a pile of shit.
So what do they do? The CEO and investors pump up the start-up by feeding tech blogs heaps of bullshit that they can write about as if they are facts. Before you know it, the bullshit keeps mounting and the start-up has a very high valuation in a year or two. The pump and dump is complete and the investors make their money and move on to the next victim.
Another successful start-up saves the internets yet again!
The problem? These are not real companies. They should not be refereed to as real companies. The real start-ups out there are ones you’ve never heard of. In fact, it’s an insult to call them start-ups. They are internet companies, whose main goal is profitability and longevity.
The average bootstrapped company’s story is boring. It often includes keeping costs down with slow but consistent growth over several years. The stack the company is built on is nothing special, either Rails or PHP. They failed to get funding or tech press. The problems they solve are often terribly boring, possibly highly technical, employs a handful of people, and worse yet, no one gets rich overnight. They tend to put their head down, plug away, and grow something out of nothing with hard-work instead of smoke and mirrors.
Talk about a snoozefest. These may not be interesting, but they are the companies you should root for and aspire to be a part of.
September and October have been eyebrow raising months for the SEO community and the blogs are going nuts. As you know a bunch of algorithm updates have occurred that may affect many sites and search queries. They make it seem like the SEO-Apocalypse is upon us. But should you be nervous when these updates come out? I think not.
…Panda update that impacts 2.4% of English search queries and is still rolling out
Holy shitballs that sounds scary. Will my site be impacted negatively? Should I hire an SEO consultant tomorrow to prevent the impending doom? All signs point to no, (unless you want to hire me of course).
All websites are designed for the end user. The end user should be able to find everything easily and access your content easily. Google has made it abundantly clear that SEO is about merging the end user experience with a machine readable experience. Google tries it’s best to mimic the end user experience in algorithms. Chances are that if your end user is having a lovely time on your site then Google also agrees.
Of course we are talking general terms here. It goes without saying that you should also make sure meta descriptions are present, you are using the proper HTML markup, etc etc. Just about all SEO related changes you can make for a website is to help Google understand what the user experience is all about. If you have lots of incoming links from quality sites, then guess what, the end user experience must be great at your site. If you tend to write articles about hot topics and include keyword friendly titles then chances are you will get more traffic. (Hint: This article’s title is both amusing and SEO friendly)
So why all the doom and gloom from SEO blogs? Fear of the SEO-Apocalypse brings pageviews, so blogs about SEO will write about them in nauseating detail. Don’t get me wrong, I like to be in the know when an algorithm update hits and you should be aware as well. But as it turns out, there are a lot of websites out there doing shady shit. Those websites will lose their traffic and then complain on SEO blogs as if they are innocent victims. If you are buying links (It’s not hard to understand that this is a paid article), exchanging links with crappy sites, light on content and heavy on ads, or trying to game Google then your site will be hit by these updates. There are a lot of terrible sites out there and Google will (hopefully) eventually weed them all out.
That is why we should look forward to a major Google update because chances are that it will help you rank higher. If you run a respectable site and follow the rules and concentrate on the end user experience then you have nothing to worry about. Occasionally a respected site is hit negatively by an SEO update and those are usually rectified. Nobody’s perfect.
Now, are there gray areas in terms of an algorithm update? Of course. Does Google just so happen to release updates that seem to favor their own properties and lead to more revenue? Probably. Does Google violate it’s own best practices by cramming 5 ads above the fold for many searches? You bet.
But the truth is that none of that should matter to you. Stick to the user experience. Stick to producing good content. Stick to making sure Google is able to properly crawl and present content in search results and you will be fine.
It appears as though Twitter is in the midst of rolling out Twitter Card Summaries for more publishers. I have been seeing an increase in summaries shown in my Twitter feed from new accounts. However, the rollout seems to be ongoing. No tech blogs have covered this as far as I know. This post is to provide an update on what I’ve been seeing since I could not find any information out there.
So here is what I’ve seen so far:
Twitter.com does NOT seem to have Twitter Card support enabled for summaries (at one point they did), nor does TweetDeck. In fact, the only way to see Twitter Card Summaries right now appears to be on a mobile device and using the latest version of the Twitter app. I have confirmed that they are working on the Android version, but I am not 100% sure if the iOS version shows them.
Summaries do not appear for every tweet posted, even though the meta tags are valid and the account is supported. Some tweets end up with a summary, while others get ignored. I do not see any pattern to explain this. My best guess is that they may be having trouble crawling the URLs and grabbing the meta data in a timely manner.
Here are tweets that show with summaries on the mobile app, though I’d be surprised if they show up inline below:
Over the past few weeks I worked on-and-off on a side project called Top IAmA. It’s goal was to present popular Reddit IAmA’s in a more readable format. There were a few criteria that the site had to meet in order for me to be happy with it:
The site design had to be minimal and lightweight. I want the content to do the talking.
It had to be able to handle large amounts of traffic, in the event it became popular.
It had to reliably and accurately collect questions and answers.
The design is something I quickly came up with in Photoshop. Once I had the basic framework up I just went straight to CSS/HTML and added some polish along the way. Check out early photoshop mockups of the homepage and article page:
Early homepage draft
Early article draft
As I mentioned, it needed some type of heavy caching in the unlikely event that the site became popular. The content of the site would not change frequently and there is no login system, so I knew it would make things a lot easier. I thought of creating a new cloud server to avoid any type of issues Top IAmA might cause to FantasySP, but utlimately I decided that it would be safe to host it on the same server. The easiest thing to do would be to use Cloudflare and “cache everything“. That means HTML, CSS, Images, etc. Most HTTP requests will not even reach my server thanks to Cloudflare’s caching. In fact, it should be able to handle being on the homepage of Reddit.
Once the project was complete, I submitted it to hackernews and much to my surprise the folks there really seemed to like it as it hit the front page. I will continuly tweak Top IAmA’s ability to accurately collect questions and answers. That is by far, the toughest hurdle to overcome. The next phase is to collect comment links and indent questions when they are within the same thread. No easy task, but then again, it wouldn’t be fun if it were easy.
I was just checking out Googlebot’s response times with Cloudflare enabled and disabled. The chart below is straight from webmaster tools for FantasySP.
Googlebot Response Times in Crawl
Back in March – April I had Cloudflare enabled, and I noticed elevated response times. I thought it might be due to Cloudflare, so I disabled it. Back then, I thought 400+ ms was bad.
As you can see, May response times were much better with Cloudflare disabled. The only problem with that is my site’s load also increased, so I decided to re-enable Cloudflare towards the end of May up until a few days ago. Now response times are well above 1,000ms. Why is their performance degrading so fast? This seems to fall inline with their announcement of business and enterprise options. Perhaps the free accounts are hurting because of this?
The past two weeks have been filled with opinions from experts on Facebook and its IPO. I would say about 66% lean towards the negative side and say to avoid them at all costs. Why? Because GM pulled their ads. Because a survey said 40% of their users will never click on an ad. Because experts on TV, who don’t even understand the web, feel that they are overvalued.
I’m not an expert when it comes to stocks or IPOs. I’m not going to write about P/E ratios, shares outstanding, or their market cap. I am going to write about their product and its influence over the web.
Pages Per Visit & Reach
Facebook has roughly the same amount of pageviews as Google per day according to Alexa. They are the 2nd most visited website on the web, just behind Google.
According to Alexa, Facebook is estimated to have 12.24 pages/visit . For comparison, Reddit is listed as having 10.6 pages/visit. So how accurate is this metric? Reddit’s blog post from January shows 13.00 pages/visit.
This means that Alexa likely undervalues by almost 2.5 pages/visit and Facebook could be closer to 15 pages/visit. As someone who runs a website and is a web developer, I assure you that 15 pages per visit is absolutely insanely good. Their users are addicted and love Facebook. Few sites or companies can pull off these kind of metrics.
Social is a Fad
Many experts believe that social media is a fad and/or Facebook could easily be replaced. I strongly disagree. Remember when web portals were going to be a thing of the past?
Yes, its true that Facebook is not the first social website. Friendster and MySpace came way before Facebook. Unfortunately, their websites lacked innovation and sophistication that their user base was desperately craving. Facebook forced the social platform to grow up and users graduated from MySpace/Friendster to Facebook.
Keep in mind that when you see commercials, Facebook and Twitter pages are often shown. Facebook does not pay these companies for the advertising. They are actually WILLING to showcase these two social networks because that is how important they are to advertising and reaching your customer base.
Today social websites are as important to the web as search engines. Speaking of search engines…
Google
If Google has taught us anything, it’s that being the first to the market means nothing. It’s about who innovates and pushes the platform to the next level. Facebook did that with social and Google did that with search. Altavista = Friendster. Excite = MySpace.
To see a market capitalization valuing Google as a mature company is assuming a best-case scenario which isn’t a for-sure outcome. It still has a long way to go to justify growing into that kind of market value,” said Michael Cohen, director of research with Pacific American Securities.
Cohen added that in addition to Yahoo!, Google will face increased pressure from Microsoft, which has been stepping up its research and development efforts in its MSN Internet business.
Potential Growth
Does anyone realize how Google makes money? They make most of their revenue from advertisements. Guess what other companies makes most of their revenue from advertisements? Facebook. Some of my points here piggyback Venture Beat’s awesome article.
Charts here show revenue growth compared to similar IPOs. Their opinion is that Year over Year growth is going down, not up. This is obviously a bad sign, however the graph also indicates that Zynga’s growth year over year is better than Facebook. It’s amazing what you can make a graph do. That tells me absolutely nothing because Zynga would not exist without Facebook. Meanwhile LinkedIn has perhaps 1/10th the amount of pageviews of Facebook. So how much weight should we really put behind something like this?
The truth is that Facebook’s growth potential is insanely high. They haven’t entered China and their mobile platform is still in its infancy. Facebook realizes this and already is taking courses of action to correct this, namely by buying Instagram.
Their current revenue numbers could change drastically once they figure out mobile and their platform continues to mature.
Conclusion
What I wrote about here are taken from a web developer perspective. I honestly don’t care about an investor’s expert opinion when he may have a hard time understanding the product. He is looking at the revenue numbers and makes a judgement compared to the industry giants. What he sees on paper simply doesn’t add up to the hype or the valuation. That is completely understandable. Software and the scope of the web is not a tangible thing that people can easily wrap their head around.
As I said, I am not a stock expert. I don’t dwell on P/E ratios or shares outstanding. I don’t know what their stock is going to be at in 6 or 12 months, but I do know that Facebook, as a product, will be as popular as ever. And that, by itself, has to be worth something on the open market.