Continuing my series of posts about Web Performance Optimization (WPO), here is another thought: use a Content Delivery Networks (CDN) to speed up web pages and save money.
Even though bits travel fast, it all comes down to distance and number of bits. The closer you are to your users, the faster your web pages will load. That's where CDNAs help us, web site owners. While a robust web site might have geographically distributed content servers for performance and redundancy, maintaining this infrastructure comes at a cost.
CDNs provide a balanced distribution platform that allows content providers to store resources closer to their clients, making everything a bit faster. Here at Geekzone we currently use MaxCDN, but played with Fastly and Amazon Cloudfront. We currently have mixed DNS and CDN solution (which I will expand on in another post).
CDNs can be used in many different ways. The most common are Push and Pull. With Push CDNs you are responsible for loading your web resources to their servers, while Pull CDNs will automatically retrieve your web resources from a nominated origin server when a request first comes in.
Below is the stats panel for one of our CDN configurations with MaxCDN, where you can see how the content is distributed through the nodes and how much data is used up every day:
And below you can see the traffic (in number of hits) including cache hits and non-cache hits:
Coming from New Zealand, where data traffic is usually one of the highest costs in a web site operation, CDNs have the side effect of helping web site owners save on traffic. You can see that our CDN serves something between 400 MB and 1.2 GB a day, depending on traffic, with 90% cache hits. This means 90% of the requests are served from the CDN caches directly, without ever reaching our servers.
CDN configuration can be as simple as just creating new DNS records pointing a resource domain to the CDN subdomain created for your specific configuration. If your web site doesn't currently use a separate domain for serving up those resources (images, scripts, CSS, static HTML) there are solutions that can automatically rewrite those when a page is requested.
When using a CDN it's important to make sure your web resources are correctly configured to appropriate cache expire and public caching. If this is not possible to configure in your server, there's always a setting on the CDN that will allow you to override settings from the original server with new default values.
In another post I will talk about latency - make sure to subscribe to my RSS feed. Of course if you run a web site and think a Web Performance Optimization project could help you improve metrics, please contact me and we can work on this.
Continuing my series of posts about Web Performance Optimization (WPO), here is a thought: focus on high impact web pages first. This might seem obvious when you read it, but from my experience most people don't actually put limits to a WPO project and over time the benefits are diluted.
The first thing to do is to identify possible candidates to a WPO project. In a previous project we found out one single script was hit with requests 80% of the time. We (the web site owner and myself) decided to concentrate efforts on this web page first.
Basically, we apply the Pareto Principle and concentrate our efforts on that page responsible for 80% of the total requests using only 20% of the overall time of a full WPO project, with more immediate results. We then have time to concentrate on the other 20% of pages which could take up to 80% of the project time, if needed.
Obviously if you have a page that is hit only a few times a day but still manages to bring the whole web site down, then this should be looked at too.
The tools of choice for this part of the project are web site analytics (Google Analytics is my favourite one - it's free!). Data needs to be collected for a while to help determine the exact focus of the sub project.
Once a web page is selected then a holistic approach takes place. Waterfall diagrams (I will talk about these in another post later) can be used to determine the balance of back end and browser side load times, helping determine which side needs more urgent attention. Scripts can be used to monitor events and report back with signals that can be used to determine specific areas causing slow rendering on the client side.
I will keep posting in this series - make sure to subscribe to my RSS feed. Of course if you run a web site and think a Web Performance Optimization project could help you improve metrics, please contact me and we can work on this.
For the last couple of years I worked on Geekzone to improve web performance, with great results. The thing that started me on this path was our limited computing resources at the time and a desire to use these resources in the best way possible. From there I found out more and more about Web Performance Optimization and accumulated some tricks that could help some of you out there.
There are so many variables in web performance that in most cases just adding more CPU power or memory won't necessarily speed up the user experience on the other side of the wire. That is because most of of the time spent by a web browser loading a web page is due to factors other than the server's processing capability. Steve Souders says 80% - 90% of the end-user response time is spent on the front end.
Any time a web page takes longer than a couple of seconds to load there is a chance the visitor will simply close the window or navigate away to another web site. In some cases this is reflected in lost sales. The faster your web site, the easier it is for customers to transact with your business.
While web site designers and developers can't control the line speed at the end, they can control what's loaded on the browser when their web pages are rendering. Being smart about it is how we implement web performance optimization.
There are of course things that can be done on the server side as well. Recently I was asked by a friend to find out why his retail web site was performing so badly, and why on peak time his Microsoft SQL server would just grind to a halt, most of the times requiring a few reboots daily. This obviously was something on the server side, not on the client side.
First I found out his server was running on a virtual environment with minimum memory, causing the database server to spend most of the time swapping things between memory and disk. I also looked on his Google Analytics reports and found 80% of the traffic was landing on a single script, which we agreed to concentrate the work on. I found things such as a "SELECT *" query over a table containing a few million records with a long "WHERE" clause and similarly long "ORDER BY" clause - and not a single index defined in the entire database. And this sort of query was used about ten times in that page alone.
Needless to say, working on that script was enough to practically solve the server side of the problem. And solving the server problem also improved revenue, since there were no more lost sales due to server crash, slow response times, etc.
But sometimes things are not that clear and the boundary between server and client performance is fuzzy. In some cases web and database servers are working just fine and a web page still takes a long time to load. That's where Web Performance Optimization projects come in.
I've created a new Web Performance Optimization category in my blog and I will start posting small tips and comments about tools I have used.
Of course if you run a web site and think a Web Performance Optimization project could help you improve metrics, please contact me and we can work on this.
A couple of things Windows Phone could do better:
- If I plug my Windows Phone to my PC and use Zune to search the marketplace for an app I have already purchased (but it's not on the handset for whatever reason - uninstalled, new handset, hard reset, whatever), then Zune does not offer a "Reinstall" button. The only option is "Cancel". It tells me I already bought the app, so it won't charge me again, but it's not smart enough to offer to reinstall. I have to go to the WindowsPhone.com website for that.
- Since Windows Live Mesh can sync up to 5GB to the cloud (in addition to the Skydrive storage), why can't Windows Phone see those files? I don't want to create files on Skydrive, but I don't mind synchronising from my PC to the cloud with Live Mesh since it's automatic and transparent.
Such small things that would make a huge difference.
What would you like to see implemented on Windows Phone?
Every few months I remember to post the Geekzone browser stats... This time (30 days through 23rd March 2012) I've noticed it was the first that Internet Explorer has dropped down to third place (worldwide visits), although still barely holding second place when it comes to New Zealand only visits.
Here is the worldwide chart (more than 500,000 visits), showing Google Chrome just ahead of Firefox and Internet Explorer in third place. The difference between those three browsers is so small, I would say they are in a tie:
And here is the New Zealand only chart (more than 250,000 visits), showing Google Chrome in first, followed by Internet Explorer and Firefox. Again the numbers are so close we could call this a tie.
A friend on Twitter (@slyall) asked about after hours and business hours stats. The reason for that is because many suspect the high number of Internet Explorer users could be concentrated on business who are slow to update their systems, either due to budgetary constraints or browser requirements when accessing old systems. So I created two segments: New Zealand Only Business Hours (more than 150,000 visits, 8am through 6pm) and New Zealand Only After Hours (more than 70,000 visits, from 6pm through midnight). Here are the two charts:
Now we can clearly see a significant drop in Internet Explorer usage, from 32.12% down to 22.51%. Chrome increases a little bit, Firefox seems almost still with a tiny movement, but Safari is the one that goes up more than the others.
Just for curiosity sake, here is the chart showing the Internet Explorer distribution, New Zealand only:
Obviously the Geekzone audience tend to be more tech savvy, but from my experience they are a good mix, representing all sorts of people - technical and non-technical. Some wouldn't know how to update their browsers, others would work in the IT area and do it all. I bet if other New Zealand publishers publicised their numbers (Trade Me for example), we would see a higher Internet Explorer and Safari (on Mac OS) usage than Chrome and Firefox, thanks to the more mainstream presence.
What do you think?
A few years ago Microsoft
adopted bought a sync technology and "enhanced" it. Windows Live Mesh turned into Windows Live Sync and then Windows Live Mesh again. Each time that technology changed names I tried it.
I tried it hard to have a successful relationship with Windows Live Mesh. It's imperative that something like this "just works" (TM). I have thousands of files in My Documents, in a development repository, my music and video folders that I
want need to keep in sync between my desktop and laptop. That's because the promised sync would allow me to grab the laptop and go, free of my desk.
In every single interaction Microsoft failed to deliver. The latest fail came today when I saw a blog post about Windows Live update, "This minor release includes numerous updates across all of the programs in the Essential package, including Windows Live Messenger, Mail, Photo Gallery, Movie Maker, Family Safety, Writer, and Mesh."
It's a minor update. It shouldn't break anything right?
It did break. EVERYTHING.
The folder My Documents is no longer syncing - neither to the cloud nor to my laptop. The folder DevRepository is receiving files from the cloud, where just before the update everything was in sync. Together both folders hold about 4.5GB of data. And Windows Live Mesh is downloading this, even though nothing has changed (well, nothing on my data).
The folders My Pictures, My Videos and Prinstcreen all sync to my laptop only (no cloud involved). As you can see Windows Live Mesh thinks BOTH sides are waiting to receive files, effectively a deadlock, since neither side should have been flagged to update anything and each side is waiting for the other.
It also turned off the option to sync Microsoft Office styles, templates, custom dictionary and email signatures I have in place.
Do the engineers behind this product even bother testing what they release? This is most appalling piece of reckless release software I've ever seen.
This week I got confirmation of another sponsor for our Geekzone Freeview Pizza 2012: HP is coming to the party, with three HP mini laptops to giveaway, one for each of the events. Not only received the confirmation, I have already received the laptops:
We have been running these pizza evenings for the last six years, always with great sponsors. These are not geeky evenings, but a time where some of our users can get to meet each other, and put some faces to names and avatars. As in previous years we know of some Geekzone users flying to attend all three events (Auckland, Wellington, Christchurch) - I will be hosting all three so looking forward to seeing everyone around.
We are still open to other prize sponsors, as usual!
More and more we see blog posts and Twitter links to "infographics", those snippets of information put together by a hipster graphic designer. And that was ok and fun, until the infographics started being
hijacked created by spammers.
You just have to get a small audience to soon be contacted by someone offering you "this infographic that will be interesting to your readers", in exchange of a link.
Even large blogs fall for that, as seen on Fast Company, which posted an infographic from a web site called "Online Graduate Programs" (intentionally not linking). Their blog post post started pointing out something that we already knew from way back, when a sponsored post appeared on RWW in February 2011 with the title "The cost of slow sites: visitors, Revenue and Google Rankings".
A quick look around "Online Graduate Programs" doesn't inspire me much confidence - even the topic chosen for the infographic is a bit distant from the site's topic.
Basically spammers found out that "new media" love page views. Creating infographics with rehashed information collected from different sources, regardless of how current it is, and spamming blog owners to get those published is a cheap way of getting an audience that will click through to their sites.
Yes, I know. Hard to put this concept in some people's heads. But yes, it is spam.
There is no doubt that mobile data roaming are too high, as evidenced, once again, by someone who was unprepared to turn off mobile data on his mobile while travelling overseas, then runs to the paper, gets the bill wiped, generating a long discussion on Geekzone and some interesting comments on Twitter.
Mobile data roaming charges are too damn high. Taking for example Telecom New Zealand, the telco involved in this case, you can buy 2GB of mobile data on a prepaid plan for NZ$50 for use in New Zealand. This is about NZ$0.02 per megabyte of mobile data at home (using the Telecom New Zealand prices).
Yet, when you travel overseas they charge you NZ$ 4.00 per megabyte while in Australia or USA. This is 163 times more expensive than our local rates. Imagine you are actually going to Germany, where mobile data roaming will cost you NZ$30 per megabyte, or 1228 times more expensive than our local rates, a stunning NZ$ 30,720 per gigabyte. Remember the same gigabyte here in New Zealand would cost you "only" NZ$ 25 on prepaid.
How can this be justified?
I have contacted Telecom New Zealand for a breakdown of mobile data roaming costs. This is their reply:
For all of our international roaming rates, we negotiate rates with the international carriers we have agreements with. The roaming charges that are passed through to our customers are largely determined by the rates that we are charged by these carriers.
There are also other costs associated with enabling international roaming for our customers. For example, with each international carrier, we need to set up appropriate billing systems. We must also establish a signalling arrangement between Telecom and each international carrier that we have an agreement with.
To keep our international roaming charging simple for our customers to understand, we have five roaming zones set up, based on the frequency that our customers travel to each roaming destination.
The international roaming charges negotiated with international carriers are passed through to customers, but it is not clear how much impact these have in the final pricing. Telecom is planning some actions in this front, alas nothing related to pricing:
The Smartcaps product will provide updates on mobile data usage by sending customers an SMS at up to five chosen dollar-amount thresholds, or unlimited data value can also be requested. Once the chosen threshold is met, roaming usage is stopped until the customer accepts further usage.
In addition, smartphone customers will soon be able to download a new mobile app called XT Telecom Roaming which allows customers to check roaming rates, country codes, and troubleshoot any roaming queries while not impacting their data usage. This app will be available to download for free from the apps store on customers' Android or i-Phone devices before departing New Zealand for their international destination. The app doesn't require data usage to run, so using it overseas will incur no roaming cost to the customer.
Currently Telecom New Zealand sends a SMS to customers when they first connect to a roaming partner, explaining how much voice, SMS and data costs. If you know how much is the cost per megabyte, and knowing you will have to pay for all and any usage overseas, it is fairly easy to put two and two together and decide either not to use mobile data roaming or seek an alternative method, such as a local SIM card on prepay.
You should not think this is something affecting Telecom New Zealand customers only, as both our other mobile network service providers (Vodafone New Zealand and 2degrees) aren't much behind in terms of prices.
It doesn't look like the New Zealand government is blind to this. Survey results published in June 2011 say this:
The Minister for Communications and Information Technology, Steven Joyce, says four out of five New Zealand businesses surveyed say the costs of data roaming is prohibitive to their staff doing business in Australia.
The Minister has today released the results of a survey from the Ministry of Economic Development which asked New Zealanders and New Zealand businesses how they stay in touch when travelling across the Tasman.
The survey of 534 New Zealanders travelling to Australia was carried out between July 2010 and January 2011 and informed the decision of the New Zealand and Australian governments to conduct a joint investigation into whether regulatory intervention is required in the trans Tasman roaming market.
The limits placed on staff wanting to use their smartphones, tablets and laptops to access the Internet was one of the significant findings.
- most New Zealand individuals and business travellers take a mobile phone with them when they travel to Australia, and most New Zealand business travellers take a laptop with them when they travel to Australia;
- both individuals and businesses attempt to limit use of mobile roaming services in Australia, with the main reason being concern about the cost;
- the majority of businesses believe that roaming services contribute to their staff's ability to work effectively while in Australia.
Unfortunately the link from that press release to the full survey results is dead.
Because mobile data roaming involves companies in two different countries, we can't even ask for a regulator to take a stand, because one of the telcos will be out of the regulator's jurisdiction, and the other can just say "that's what I am charged, I am just passing the costs". Anything to change this would require international agreements.
And on that note a trans-Tasman investigation on mobile roaming charges should've released a draft decision by end of 2011, but it's now delayed to mid-2012 with a final report expected not before end of 2012.
Currently EU lawmakers are pushing to have mobile data prices regulated across member countries. Their proposal is to limit mobile data roaming at 0.20 euro per megabyte (still 205 euros per gigabyte) but, you guessed it right, with no visible impact on any country outside the EU.
Sometimes it isn't easy for customers to ditch their mobile number and go with a local replacement in each country they visit. Business people depend on being contactable, families rely on communications. A temporary change of numbers is ok, but if one is going across multiple countries this becomes an inconvenience.
What do you think should be done? Regulation, transparency in costs, customers walking away from roaming?