My window to the world


Legal video, movie and TV downloads and streaming options in New Zealand

By Mauricio Freitas, in , posted: 30-Jul-2012 12:39

Last updated 24 AUG 2014

In another post I started a list of legal music downloads in New Zealand, and thanks to so many suggestions from folks in the comments and Twitter I decided to compile a similar list for movies and TV episodes.

I don't see as many options for downloading or streaming movies and TV series as we can find for music, but here are the ones available in New Zealand now:

Like I said in the previous post, just because you pay for downloading something doesn't make it legal - it could be that someone is just getting your money while not having the legal rights to distribute the movie. The list above is only of legal video, movie and TV services available for New Zealand users.

Also movie and TV services in New Zealand have long delays when releasing new content, unlike the music services. This is probably one of the reasons why so many people buy music, but pirate videos. We just wish the industry would catch up with times.

If you know of any other options, post in the comments...



Legal music downloads and streaming options in New Zealand

By Mauricio Freitas, in , posted: 30-Jul-2012 08:46

Once again I see someone starting a discussion on Geekzone asking if downloading music from some file sharing site is legal under the Copyright (Infringing File Sharing) Amendment Act 2011...

Their reasoning for the question is because currently the technology for tracking file downloads is heavily biased against peer-to-peer (P2P) distribution, and in their eyes if it's not P2P then it would be legal, right?

Wrong. Breaking copyright is still illegal, regardless of which technology is used. Paying a Russian site for music does not make it legal - you are probably giving your money (and credit card details) to someone who doesn't hold the rights to distribute content.

We keep asking for new forms of digital content distribution, and while movies and TV series are still behind the times, there are some good music options in New Zealand, including downloads to own, streaming, free, paid, reward points, or a combination.

Pick your choice from this list of music services available in New Zealand, and be legal:
If you know of any other options, post in the comments...

UPDATE: I have posted a list of legal movie and TV downloads and streaming in New Zealand as well.



How advertising delivery can be bad for your web site, your readers and advertisers

By Mauricio Freitas, in , posted: 19-Jul-2012 20:11

Another advertising order for Geekzone, another reason to be happy. But I'm actually sad - sad for my readers and advertisers.

You probably know by now I try to get maximum performance out of the servers we use. I also work hard, using different software, services and techniques to get the site as fast as possible.

Many people use ad blockers for different reasons. Some say they find the ads slow down their PCs, others say ads may be vector for malware. Some say ads slow down web page load times.

To solve this last problem we use different approaches: we use Google DFP for ad delivery (and gain speed thanks to their world wide network of caches), use Javascript asynchronous loading, and enable single request for ad delivery.

Assuming we are hosting the creatives (ads) with Google DFP, a single call will be all its needed to get the image and parameters to show it on the page.

If the advertiser is using DoubleClick (a Google company agencies use to manage a campaign workflow) , Google is smart enough to get the ads out exactly like it would do with hosted creatives - that is in a single call.

Between advertisers and publishers there's almost all the times an agency that represents the publisher, trying to sell available inventory. These agencies get paid a commission on each sale they manage to complete. They also like to know how many impressions and clicks campaigns are getting. As a publisher using Google DFP I can easily give agencies access to real-time reports for their campaigns. But I haven't seen any agency that takes advantage of this feature.

Instead, these agencies load the tags supplied by the advertisers into their own systems. In turn they give the publishers their own tags. And we obviously need to load our own scripts to manage the delivery.

So instead of having one script that loads and ad with a single call (the Google DFP and Google DoubleClick integration), we have a script that loads the agency script than in turn loads the advertiser script than in turn loads the ads.

This ads an incredible latency to the whole ad delivery system. Usually these ad agencies don't have servers closer to end users. They don't use CDNs. Things get slow. And when things get slow users navigate away. And when users navigate away then don't see the ads.

For all purposes Google DFP delivered the code and counted one impression. But by the time the browser loaded the second script and is waiting to load the third script the user might have closed the window or clicked a link to go away. So the agency doesn't count the impression. Then they complain there's a difference between my counter and their counter.

Another important thing: Google DFP is smart enough to deliver more impressions of those ads that perform better. In other words, if the advertiser supplies more than one ad then Google DFP will make sure it shows more of the ads getting a higher number of clicks. If we run an agency tag we lose control and can't count the clicks, meaning all ads are delivered in a balanced manner. This mean the optimization that could benefit the advertiser and attract more clicks is lost.

At the end advertisers lose the opportunity to get more clicks, our reader sees pages slowing down, and agencies act as a middle man that really is trying to do more than they should do by getting technical where they don't have the capability and don't actually ad any value.

This is not a rant at one specific agency. Most agencies work like this. They just don't understand that a fast web means more business for everyone.



Broadband in New Zealand according to OECD: a six month update

By Mauricio Freitas, in , posted: 19-Jul-2012 10:30

About six months ago I posted some New Zealand broadband statistics collected from the OECD Broadband portal. These numbers have now been updated by the OECD, and here is the latest numbers after six months:

1. Fixed broadband subscriptions per 100 inhabitants

December 2011: #17 (26%, 1,138,830 connections)
June 2012: #17 (26.9%, 1,174,790 connections)

Although not a change in position, a very small increase in connections.

2. Wireless broadband subscriptions per 100 inhabitants

December 2011: #14 (54%, 2,380,709 connections)
June 2012: #9 (67.5, 2.946,260 connections)

What an incredible jump for wireless broadband. This includes 53.7% mobile broadband as part of a mobile plan and 12.9% dedicated mobile data subscriptions. The other connections are satellite and terrestrial fixed wireless.



Windows Phone updates: no better than the old Windows Mobile updates

By Mauricio Freitas, in , posted: 17-Jul-2012 14:53

This was supposed to be a long blog post. I have deleted everything and will leave just this: the Windows Phone 7 update experience is terrible. I wish they would at least once actually copy something from another company.



Could Amazon beat Apple and create the largest MVNO in the world?

By Mauricio Freitas, in , posted: 12-Jul-2012 08:48

By now you probably read reports around the web about Amazon's efforts in creating its own smartphone. Articles on Bloomberg and The Wall Street Journal say it's being tested.

I'm pretty sure it is. I can't go in details now without risking someone's job, but from little bits and pieces collected around I'd say this has been going on for some time now.

If Amazon continues with its Amazon Kindle Fire ecosystem (and why not?) then it will be an Android-based device. But what I haven't seen yet is any comment on how Amazon's work with telcos from around the world could make it the largest Mobile Virtual Network Operator (MVNO) around.

Generally speaking, a MVNO works with a mobile operator to get bulk access to network services and resells these with its own branding and prices.

Amazon, through its Kindle 3G strategy, already has relationships established with almost one mobile operator in each single country in the world.

This is how it works: when you buy a Kindle 3G you receive an eBook reader (or tablet as is the case with the Kindle Fire) with mobile data access enabled and ready to use. You can turn the Kindle 3G on, select a book online from the Amazon store using its browser and have it delivered in seconds, without ever coming close to a computer. This also includes subscriptions - newspapers, magazines and blogs.

At no moment you have to sign a contract with a mobile operator. You don't actually even need to know which mobile operator is your Kindle connecting to, or how much they charge for the content download (mobile data charges are already included in the book price). You don't have to insert a SIM card in your Kindle 3G (there isn't even a visible access to it). It's ready to use out of the box. All you know is that Amazon is sending the book to you "over the air", automatically charged to your Amazon account.

Currently this is for electronic content download only, but there isn't anything preventing Amazon from offering voice services in addition to the existing mobile data they already use. It would make even more sense if these were VoIP over the data network.

Rumours have been around for years now that Apple would like to sells a more complete iPhone package - one that they control without interference from mobile operators. This would even be the force behind Apple's insistence in creating a new nanoSIM card standard.

We don't know if this is the model Amazon will bring to its smartphone. But it would make sense. And it looks like Amazon could beat Apple to the market.



Alan Turing

By Mauricio Freitas, in , posted: 24-Jun-2012 18:38

Today would be Alan Turing's 100th birthday.

Turing was one of the leaders at Bletchley Park, working as a codebreaker during the war, working on the cryptanalysis of German's Enigma machine.  Probably the man who helped England the most when all was but lost to the enemy during WWII, ended up persecuted for its sexual orientation, after the war.

His conviction for indecency cost him his security clearance and job for the Government Communications Headquarters. He was forced to undergo chemical castration. And he commited suicide (although some believe it was an accident).

And to think the top German scientists got free passes to America thanks to Operation Paperclip, while England did nothing but to hunt Turing.

Turing gave us the modern computer, thanks to his design of a stored-program computer.

When I was first introduced to computers, back in early 80s (and I was late to this!) our teacher made a point of showing the class the works of both Turing and Von Neumann.

But most of all, I still believe he is responsible for saving Great Britain's hide during the war. Even with an official apology from PM Gordon Brown (10 September 2009) it's still a shame his name has a record of a conviction for indecency against his name.



Microsoft Surface: coming soon, any time now?

By Mauricio Freitas, in , posted: 19-Jun-2012 12:14

Microsoft has announced its own tablet, running on Windows 8 supported by both ARM and Intel Core processors. The specs are not bad either, with 16:9 aspect ratio, 32GB to 129GB memory, USB support.

It looks good, the built-in keyboard and kick-stand together will make it a worthy contender no doubt and certainly a good replacement for many laptops.

Except when you read the press release and it says "coming soon" and "[Surface] will be sold in the Microsoft Store locations in the U.S. and available through select online Microsoft Stores."

Microsoft wasted the opportunity to get the world by surprise, with something like "available in the US end of June, in another 20 countries by August and the rest of the world by September".

Apple still knows how to do it.



Scale your database without effort? ScaleArc iDB

By Mauricio Freitas, in , posted: 21-May-2012 16:51

Last weekend a press release landed in my inbox, and I thought it interesting enough to make me contact the agency and get more information about the product. In summary ScaleArc iDB promised to scale your database without changes in code or database itself:

ScaleArc, the pioneer in a new category of database infrastructure that accelerates application development by simplifying the way database environments are deployed and managed, today announced general availability of iDB v2.0 for Microsoft SQL Server that brings significant new capabilities to SQL Server environments such as instant horizontal scaling, higher availability, faster database performance, increased SQL protection and real-time query analytics. iDB takes a fundamentally different approach at the SQL protocol layer by providing customers with a wide spectrum of capabilities for their database environment in a single solution, without requiring any modifications to existing applications or SQL Server databases.

Until now, moving to advanced architectures like multi-master, or achieving instant scale and better performance within SQL server environments, has been costly and extremely difficult to implement. iDB v2.0 for MS SQL supports a wide range of functions including Read/Write splitting, dynamic load balancing and horizontal scaling, query caching for up to 24x faster query responses, wire-speed SQL filtering and real-time instrumentation and analytics to enhance all deployment modes of SQL server, including SQL Server Clustering, SQL mirroring, Peer-to-Peer (P2P) Replication and log shipping.

iDB for MS SQL Feature Highlights

. Dynamic Query Load Balancing for High Availability: ScaleArc iDB implements a specialized dynamic load-balancing algorithm that allows the most efficient utilization of available database capacity, even when servers have varying capacity. iDB monitors query responses in real-time and can load balance queries to the server that will provide the fastest response to properly distribute the load. Up to 40% better performance has been observed with iDB's dynamic load balancing relative to TCP-based load balancing.

. Pattern-Based Query Caching for Increased Performance: ScaleArc iDB allows users to cache query responses with one-click. No changes are required at the database server or in the application code; the query is cached at the SQL protocol level, providing up to 24x acceleration without any modifications.

. Multi-master: iDB supports multi-master and master-slave scenarios to ensure high availability and scalability. Specific queries, irrespective of their origin, are routed to the right server with the advance query routing engine that also simplifies sharding.

. Real-time Analytics: Advanced graphical analysis tools provided by ScaleArc iDB bring comprehensive real-time awareness of all queries, helping to quickly pinpoint query patterns that are not performing optimally and allowing more precise management.

. Wire Speed SQL filtering: iDB is able to enforce query-level policies for security or compliance reasons to protect against attacks, theft and other threats. iDB can operate outside of the application where policies have not traditionally been easily enforced.

. SQL Query Surge Queue: Extreme loads can lead to unacceptable response times or even halting of operations until the load reduces, leading to "Database not Available" errors. ScaleArc iDB allows a more graceful response to peak loads. When faced with an extreme load, ScaleArc iDB can initiate a SQL Query Surge Queue and momentarily queue queries in a FIFO queue and process them once server resources become available.

Obviously I was a bit worried with their claims, so asked a couple of questions. Here are the answers:

What happens to cached query results when the result changes? For example a record is updated - will the next query use previous results, or get new results?

The key to iDB lies in our Analytics.  We provide granular real-time data on all SQL queries flowing between application servers and the database servers.  As such, customer now have the intelligence they need to understand the query structure, the frequency it hits the database, the amount of server resources it takes, etc.  We then give the customer the power to cache on a per query basis, but we do not set a Time-To-Live for the customer.  They need to understand how often the query will be updated, and ensure they do not set a Time-To-Live that may serve stale data is an Update comes in from the Application.  We allow customers to set TTL anywhere from 1 second to multiple years.  When a cache rule for a query is activated with a single click of a button, we immediately measure the performance and offload impact of the cache.  And since our cache on iDB is a hash map that caches the TCP output of Read queries, subsequent Read queries served from our cache are served up to 24x faster (or more).

ScaleArc also has API that can be invoked from the application to add, invalidate and bypass the cache for specific SQL statements

How much more memory does it require? Or does it use the SQL DB footprint?

ScaleArc iDB is a Network appliance like deployment and does not have any agents on the Server or the Application.This would mean that iDB has its own physical/virtual machine to perform its operations.  iDB can run load balancing within 4GB of memory, however for caching and logging purposes iDB can address up to 128GB of memory.

iDB is a separate instance from the database.  Most customers run our software on a dedicated x86 server to make it a dedicated appliance.  We also sell appliances, or iDB can be installed on a hypervisor as a Virtual Machine.  iDB does not require a lot of memory to operate, but we can allocate up to 128GB of RAM for caching of READ queries.  Query logs are stored on drives on the appliance.

Very interesting - an appliance for SQL TCP output caching. Ok, I have entered my name in to get a 30 day trial and see how much difference it can actually make.

UPDATE: Someone on Facebook said this was advertising. IT IS NOT. I was not asked to post about it, and did not receive any payment to post about it. If you are so inclined please read my FULL DISCLOSURE post.



Riverbed Performance Summit in Sydney

By Mauricio Freitas, in , posted: 14-May-2012 12:35

I just got an email from Riverbed announcing their Riverbed Performance Summit in Sydney. This half day event is happening 5th June (Tuesday) from 12:30pm to 5:30pm. Click here for the agenda (pdf).

Learn how the Riverbed performance platform can help you up your IT game

With the growth of virtualization, consolidation, and cloud computing have come new challenges. IT is increasingly consolidated and virtualized while workers and consumers are distributed. How best to harness these approaches and deliver the efficiency and control your organization requires, while ensuring that end users get the performance they need?

Attend the Riverbed Performance Summit to find out how Riverbed empowers enterprises like yours with the tools to analyze, accelerate, and control your IT. Stay on top of the latest technology and solutions from Riverbed and join us for a deep dive into our vision for delivering performance for the globally connected enterprise.

Sign up to connect with Riverbed technology experts and your peers to learn how you can get more out of your Riverbed investment.

At this exclusive event you'll hear firsthand from our experts on how to maximize your Riverbed investment with the latest release of cutting-edge performance platform products and solutions:

  • Granite, our revolutionary new product for consolidating edge servers in the data center
  • Getting the most out of the latest release of RiOS (7.0), including optimization for video, UDP, IPv6, and VDI
  • Steelhead Cloud Accelerator, a new powerful solution for boosting the performance of SaaS applications
  • The latest product updates, technical overviews, demos, and more

Register now and discover how to make the Riverbed performance platform work for you. Find out how you can finally consolidate your entire infrastructure, including edge applications, servers and storage to the data center, all without compromising performance.

A shame I won't be attending this event since it falls on the same week I will be in Las Vegas for the HP Discover 2012.



freitasm's profile

Mauricio Freitas
Wellington
New Zealand


I live in New Zealand and my interests include mobile devices, good books, movies and food of course! 

I work for Intergen and I'm also the Geekzone admin. On Geekzone we publish news, reviews and articles on technology topics. The site also has some busy forums.

Subscribe now to my blog RSS feed or the Geekzone RSS feed.

If you want to contact me, please use this page or email me freitasm@geekzone.co.nz. Note this email is not for technical support. I don't give technical support. You can use our Geekzone Forums for community discussions on technical issues.

Here's is my full disclosure post.

If you'd like to help me keep Geekzone going, please use this Geekzone Amazon affiliate link when placing any orders on Amazon.



Social networks presence

View Mauricio Freitas's profile on LinkedIn


My Blog by tags...

Blog...
Entrepreneurship...
Media...
Personal...
Technology...
Viral Marketing...
Web Performance Optimization...
Windows...
Windows Phone...

Other recent posts in my blog

Going to Microsoft TechEd New ...
State of Browsers Geekzone Mar...
Free speech...
Testing the Kingston DataTrave...
Telecom enforces SSL email, us...
Windows XP end of support: 8 A...
Take a bit of time to cleanup ...
Geekzone is a 2013 ESET NetGui...
Telecom NZ 2014 New Year decis...
Spammers on LinkedIn...

New posts on Geekzone