First I will be a bit critical: I think everyone at IBM New Zealand, from General Manager Katrina Trroughton down, were the target of a viral distribution of Shift Happens (author, video, wiki) because the buzzwords and questions you see throughout the video were repeated a lot of times, by different speakers, during the whole day.
The keynote The New Working Frontier - How Our Children Are Beating Us to It, by Cliff Dennett, was worth attending. He traced a parallel between gaming and collaboration, attention management, and planning. Food for thought and very well put, although I wonder how many of the middle managers attending would feel inclined to go back to their desks and suggest something as revolutionary as using computer-based games to train and retain staff and clients. But if you attended the session, please put some work into that. It all makes sense, I assure you!
I then spent some time visiting the exhibitions around the show floor, and noticed the main topics around were virtualisation and unified communication - this a theme already present in the IBM Forum 06.
I had contrasting experiences when poking around the virtualisation area. While talking to IBM partner VMWARE I asked if I would notice any difference in performance between VMWARE Server and Microsoft Virtual Server. The technical person was clear: you might not notice any difference in performance between those two tools. And this gave me an incentive to actually try their virtualisation software later to see it by myself.
He then pointed out that VMWARE ESX Server would be a completely different thing: you see, VMWARE ESX Server is installed in the bare metal, and requires no OS to run - unlike VMWARE Server (Windows, Linux) or Microsoft Virtual Server (Windows). So one less layer to worry about, which can actually speed things up. But wait for Microsoft Hypervisor technology to show up soon...
Now the "interesting": I walked to the Integral stand (flash-based site) to find out more about their vision of Utiliy Computing. It just happens that their computing on demand solution is the use of virtual environments on blade servers, providing customers with additional power if the need arises. Well, if I understood correctly what I was told, it's like "we can run as many virtual servers you need in our hosted solution and if your requirements grows we can order more blade servers". Which to me doesn't mean "instantaneous elastic and resilient computing" under certain conditions.
Note that this approach is a bit different to what some mainframe companies are doing, where extra computing power is delivered with every system and always available on demand, with clients paying for a set level of performance, and any excess automatically metered and then charged to the client's account. Note that IBM themselves also provide some "metered computing".
But what really surprised me was when I asked the consultant to tell me how their solution compared with Amazon EC2 (Amazon Elastic Compute Cloud) and Amazon S3 (Amazon Simple Storage Service) and he didn't know about these offerings - interesting because I dare say Amazon EC2 is the cream in terms of utility computing, while Amazon S3 is the top of storage virtualisation.
But, yes, I am planning to be at the IBM Forum 08 when it comes back to town.
You use a 3G wireless wide area network (WWAN) data card on a Windows Vista-based computer. The WWAN data card uses a connection that only receives data. However, you may find that the throughput is much less than the throughput of the same 3G WWAN data card when you use it on a Microsoft Windows XP-based computer.
This problem occurs because of the way that TCP receive window auto tuning is used in Windows Vista for connections that only receive data.
Windows Vista obtains a round-trip time (RTT) estimate at the time of connection setup and every time that a new segment of data that is transmitted. A connection that only receives data is limited to the single RTT sample that is obtained at the time of connection setup. Because the connection only receives data, the connection cannot "converge" to the actual RTT of the connection. For example, because the RTT data may fluctuate for various reasons, a "converge" operation is performed to estimate a meaningful RTT by using blending current and previous RTT data. However, if the connection cannot converge to the actual RTT of the connection, the bandwidth delay product (BDP) estimate that Vista obtains is also incorrect. Therefore, the connection's receive window is limited, and throughput of the connected transmission network is reduced.
I believe most users will actually purchase a turn-key solution, a box ready to go from some partners, such as HP (pictured) and others.
But then you read some "expert" blogger comments such as this, and you have to really start thinking where this comes from:
On August 27, 2007, Windows Home Server is supposed to be released, and I'm in no hurry at all to buy it. I think there is no purpose for it! Who is really going to spend money on buying a new computer and operating system, just so they can share their files with other people? Home Server is said to cost around $500!
Wow! $500 is cheaper than some NAS devices. And it's not only for sharing files with other people. It's for home LAN management and maintenance, automatic backup with mirroring, easy restore tools, media distribution and more.
But if you keep reading it's clear he doesn't know it:
Before you go bashing me, just remember – I have not tried Home Server, and not many people have!
What I just said could be completely wrong! Home Server could be a huge breakthrough in technology, for all I know! I just think it's too expensive and too limited to Windows XP and Vista.
So you have not tried it and issue an expert opinion anyway?
Why I am so excited? Because Windows Home Server is really different from everything else in the Windows world we've seen before - it's easy to use, it's server grade software, it's an important element in an easy-to-use backup and recovery setup, it can automatically mirror your data across attached drives, and lots more.
Note that I didn't say "it's easy to install"... It's really is, but I see it coming ready to use in bundles with special server hardware, tested and certified. Users who can install any Windows version can install Windows Home Server, but this will be for more "geek" types. Consumers in general will be happy with the option to get a turn key solution - which have been already announced by HP and others.
I currently have a test setup running as a virtual machine, with SlimServer providing streaming services to my Squeezebox and a couple of plugins installed (see my previous post).
Currently Windows Home Server is only available in New Zealand and Australia, but it should be available in the U.S. from next week sometime.
Telecom's long-awaited $1.4 billion "next generation network" has a potentially life-threatening flaw - it won't work in a power cut.
The alarming implication of not being able to make emergency 111 calls has rescue services concerned and the government wondering how to ensure phone lines stay open in emergencies.
The revelation comes just months after a revamp of the 111 system and a year after a storm in Canterbury left almost 20,000 homes without power, some for almost two weeks. As a result, Canterbury Civil Defence Emergency Management Group worked closely with Telecom to ensure telephone systems would continue working in disasters.
Telecom's new all-digital system is to be phased in over the next five years and will deliver increasingly sophisticated communication services to homes, such as video on demand. But unlike current phone lines it requires a mains-powered "gateway" device in each customer's home.
The article goes on and on with this line. But you can clearly see that the "revamp of the 111 [emergency] system] has nothing to do with the telco.
The story doesn't really explain why they come to this conclusion except that the new system uses a "mains-powered 'gateway' device".
So let's explain what the paper failed to tell you: they are talking about Voice over IP (VoIP). When you use VoIP your voice calls go over the Internet connection, not over the landline.
Wait a minute, doesn't my Internet connection go over the landline anyway? In some cases yes, if you use a technology such as DSL, which uses the same copper wires that transmit circuit switched calls (voice calls) to transmist packet switched (Internet) data.
Sometime soon you will be able to get DSL service without having to rent a landline from Telecom. When this happens you could have your voice calls provided by another company using the copper - or simply use a VoIP service such as Xnet VFX.
Here comes the thing: VoIP requires a small box that will translate the analogue signals from your telephone to the Internet digital standards. And here lies the problem: if there's a power outage, this box won't work.
The solution is to buy a small UPS, a device that sits between this box and the mains, and if the power goes down it will provide energy from its internal battery, for a certain amount of time. Ihave two of these devices here at home - they power my entire home office (two desktops, two laptops, printer, router, cable modem, VoIP gateway). If the power goes down I have about 100 minutes of energy to keep us going. Or I can shutdown my computers and have even more time for the Internet router and VoIP gateway.
If you use a standard landline for your voice calls you won't have this kind of problem. That's because the power for the telephone comes down the copper, direct from the exchange.
The article on Stuff failed to explain all this, instead trying to focus in all the horror that would happen if we lived in a society where everyone uses VoIP and no one has a mobile phone.
Which reminds me that you already have this problem today if you use a cordless phone. That's because if the power goes down your cordless phone is useless. Go on, try it without power and you will see.
Why didn't the Stuff folks wrote an article about the life-threatening flaw of cordless phones?
This is not the tip of the iceberg. One of our Geekzone users contacted Microsoft Activation phone line, and the customer (dis)service representative instead of providing the correct information about the validation servers having a server problem, decided to call the user a pirate, and disconnected the call.
Way to go with customer services Microsoft...
UPDATE: Microsoft admits there were problems with its WGA server (validation) and it should be working again. No word on what's gone wrong.
We've been receiving reports on our forum and through customer service starting last night that Windows Vista validations have been failing on genuine systems. It looks now as though the issue has been resolved and validations are being processed successfully.
Customers who received an incorrect validation response can fix their system by revalidating on our site (http://www.microsoft.com/genuine). We encourage anyone who received a validation failure since Friday evening to do this now. After successfully revalidating any affected system should be rebooted to ensure the genuine-only features are restored.
Now, I really, really think customer services people who fail the "services" part should get the boot. Or retraining under fire.
Bad service is the worst thing in any industry.
UPDATE: I have just received an e-mail from Microsoft New Zealand, copying the user in question. Microsoft apologised by this incident, and agrees this should not be the way the customer service handled the call - even more because it was a worldwide meltdown in the company's own WGA servers.
UPDATE: This is an update on what happened durig the meltdown.
According to the current Wikipedia entry:
Open XML is an XML-based file format specification for electronic documents such as memos, reports, books, spreadsheets, charts, presentations and word processing documents. The specification has been developed by Microsoft as a successor of its binary office file formats and was published by Ecma International as the Ecma 376 standard in December 2006.
Uninitiated users may confuse "Office Open XML" with "OpenDocument" (ISO 26300:2006) and "OpenOffice". Therefore it is commonly referred to as OOXML or its earlier name Open XML.
Office Open XML uses a number of dedicated XML markup languages in fileparts that are placed in a file container (Open Packaging Convention ). The format specification which is available for free at Ecma International includes XML schemas that can be used to validate the XML syntax.
So what's Standards New Zealand involvement in all this? The group is assessing stakeholder views on the suitability of a document on Office Open XML for publication as an International Standard. This is from their press release:
...New Zealand is obliged to vote on the adoption of the European Computer Manufacturers’ Association Office Open XML document (ECMA 376) as an International Standard. This document is currently designated draft international Standard (DIS) 29500 and is available at http://www.jtc1sc34.org/. There is an existing international Standard for Open XML, referred to as Open Document Format (ODF), which was published last year (ISO/IEC 26300).
Standards New Zealand, as New Zealand’s national Standards body holds the responsibility to cast this vote on behalf of New Zealand.
‘The aim of the meeting is to assess and understand New Zealand stakeholder views to allow Standards New Zealand to make an informed vote on behalf of New Zealand. This meeting will be independently chaired by Ms Alison Holt, the New Zealand delegate to the international committee JTC1 SC7 Software Engineering’ said Grant Thomas, Chief Operating Officer, Standards New Zealand.
The way I read it, when Standards New Zealand sends out a press release with "[There is] an existing international Standard for Open XML, referred to as Open Document Format (ODF), which was published last year (ISO/IEC 26300)." it makes all sound like "we have already decided, there is one standard already, why bother with another one and these two days are just formality"...
I'd suggest you check Rod Drury's reasoning on why we should have another standard. A standard doesn't mean it must be unique. Microsoft's Sean McBreen wrote:
Do other standardised document formats not exist today?
Yes, in fact there are actually many different and at times overlapping formats that exist today, for instance PDF/A, ODF and HTML are all ISO/IEC standard document types today.
Why do we need multiple standardised formats?
Multiple formats are required as requirements change and to cater for differing scenarios for instance PNG and JPEG are two ISO/IEC image standards in heavy use today. Individuals and organisations will also continue to innovate and standards must evolve to keep pace with this, for instance MPEG-1, MPEG-2 and MPEG -4 are all ISO/IEC standards for video encoding.
What is the impact to the industry if Open XML is not accepted as an ISO/IEC standard?
Literally billions of documents today are stored and saved using Microsoft Office file formats, an important aspect of Open XML is backwards compatibility for these documents. Not standardising Open XML will have an impact on the longevity of these documents and force government departments, individual organisations and consumers to migrate all of their documents over time. It will also significantly reduce the choice available to our customers in relation to document formats.
What is the impact to Microsoft if Open XML is not accepted as a standard?
While standards themselves don’t dictate customer and partner behaviour or purchasing patterns they do have a strong influence on this over time. As a result there is likely to be a direct impact on the adoption of Microsoft products if Open XML is not accepted as a standard that will reduce our ability to compete in the marketplace.
If a standard is mandated and does not support all the functionality and formatting of a document (say a Microsoft Word 97 document), all the unsupported formatting will be lost in conversion. This raises questions about the validity of the document as a historic record as it has not been maintained in the original formatting.
Secondly it means that someone has to go through and fix the documents - and when you consider the number of potential documents affected, this would be an expensive exercise.
Who's going to tell these organizations that they have to do all this work to move their documents to ODF and fix all the formatting issues and manage the compliance issues?
Simply saying that the standard should translate the old "rendering quirks" into the new and less buggy version doesn't cut it.
Just check the previous ITC vote and comments. Several companies explicitly said that there was a place for both ODF and Open XML as a standard.
Here is an interesting take on this issue, including a reference to Kiwibank, showing how this whole thing can impact enterprises and the market.
Now that Windows Home Server is available here in New Zealand and Australia (the rest of the world needs to wait!) and seeing it doesn't look like it will be making into MSDN subscriptions anytime soon, I decided to buy a copy now, and should have it here tomorrow morning.
The plan is to install Windows Home Server as the host OS, and keep running Windows Virtual Server on it, plus a couple of really cool plugins I found through wegotserved.co.uk.
This will reduce the number of virtual machines on this hardware by one, freeing up 1 GB RAM (out of a total 2.5 GB on this host), which will be enough to run the Windows Home Server as host. The host has an internal 160 GB SATA drive, plus two external drives for a total of 1.45 TB storage, so this should be enough.
I currently run Hamachi Premium on my Windows Home Server, which means I am always in my home LAN, regardless of where I am connected to the Internet.
As part of this master plan I am also getting a Logitech Squeezebox. I have tested the SlimServer software and it works really well under Windows Home Server, and the SoftSqueeze emulator played all my music content and radio, so adding the Squeezebox to the network is not going to be a problem.
Windows Home Server is a great home LAN solution, allowing you to automatically backup your PCs, and keep the content safe by automatically "balancing" the content between your drives. Adding or removing more disc space is easy, and all the "magic" mirroring happens behind the scenes.
Some of the plugins I have installed are the Add Website and Whiist.
If you haven't seen Windows Home Server yet, here are some screenshots: