Sierra Wireless has added support for Mac OS X to their Watcher software. The software supports the AC597E (Express Card), AC595U (USB) and AC595 (PCMCIA) models.
There's also an updated Windows version (v0.6.0.64) which introduces a new driver, and allows additional VPNs to be added.
Unfortunatelly the venue is now at capacity for seating but you can always stand around the walls.
This Friday 1st February I will be flying up to Auckland to join other local and international guests attending the Kiwi Foo Camp (a.k.a. BaaCamp). This is also the second installment of thi event and I am looking forward to meeting a lot of new people there!
No, it is not really. Sometimes this error happens even if I copy a file with a single character name from the root folder on any non-storage pool disc drive to one of the Windows Home Server shares. And this happens only when copying those files inside the server itself. There's no Path Too Deep in this operation. It's just an error message that doesn't explain anything.
The error won't happen if I copy files from one disc on the server to another folder on the server from a client PC. But this is no a feasible workaround because copying a bunch of 30 GB files roundtrip over the LAN is an exercise in patience, if you have tried network copies with Windows Vista.
I've submitted this as a bug during beta, applied a debug driver, installed a fresh copy of RTM, and submitted this as a bug again. So far no fix for this, not even in Windows Home Server Power Pack 1.
Of course it is not as bad as the Windows Home Server file corruption bug. Bad it is bad if you have large files that reside in a drive on your server and you want copied to the storage pool - for backup purposes for example.
I found out that this error only happens if you use the standard copy procedure: select file(s), Copy, select destination, Paste. Or drag and drop from source to destination.
So I decided to try alternative copy methods. And I found that Teracopy is much faster, can be paused/restarted and most important: it does not show the Path Too Deep error.
It can be configured to completely replace the standard method - even being invoked when you Copy & Paste or drag-and-drop a file or folders.
Now I read a report from Reasearch and Markets saying that thy expect the number of mobile subscribers in China to increase from 540 million (2007) to about 738 million in 2010.
I never managed to get one of these reports ($$$) to read to confirm their claims. But it's not hard to believe the numbers should be pretty close to reality.
Peer-to-peer (or P2P) is a technology that allows large files such as video and music to be distributed over the network in a very efficient way, with computers sharing pieces of files from different users - or peers.
This is different from a centralised model, because it doesn't need the central server to be always available, and it's faster because it does not consume the total upstream bandwidth from a single central server at once.
Software companies are increasingly using P2P technology to distribute their software and patches, usually in the ISO format, which is a single file image of a DVD or CD contents.
Peer-to-peer traffic is also successfully used for voice-over-IP (VoIP) for example, with Skype being a prime example.
But ISPs don't like P2P. For one because they claim it overloads their network due to the exponential increase in traffic.
So those ISPs try all they can to "manage" or reduce the available bandwidth for P2P applications using special hardware and software that analyses the packets flowing through their networks and throttling the speed.
This is all good if the ISPs provided "unlimited" network traffic in New Zealand. But not many plans offer this option, and most charge users for a limited amount of traffic every month.
What you as a user does with that limited amount is your problem. If you use 50GB with email, or 50GB browsing the web, or 50GB downloading large files it doesn't matter. It is still 50GB.
Now it appears Slingshot wants to go and limit what everyone can do with their Internet connections. The news came in an article on Stuff (ISPs try to turn torrent) and people are confused and not happy with this:
The junior telco [CallPlus] has invested in Cisco equipment that will let it "throttle" the bandwidth available for BitTorrent traffic, sources say.
It is estimated that 60 to 90 per cent of all Internet traffic is now made up of videos, TV shows, music and software that is stored on people's PCs and shared around the world using peer-to-peer services, of which BitTorrent - with 160 million registered users - is the most popular.
CallPlus chief executive Martin Wylie says the company is "always trying to manage the experience of all customers as best we can".
But Mr Moore says the growing popularity of peer-to-peer services has eroded the experience of web browsing for many people over the past two years and ISPs should not be afraid to say they are taking action.
When email was first "liberated" to the masses no one predicted the uptake it would have. And today billions of messages are exchanged every year, and most are spam. ISPs are battling this problem, but they had to increase their infrastructure to cope of the messages. It's natural of their business.
The same goes with P2P. It's just another tool, like email, and it is responsible for most of the traffic on the Internet now, it seems. So it's up to the ISPs to improve their infrastructure to provide the service users want - not to restrict them.
Again, if someone pays for a plan with 50 GB a month, then it is up to me how I use it. The ISPs is going to receive the monthly payment for these 50 GB regardless of kind of traffic - email, web, ftp, newsgroups, P2P.
From the same article:
Mark Rushworth, chief executive of Vodafone-owned ISP ihug, says peer-to-peer services were once the preserve of tech-savvy Internet users but new services such as LimeWire that are easier to use have encouraged take-up by the mass-market.
"Peer-to-peer and people embracing video makes it harder for ISPs to have flat-rate `all you can eat' plans."
If ISPs are serious about providing a service they should charge for it accordingly. A real, unlimited, all you can eat, plan is available from Actrix. It costs $596/month - and I hear it's worth it.
If someone wants something cheaper they should find a plan that allows a base usage and then the purchase of additional data blocks. Two examples of this kind of plan is the Xnet Fusion plan that allows users to buy additional allowance at $1.02 per GB, or th Slingshot Extreme, which provides 6 GB to start with and then users can purchase data blocks - up to blocks of $40 per 50 GB.
But again, these are paid for and should not be limited. Bits are bits and if they are part of an email, web page or anything else it doesn't change their nature.
ISPS shouldn't limit in any way how users put those data blocks to use, regardless of those being additional to a plan or part of an "unlimited" plan. ISPs shouldn't call "unlimited" a plan if they have no intention to offer unlimited utilisation.
If they were driven to offer "unlimited" plans at cheap prices and have no intention on following on their promise, well, they didn't plan well, did they?
In the PC market report, Gartner tells us 271.2 million PC were sold in 2007. HP was the number one company with 18% of the worldwide market, followed by Dell (14.3%), Acer (8.9%), Lenovo (7.4%) and Toshiba (4%). The research firm counted desk-based PCs, mobile PCs and X86 servers.
Then I looked at the mobile phone industry numbers. Nokia alone reports 1.14 billion units. Sony Ericsson sold about 100 million handsets. Apple alone sold 4 million of its just release Apple iPhone, compared with about 10 million computers sold by Toshiba around the world.
Now we are not talking about other manufacturers such as Sagem, Samsung, Sanyo, Kyocera, HTC, etc...
It puts the whole "content distribution" problem in another level, doesn't it?
The service will be unavailable Tuesday, 29 January 2008 from 11am for 30 minutes. Check this link for other timezones.
The new HDD will allow us to create a mirror and increase the site availability in case of a drive failure.
This is the Executive Summary from this report:
2. Executive Summary
2.1. Findings and Investigations
2.1.1. In March 2007, InternetNZ formed an external group (the Group) to investigate industry solutions around issues of Internet peering and local data interconnection, and hereby presents its findings.
2.1.2. Differences in what is meant by the word “peering” and emotive responses flowing from historical events can be overcome by defining as far as possible what peering means, or by referring instead to local data interconnection where possible.
2.1.3. The cost of transit appears to be high within New Zealand relative to that in comparable countries around the world. This would appear to be a significant driver of dissatisfaction around issues of interconnection, and many such issues might evaporate if the cost of transit were to reduce substantially.
2.1.4. The availability of complete and reliable traffic statistics for the Internet both within New Zealand and in and out of New Zealand appears to be impossible to obtain. Many sources have partial information, but these partial sources are often misleading and are responsible for creating much misinformation about what is actually happening in the Internet.
2.1.5. Take-up and delivery of rich media content is hampered on two levels. From the content provider’s perspective high national transit cost leads to hosting of content offshore and from a consumer’s perspective, lack of differentiation between national and international traffic charges limits the volume of rich content that consumers can access at reasonable cost.
2.1.6. There is no recognition that local traffic is cheaper to deliver than national traffic or international traffic. This may change if a local data interconnect proposal that Telecom is proposing to the market is successful. Success for that proposal would lead to a different peering model that would enable New Zealand consumers to react positively to the emergence of rich media content, and which we would expect would be supported by telecommunications firms and ISPs.
Now read one of the items in the comments submitted from Vodafone:
5. Consumer Pricing Strategies
The report suggests consumer pricing strategies that differentiate between national and international traffic might encourage a greater take-up of New Zealand-sourced content. Do you agree?
No. ISPs have done this in the past and it made no discernable difference. The cost of international content is approximately double that of national and that is not enough difference to lead to a real change in behaviour. This challenge is firmly in the hands of producers of local content.
And this is what TelstraClear has to say:
It is not clear how the creation of separatep ricing for local and intemational traffic would benefit end users, or how it could be effectively implemented.
Back to some comments from Vodafone:
2. Cost of Transit
International benchmarking suggests the cost of transit is relatively high in New Zealand. Do you agree? Specifically, is there a market failure or evidence of SMP (significant market power) with regard to the cost of transit? Should government conduct an investigation regarding the price of transit? And finally should Government regulate the price of transit?
Yes the cost is high, primarily due to lack of competition. Much of whatever competition that exists is regionally localised. However, we do not see a need to regulate price and do not see the need for an investigation unless it is done with a solution in mind, such as public investment into areas where competition is lacking.
3. The state of networking - lack of reliable data
The report highlights the difficulty in having any informed debate about the state
of networking in relation to local, national and international data interconnection in New Zealand, without access to reliable Internet traffic data and statistics.
Do you support the need for an initiative that would collect and make available on an aggregated basis, New Zealand Internet traffic flows and volumes? If so who should collect that data? Specifically; the Telecommunication Carrier’s Forum, Commerce Commission, Ministry of Economic Development, InternetNZ, or other?
Vodafone/Ihug does not see this as a high priority. If such information was collected it should be kept confidential with only high level aggregate data made public. We don’t have a strong preference for which organisation would do this but note the Commerce Commission has responsibility under the Telecommunications Act for reporting on the performance of the telecommunications market. Any such information flow could perhaps be facilitated by the TCF Information Reporting working party.
The peering report is 84 pages long, so I suggest you have a read and draw your own conclusions.
Are you saying that this linux can run on a computer without windows underneath it, at all ? As in, without a boot disk, without any drivers, and without any services ?
That sounds preposterous to me.
If it were true (and I doubt it), then companies would be selling computers without a windows. This clearly is not happening, so there must be some error in your calculations. I hope you realise that windows is more than just Office ? Its a whole system that runs the computer from start to finish, and that is a very difficult thing to acheive. A lot of people dont realise this.
Microsoft just spent $9 billion and many years to create Vista, so it does not sound reasonable that some new alternative could just snap into existence overnight like that. It would take billions of dollars and a massive effort to achieve. IBM tried, and spent a huge amount of money developing OS/2 but could never keep up with Windows. Apple tried to create their own system for years, but finally gave up recently and moved to Intel and Microsoft.
Its just not possible that a freeware like the Linux could be extended to the point where it runs the entire computer fron start to finish, without using some of the more critical parts of windows. Not possible.
The original actually is here, where you can find other "gems" of knowledge:
Hey, I’m new here. A while ago I tried to talk to a bunch of people on another board and they were telling me Linux is not a Windows program. I’m here to prove them wrong.
Linux will have to find a way to work under Vista from here on, since it wont be able to rely on XP being readily available anymore.
Linux may seem like a good alternative to Office, but all that is happening in linux is that the windows interface is cleverly hidden away. It still needs the drivers and software services in order to run, and in most cases - that happens WITHOUT a valid windows licence.
This is just plain piracy.
Another probable reason for the failure to install is because your machine was running too fast - linux works best on machines around the 1GHz mark, and would be very unstable running on a 3GHz machine.
If this is linkbait, he succeeded brilliantly. Otherwise I would say someone forgot to take the meds...