headline here

sub header section

you can include a longer description, easily changed within the super slick theme options

The tech world has been inundated recently with all sorts of quixotic talk about how cell phones and US$100 laptops will bring the power of computers and the Internet to the furthest reaches of the globe, laying an inexpensive bridge across the digital divide and enabling citizens from every country on earth to flame each other in chat rooms. Given all the hype, it can be difficult to remember that there are large groups of people even in America who don’t see the need for access to the ‘Net. And among those who do, dial-up still remains a popular choice. Those are some of the conclusions from a recent study by Parks Associates about the use of technology in America, and their report suggests that the Internet access market has matured.HangZhou Night Net

“Explosive growth,” a phrase once ubiquitous in discussions of Internet access, is no longer a reality in the US. From 1997 to 2000, home access grew at 4 to 7 percent per month, but slowed to two or three percent a month by 2002. This year, it is expected to rise only one point, from 63 to 64 percent. Of the 36 percent that don’t have access, 29 percent don’t even own a computer, which suggests that they won’t be subscribing to DSL anytime soon. Cheaper machines may eventually attract these users, but prices are already so low that a bargain-basement computer is already within the reach of most families that want one.

Remember modems? So do the 22 percent of Americans who still have dial-up. While this may come as a shock to the geekerati, millions of households manage to do their e-mail and Web browsing over 56Kbps modems—and most of them plan to keep it that way. A full 18 percent of US households (my grandparents included) have no plans to upgrade to broadband, no matter how many people complain about the constant busy signals when they attempt to call.

The numbers might suggest that the US has large segments of the population that are Internet-illiterate, but it’s important to keep in mind what these numbers are measuring: home Internet use. Those who have a computer but no Internet connection (7 percent) and those who don’t even own a computer (29 percent) may all access the ‘Net at other places, including work or the public library. The survey bears this out; of the households with no Internet access at home, 14 million use the ‘Net in other places, so it does appear that most people have at least a passing familiarity with e-mail and Web browsing.

With so little growth predicted in the industry, and with most dial-up users willing to stay with what they’ve got, ISPs in search of more money have limited options. AOL’s strategy is simply to gouge the consumer by charging dial-up users the same price that DSL subscribers pay, while many broadband operators are engaged in laying fiber and other technologies designed to bring faster speeds to the home as a means of luring customers away from rivals. But raising speeds and raising prices isn’t likely to attract anyone who doesn’t already see the need for the Internet at home (just the opposite, in fact). One country that still does have new users hungry for the ‘Net is China, which is predicted to surpass the US in the number of total broadband users by the end of this year, giving it the most broadband users of any country on the planet.

no comments

As I was sitting up late last night it occured to me that Microsoft needs to add a breathalyzer to the 360. Bear with me now, this is going to make sense. On the new iteration of Xbox LIVE you buy your arcade games with MS points, not money. Of course, adding more points to your account is as easy as hitting the X button while browsing and selecting how many you’d like. The amounts are designed so that you will always a few left over, thus forcing you to buy more than you need to have 200 points always left over. HangZhou Night Net

You will never run out of points. You will always have a few left. It’s like three bucks you’ll never be able to spend, unless you want the Argentinian picture pack from FIFA 2006.

The whole system makes it too easy to spend money. Normally I’m fine, I can tell myself that I can just play the free demos of the games and decide if they’re worth my money. I know Marble Blast Ultra is only so-so, during the game I don’t want to drop my money on points to pick it up. During the night, at around 2 a.m. or after a few mai tais the idea of paying for a game from my couch and playing it that moment is just too delicious. There goes US$10. Then I wake up feeling taken advantage of. Curse you, M$!

The amazing thing is how good a Marble Madness clone can look on a large television in high definition. I spent the evening rolling my marble (I changed it to the globe skin, woohoo!) across the boards, making sure I was laying waste to the high scores of the people on my friends list. It’s fun, you have built in competition and bragging rights from the jump.

Still, I can’t help but think I need to set up a security system for myself. I wish I could have a code to get to my points, and then let a friend or my better half be the only person who knows said code. That why I at least have someone to talk me out of Wik: Fable of Souls before the sun comes up.

no comments

It’s been a few days since the last Guitar Hero post, but this may be an issue you’re dealing with so it’s best to get it out in the open. Where it can breathe. A few weeks ago I played about an entire weekend full of Guitar Hero, and it was fun. Then when I got home I couldn’t play a song, I was losing even medium level songs and it was driving me up a wall. How come two days ago I was acing these tracks and now I can barely play any of them? HangZhou Night Net

The issue, as Penny Arcade was kind to point out at the bottom of one of their posts, was that DLP’s have issues showing an interlaced image. When they upscale it there is a bit of lag. I’ve never noticed anything on my Xbox or 360 before because those run in high definition, or at least progressive scan. The few games I still play on my PS2 aren’t very timing based, but with a game like Guitar Hero timing is everything.

There are a few fixes, talked about on different forums. I’m not going to pay US$100 or more for a VGA box, so that’s out. I switched my settings around and turned some of the smoothing effects of my television off, and that didn’t do it either. Oddly enough, if I output the sound to my surround sound receiver instead of through the television the lag disappears. Fixed. The only issue is I’m out of slots on my receiver, so it’s either move the optical cable everytime I want to Bark at the Moon, or get a switchbox. Argh.

Since with the 360 and PS3 most of your future gaming will be played in HD this isn’t a huge strike against DLPs, and I do love my 56" behemoth. This is just something to be aware of if you really are still giving your copy of Amplitude some loving.

no comments

HangZhou Night Net

I’ve had Solitaire on my mind lately, and it all started when we heard the reports about the fellow who was canned for playing it on his workstation. I think we would be amazed at the statistics on how much productivity businesses have lost as a result of that simple application.

At a previous job, my company had gone to great lengths to prevent users from playing Solitaire on the company’s computers. They even used system policies to prevent the installation of the nefarious Windows components that included it. The individuals that missed their beloved Sol.exe soon figured out a way to start playing it again. The IT department had put the setup cabinet (.CAB) files on each hard drive to avoid manual insertion of the Windows setup CD. Coincidentally they had failed to remove the Sol.exe file from one of those pesky .CAB files. The end-users capitalized upon this goof and were soon playing their cards again.

Have you ever wondered who was to thank or blame for this lovely application? The answer is as simple as selecting the Help menu item in Solitaire, and then clicking About. The About form clearly identifies the guilty party, a Mr. Wes Cherry. I’m astonished this man has yet to be served court papers for single handedly destroying a large portion of productivity in businesses. Nonetheless, we at M-Dollar wish to salute this man for giving us a trusted friend to entertain us throughout the years. Thanks to Solitaire, receptionists, IT professionals, and managers pretending to work have received hours of entertainment during some of the most boring moments of their lives.

To close this tribute to Solitaire, I’d like to leave you with a few precious nuggets. First off, you can read a halfway serious interview with Wes Cherry over at B3TA. Secondly, I offer you an Easter Egg that launches you into the end game animation. While a game is in progress press the following key combination, ALT-Shift-2. You’ll see the game play 52 card pick up with you by shooting the cards at you until it has iterated through the entire deck. Also, during that animation, mouse movement will increase the speed of the flying cards. Last but not least, if you think you are hot stuff when it comes to all things Solitaire, see how you stack up at the International Solitaire rankings website. Just don’t get caught playing at work, or during work hours since it may grant you a place in the unemployment line.

no comments

Vista testers get your guns ready: Vista Community Technology Preview (CTP) Build 5308 is out and ready to go. Yesterday, Microsoft released the feature complete CTP to beta testers, who were apparently a little too eager as Microsoft’s Connect web site was buried with downloaders and completely inaccessible for a decent period of time. Nevertheless, the new CTP is out and patient users will be able to get their hands on it soon enough. But what does the new CTP offer? HangZhou Night Net

Stated earlier, Build 5308 is feature complete, which means that this release is a near-accurate glimpse of the final product. While this build is being touted as an "Enterprise" release because of its Enterprise-related functionality, it does contain many features that appeal to everyone. Here are some inclusions:

The Mobility Center which allows users to modify their mobile settings depending on the environment.The SidebarThe Welcome Center which lets users know of any actions that need to be performed on the OS. BitLocker, which allows users to encrypt their data, keeping it hidden from other users on the same PC.Internet Explorer 7Windows DefenderWindows Media Player 11 and a small toolbar for easy navigation when working in other windows.Revised calendarAbility to display multiple clocks (e.g. you can see time in London, NYC, and China all at once)Single-button desktop lockingOne-click shutdown

This release of Vista may appear to be for everyone, but Microsoft’s product manager for Windows Vista Brad Goldberg says it is targeted more toward businesses. Vista will benefit companies by reducing deployment and support costs, increasing security, connecting people and information, and increasing mobile productivity. Goldberg believes that there is a little something in Vista for everyone.

"While there is a lot of cool stuff in the release that consumers will be excited about, Windows Vista is as much, if not more, a business focused release that will provide significant value for business customers."

The main Enterprise-related additions include BitLocker, better Group Policy features, better network administration tools, and new system imaging tools. System administrators can create an image (WIM file) that is hardware independent and neutral in language. The images can be edited so that patches and updates can be added when necessary. Also in the imaging category is the System Imaging Manager, which allows for custom configurations to be created offline and then executed during installs. "We have also addressed the cost and complexity around images. Industry data shows that most organizations spend about $100,000 per unique image they need to manage at the operating system level," Goldberg said.

The Group Policy Management Console (GPMC) has also been improved. User-specific policies can be created on the same machine. The GPMC now includes settings for printers, diagnostics, power management, Internet, Internet protocol security, removable storage, specific read/write access, and firewalls.

Many people have been wondering how many different versions of Windows Vista there will be. Microsoft still hasn’t made any public announcements, but have said that the versioning will be made clear very soon. The company will also be releasing future CTPs which focus more on the end-user. Finally, there is a small chance that we could see a second beta.

For those of you interested in testing the February CTP, one MSDN blogger is taking requests for nominations. MSDN subscribers can now download both the 32 and 64-bit versions from the subscription site. Then again, I was actually on the MSDN subscription site a short while ago, and the 32-bit February CTP was available but wouldn’t download. It magically disappeared seconds later. Bad timing! For now, I’ll just stare at the high-res screenshots over at ActiveWin and read the Build 5308 guides over at DigitalFive.

no comments

Yesterday IGN.com posted a status update on Nintendo’s upcoming Revolution, and things seem to be moving along swimmingly. The article quotes Reggie as stating that over one thousand dev kits are in the hands of studios, and apparently the kits have progressed through several prototypes and are nearing a final model. Developers seem really happy with the console thus far – especially the gameplay innovations that the much-lauded controller provides. Check out this quote from a studio rep that IGN interviewed:HangZhou Night Net

At first, we were discouraged that it would be less powerful than Xbox 360, but once we got everything working with the controller, our concerns faded.

This is great news, but I found that one of the most intriguing new details wasjust briefly mentioned in the article. It seems that Revolution dev kits only cost around $2,000, which is cheaper that even the PSP dev kits and “thousands of dollars” cheaper than those of the other next-gen systems. I don’t know how much is the average price for these things, but $2,000 seems like an affordable number even for small-time development houses. I wouldn’t be surprised if we see a whole slew of indie developers offer up titles for the Rev, which is definitely an exciting prospect.

Call me a fanboy if you want, but I really want to see Nintendo succeed with this system. They’re really showing their bravery by taking the Rev down the road less traveled in many respects, and we don’t see enough of that in this industry. If they can get some better third party support for this system than they had with the Gamecube, they may have a fighting chance to take back their rightful throne from Sony.

no comments

The City of London (not to be confused with the small “c” city of London) has just announced a plan to blanket the London financial district with what my (not-so-tech-savvy, but definitely improving) father suspiciously refers to as “them Internet rays.” The plan is to start with a section of London’s Square Mile financial district, and then to roll out WiFi coverage to the entire area over the course of the next six months. HangZhou Night Net

The City has contracted a private company called The Cloud to mount 802.11 hardware on various lamp posts and high points around London. (Presumably, they’ll just stick the new hardware on top of the existing surveillance cameras that spy on every square inch of the city.) The network will be open in the sense that multiple ISPs can use the hotspots to provide Internet connectivity and other network services to end users. This means that one user could be making a VoIP call through one provider while another could be surfing the Web via a different ISP. I think this is a great model, and more US cities should copy it.

Speaking of US cities, San Francisco’s public WiFi plans proceed apace with today’s announcement that Google isn’t planning to go it alone, as we’d previously thought. Instead, they’re teaming up with Earthlink and entering the bidding process to handle the city’s upcoming tiered wireless network. According to the plan submitted by Google and Earthlink, Google would handle the free, 300Kbps tier of the service, which would make basic Internet connectivity available to anyone. Earthlink would provide the premium 1Mbps service tier, which could be accessed for a monthly fee of under $20.

Finally, Chicago recently announced that the city is taking proposals for its own citywide WiFi network. Not a lot is known about the proposed network yet, and the city is entertaining a number of proposals from private companies who might like to own, operate, and even charge for a piece of the network.

no comments

DirecTV has emerged as the major competition for cable companies when it comes to television. As a result, the company has won a certain amount of affection from its subscribers. For some, it’s simply because DirecTV is not Comcast, Charter, or whatever company has been granted an exclusive franchise where they live. Others love DirecTV because of its support for the TiVo, which it sadly no longer offers. Me? The wife loves TiVo and DirecTV offers Setanta Sports, which means I get to watch all the rugby I want.HangZhou Night Net

But that’s not enough for DirecTV. Chasing what has become the holy trinity of video, broadband, and telephone service that has captivated both the cable and phone companies, DirecTV had dropped hints that it was planning to roll out its own broadband service in the form of WiMAX. Doing so would give DirecTV the missing pieces of the puzzle and allow it to offer its own integrated package of voice, video, and Internet.

Apparently the company is starting to have second thoughts about broadband. At an investor meeting where DirecTV announced plans for a video-on-demand service, the company hemmed and hawed about its plans for broadband.

DirecTV’s video-on-demand service will be up and running by the end of the year. Customers will need a new DirecTV-branded DVR (no TiVos, please) and a broadband connection. Programming will not be streamed, but will be downloaded to the DVR for viewing within a specified window, which knowing how the satellite TV provider feels about controlling its content, will probably be no more than 24 hours. Plans call for 2,000 titles to be avaible at launch.

For broadband, DirecTV currently partners with other providers to offer cobranded and bundled DSL and VoIP services—Qwest in the Rocky Mountain region and BellSouth in the southern US, among others. In addition, the company at one point offered Direcway broadband over satellite, a solution for those without other broadband options or those who don’t care about lag. Over the past couple of months, DirecTV has talked about rolling its own broadband ISP using wireless. The most obvious option should it choose to go that route is WiMAX.

Unfortunately, WiMAX isn’t quite ready for market in the US. Promising such glorious things as 40Mbps speed with in 1.5- to 6.0-mile cells without the need for line of sight, WiMAX is inching ever closer to market. Last month, the WiMAX forum officially certified its first hardware, which will use the 3.5GHz spectrum. That’s not a problem in Europe, but that spectrum is spoken for in the US. As a result, deployment will have to wait for the certification of 5.8GHz hardware towards the end of the year or in early 2007.

So if DirecTV wants to push ahead with its plans, it’s left with two unattractive options: go with an uncertified version of WiMAX that may not play nicely with the final specification or creating its own proprietary wireless network using unlicensed spectrum. Both are likely to be expensive, and buying up enough licensed spectrum for its own network will cost even more. While CEO Chase Carey is willing to spend up to US$1 billion to get its broadband service up and running, that may not be enough.

DirecTV appears to be intent on keeping its current broadband partnerships for the time being, but it will almost certainly keep an eye on developments with WiMAX. Once the first to-specification WiMAX installations appear in North America, DirecTV may very well decide to become a broadband ISP after all.

no comments

Many GMail-using Mac-users, such as ourselves, probably already have some sort of third-party plugin that allows us to see when we have new GMail (that is, if we’re not using our GMail accounts with our regular mail application as POP or IMAP—[edit] Sorry folks, I must have been on crack this morning!), such as gCount or Gmail Notifier for Mac. According to an announcement on the Official Google Blog, however, Google has now released a number of new Dashboard Widgets to satisfy your OS X widget cravings, and they’re a bit more robust than those old & busted notifiers could do.HangZhou Night Net

The GMail widget allows you to have a little window into your GMail world to see all messages available and even search & compose messages. However, what’s cool about it is that you can have several of these widgets open for various different GMail accounts at the same time, something that you can’t do with most already-existing GMail notifying plugins out there right now. So if you’re like us and have several GMail accounts—one for real friends, one for junk [email protected] e-mail, and one for Craigslist casual encounters uh, family—this may be exactly what you’ve been wishing for!

The Blogger widget is, as you can imagine, associated with Google’s Blogger service and makes posting to your Blogger account quick & easy right from your Dashboard without having to deal with those nasty web browsers.

Finally, the search widget is a rather intriguing animal in itself. Whenever you do a search on Google, it records what stuff you search for and what links you clicked on when you did that search, so that you can go back to it later if you wanted to without having to fish through a whole sea of results all over again, just like the Google search history on the web. This, however, only works if you have a GMail account and are currently logged in. No Anonymous search logging for you!

The last two are widgets that I don’t care for as much, but I love the GMail widget. Screw the nano Gmail Notifier!

no comments

Recently some folks had posted questions about Visual Studio 2005 in the comments section. Many people were asking why they should upgrade. To me there are several reasons, as long as you can exhaustively test your application, since who knows what bugs have yet to surface. The 2005 edition of Visual Studio makes one particular task easier and that’s deployment. I’m all about ease of use; I remember watching a video of Steve Jobs demonstrating the power of developing on Nextstep. The ease of development blew me away, and I was in awe. Visual Studio’s ClickOnce feature produced the same type of awe when I bumped into it. For those of you who are wondering, this feature is included in both the Express and Professional versions. Read on for more.HangZhou Night Net

So what exactly is ClickOnce you may ask? I’ll allow Microsoft the luxury of defining it:

ClickOnce is a new application deployment technology that makes deploying a Windows Forms based application as easy as deploying a web application. With ClickOnce, running a Windows Forms application is as simple as clicking a link in a web page. For administrators, deploying or updating an application is simply a matter of updating files on a server; no need to individually touch every client.

In the past I’ve had the privilege of distributing updates for applications through e-mail, fileservers, or web pages. By no means did I take pleasure in this service, instead it was a prickly thorn in my side. Inside of a large company you tend to have all sorts of road blocks that make application distribution a struggle. That is why Microsoft’s simple new deployment technology, ClickOnce, brought a smile to my face.

Now that I’ve sang some praise about ClickOnce, allow me to demonstrate the ease and power of this new feature in Visual Studio 2005. The following text is a brief, 30,000 foot overview of ClickOnce.

First off, you need to have a project and in this case I named mine, "test." It’d be nice for our program to actually do something, so I’ve added a control to the form. I then added a msgbox to the click event of my control. We now have a working application that we can build and publish. To publish we click on the Project menu item and go to our application’s properties. Select the Publish tab on the left hand side, and you’ll be presented with 3 categories of goodness to sort through. For the Publish Location I have chosen a Windows 2003 server I am an administrator on.

The next item we need to address is the Install Mode and Settings. Here we can specify whether the application can be executed from the web or locally. We want our clients to have the ability to run this lovely application locally so of course we click the, "The application is available offline." The Application files option allows control over the inclusion or exclusion of files that are associated with this project. We don’t have any thing special to include here, so we’ll ignore this option. The prerequisite option is important for us since our application requires the .NET 2.0 framework. By selecting the .NET 2.0 option, our end-users will be prompted to install the framework if it is missing from their system. The installer ClickOnce builds caninstall a number of other dependencies as well, such as,Visual C++ Runtime Libraries, SQL Server Express Edition, and older Windows Installers. These components will install from your server,or the vendor’s website.

Moving on to one of my favorite features of ClickOnce, we have Updates. In the past it wasn’t impossible to build a web installer, but integrating an update mechanism with little or no effort was nonexistent. The Updates option allows us to choose whether or not our application will phone home to the installer location. This option prompts the user if newer revisions of our application exist. Depending on your preference, the application can look for updates before it launches, or after. Lastly you can choose the minimum required version for this application, thus forcing clients to update if you find a critical bug in your application. The Publishing options allow us to specify the Publish Language, Publisher Name, and Product Name. More importantly, a support URL can be specified here, so users have a place to look for technical support. The actual deployment page name is specified in this section too, which defaults to publish.htm. Our remaining configuration option is the Publish version, which if specified, automatically increments with each publish. The time has come to actually publish the application to our server, and have Visual Studio use ClickOnce to build an installer for our application. If successful, you’ll see a, "build succeeded" message in the bottom left corner.

To test the installer, we can simply punch in the URL of our server’s publish page address. If all goes well, we’ll be greeted by an installer page. Our clients can install the application simply by clicking on the install button, and an entry will be added to the Add/Remove Programs page. Additionally a program group and shortcut will be added to the Start Menu. Now if we were to publish a newer build, an end-user launching our application would be prompted with an update message. The message is a simple dialogue box asking them to update their application. If an update breaks the application or a show stopping bug is found, the end-user could simply revert to a previous build of our application. This is accomplished by visiting the Add/Remove Programs and clicking change, which will give them the option to roll back one version.

One last point worth mentioning is that the installer created by ClickOnce works great in FireFox. The only exception is that you have to save the setup.exe to your desktop first, and then launch the installer. Hopefully I’ve opened up Pandora’s Box and made you curious about ClickOnce, giving you at least one reason to give Visual Studio a spin. As I stated previously, Microsoft’s free Visual Studio Express includes ClickOnce, so you don’t have to pay to play with it. For more information on this nifty feature, Microsoft has an excellent FAQ that you could review.

no comments

The MPAA has filed (PDF) seven more lawsuits in their ongoing efforts to "thwart illegal file swapping on major pirate networks." The targets are various high-traffic web sites that facilitate piracy using services like BitTorrent, eDonkey, and USENET. The MPAA hopes that shutting down these web sites will make it more difficult for the "pirate networks" to accumulate and distribute copyrighted material. HangZhou Night Net

Popular sites Torrentspy and Isohunt are among those listed in the press release as piracy perpetrators presently under scrutiny. According to the MPAA, these sites provide illegal access to tens of thousands of copyrighted works, and facilitate millions of illegal downloads.

The latest legal assault is unique in that it also includes the first MPAA lawsuits against web sites like NZB-Zone.com and BinNews.com that help users orchestrate USENET piracy. According to MPAA executive vice president and director of worldwide anti-piracy operations John G. Malcom, the MPAA is now vigorously pursuing legal action against web site operators that aid and promote piracy on the Internet:

Website operators who abuse technology to facilitate infringements of copyrighted works by millions of people are not anonymous – they can and will be stopped. Disabling these powerful networks of illegal file distribution is a significant step in stemming the tide of piracy on the Internet."

In the past year, the MPAA has shut down about 75 separate Torrent and eDonkey sites. Last week, they successfully toppled the Razorback2 eDonkey server, which was one of the largest in the world with over 1 million simultaneous users at any given time. The MPAA’s approach is clear: target the front-ends if the backend network is untouchable.

Is the MPAA fighting a battle it can’t hope to win? It depends on the victory conditions. Certainly many of these web sites will be replaced with others, and life will go on; this has happened more than once before in the wake of a major torrent search-site takedown. Of course, most are in agreement that even if the MPAA’s aggressive legal tactics finally manage to put the public realm of piracy under close watch, users will simply move towards private file sharing networks that will allow them to evade detection and unwanted snooping. One must ask, however, if this is not the point. If piracy cannot be eliminated, driving it underground may seem like the next best option.

Although the MPAA’s frustration with piracy facilitators is understandable, the MPAA could better serve its own interests by working to establish a legal alternative to file sharing that can provide consumers with flexible and affordable Internet content delivery capable of meeting the needs of modern consumers.

no comments

California, the world’s sixth largest economy, could once again use its massive market clout to impose de facto environmental regulations on the rest of the country if a new bill becomes law. Though California already prohibits many electronic components from being thrown in the trash, the proposed legislation would require consumer electronics sold in the state to be manufactured without toxins or other hazardous materials by 2008.HangZhou Night Net

Whether or not the bill becomes a law, it does shine a spotlight on the issue of toxins used by the electronics industry. The EPA refuses to regulate the use of such materials, choosing instead to leave the decision up to each state. Lori Saldana, a San Diego Democrat, believes that the time is right for California to do something about the problem—and it is a problem. A 2004 study showed that dangerous chemicals and metals were present in the dust surrounding many common PCs and monitors. And on the national level, nearly 70 percent of a typical landfill’s heavy metals come from electronic components. Materials such as lead, cadmium, chromium, mercury, and brominated flame retardants are commonly used in cell phones, music players, and computers, and while they do not commonly cause direct harm to consumers, such devices can leach these materials into groundwater when placed in landfills. It’s a global issue as well, since much of the recycling of electronic components takes place in developing countries such as India and China, where workers routinely expose themselves to these unsafe substances with inadequate safeguards and suffer the consequences.

Though California is now considering a bill to ban such substances, the US is not exactly leading the world on this topic. Asian nations as well as the EU have adopted green manufacturing guidelines, and US-based manufacturers are already contemplating changes in their own products as they want to avoid being locked out of these important markets. In the US, though, legislative solutions to the problem are generally frowned upon; one idea, the National Computer Recycling Act has been introduced in Congress multiple times without success. Its most recent incarnation has been sitting in committee for a year and doesn’t look to be going anywhere.

US businesses have responded with a patchwork of programs designed to encourage proper recycling of consumer electronics (though not necessarily changing the way they are manufactured). Dell (like other PC makers) runs a computer recycling program that provides a low-cost way to recycle components, though the company did come under fire for allegedly using prison labor. Apple has a similar program (though without the prisoners) in which you can drop off your old iPod at Apple stores for free recycling. Electronics retailers now commonly offer bins for the recycling of old cell phones and batteries.

This has been a largely voluntary efforts so far, though that could change if the California bill is eventually signed into law. When paired with the EU’s strict manufacturing guidelines, the result would certainly be a change in practice by most global manufacturers, who would certainly find it more cost-effective to find new manufacturing techniques than to give up both the American and European markets.

no comments

As reported last week, AACS is a "go." Short for Advanced Access Content System, AACS is a kind of successor to CSS; the specification is being adopted by both HD DVD and Blu-ray for the purposes of securing commercial video content stored on those mechanisms. AACS is far broader than CSS, but that need not concern us now. What I would like to draw some attention to today is the fact that AACS also has some important off-disc ramifications. HangZhou Night Net

The AACS Interim Agreement is a 106-page set of rules for those who wish to license AACS, a list that would presumably include consumer electronics manufacturers and many other technology companies. Although AACS includes, among other things, the ability to allow content owners to determine what kinds of outputs HD video is directed towards, the license also includes a forced sunset for most of those outputs. Put simply, AACS licensees must eliminate analog outputs on consumer electronics devices by 2013 to remain in compliance with the license. Forced obsolescence it is.

The so-called "analog hole" is the bane of Hollywood’s attempt to control your entertainment life; put succinctly, if you can hear it or see it on today’s consumer electronics devices, you can copy it, with rare exception. The reason is that sooner or later, most signals are converted to analog, and most analog playback devices lack DRM, and analog signals themselves are not particularly well suited to DRM control. Hollywood hates the analog hole, and they have introduced some shocking legislation to shut it down.

While much attention has been paid to how next-gen digital formats won’t play on many HD-capable displays, including TVs, we’re only now getting a look at AACS’s other foibles. The agreement, in its current form, requires that manufacturers cease selling any devices capable of analog video output which pass decrypted AACS content after December 31, 2013. Furthermore, a "sunset" period will begin in 2010, during which time manufacturers will need to scale back their analog support. Existing player models of any (compliant) sort may be sold up to December 31, 2011, with one catch: any device manufactured between December 31, 2010 and the ultimate cutoff date in 2013 must have its analog outputs limited to SD Interlace modes only (Composite, S-Video, 480 component, and possible 576i component [the latter will be addressed in the final version of the license]). Older models can also be sold if the manufacturer can programmatically enforce this SD Interlace mode limitation.

Brave, new future

2013 may seem like a long way away, and it is. For technology planners, it’s not so far away, and this can be perhaps best illustrated when we consider the debates over the transition to digital TV in the United States. While the government is drooling in anticipation of the money it will make by selling the spectrum that is currently used by analog TV broadcasts, it’s also realistic. The switch to digital TV won’t happen until February of 2009, and even then, the government will provide millions in subsidies to homes that will need analog-to-digital converters to make the jump. Lawmakers know that February 2009 is an aggressive move date, but they believe that converters will make it possible to proceed without worrying about how old the TV technology among the populace is.

Put against this backdrop, a sunset for analog outputs beginning in 2011 begins to look questionable in terms of feasibility (leaving aside all other objections at the moment). With some estimating that as many as 40 million homes will lack the ability to receive digital TV signals, a parallel guesstimation can be made: similar (if not greater) numbers of homes will also lack televisions capable of handling protected digital inputs.

The end result will likely be a consumer rush to buy less-crippled players (you can’t call them non-crippled because AACS cripples by definition) before the sunset period begins. Otherwise, come 2011, analog viewing on new players will be limited to SD Interlaced quality, and on New Year’s Day 2014, analog outputs will be a thing of the past. For the movie industry, then, 2014 is the year that the analog hole starts to walk the plank, as protected transmission comes to dominate. Such is their hope.

no comments