headline here

sub header section

you can include a longer description, easily changed within the super slick theme options

The tech world has been inundated recently with all sorts of quixotic talk about how cell phones and US$100 laptops will bring the power of computers and the Internet to the furthest reaches of the globe, laying an inexpensive bridge across the digital divide and enabling citizens from every country on earth to flame each other in chat rooms. Given all the hype, it can be difficult to remember that there are large groups of people even in America who don’t see the need for access to the ‘Net. And among those who do, dial-up still remains a popular choice. Those are some of the conclusions from a recent study by Parks Associates about the use of technology in America, and their report suggests that the Internet access market has matured.HangZhou Night Net

“Explosive growth,” a phrase once ubiquitous in discussions of Internet access, is no longer a reality in the US. From 1997 to 2000, home access grew at 4 to 7 percent per month, but slowed to two or three percent a month by 2002. This year, it is expected to rise only one point, from 63 to 64 percent. Of the 36 percent that don’t have access, 29 percent don’t even own a computer, which suggests that they won’t be subscribing to DSL anytime soon. Cheaper machines may eventually attract these users, but prices are already so low that a bargain-basement computer is already within the reach of most families that want one.

Remember modems? So do the 22 percent of Americans who still have dial-up. While this may come as a shock to the geekerati, millions of households manage to do their e-mail and Web browsing over 56Kbps modems—and most of them plan to keep it that way. A full 18 percent of US households (my grandparents included) have no plans to upgrade to broadband, no matter how many people complain about the constant busy signals when they attempt to call.

The numbers might suggest that the US has large segments of the population that are Internet-illiterate, but it’s important to keep in mind what these numbers are measuring: home Internet use. Those who have a computer but no Internet connection (7 percent) and those who don’t even own a computer (29 percent) may all access the ‘Net at other places, including work or the public library. The survey bears this out; of the households with no Internet access at home, 14 million use the ‘Net in other places, so it does appear that most people have at least a passing familiarity with e-mail and Web browsing.

With so little growth predicted in the industry, and with most dial-up users willing to stay with what they’ve got, ISPs in search of more money have limited options. AOL’s strategy is simply to gouge the consumer by charging dial-up users the same price that DSL subscribers pay, while many broadband operators are engaged in laying fiber and other technologies designed to bring faster speeds to the home as a means of luring customers away from rivals. But raising speeds and raising prices isn’t likely to attract anyone who doesn’t already see the need for the Internet at home (just the opposite, in fact). One country that still does have new users hungry for the ‘Net is China, which is predicted to surpass the US in the number of total broadband users by the end of this year, giving it the most broadband users of any country on the planet.

no comments

As I was sitting up late last night it occured to me that Microsoft needs to add a breathalyzer to the 360. Bear with me now, this is going to make sense. On the new iteration of Xbox LIVE you buy your arcade games with MS points, not money. Of course, adding more points to your account is as easy as hitting the X button while browsing and selecting how many you’d like. The amounts are designed so that you will always a few left over, thus forcing you to buy more than you need to have 200 points always left over. HangZhou Night Net

You will never run out of points. You will always have a few left. It’s like three bucks you’ll never be able to spend, unless you want the Argentinian picture pack from FIFA 2006.

The whole system makes it too easy to spend money. Normally I’m fine, I can tell myself that I can just play the free demos of the games and decide if they’re worth my money. I know Marble Blast Ultra is only so-so, during the game I don’t want to drop my money on points to pick it up. During the night, at around 2 a.m. or after a few mai tais the idea of paying for a game from my couch and playing it that moment is just too delicious. There goes US$10. Then I wake up feeling taken advantage of. Curse you, M$!

The amazing thing is how good a Marble Madness clone can look on a large television in high definition. I spent the evening rolling my marble (I changed it to the globe skin, woohoo!) across the boards, making sure I was laying waste to the high scores of the people on my friends list. It’s fun, you have built in competition and bragging rights from the jump.

Still, I can’t help but think I need to set up a security system for myself. I wish I could have a code to get to my points, and then let a friend or my better half be the only person who knows said code. That why I at least have someone to talk me out of Wik: Fable of Souls before the sun comes up.

no comments

It’s been a few days since the last Guitar Hero post, but this may be an issue you’re dealing with so it’s best to get it out in the open. Where it can breathe. A few weeks ago I played about an entire weekend full of Guitar Hero, and it was fun. Then when I got home I couldn’t play a song, I was losing even medium level songs and it was driving me up a wall. How come two days ago I was acing these tracks and now I can barely play any of them? HangZhou Night Net

The issue, as Penny Arcade was kind to point out at the bottom of one of their posts, was that DLP’s have issues showing an interlaced image. When they upscale it there is a bit of lag. I’ve never noticed anything on my Xbox or 360 before because those run in high definition, or at least progressive scan. The few games I still play on my PS2 aren’t very timing based, but with a game like Guitar Hero timing is everything.

There are a few fixes, talked about on different forums. I’m not going to pay US$100 or more for a VGA box, so that’s out. I switched my settings around and turned some of the smoothing effects of my television off, and that didn’t do it either. Oddly enough, if I output the sound to my surround sound receiver instead of through the television the lag disappears. Fixed. The only issue is I’m out of slots on my receiver, so it’s either move the optical cable everytime I want to Bark at the Moon, or get a switchbox. Argh.

Since with the 360 and PS3 most of your future gaming will be played in HD this isn’t a huge strike against DLPs, and I do love my 56" behemoth. This is just something to be aware of if you really are still giving your copy of Amplitude some loving.

no comments

Tectonic rebound

By
May 8th, 2019

Glacial rebound is a fairly simple concept: put a massive ice sheet on top of some ground, and the ground will compress and deform. Remove the ice sheet, and the ground will gradually begin to return to its prior state. This slow rebound will clearly cause some stresses along the way. Because of this, the entire period since the last glacial age can be viewed as a bit of a geology experiment. And geologists are thinking that among the experimental results were the New Madrid Earthquakes, a series of three massive jolts centered in Missouri that were powerful enough to shift the course of the Mississippi River back in 1811. A Stanford researcher presented new models at last week’s AAAS meeting that reinforced and extended his earlier publications indicating that it was glacial rebound that provided the force to generate these quakes in an area that’s nowhere near the border of any tectonic plates. HangZhou Night Net

The most significant part of the new model is that it suggests that the rebound isn’t done with. Similar quakes might be expected to occur for another few thousand years as the last of the glacial strain is relieved. This is especially worrisome as much of the midwest doesn’t appear to be conscious of the risk, and isn’t building as if there is a risk. This is in contrast to strict building codes in areas with widely recognized danger, such as California.

Being pretty geologically ignorant, the thing that struck me most about this is the fact that the entire northern half of the continent is undergoing glacial rebound: why New Madrid? The glaciers also made it to New York Harbor, and the nearby Ramapo fault would seem to provide an ample opportunity to relieve some of that stress, but it has never spawned anything that large. Since living in the city, the only thing I’ve felt was a 2.0 magnitude pop that was centered somewhere around 90th and 3rd Ave. Although New York City’s building codes require new construction to survive a 5.5 magnitude quake, there are a lot of old buildings around, many of them with ornamentation that, having gone through the ’89 Loma Prieta quake in California, I find a bit unnerving.

no comments

Massive multiplayer on-line games (MMOs) have been around for a while, but their continued and massive revenue growth as of late have made the industry sit up and take notice. There are many new MMO products in the works, most funded by large publishers and given huge amounts of money to make a splashy debut in the marketplace. However, a new company, Multiverse Networks, is taking a different approach.HangZhou Night Net

The company recently announced the start of beta testing for its primary product, the Multiverse Platform. It consists of a package containing a client, a server, and a bundle of tools that allow third-party developers to create their own MMO games. According to their website, it allows developers to create a fully customized MMO for a fraction of the cost it would take to create one from scratch.

The idea of a “MMO Construction Set” has been rattling around for a while. In the early days of text-based Multi-User Dungeons (MUDs) one of the most compelling aspects of the game play was the ability for advanced users to achieve “wizard” status and start to modify the game world themselves. More recently, games such as Bioware’s Neverwinter Nights have been shipped with a built-in world editor, and enterprising programmers released open-source modules that extended the game play to provide persistent multi-player worlds. Prairie Dog Games’ Minions of Mirth (a MMO selling for US$25 with no monthly fee) was recently released with a free server and source code to allow aspiring MMO authors to create their own content. However, Multiverse has been attracting the most attention from the gaming industry.

Instead of charging other companies an upfront cost for the platform, Multiverse Networks plans to license the software for free and charge royalties on income. Developers can host their own games on their servers, and use Multiverse’s tools to handle subscriptions and billing. Theoretically, this could allow MMOs to be developed for a much smaller up-front cost, estimated to be as low as US$10,000, rather than the tens of millions that games such as EverQuest II and World of Warcraft cost to launch.

However, these figures are somewhat misleading. The majority of the cost of launching a new, top-tier MMO is not the price of developing the client, server, and tools. Rather, it is the amount that needs to be spent on salaries for skilled artists, game designers, and even testers to create and test enough content to make a compelling MMO experience. Acquiring and utilizing this talent is more difficult than it appears. To prove to investors and potential clients that their technology is capable of competing with existing MMO games, Multiverse Networks has created their own sample game with their platform, a fantasy role-playing game called Kothuria: The World’s Edge.

A brief glance at the Kothuria screenshots shows the difficulty in making a MMO that can stand out among the crowd, let alone compete with Blizzard’s 5.5-million subscriber juggernaut. The graphics, while taking advantage of DirectX9 special effects, have a generic and sterile feel to them. This problem is not unique to Kothuria: Sony’s Everquest II has also been criticized for having the same problem. But when you look more closely at Kothuria, other problems emerge. The tiny icons that represent player inventory are almost impossible to tell apart. Blizzard expended a considerable effort creating beautiful art for their user interface, and as the player spends a lot of of time staring at these icons, when they are designed poorly it detracts from the gaming experience. It seems like a small thing, but all these tiny things add up in a MMO, and when combined make the difference between a game that fails and one that winds up with millions of subscribers. After all, World of Warcraft didn’t do anything that hadn’t been done before. It just did it better.

So, if Multiverse isn’t likely to produce the next MMO blockbuster, does it have other possibilities? Movie director James Cameron seems to think so. He and Jon Landau recently joined Multiverse Networks’ board of directors, and Cameron says that he plans to use the technology to deliver a MMO based on one of his upcoming movies, before the film is released. While nothing can replace talent, money, and hard work, the availability of cheaper MMO creation tools does open up some new possibilities for the genre. Instead of yet another fantasy role-playing game, developers may take advantage of the low cost of these tools to try experimenting with new genres. The only problem for gamers may be a lack of time to play them all.

no comments

Last year, the "Gathering Storm" report was released, warning about the potential dangers set to befall science in the US in years to come. Now there’s another one to add to the pile, the Science and Engineering Indicators 2006 report from the NSF, analysing the state of US science and engineering relative to the rest of the world. HangZhou Night Net

It makes for interesting reading. A chief point is the growth of R&D spending outside of the traditional big three – the US, EU and Japan. As measured by publications:

The total number of articles rose from 466,000 in 1988 to
699,000 in 2003. Over the period, the combined share of the United
States, Japan, and the EU-15 declined from 75% to 70% of the total,
with flat U.S. article output from 1992 to 2002, leading to a drop of
the U.S. share from 38% to 30%. Meanwhile, EU-15 output rose steadily
to surpass that of the United States in 1998, and Japan’s output also
continued to rise. Output from China and the Asia-8 expanded rapidly
over the period, by 530% and 235%, respectively, boosting their
combined share of the world total from less than 4% in 1988 to 10% in
2003. By 2003, South Korea ranked 6th and China ranked 12th in world
article output. Increases in other parts of the world tended to be more
modest.

The explosve growth of scientific R&D in Asia would be evident to anyone who’s spent any time in a lab recently. All those Chinese and Indian researchers who have spent time in the US and EU and then returned to their home countries have been applying the skills honed in the developed world, and multinational companies are spending their R&D budgets in the region. Then there is the governmental aspect; countries like Singapore are committed to making themselves frontrunners in new fields, at the expense of the US, EU and Japan.

Increases in spending on scientific R&D can be difficult to compare, due to question marks over applying purchasing power parity to foreign countries:

It is difficult or impossible to assess the quality of PPPs for some countries, most notably China. Although PPP estimates for OECD countries are quite reliable, PPP estimates for developing countries are often rough approximations.

Chinese expenditure on science could in some cases be considered four times the actual dollar amount then

All the usual remedies are proposed, things I wholeheartedly endorse like spending more money on research, more money on teachers, encouraging kids to learn about science. A shame then, that so often we find ourselves reporting on just the opposite happening.

no comments

A few days ago Kotaku posted a preview of sorts of what the upcoming 360’s video camera willbe able todo. Though their link to the more technical details did not seem to work at the time of this post, several of the camera’s functions were revealed, including group-based audio/video chatting and “gesture gaming”, which is a feature that uses movement detection technology as a gameplay element a la Eye Toy. It also sounds like Microsoft will allow gamers to add their own leaderboard pictures and other content.HangZhou Night Net

With the 360’s focus on communication, a video element seems like a natural addition to the already robust voice interaction players have become accustomed to via Xbox LIVE. This is a wonderful feature to use among friends, but I’m a little uncertain as to how this will go over in a gaming setting, and I’ll tell you why.

For one, ever since the advent of internet communication, people have grown accustomed to a certain level of anonymity online. Hiding under the comfortable mask of a screen name has allowed many people to shed their inhibitions and express more of their personalities without ever having to show their faces. While audio chatting still keeps this masquerade more or less intact, a video element yanks it away by putting gamers face to pimply face with one another. I can’t say for sure how this will affect the gaming community, but at the moment I’m kind of frightened by the prospect of it.

Also, how is Microsoft going to police video content on here? What safeguards will be in place to prevent the inevitable flood of naked leaderboard pictures? I’m guessing Microsoft will have to ask everyone to sign the mother of all disclaimers when they release this thing, but if inappropriate content becomes rampant on LIVE, what will become of the community (especially family gamers)?

I am certain that this camera will undoubtedly be a huge technological advance with tons of potential, but I really hope that Microsoft handles it with great care.

no comments

In a joint press release today, Google and the United States National Archives announced the immediate availability of 104 films (well, they announced 103, but forgot about one WWII clip) from the Archives on Google Video. This collection of movies constitutes a pilot program for what is anticipated to become a much wider cooperation, and future expansions could also include Google helping the archivists make portions of their massive collection of texts available online. The partnership is non-exclusive, so the archive is free to find more channels through which to distribute its materials.HangZhou Night Net

The pilot is a pretty focused group of films, and includes 23 movies about the creation of the national parks system, 19 documentaries from NASA (including the famous 1969 moon landing), 61 World War II newsreels from the US Office of War Information, and one clip from Thomas Edison’s collection thrown in for good measure, showing some light-footed flamenco and a couple of serious-looking geishas.

If you’re interested in checking out the offerings, there are two good ways of doing so: most people will probably prefer the Google Video interface, but us hardcore librarians can use the Archive’s ARC search (remember to use the search keyword “siGoogle”) to get all the juicy catalog info about the documents. Either way, you’ll eventually end up at Google Video to watch the video. It’s all free (as in air) and the video quality is what you’d expect of Google Video, which is to say decent but not great.

The press release sounds optimistic about the future of online access to archival records, and it’s almost creepy how closely aligned these two partners seem to be. Compare Google’s stated goal of indexing all the information in the world to the following statement from Allen Weinstein, Archivist of the United States:

“This is an important step for the National Archives to achieve its goal of becoming an archives [sic] without walls,” said Professor Weinstein. “Our new strategic plan emphasizes the importance of providing access to records anytime, anywhere. This is one of many initiatives that we are launching to make our goal a reality. For the first time, the public will be able to view this collection of rare and unusual films on the Internet.”

So one side has a lot of content and wants to make it available, and the other makes a living out of making things available online. This could be the beginning of a beautiful friendship, as Humphrey Bogart would say. The National Archive currently holds 260,000 moving picture documents, most of it Universal Newsreel material, but tens of thousands of movies come from various arms of the US military. A few thousand titles were also culled from non-government sources, and all in all, it’s an impressive record of recent-ish American history. I can hear history students and war movie buffs worldwide drooling already.

All kidding aside, this is a promising, though tentative, first step towards universal access to the records of this massive information repository. Google’s expertise should make the digitization of all the physical records a somewhat faster and easier process, and the end result is not just ubiquitous access to all this beautiful information, but preservation of it as well. Physical media will crumble at some point, some faster than others, and rescuing the information onto digital formats is one way of ensuring future generations’ access to it. Digitizing materials today may seem like a lot of work—because it is—but the results should be well worth it. As storage media evolves and obsolescence starts to claim the formats we have available today, simply transferring the files onto holocubes or quantum dot clusters or whatever comes next will be many orders of magnitude easier than scanning texts and digitizing rotting celluloid.

no comments

The Globe and Mail ran an interview with Apple co-founder Steve Wozniak yesterday that has the Internet buzzing. In it, the irrepressible Woz speaks his mind on Apple’s latest corporate directions, and not all his comments were positive.HangZhou Night Net

Woz is an icon in personal computer history, a digital wunderkind who invented the Apple ][ largely by himself and then, with co-founder Steve Jobs, rode the success of that product to fame and fortune before leaving the company in 1985. Since then, he set up a venture called CL-9 (Cloud 9) that manufactured universal remote controls, but the company quickly folded after releasing its first product in 1987. He went on to donate millions to charity and spends much of his time today teaching computer literacy to elementary school students.

Woz was asked about Apple’s switch to Intel processors, which has gone faster than expected and delivered both new iMacs and MacBook Pros with Intel Core Duo processors. His reaction, as reported by the Globe & Mail, expressed an interesting sentiment:

"It’s like consorting with the enemy. We’ve had this long history of saying the enemy is the big black-hatted guys, and they kind of represent evil. We are different, and by being different we’re better," he said. "All of a sudden we’re the same in this hardware regard, so it’s a little hard to swallow your words from the past."

"Still, the switch to Intel is a necessary one from an engineering standpoint," he said. "Intel just did a very good logic design. [However] if it wasn’t needed, I would say we shouldn’t do it. And I still have some questions as to how much it’s needed."

The idea that switching to Intel represents "consorting with the enemy" has been largely dismissed by most Macintosh fans, but the fact that Apple no longer provides differentiating hardware inside their machines is undeniable. While most welcome the change, and profess that what makes a Macintosh a Macintosh is the operating system, there are a few engineering fans who will miss the PowerPC and its Altivec SIMD hardware. Woz is clearly among them.

Or is he? The Woz is now claiming that the Globe & Mail fabricated this material, saying "I have consistently backed [the Intel] decision." He contends that the statement attributed to him "must have been stretched into being one about my own
thinking."

And what about Apple’s success with the iPod? Woz is happy for the company, but feels the iPod may be distracting Apple from its true mission:

"We’re a computer company, and we really think computers. Spinning off a separate division makes a whole lot of sense."

Except now, Woz claims that he never said this, either. "I heartily deny saying this," wrote Woz. "I did NOT say that the iPod division should be spun off and I feel used in that regard."

It’s interesting that a man who has not been involved with Apple for over two decades still refers to the company as "we." In fact, he still draws a nominal paycheck from Apple to this day. While he is no longer active in the industry the way his former partner is, he still remains a respected figure and his words, though they may not please everybody, are still worth listening to. He was a man who was there at the personal computer’s infancy, and though it may have moved on without him, he never abandoned his idealistic values. Should Apple remain a computer company and split off the iPod? Wall Street doesn’t seem to think so, but how about everyone else? Will the real
Woz please stand up?

no comments

People are constantly comparing Windows Media Player (WMP) to iTunes. It’s inevitable. I can’t call who’s winning the race, but Microsoft is definitely picking up speed with the coming WMP 11. CNET has briefly reviewed the current state of the product, and it looks like a vast improvement over WMP 10. HangZhou Night Net

The basic concept around WMP 11 is ease of use. The user interface isn’t nearly as busy as WMP 10. The main screen of WMP 11 places a focus on music by discarding several options previously found in the left-hand navigation pane and organizing music by album art. The top and bottom of the screen hold buttons for library navigation and tasks like Ripping. The reviewer also mentions that WMP 11 quickly searched a 10,000-song library, and it is smart enough to split playlists into "time-optimized" groups for CD burning purposes.

Where WMP prevails over iTunes is with its device support—100+ devices can interoperate with WMP. Other device features include device space notifications (how much space is left) and shuffle capabilities.

The last piece of the review is definitely the most important. According to the reviewer, WMP 11 performs significantly better than WMP 10 with 10,000 tracks in the barrel. That’s great to hear since both iTunes and WMP are memory hogs. Wouldn’t it be great to see Apple and Microsoft duke it out over who could release the slimmest PC media player? If I had my way, the media player software would be smart enough to say, "oh, I see you’re trying to open Visual Studio and your screen is engulfed in a fog of hanging programs. I’ll go ahead and stop swallowing resources for the moment." Am I the only one that thinks that would be an awesome feature?

no comments

Just like a few weeks ago, this weekend Ars brings you more .NET development news from the past week. HangZhou Night Net

One of the best add-ins for Visual Studio 2005 is code snippets. While snippets are easy to use, they can be tricky to write. Enter Snippy, a free visual code snippet editor that makes the whole process much easier (if you can get past that awful name that reminds me way too much of Clippy).F# 1.1.10 is now available. The new release includes "F# Interactive" for Visual Studio 2005, composable events, quotation processing over raw quoted expressions, improved code generation, and several other subtle fixes.Microsoft is looking for a Developer and a Program Manager to work on the Tuscany project which has also been called Visual Studio Live. Knowledge of C# and ASP.NET as well as 4+ years of experience are required for both positions. Interested? Then you better review some C# interview questions.Microsoft is running a promotion where you can attend three ASP.NET Developer Webcasts and receive a free copy of Visual Studio 2005 Standard, a 50% discount on any MCP exam, and a voucher that allows you to buy Visual Studio 2005 Professional with a subscription to the MSDN for the renewal price.ASP.NET developers should be interested in Microsoft’s new ASP.NET RSS Toolkit. It has a ton of features; so many that I can’t even begin to name them all here..NET developer Paul Ballard has raised an interesting point: many new VB.NET programming books do not cover object-oriented programming. VB.NET books by Wrox, Microsoft Press, and Sams all lacked OOP discussion. This means nothing except that publishers don’t want to spend time covering a topic that is based around architecture rather than language.Are you a day programmer or a night programmer? Here’s what Mitch Denny has to say:
Now – day programmers are the most prevalent in this industry, and you find them mostly in organisations which have historically tolerated a certain amount of inefficiency…If you are a night programmer, you probably have trouble understanding why a day programmer even entered the industry, and the reason is because they are motivated by different things than you are. This reminds me of a poll taken back in my school days where almost every single comp-sci student preferred to do his homework late at night…but that was college. When do you write your best code?

no comments

HangZhou Night Net

I’m sitting here on my laptop, sick as a dog with a sinus infection and treating myself to a teaspoon of Robotussin and an Asian Horror movie fiesta. I was in the mood for this, soenjoy.

The movie based on the Siren game was quietly released in Japan a few weeks ago to coincide with the release of Siren 2 for PS2. I did a little research, and as far as I could tell this movie is the first live action video game-based movie made in Japan (please correct me if I’m wrong). Anyway, I had a hard time finding any information at all about this movie over here, I guess because of therelative obscurity of the game. But if you’re like me and you love that Asian creepiness, you can check out the trailer here and the official site here.

Since “nihongo ga deki masen” (That means “I don’t speak Japanese” in, er, Japanese, I think), nothing in the trailer made a lick of sense to me, but I plan on giving this flick a whirl as soon as I can get some subtitles all up on the bottom. I would think that with the game’s photo-realistic art style, a movie adaptation would go over well, but you know how them game movies go. Incidentally, if you get a chance, check out the Siren 2 website, if only for the really creepy chanting little girl music.

On our side of the blue, a veritable buffet of new Silent Hill movie propaganda is fresh from the oven. IGN’s got a set visit / cast interview feature that actually makes it sound like everyone involved is paying serious homage to the games, which is great news, and a solid new trailer is also available. The “dark nurse” scene sounds / looks incredible. I’m full-fledged looking forward to this movie now, and I’m not afraid to admit it.

All right chickadees, I’m going to go try to get better. Hope you’re enjoying your weekend!

NOTE: Sean Bean and/or Boromirdoes not actually appear in the Siren games or movie. The above picture was digitally alteredand is pretty much justpretend.

no comments

In terms of a system launch I’ve never really seen anything quite like what the PS3 is going through in the press. Most of the coverage given to the console has been talking about the price, or the specifics of the technology inside. We’re all worried about the Blu-Ray drive, and what it may mean for movies and how much it will affect the fight between Blu-Ray and HD-DVD. We talk about the Cell and if it’s too complicated for it’s own good. Because of this system I now have the word Octopiler in my vocabulary. People publish reports using other reports. HangZhou Night Net

I can’t go through my list of links every day without bumping into a new story about the technology of the system, looking at the nuts and bolts of how it’ll be put together. It’s interesting, I guess. But none of it affects me. I don’t put systems together, I don’t design them. I play games.

I don’t know why all the emphasis, and I do mean all of it, has been about the tech. No one is talking about the games. Everyone is out there reading page after page of technical information, and the arguments on forums now look like they’re written by engineers. This isn’t making us better gamers, in fact it’s doing the opposite. I think this is going to be something Sony is going to have to attack.

They simply have to start talking about games. Right now people are talking about the PS3 as a movie player and a technological widget, not as a videogame console. This is supposed to be the primary task, and I don’t think consumers are being given that impression by anyone in the press or even from Sony themselves. We have to keep in mind that the vast majority of gamers will be playing the PS3 on a regular definition CRT, without ever hooking it up to the internet. Most people are interested in playing games on it.

I think that’s why it’s hard for me to get excited about the PS3. I want to sit in front of one, with a controller in my hand, and play a game. I want to see some gameplay footage of what the games will actually look like. I want to know what games on it are going to keep me interested. This is designed to play games, and that’s what most people will primarily use it for. Unfortunately no one is talking about it.

no comments