The return of Game.Ars

admin | 03/16/2003 | COMMENTS:Comments Closed

What a couple of weeks. First my computer started overheating and spontaneously shutting off. That got fixed. Then, the next week, Comcast took over the lease from Millennium Digital Cable in my apartment building. So, for the past three days, as they reinstalled their equipment in all of the units, my internet connection was off, on, and off again. Mostly off. The good thing is that the service from Comcast seems to be much better than that of Millennium Digital.苏州美甲

So here I am. Saturday at noon and I am trying to build a Game.Ars from scratch. Gosh, don’t you love reading about my trials and tribulations at the beginning of every Game.Ars column? Yeah, me neither, so let’s get to it.

Music to my ears

Electronic Arts announced last week that the soundtrack CD to its popular NBA Live basketball game had officially achieved platinum status. That’s one million copies sold. Think about it, that’s more than most games sell, period. I’ve seen game developers sink a ton of money into a solid game, put it out on the market, sell only 70,000 copies, and then go out of business. If they had a good soundtrack, they might have recouped their investment. They should have been in talks with Madonna first.

Someone call Eminem, I’m re-releasing Pong!

A new look at Frozen Throne

When Blizzard sneezes, game editors tend to bum-rush them with handfuls of Kleenex. So, as the beta test moves along, Gamespot is keeping the public on notice. With the latest impressions of The Frozen Throne, Blizzards WC3 expansion, Gamespot took a close look at what’s in store for the Night Elves and Orcs.

“The Frozen Throne’s new troll shadow hunter hero… …is extremely powerful and versatile; his abilities include healing wave (a healing spell that works just like chain lightning by bouncing off of allied units, healing less and less damage with each bounce).” – staff

“The warden is the new night elf hero. …her abilities allow her to quickly strike vulnerable units in the opposing army. The blink ability allows her to instantly teleport anywhere on the screen.” – staff

There’s a lot more cool information, so go read the full preview.

Molyneux makes a plea

In a recent plea to the UK government, Peter Molyneux said that “…making a successful video game has become too expensive for the smaller, independent developers.” Molyneux, the head of Lionhead Studios and creator of Black & White, is seeking government funds to help support Britain’s game-development industry.

In an article at, he is further quoted as saying “The small independents are the creators of all the new, fresh, and different ideas, and that is definitely going to suffer.” And indeed, we here at Game.Ars share his concerns. Molyneux believes the industry could be helped by setting up a government fund, such as the Film Council, which helps the British film industry.

But right now, Molyneux says: “Let’s put it this way, I’m not holding my breath.”

Read the full article here.

Get yer game on

The game is afoot! Gamecaster has announced that it is holding a big-time game competition in Las Vegas. The event will be a week-long, tournament-style Olympics of gaming goodness. Television crews will be on hand for a post-event broadcast. The official press release says:

The seven-day competition, which will be shot “live-on-tape” for a national television broadcast, will culminate with three Gamecaster Champions emerging in three respective video game genres.

[Insert Rocky soundtrack here]

More details are promised in the near future. For now, check out the Gamecaster Web site or the Gamecaster registration site.

Top of the pops

NPD has released its best-sellers for the week ending March 1. The month has come in like a lion for MOO3, a game that has disappointed me greatly. Ah well, they can’t all be winners, but they can still sell big-time!

    Master of Orion III – InfogramesCommand & Conquer: Generals – EASplinter Cell – Ubi SoftThe Sims Deluxe – EAGrand Theft Auto III – Take TwoThe Sims: Unleashed – EASimCity 4 – EAZoo Tycoon – MicrosoftBattlefield 1942: The Road to Rome – EABattlefield 1942 – EA

Make a match

Game.Ars usually focuses on PC gaming, but a friend of mine, and a clever game designer in his own right, sent me a link to his new game matching system. While this service, which comes to you via, only supports PlayStation and PlayStation 2 at the moment, there are plans to expand the service in the future (hopefully). Here’s a bit of the official press documents:

Well, with FlyingSheep’s Game Match Service, you can compare like/dislike lists with gamers around the world in seconds! You can just browse the list and see which games are the most popular, or you can build your own gaming profile to get increasingly accurate recommendations.

Check it out here.

Bullet newsStar Wars Galaxies delayed. April release will not happen.Electronic Arts gets publishing rights for Black & White II.No One Lives Forever 2 patch brings new multiplayer modes.Arush to release Devastation demo late this month.The Week in PatchesNo One Lives Forever 2 to version 1.3.Command & Conquer: Generals to version 1.4.Neverwinter Nights to version 1.29.Gone GoldIndiana Jones and the Emperor’s Tomb – LucasArtsBlack Hawk Down – NovaLogicKeep Gaming. It’s low in saturated fat.- Carl

Category: 苏州美甲

NASA working towards resumption of shuttle flights

admin | | COMMENTS:Comments Closed

Even though the Columbia disaster investigation is still ongoing, NASA is embarking on an internal review to help grease the wheels for an eventual resumption in space shuttle flights. Many news outlets are touting missions could resume as early as this fall, but the next flight still may be 18 to 24 months away. NASA’s internal review will focus on design and safety features of the shuttle in addition to management procedures and practices. It is hoped the review will help speed the implementation of changes recommended by the Columbia investigation board.苏州美甲

"The alternative is to sit here on our hands and wait for a report to be released, and we’re not going to do that," NASA Administrator Sean O’Keefe said during a news conference at headquarters. But he stressed, "We’re not going to do anything that would fundamentally alter or implement anything" until the board finishes.

It is believed that superheated plasma entered the internal structures of the wing near its leading edge and eventually weakened the wing to the point of failure. The origin and exact location of the initial damage has not been (and may never be) determined, but it appears NASA engineers wanted and an agency official half-heartedly requested satellite photographs of Columbia while in orbit.

Both the investigation board and NASA’s internal review board will need to address serious deficiencies in the Shuttle team’s hierarchy and NASA’s apparent inferiority complex in relation to other government agencies. Concerns from engineers were never seriously considered by higher-ups and when NASA officials did approach other agencies for help, they were afraid to press for a high-priority photo request because they didn’t want to step on anyone’s toes. Even if a there is a solid determination to the cause of Columbia’s breakup, it appears "human error" will also be a contributing factor.

Category: 苏州美甲

P2P to influence radio playlists

admin | 03/15/2003 | COMMENTS:Comments Closed

ClearChannel, the darling of the radio world (heh), is doing something novel: the company is launching program that will take data from monitored P2P networks, and report to radio stations on what songs are hot. The idea is that this information can aide in creating more successful, responsive programming by keeping a rather direct finger on the pulse of the ‘net. The tracking will be handled by BigChampage, an LA data-mining company that is already tracking downloads and "band battles."苏州美甲

BigChampagne will be able to parse the information by geographic market. It also will let stations focus on peer-to-peer users whose uploadable collections overlap substantially with their playlists. Then stations will be able to develop in-depth profiles by learning what those people are downloading, requesting and offering to other users on the networks.

Their BandBattle page is pretty cool: you can see how a handful of groups have fared in the P2P world over the last few months. Currently (for example), Norah Jones is featured by default, and there’s this oh-so-mysterious spike right around the Grammys 😉 BigChampagne claims that it does not collect personal information, although the company does pay attention to what kinds of song are grouped together in various collections. Their position on P2P networks is rather interesting:

Music downloaders say that they buy music…but never mind what they say. Every day, file sharing grows more popular, and the number of people downloading entertainment (music, movies, etc.) on P2P networks increases. It is statistically unlikely that an active online audience numbering in the tens of millions is made up entirely of non-consumers. In fact, BigChampagne’s ongoing examination of the space points to a very direct and observable relationship between popular downloaded content and success in the marketplace.

You can see this relationship by comparing radio playlists with what’s hot online. It’s nice to see someone at least thinking positively about how to harness the social symbiosis of the ‘net, even if it is only to bolster crappy radio station playlists.

Category: 苏州美甲

Study: racial profiling no more effective than random screen

admin | 10/09/2019 | COMMENTS:Comments Closed

One of the larger problems facing the security industry in the era of mass terrorism is the task of creating a profile of a likely terrorist. Identifying those at risk of first time offenses is a challenge in any context, but the stakes are higher when that offense may also be the last, and involve the deaths of dozens of people. We’ve discussed the challenges of generating profiles of potential terrorists in the past, but a study that will be released by the Proceedings of the National Academies of Science does a mathematical analysis how we’re deploying the profiles we do have, and suggests we may not be using them wisely.


The study was performed by William Press, who does bioinformatics research at the University of Texas, Austin, with a joint appointment at Los Alamos National Labs. His background in statistics is apparent in his ability to handle various mathematical formulae with aplomb, but he’s apparently used to explaining his work to biologists, since the descriptions that surround those formulae make the general outlines of the paper fairly accessible.

Press starts by examining what could be viewed as an idealized situation, at least from the screening perspective: a single perpetrator living under an authoritarian government that has perfect records on its citizens. Applying a profile to those records should allow the government to rank those citizens in order of risk, and it can screen them one-by-one until it identifies the actual perpetrator. Those circumstances lead to a pretty rapid screening process, and they can be generalized out to a situation where there are multiple likely perpetrators.

Things go rapidly sour for this system, however, as soon as you have an imperfect profile. In that case, which is more likely to reflect reality, there’s a finite chance that the screening process misses a likely security risk. Since it works its way through the list of individuals iteratively, it never goes back to rescreen someone that’s made it through the first pass. The impact of this flaw grows rapidly as the ability to accurately match the profile to the data available on an individual gets worse. Since we’ve already said that making a profile is challenging, and we know that even authoritarian governments don’t have perfect information on their citizens, this system is probably worse than random screening in the real world.

In the real world, of course, most of us aren’t going through security checks run by authoritarian governments. In Press’ phrasing, democracies resample with replacement, in that they don’t keep records of who goes through careful security screening at places like airports, so people get placed back on the list to go through the screening process again. One consequence of this is that, since screening resources are never infinite, we can only resample a small subset of the total population at any given moment.

Press then examines the effect of what he terms a strong profiling strategy, one in which a limited set of screening resources is deployed solely based the risk probabilities identified through profiling. It turns out that this also works poorly as the population size goes up. “The reason that this strong profiling strategy is inefficient,” Press writes, “is that, on average, it keeps retesting the same innocent individuals who happen to have large pj [risk profile match] values.”

According to Press, the solution is something that’s widely recognized by the statistics community: identify individuals for robust screening based on the square root of their risk value. That gives the profile some weight, but distributes the screening much more broadly through the population, and uses limited resources more effectively. It’s so widely used in mathematical circles that Press concludes his paper by writing, “It seems peculiar that the method is not better known.”

We’re not privy to the exact details of various screening systems, so it’s possible that the optimal solution is in use in a number of contexts. But, given that things like racial profiling are used in so many law enforcement contexts, from community policing to immigration, it’s a safe bet that there are a fair number in which it’s not. And, given that the use of profiles is frequently the subject of public debate, having a public that’s informed of the limits of profiling could certainly help inform those debates.

PNAS, 2009. DOI: 10.1073/pnas.0813202106

Category: 苏州美甲

Report: Apple retail Stores full of window shoppers

admin | | COMMENTS:Comments Closed

Recent numbers from Needham & Company LLC show that Apple isn’t immune to the recent economic downturn. According to an article at, analyst Charlie Wolf is reporting that average revenue at brick-and-mortar Apple Stores were down some 17.4 percent in December of 2008 versus the same month in 2007. While the numbers shouldn’t really come as a surprise, as everyone is tightening their collective belts in anticipation for a long and arduous recession, the interesting observation has to do with visitors to the stores. The same firm reports that foot traffic in and out of those same stores was only down 1.8 percent during the busy shopping season year over year.


What does that tell us? Well, if you were going to the Apple Store just to look at the shiny toys, you apparently had a lot more window shopping company in December. That being said, despite the fall in revenue, Apple Stores are still kicking butt and taking names in the electronics retail space with the highest sales per square foot in the country. The average 6,000 square-foot Apple Store was apparently selling $4,700 per square foot in 2008. Unfortunately being the best of a slumping group still puts you in a slumping group.

Still, the foot traffic numbers really shouldn’t come as much of a surprise. Since the beginning, Apple Stores have had somewhat of a hip vibe as well as a layout that encourages customers to hunker down and try out the merchandise. Partner that with, in my experience, not-overly pushy sales staff and you have the recipe for a lot of looking and even some hanging out. Whether or not as the economy continues to struggle, individuals will continue to willingly submit themselves to shiny new toys regardless of their ability to pay for them.

Category: 苏州美甲

Hands on: Google Earth 5.0 goes under the sea, back in time

admin | | COMMENTS:Comments Closed

Google announced a new version of Google Earth today with features that focus on what is under our ocean, in our past, and above our heads. Ars Technica did some vicarious adventuring to check out the new features.


Google Earth 5.0 (beta) is available for Mac OS X, Windows, and Linux users, though we should note that Google has become even more aggressive with the installation of its software update mechanism. Instead of covertly installing the software like it has in the past or offering the option to disable said updater like it should, Google now presents a dialog that forces the user to agree to the software license and installation of a phantom software update tool that cannot be uninstalled if the user wants to run Google Earth. But let’s not dwell on the negatives, because there is a lot to love about this new version.

Look under

Initial gripes aside, Google Earth 5.0 exhibits yet more UI refinements and polish that bring it more in line with Google’s other desktop software. One of the most interesting features of this release is the introduction of an interactive ocean. While Google Earth has featured large blue bodies of water for some time and basic, 2D topological details, version 5.0 allows users to dive below the surface and explore a 3D, bathymetric map of much of the ocean’s floor. Users can simply keep zooming into the world’s oceans and many large seas, and wherever depth details begin appearing, continue to zoom in past the ocean surface and orient one’s view to start swimming.

Of course, introducing an entirely new way to view two-thirds of the world’s surface would not be complete without some actual data points to make all that space interesting. Google added a new ocean-centric layer of toggleable information to Google Earth, including content from National Geographic, Cousteau Ocean World, shipwrecks, animal tracking, and more. While you cannot dive to actually visit and zoom around underwater landmarks like shipwrecks, Google does provide a wealth of embedded content from sites like Wikipedia, National Geographic, and YouTube for many significant locations.

The experience of diving below the ocean’s surface is pretty surreal, though the ocean’s floor is not always as detailed as we want it to be. Some areas, such as the Mediterranean Sea or the Mariana Trench, are barely perceptible. Still, it’s great to be able to go one large step dive into exploring the rest of the Earth, and Google applied some pleasant water animation to the ocean to help users determine whether underwater data is available.

Look back

The next major new feature of Google Earth 5.0 is historical imagery. A new clock button in the toolbar toggles a timeline controller that can be adjusted to shift any view back in time and reveal changes to things like topography and city construction. This controller is pretty handy, as it contains buttons for stepping back through any significant historical changes that Google has on file for a particular location. Using either the buttons or the timeline scrubber, Google Earth will seamlessly pull down past satellite images and any other data Google has to swap into the current view.

This feature is quite interesting, though Google’s image archive seems to extend back only to around the late 90s, so many of the locations that we checked out have not changed much. Still, as Google acquires more data, this feature will become even more interesting.

Look up

The third major feature in this release brings Google’s choice of keeping the “Google Earth” name into question, as users can now blast off and explore Mars. Thanks to a collaboration with NASA, users can now select “Mars” from a new planetary menu in Google Earth’s toolbar to visit the red planet.

Not a lot of great terrain data is available for Mars, but Google did manage to add a number of content layers such as key places and landmarks, featured satellite imagery, daytime/nighttime infrared shots, and even “A Traveler’s Guide to Mars” that includes excerpts from the book of the same name.

Make a movie

The last major notable feature of this release is a new “Touring” feature that allows users to record a trip through Google Earth. Toggle a simple recording tool on, and you can begin clicking previous locations, typing in new ones, and manually adjusting your view to create a 3D recount of a road trip or a prospective honeymoon. A voice recording option allows you to narrate the trip while you create it, but everything needs to be done in real time and in one single take; there is no pause button while recording nor is there a way to string together multiple tours.

There is also no way to export a tour to some kind of video file for sharing, and they cannot be uploaded to something like Picasa Web Albums. Tours, as far as we can tell, live entirely inside one’s local copy of Google Earth. The feature is still fun and useful, but we would love to see this sharing drawback resolved soon. To get a grasp of these features in action, check out Google’s video demo of Google Earth 5.0’s highlights embedded below.

Overall this is a very welcome upgrade to what is already a wonderful piece of software. Google Earth now offers an engaging and useful view of what is on, below, and beyond our world. We can’t wait to see more planets, more water, and more ways to share these virtual experiences.

Category: 苏州美甲

New AG on state secrets privilege

admin | | COMMENTS:Comments Closed

The Senate has just confirmed the appointment of Eric Holder as the new attorney general. That means we’ll soon get to see the follow-through on two commitments he made during his confirmation hearings, which Steven Aftergood at Secrecy News highlighted earlier today.


First, as we noted last week, the ACLU has been trying to get their hands on the heretofore closely-guarded legal memos produced by the Office of Legal Counsel justifying the administration’s warrantless wiretapping and “enhanced interrogation” practices. Holder seems at least tentatively open to finally permitting their release:

Once the new Assistant Attorney General in charge of the Office of Legal Counsel is confirmed, I plan to instruct that official to review the OLC’s policies relating to publication of its opinions with the goal of making its opinions available to the maximum extent consistent with sound practice and competing concerns.

Second—and as we also noted last week—Justice Department lawyers have been fighting fiercely to block a lawsuit filed by an Islamic charity that (due to a DOJ error) learned it had been subject to those extrajudicial wiretaps. Some early reporting on the most recent developments in that case somewhat misleadingly suggested that “Obama’s Justice Department” was following the Bush administration’s lead by asserting (without much success) that the entire proceeding was foreclosed by the state secrets privilege. This was true only in the very technical sense that the same set of officials running Justice now nominally report to Obama. The question remains whether the old approach will continue once Obama’s appointees are actually in place at the helm of DOJ. Holder’s answer here is not massively illuminating, but it does at least suggest that the strategy will get a second look:

I will review significant pending cases in which DOJ has invoked the state secrets privilege, and will work with leaders in other agencies and professionals at the Department of Justice to ensure that the United States invokes the state secrets privilege only in legally appropriate situations.

Oh, Eric, you had me at “review significant pending cases.”

Category: 苏州美甲

Italian red-light cameras rigged with shorter yellow lights

admin | | COMMENTS:Comments Closed

As if red-light and speed cameras weren’t already controversial enough, a recent discovery in Italy is sure to send all drivers over to the Hatorade stand. A programmer and 108 other individuals are being investigated for rigging a “smart” traffic light system to purposefully trap drivers and fine them for violations, with some speculating that up to a million Italian drivers have been unfairly slapped with fines.


A 45-year-old engineering graduate from Genoa named Stefano Arrighetti is responsible for programming the T-Redspeed system that has been implemented throughout Italy. T-Redspeed uses three cameras as part of the traffic light system, which is meant to determine the exact 3D placement of vehicles going through the intersection in addition to storing their licence plate information. When drivers are caught running a red light, performing an illegal left turn, or any number of other violations, they are automatically fined �150 for each incident.

It turns out, however, that Arrighetti and a handful of public officials were allegedly a bit greedier than most. He’s accused of conspiring with 63 municipal police, 39 local government officials, and the managers of seven different companies in order to rig the system so that it would turn from yellow to red quicker, therefore catching more motorists. The scheme was uncovered by Lerici police chief Roberto Franzini, who noticed that the number of violations were too high for a period of months and, after some investigation, found that the lights were changing way sooner than usual. “There were 1,439 for the previous two months,” Franzini told The Independent (via TechDirt). “It seemed too much: at the most our patrols catch 15 per day.”

According to the police report seen by The Independent, some 300 municipalities across Italy and a number of companies shared the revenues made by the rigged camera system since it was implemented in 2007. Arrighetti has since been put under house arrest while the case is being investigated, though Arrighetti’s lawyer insisted to the newspaper that he was innocent and that there was no need for the T-Redspeed system to be checked. “Arrighetti is a genius whom the whole world envies,” Arrighetti’s attorney Rosario Minniti said.

Red light cameras have been under increased scrutiny by citizens and the media, and apparently for good reason. A local newspaper recently discovered in Denver that the city had not been collecting the required accuracy data from the contractor who implemented its red light cameras, but that didn’t stop Denver from sending out $75 tickets to over 14,000 drivers. And, of course, yellow-light shorting is a trend that seems to be making its way around the world quickly as municipalities discover that while it may not exactly improve safety, it can definitely improve ticket revenues.

On top of it all, red light and speed cameras have been known to be wildly inaccurate at times, which is why some teenagers have taken to pranking their enemies by masking their cars with fake license plates and speeding through lights so that they get caught on camera.

Category: 苏州美甲

Stimulus stimulates crowdsourced oversight, activism

admin | 09/08/2019 | COMMENTS:Comments Closed

Open-government advocates have been heaping praise on Barack Obama’s early efforts to put technology in the service of transparency. Especially popular has been the planned creation of to track spending under the stimulus bill passed by the House of Representatives last week and currently under consideration in the Senate. The parking page currently at the site boasts that it will be “part of an unprecedented effort to root out waste, inefficiency, and unnecessary spending in our government.” But many aren’t waiting for the White House, and a number of online campaigns are already underway to keep crowdsourced tabs on the stimulus—and to mobilize supporters and opponents.


The most recent effort is Stimulus Watch, which launched Monday. While will rely on an “oversight board” to post updates, Stimulus Watch seeks to crowdsource the task of monitoring stimulus spending on “shovel ready” local projects that have been offered up as potential recipients of federal grants. Each recipient will get a user-edited wiki page describing the state of the project in neutral terms, while discussion pages and a voting system will let visitors weigh in on the worthiness of the endeavor—ideally self-selecting for either geographical proximity or relevant specialized knowledge.

The site is the brainchild of libertarian researcher (and—full disclosure—friend of the author) Jerry Brito, whose theoretical work has focused on “crowdsourcing government transparency.” The new site joins older projects like Bailout Sleuth, which tracks the fate of funds disbursed under the Troubled Asset Relief Program.

Of course, the stimulus bill itself has yet to pass, and while the White House has pledged to make “nonemergency” legislation available online for at least five days before it is signed, last week’s signing of the Lilly Ledbetter Fair Pay Act makes clear that this pledge isn’t yet operative. Even if it were, the stimulus bill would likely slip through the “emergency” loophole. Hence Read the Stimulus, a site sponsored by an array of conservative groups which makes it easy to search through the bill’s 1,588 mind-numbing pages, and link to specific items of interest.

So suppose you’ve read as much of it as you can stomach: What next? If you’re opposed to the stimulus—or just curious about how support for it is faring in the Senate—there’s Congress Whip, another conservative-sponsored site, which went live this weekend. With a Senate vote on the stimulus likely this week, users are urged to phone up their senators to determine how they’ll be voting on the bill—and, presumably, to warn off any Republicans who might be tempted to break ranks.

And if you’re eager to get your stimulus on? In that case, Organizing for America—an attempt to keep Obama’s formidable online machine humming under the aegis of the Democratic National Committee—is asking supporters to host house parties at which they watch a video about the recovery package and urge their neighbors to support it. Polls have shown that the stimulus plan is highly unpopular with independents, and has been growing more so over time.

Category: 苏州美甲

FCC asked to probe AT&T treatment of public access channels

admin | | COMMENTS:Comments Closed

PEG channels—public, educational, and government programming that generally takes the form of city council meetings and plays from the local middle school—are being treated as second-class citizens on AT&T’s new U-Verse IPTV system, according to a new complaint to the FCC. Anger over AT&T’s PEG handling has been buildling for some at the local level, but late last week it went national.


The FCC is now being asked to step in where state regulators so far have not to “rule in no uncertain terms” that the U-Verse PEG situation is “in violation of the Act and Commission rules and policies.”

Everyone agrees on what’s happening here, just not on whether it’s a “feature” or a “bug.” Instead of providing each PEG channel with an actual “channel” that subscribers can simply punch into a remote or surf past on accident, AT&T has bundled all the PEG channels from a broad area and dumped them onto channel 99. Users who want to see that city council meeting need to visit channel 99, click “OK,” download a small app (from eight seconds to one minute), choose their community from a list of local towns, then choose a particular PEG channel from that community.

This PEG ghetto comes with consequences beyond inconvenience; second audio channel (SAP) programming is allegedly stripped, closed captioning text is not carried correctly, and users are unable to record the material using DVRs.

The issue affects U-Verse installations around the country, but it has flared hottest in Illinois, where AT&T has a duty to carry PEG channels under a statewide video franchising law. State Attorney General Lisa Madigan agreed to investigate the issue in January after cities claimed that AT&T wasn’t living up to its responsibility to deliver PEG channels on actual “channels.”

A set of community media groups and municipalities has now asked the FCC to act. PEG channels are not enshrined in federal law; they are something that can be required by franchise agreements but are not mandatory. But the media groups contend that AT&T’s treatment of the channels still falls afoul of various FCC policies, the Communications Act, and the Cable Act.

“Relegating local, non-profit media channels to second-class status is a disservice to the public and violates both the spirit and letter of the law,” said Helen Soule, Executive Director of the Alliance for Community Media (ACM). “AT&T’s treatment of PEG channels is inferior in virtually every way that matters to a viewer, preventing the public’s ability to easily access safety alerts, health information, town hall meetings, educational and other local programming.”

Does the public watch public access?

Ars spoke with Peter Collins, IT manager for the city of Geneva, a far-west Chicago suburb that has dealt with this issue. While PEG programming might not seem like a hot commodity, Collins says, “I know for a fact that people watch our local access programming” because “I hear about it when they can’t see it.” High school events and local committee meetings prove to be the most popular content.

From his view, people often stumble on PEG channels while surfing; by sticking the channels in a separate area, few will ever find them without some dedicated seeking. Given the essentially infinite number of channels available to IPTV systems (IPTV sends only the channel being watched down the wire), Collins argues that lumping the channels into one amounts to second-class treatment. “Why don’t they put all the HBOs on one channel?” he wonders, if one channel with a menu is truly equivalent to traditional treatment.

The FCC complaint points out that the U-Verse set-top boxes are directly addressable, and it suggests that AT&T should simply serve up the local PEG channels to the correct households without needing to clutter up its local programming lineup with hundreds of channels from the entire area.

PEG creators have been complaining to AT&T about the issues since the middle of last year, and some modifications have been made (such as upping the encoding bitrate). But the main problems persist.

AT&T’s U-Verse PEG page reiterates the company’s commitment to “carrying Public, Educational and Governmental (PEG) programming over its AT&T U-verse TV service,” and the company claims that putting PEG at channel 99 and offering it to an entire metropolitan area makes the channels easier to find (no matter where you are, all PEG channels are on channel 99) and available to even more people.

The spat raises broader questions about whether, in the YouTube era of cheap and easy online publishing, we even need public access and PEG at all? PBS ran down the arguments late last year.

Category: 苏州美甲

The State of the Netbook, Part I: WEee have lived before

admin | | COMMENTS:Comments Closed

Since their introduction at the beginning of 2008, so-called netbooks have had a sudden, meteoric rise; sales have surpassed all projections, and new launches have dominated much of the gadget press. My own anecdotal evidence matches perfectly with the sales data: I’ve been seeing them all over the place this year. Today I saw an Italian guy using an MSI Wind on the quad and a businesswoman using an Eee on the train, in addition to the bearded Eee-user in the back of my math class and the Mini-Note-toting hipster in my Chemistry class.


How did this happen all of the sudden, and why? Can the netbook growth phenomenon possibly continue unabated? In this feature series, Ars explores the past, present, and future of the netbook form factor. This first article explores the surprisingly long history of netbook-style computers, from their origins in the early days of x86 to the long hiatus before the rennaissance which created the modern netbook.

Power, density, and x86 inevitability

The prime mover behind the netbook revolution is the continuing exponential increase in transistor density—and hence computing power and storage capacity—made famous by the semiconductor industry’s preternatural tracking of Gordon Moore’s prediction of future densities from all the way back in 1965. Last year, Moore’s Law at work in both the processor and flash memory markets tipped the balance, leading to the dramatic saga which has unfolded over the last fourteen months. Namely, at the 45nm node, the x86 ISA has finally gone ultramobile.

Something of the kind was predicted near the time of its inception by Ars’ own Jon Stokes, who posited a “law of x86 inevitability” and trumpeted his conviction that ultramobile x86 devices would become popular very shortly.

But though the rise of the netbook may have taken the market by storm this year, the “netbook” form factor has been with us in some form or fashion since the very early years of x86. Let’s take a brief tour of netbook history by looking at a few historical examples.

The HP 200LXThe HP 200LX. Source: Copyright (c) Andrew R. Lawn, provided courtesy of and used with permission.

As far back as 1994, the HP 200LX and its less-capable 95LX and 100LX predecessors packed 8Mhz 80186-class hardware capable of natively running DOS programs into a one-pound “palmtop” form factor. Running a 640×200 non-backlit grayscale screen and several megs of RAM, the device could run DOS and even Windows 3.0. With serial, PCMCIA, CF and IR ports, expandability allowed additional storage, modems, and even mice to be connected to the tiny device. To top it all off, this dynamo could run for weeks of typical use on a single pair of AA batteries with easy recharging. (Image (c) Andrew R. Lawn, reproduced with permission from

Even now, more than ten years after HP phased out the 200LX in favor of WinCE-based devices, active fan communities putter with Linux, Web browsers, cameras, video, sound recording, backlit screens, overclocking, custom cases, and other novel features on the tiny devices. Given these amazing features, which make the device so wonderful my mother uses one to this day, why didn’t the 200LX or a close competitor with the same design ethos take over the entire world and several major asteroids?

The answer can be found in power; the guts of an ancient desktop weren’t enough for the applications that ultraportable PCs would need. The LX series was great for businesspeople and technologists, but the multimedia revolution completely bypassed it. The 200LX ran spreadsheets like a champ (it’s where I learned Lotus 123) and supported all the contacts and calendars users could ever need, along with hokey DOS games like the infamous “Lair Of Squid,” but that’s about it. And access to the still-primitive internet was limited to text-based DOS applications running on wired dial-up modems. The 200LX was good, but it just wasn’t good enough.

Model200LXScreen640x200 monochromeProcessor8MHz 80186RAMup to 2MBROM2MBBattery LifeWeeks on AAWeight10 oz.


PCMCIA, CFOSDOSInternetSerial modemPrice$550 at launch, $250 at EOL

Category: 苏州美甲

Kiwis get strict copyright, three-strikes law at month’s end

admin | | COMMENTS:Comments Closed

“It is a strange fate we should suffer so much fear and doubt over so small a thing,” says Boromir in the recent movie adaptation of the Lord of the Rings. The scene, shot in New Zealand, might pop into the minds of many Kiwis these days, as one tiny legislative change to copyright law is poised to bring “graduated response” (or “three strikes”) rules to the country. And disconnection of users isn’t just on the table—it’s mandatory.


New Zealand’s 1984 Copyright Act was last year amended in numerous ways, but the most controversial has certainly been new section 92A. “An Internet service provider must adopt and reasonably implement a policy that provides for termination, in appropriate circumstances, of the account with that Internet service provider of a repeat infringer,” it says, and if that seems more than a trifle vague, you’ll understand why ISPs are practically up in arms about it.

The bill gives no real guidance on what “appropriate circumstances” are, so ISPs are trying to hash out some sort of voluntary agreement with the local film industry. But because the law has already been passed and its February 28 implementation date is approaching, rightsholders are apparently being quite tough in negotiations. Things have grown contentious enough that the ISPs are now calling for a delay in implementing the law.

An ISP representative complained last year that “identifying repeat offenders will not be easy. A complex data matching exercise will be required, and even then it will not always be clear who the real offender is, particularly when an internet account is used by a family, a business or a school.”

New Zealand Computer Society boss Paul Matthews questioned the reasoning behind the law. “You could use the same flawed justification that underpins this law to force The Warehouse to ban someone from shopping there for their food and clothes just because they are accused of copying a few DVDs that they have bought,” he said. “Yes, copyright infringement is wrong, but it needs to be proven first and the penalty kept in proportion. Termination of all internet access in this day and age of online education, social networking and electronic services is a huge penalty.”

And librarians worry that the vague wording will expose them to tremendous difficulties. A New Zealand library association complained this week that “an Internet service provider must terminate the account of a repeat infringer. This draconian provision would seem to mean that, if a user is found on more than one occasion to have illegally accessed or downloaded copyright materials, or otherwise breached copyright in a work, the ISP must terminate the Internet access not of the individual accused of breaching copyright, but of the account holder—that is, of the entire library.”

New Zealand’s creative industries, which generally welcome the law, say that “there should be no termination of the accounts of responsible businesses and organisations such as hospitals and schools as they will have responded to the first warning and prevented further infringement taking place.” It’s tough to see how a 200 person company or a public library could stop any three of its users from ever violating copyright using the Internet, however; better policies around this will clearly have to be worked out, as businesses who get cut off when a few interns download the latest pop single will be apoplectic.

The music and movie industries insist that graduated response is “not about ISPs policing the internet, it’s about ISPs responding to a high standard of evidence of infringement and illegal activity on their networks supplied by rights holders. More than anything it is about educating users.”

ISPs, much as they might not like the law, have a more specific business concern: the law does not appear to grant them legal protection even from rightsholders or end users. Content owners unsatisfied with ISP action could still sue the companies, alleging that they have not fulfilled their 92a responsibilities. Customers who are disconnected, especially in cases where mistakes are made, could potentially file suits of their own against ISPs. According to some IT folks who met with government ministers last year about the change, the government is aware of the issue but the message to ISPs was clear: “deal with it.”

While tough sanctions like disconnection have been condemned by the European Parliament and avoided by the UK, France still hopes to implement them under its own graduated response regime. Irish ISP Eircom voluntarily agreed to such a plan after being sued, and New Zealand will try the idea out on a national basis. With milder versions of the plan coming to countries like the UK and the US, there’s certainly something to the music industry argument that graduated response proposals are sweeping the globe.

Category: 苏州美甲

Comcast defends itself against FCC’s VoIP probe

admin | | COMMENTS:Comments Closed

The Federal Communications Commission’s main sparring partner in the realm of network management has sent the agency a polite but chilly refutation of its suggestion that the company may allow its own VoIP service an advantage over others running through its pipes. Comcast says that the ISP giant doesn’t give its Digital Voice product (CDV) “disparate” treatment over its High-Speed Internet (HSI) lines, because it doesn’t route the application through those lines.


“CDV is a service separate from Comcast’s HSI service; it does not run
over Comcast’s HSI service,” Comcast Vice President Kathryn A. Zachem
wrote to the Commission on January 30.
As Ars Technica has reported, the FCC has shifted its focus from the cable company’s network management practices to how it handles the many VoIP services running through its system. In mid-January the agency sent a letter to Comcast, asking it to justify its “disparate treatment of its own VoIP service as compared to that offered by other VoIP providers on its network.”

The missive observed that Comcast’s own explanation of its new network management techniques discloses that, when the system is trying to manage congestion, a VoIP call might sound “choppy.” This potential choppiness stands in contrast to Comcast’s own VoIP product, a difference that Comcast explains on its FAQ Network Management page. CDV is a “separate facilities-based IP phone service,” Comcast notes, and “is not affected” by the new network management techniques.

We seek clarification, the FCC informed Comcast, on why the company has not clarified “the distinct effects that Comcast’s new network management technique has on Comcast’s VoIP offering versus those of its competitors.”

What part of “separate” don’t you understand?

But Comcast responds that it is precisely the separateness of its VoIP service that exonerates the company from the charge that it gives CDV a disparate boost. Yes, like Vonage and Skype, CDV is an IP-enabled service. But unlike them, it doesn’t run “over the top” of Comcast’s high speed Internet lines. In fact, Comcast customers don’t need to subscribe to its HSI service to get CDV, Zachem explains, “and Comcast does not route those CDV customers’ traffic over the public Internet.”

The letter also implies that Comcast doesn’t see its CDV service competing with other VoIP providers so much as against the “dominant local Bell telephone companies.” In fact, Comcast notes, the company has been working with Vonage on network congestion problems.

Yet even though Comcast concedes that it is competing with telcos, its response takes exception to the second half of the FCC letter, which suggests that CDV may be a telecommunications service, and thus potentially subject “to the same intercarrier compensation obligations applicable to other facilities-based telecommunications carriers.”

No, no, no, Comcast insists, invoking the FCC’s own 2002 Cable Modem decision, which classifies cable as an “information” rather than a telecommunications service. “Although the transmission of information to and from… computers may constitute ‘telecommunications,’ that transmission is not necessarily a separate ‘telecommunications service’,” the agency’s still controversial Order declared.

“We hope this letter clarifies the ‘apparent discrepancy’ you perceived,” Comcast’s cool response concludes, “as well as the related questions in your letter.”

This answer, however, is not routing to an FCC run by Kevin Martin, who voted for the Cable Modem decision, but to his successor, interim Chair Michael Copps, who voted against it.

“Today we take a gigantic leap down the road of removing core communications services from the statutory frameworks established by Congress,” Copps warned almost six years ago in his dissenting statement, “…and playing a game of regulatory musical chairs by moving technologies and services from one statutory definition to another.” Ars suspects that Comcast’s letter is not the last word in this fight.

Category: 苏州美甲