Twitter, search robots get welcomes from Obama White House

22.01.2009
President Barack Obama's stated plan to create a " for government" began Tuesday with a that was announced via a blog entry on the redesigned Web site and a post.

The Web-site transition occurred at 12:01 p.m. yesterday, when Obama officially took over the presidency from predecessor . The said "Change has come to WhiteHouse.gov" and pointed online readers to the written by Macon Phillips, the Obama administration's director of new media at the White House.

The @TheWhitehouse Twitter account was set up by the Bush administration, which posted more than 1,500 entries there. But the account never attracted much of a following until now. On Monday, it had only about 3,800 followers. By this evening, just a little over 24 hours since , the number of followers had topped 14,000 and was seemingly climbing with each page refresh.

Twitter is a familiar communications tool for the new administration. The Obama campaign's still has more than 144,000 followers, the largest number for any account on the microblogging site, according to posted on the Twitterholic.com Web site. That's despite the fact that only two new entries have been posted on the BarackObama page since Nov. 5.

The Obama administration may have also used the WhiteHouse.gov redesign to let the tech community know that its plan for a Google-enabled government is under way.

As a senator from Illinois, Obama sought to make federal data searchable and usable in tools such as mashups. His ideas include putting data in standard formats and making the information accessible through RSS feeds and other methods. That is similar to what , the District of Columbia's chief technology officer and one of the top candidates for the position of federal CTO under Obama, has accomplished with the D.C. government's , which makes a variety of municipal data available via the Web.

New York-based blogger Jason Kottke noted in a post yesterday that since the transfer of power to Obama, the robots.txt file on the WhiteHouse.gov site . Search engines rely on robot programs to index content, and robots.txt is used to set limits on what is indexed. The Bush administration's "disallow" listings, which went on for 2,400 lines, were removed by the new administration and replaced by a that appears to have just a single entry.

Kottke, in an e-mail response to a question from Computerworld , said the material that was disallowed from being indexed by the Bush administration primarily consisted of "text-only duplicates of hundreds of pages. I would argue those text-only pages should be included in search indexes, but it doesn't look like there [was] a conspiratorial attempt to hide information."

Nevertheless, Kottke thinks that by eliminating the Bush disallow list on its first day in office, the Obama administration was sending out a symbolic message. "One of Obama's big talking points during the campaign and transition was a desire for a more transparent government, and the spare robots.txt file is a symbol of that desire," Kottke wrote.

The nearly blank disallow slate may not last long, though. "As the site grows and accumulates more content," Kottke added, "I'm sure they'll need to add to the directories and scripts that search engines shouldn't index."