CENDI PRINCIPALS AND ALTERNATES MEETING
Environmental Protection Agency
Washington, DC
January 8, 2008

Final Minutes

Social Networking and STI

Increasing the Reach of Government Content: A Survey of Government Participation in the Web 2.0 Universe
NOAA and Second Life
EPA Showcase – “The Puget Sound Information Challenge: Gov 2.0 in Action”

 Welcome

Mr. Ryan, CENDI Chair, opened the meeting at 9:10 am.  He thanked the EPA for hosting the meeting. He gave special acknowledgement to John Sykes and the EPA staff for setting up the meeting on short notice and dealing with a variety of logistics and technology issues. Mr. Sykes welcomed everyone to the “Woodies” Building and to the Science Advisory Board Offices.

Mr. Ryan presented a certificate of appreciation to Nancy Allard to deliver to Lew Bellardo in honor of hi service as the CENDI principal from NARA. Lew retired at the end of December. Nancy will be the principal and Carla Patterson will be the new alternate.

Mr. Ryan also announced that Pete Suthard, the alternate from DTIC, will be retiring at the end of February. All took part in wishing Pete well. 
  
Social Networking and STI

“Increasing the Reach of Government Content: A Survey of Government Participation in the Web 2.0 Universe”
Michelle Springer, Project Manager, Digital Initiatives Web Services Division, Library of Congress 

Ms. Springer prefaced her remarks by stating that the opinions expressed were her own and that she did not speak for the Library of Congress.

Web 2.0 is changing the expectations of Internet users, including those who access government information. New technologies such as blogging, social bookmarking, web sites that facilitate sharing and rating content, podcasts, collaborative software packages such as wikis, and virtual worlds are changing the communications landscape. These communication methods are indicative of a shift in how new generations access information. There is a new expectation that communication will be a conversation rather than a one-way broadcast.  Mixing authoritative information with user-generated content is a challenge for federal agencies; they must enhance access to the agency and its information without diluting the integrity, authority, and the public’s confidence in what is being disseminated.  Agencies may be concerned that they will lose control of the message.  

Government must communicate within a set of policies that have been developed over time to protect the government and the public. However, these policies have yet to sufficiently address these new technologies that have burst upon the scene very recently. Government agencies have responsibilities such as public health and safety, defense, and security that are not shared by the private sector; they must speak with authenticity and reliability and this needs to be safeguarded.

Many policy questions arise. How can user-generated content be integrated with government information, while maintaining the authoritative nature of the government information? What level of control can be exerted over content once it has been uploaded to a non-agency web site? How does the expectation of immediate, unfiltered interactions mesh with the agency’s policies regarding formal review? How can the agency establish identity and trust through branding when the information is made available on non-agency sites?

Several government agencies are undertaking pilot projects to investigate not only new technologies but to better understand the policy and operational ramifications of these technologies. They are using low-cost, multi-channel approaches for these prototypes and they are careful to label them as such.

“Web 2.0” technologies encompass a wide range of social software, including blogging, podcasting, wikis, tag clouds, social bookmarking, and virtual worlds. Usa.gov is keeping track of what agencies are doing in some of these areas on its Reference Shelf pages. Ms. Springer addressed each of these technologies and gave some examples of how agencies are using them.

Blogs

There are currently 16 active public government blogs listed on USA.gov. Blogs are still not very prevalent across the agencies. The benefit blogs provide is to put a personal voice and perspective on the agency information. The process for authoring is often the issue. Should there be multiple authors and multiple blogs? What level of review is needed and by whom? How does the agency ensure that the information is appropriate? How do employees find the time to blog? The challenge is to achieve a balance between the personal voice and official communications. Disclaimers and review processes are part of the answer.

Agencies are implementing blogs in different ways. The Library of Congress blog has a single author. Some agencies have blogs by the agency head only. Some blogs such as the State Department’s DIPNOTE and usa.gov’s Gov Gab have rotating authors. Some blogs allow anonymous postings; other blogs are connected to a specific event or time period.

Incorporating user-generated comments raises policy issues. Non-government blogs generally rely on self-policing. However, the government can’t relinquish control to the crowd. It is also important to remember that these blogs are still government records. Risk management comes into play when information from outside the agency is published. NARA’s recent document on web technology and records management (http://www.archives.gov/records-mgmt/initiatives/web-tech.html) provides guidance on the issues to consider.

Agencies that decide to allow comments on their blog must develop policies and procedures on what constitutes an appropriate comment and how inappropriate ones will be handled. However, it is necessary to ensure that differing opinions are heard. Some comments that aren’t spam may be caught by automated spam filters. Estimating staff time resources to moderate a blog should not be underestimated.

Bookmarking

Many federal blogs provide links to web services that allow users to save, organize, share, annotate and even rate the information they find. These may include links to social bookmarking and rating sites. There is no consistent list of links or presentation of the icons to these sites across agencies at this time.

Digg.com is a rating site. Del.icio.us.com is a bookmarking site. These are all free services. One person “digs” a page, and, as more people mark it, it becomes a way of organizing and prioritizing the content on the web. It answers the question, what does my community think I should be reading on the web today? 

Remarkable spikes in user views of agency web pages can occur when these pages are bookmarked or rated and then discovered by the communities using these sites.  You can find similar sites that have been bookmarked and see what people find interesting on your site. You can search to see what agency content has been placed on Digg or del.icio.us and then see how this is impacting your web usage statistics.

Tag Clouds

Tag Clouds are a way of visually depicting the relative popularity of keyword tags or search terms, and another way of elevating popular content through a visual approach using color and size. The terms themselves are often clickable. There are many software applications to create tag clouds; they can be static or dynamic. Tag clouds can be used to assess the popularity of terms and then work to incorporate those popular terms in other less used site, if appropriate.

YouTube and Social Media Sites

Agencies are expanding the reach of their multimedia assets by putting them on sites such as YouTube, Flickr and iTunes. Some of these sites focus on user-generated content but the amount of professionally produced materials is growing rapidly. However, before an agency chooses to place content on these sites, there are several questions to consider. Have all the clearances and permissions been obtained? What license options are available on a particular site and are they appropriate to the permissions obtained and to the content and audience? This is particularly important if the agency is not the originator of the content. Can the agency limit ads? Does the liability language on the terms of service agreement allow for government institutions? Is additional branding required in order to mark the material for use or reuse once it has been uploaded, and can users comment on the material once it has been uploaded?

Examples of the use of social media sites include Queen Elizabeth’s Christmas message, which has received more than 1 million views since Christmas Day, and the NOAA, NASA and State Department channels on YouTube. NOAA actually has multiple channels; its Ocean Explorer site is an example of the functionality available in social media. In addition to making pod and video casts available from their own web sites, agencies are placing them with sites like iTunes. USA.gov provides a list of government podcasts by topic. Other examples include the US Department of Agriculture’s Economic Research Service’s ERS podcast page and NASA’s list of podcast directories.

Wikis, Collaborative Software, and Wikipedia

A wiki traditionally allows content contributions by its community of reader/editors, rather than following the typical Web page model that rests content quality responsibility on a single Web page owner.. Agencies are experimenting with both internal and external wikis as a way to tap into the expertise of the community.  Examples include the Peer to Patent Pilot Project of the Patent and Trademark Office (PTO), where patent application information is loaded to the system and registered participants can rate claims and provide comments on patents and prior art. The State Department’s internal wiki, called “Diplopedia”, allows for communication among staff who are separated by time zones and geography. The General Services Administration (GSA) has an Intergovernmental Solution’s Collaborative Work Environment wiki.

The popularity of Wikipedia has resulted in a broad familiarity with wiki technology. The Pew Internet and the American Life Project in April 2007 reported that Wikipedia has enormous reach. A new Wikipedia analysis tool (http://wikiscanner.virgil.gr/), which cross-referenced anonymous edits with the registered owners of the IP address from which the edit originated, found that edits were originating from government agency computers. Whether this activity reflects an official institutional strategy or disparate instances of individual staff initiative is unknown.   .

In 2006, the University of Washington Libraries reported that there was an increased use of UW materials when external links were added to the Library’s collection from Wikipedia articles. The National Agricultural Library has embarked on a pilot project to contribute articles and links related to agriculture but they will not edit anonymously. The use of Wikipedia by UW and others to promote their organizations is viewed by some in the Wikipedia community as a violation of the site’s conflict of interest policies. This controversy can result in agency inserted information being edited out or deleted. One strategy is to place information on discussion pages rather than directly edit the articles and ask the article administrator to review and add the information to the article .

Another important use of Wikipedia by agencies is to see how the agency is being portrayed, particularly in different languages.  It is worthwhile to review the Wikipedia content periodically.

Virtual Worlds

These sites use 3D modeling tools to create massive multi-player online environments. Several agencies, including NASA, NOAA, the Centers for Disease Control (CDC) and the National Institutes of Health (NIH), are leasing virtual land in environments such as Second Life. Second Life examples include the Mars Victoria Crater from NASA, the SciLands project; non-federal agency hosted but  and the New York Law School’s Democracy island which includes a PTO’s Peer to Patent exhibit, and a Virtual US Supreme Court Building.

The best practices for government-sponsored virtual worlds are still emerging. Many of the best practices developed for the 2D web environment don’t apply. The sensational nature of some of the virtual world activities, constraints in the use of these sites over agency networks, high use by the public at night outside of normal work hours, and the lack of sustainable funding are challenges.

Despite these challenges, there continues to be great interest on the part of agencies in these virtual worlds. NASA Ames will be hosting a two-day workshop on virtual environments and how they can be used to support and advance space exploration and settlement by reducing isolation and allowing more interaction with family members. The Federal Virtual Worlds Expo will be held at the National Defense University on April 22-23, 2008. The presentations from the November meeting are available on online. There is also a wiki for federal agencies to discuss virtual world topics.

Technologies continue to push the envelope and challenge the creativity of agencies and policy makers. For example, the State Department has announced it will explore micro-blogging via Twitter in 2008. This is a no-cost, social networking service. Up to 140 characters can be sent to a Twitter page. A user-generated tagging pilot will launch shortly by the Library of Congress. There are four widgets and APIs available on the FBI web site.

Agencies are very careful to say that these are experiments and prototypes. There are no standards across agencies. One of the reasons for getting involved in this type of media is to make sure that you have a presence in communities that are interested in the type of content that the agency has to offer and to control and brand the presentation of that content since users may take content from your site and place them in these spaces anyway.

Discussion

Ms. Shaffer announced that the February 28, 2008, FLICC General Counsel’s Forum will be on the legal issues of using Web 2.0. They hope to launch a closed wiki following the forum where general counsels and librarians can submit different policies on 2.0 and compare them. It was noted that accessibility and 508 compliance are issues. However, usually there is a 508-compliant version of the same information available on a regular web site. 

NOAA and Second Life
Eric Hackathorn, NOAA Earth Systems Research Laboratory

Mr. Hackathorn described the National Oceanic and Atmospheric Administration’s (NOAA) use of Second Life (SL) by sharing one of NOAA’s islands with Ms. Springer’s Avatar (her personally developed virtual character). He took her and the CENDI participants on a virtual tour of some of the areas of the island, including a virtual tsunami where the avatars can experience the tsunami from the beach or walk into the water to see the geologic conditions that cause such a phenomenon.

There were a number of other avatars on the island at the same time. Anyone can visit if they have the right hardware and software. If the avatars are in range, you can communicate via text. It is possible to share slides and videos and to do videoconferencing as well.

Another example of NOAA’s use of the Second Life virtual world is the site where the National Marine Sanctuary Laboratory tracks coral reef sanctuaries worldwide. It is possible through SL to experience the coral reefs in their natural setting via submarine. This is an immersive way of bringing the data and research to life in a way that static web sites can’t do. The avatars also have a personalized presence when they interact with visitors.

SL’s strengths include not only its immersive nature, but the ability to tell stories, communicate, collaborate and create a community. It is an intimate experience that closely approximates the real world.

SL was developed by Linden Labs and is the platform used by NOAA, after determining that it most closely matched their needs. There are other platforms that would be appropriate depending on one’s data visualization needs and the perceived audience. There are currently about 11 million avatars throughout SL. It provides a way to reach an increasingly internet savvy, younger population. In the future, Google may add avatars, which would make this an important platform for NOAA as well. 

NOAA’s primary focus with SL is outreach and education. In the future, this may not be as much the case. There are opportunities for researchers to share data and collaborate in real time. A shared data experience can be achieved in a more realistic fashion. These opportunities may move SL away from education and outreach and more toward decision making and other activities.

NOAA is investigating SL’s use in the design and planning of its facilities world wide.  Scientists could use SL to collaborate on the design of new facilities and to closely monitor and track them as they are built. Data from Web Trends and Google Analytics might also be visualized this way.

NOAA’s island is version 0.1. They are now working on 0.2.  Second Earth is under development. This is SL but using Google Earth. There is a virtual sphere, 200 meters in diameter in the first version. The spherical projection screen exists in the real world, and they have mimicked it here. SOS.NOAA.gov shows some projects for displaying datasets. The graphic capabilities are improving very quickly. The idea is to be able to take XML formats from NOAA and other collaborators to display in this way. There are many collaborators working on this. Science on Sphere is the latest IPC (inter-process communication) data model.

NOAA is working with Alaska on a spherical projection that will include glaciers.  Hawaii might be another space to be presented this way. It is a unique opportunity for politicians to highlight what they do for their constituents by using a geospatial visualization approach. It becomes almost a 3-D search engine.

Each island costs NOAA about $75,000 in resources. However, there are many ways to view this. This is a very large development community, and many organizations have recruited volunteers from the SL community to develop for them. Management resources are still required. It took about two months of an FTE to develop this island site.

Mr. Hackathorn cautioned that this is new technology and should be treated as such. He doesn’t have metrics as to whether this increases traffic (users) to the traditional web sites. There is currently a link from the SL island to the traditional web site, but no link the other way. In terms of metrics on the SL site, the number of users is lower, but the average time spent is higher. Last month, visitors spent about 51,000 minutes on NOAA’s islands.

There are a number of government science agencies on www.scilands.org. These include NASA and the National Propulsion Laboratory from the United Kingdom. NIH may be joining soon. Other education- and science-related organizations in scilands.org include National Public Radio and the San Francisco Exploratorium. By grouping the islands together, you can share content and create conversations that wouldn’t normally happen. The number of organizations involved is increasing by one to two per month. The site has grown from one island last July to 30.

EPA Showcase – “The Puget Sound Information Challenge: Gov 2.0 in Action”  Molly O’Neill, EPA Assistant Administrator for the Office of Information and Chief Information Officer

EPA’s Office of Environmental Information (OEI) has an annual meeting called the Environmental Information Symposium to which industry, contractors, and other agencies are invited. Typically, it has been used to share and show applications and report on EPA projects. Ms. O’Neill wanted to do something different this year. The CIO Council is trying to do Gov 2.0, but questions have arisen as to the appropriate places for 2.0 tools in the business of government and how the government can embed these tools in what it is doing.  

At about the same time that final preparations were underway for the Symposium, Ms. O’Neill reached out to the Puget Sound Partnership to see if they would be interested in being a test case for a Gov 2.0 experiment.  The Partnership was created by the Washington Legislature in 2007 in response to the growth and resulting environmental issues in this geographic area. The group is led by William Ruckelshaus, former EPA Administrator. The goal is a healthy Puget Sound by 2020. The Partnership established collaboration across sectors, involvement of the public, and an action plan based on science as part of its guiding principles.

The question became, how could the Internet help to solve a challenge such as that faced by the Puget Sound Partnership? Ms. O’Neill and her staff decided to challenge the attendees at the conference to use the Internet and 2.0 tools to address this challenge. They learned from key individuals involved with the Puget Sound Partnership, that they needed information about relevant science issues, science data, indicators, performance measures, best practices, and maps that would help build a sound action plan.

Four weeks prior to the Symposium, Ms. O’Neill’s office built a web site outside of EPA and a wiki front page. When trying to decide how to organize the page, they selected the information need areas outlined above. In the three weeks prior, a video was produced of Mr. Ruckelshaus addressing the conference attendees explaining the Puget Sound situation and the Partnership, and encouraging attendees to share the challenge with others in their personal networks. The video was uploaded to YouTube before the meeting. Ms. O’Neill set up an interactive presentation during the first session of the meeting, which was attended by over 600 people.  The interactive presentation interspersed the YouTube video and her own presentation.

In addition to the presentations, the challenge was advertised and explained to attendees in several ways. It was described throughout the meeting and postcards were distributed explaining the project, what was needed, and how to contribute, and pointing them to the web site. Actual technical capabilities were focused in the Mash-up Camp, which was already planned for the exhibit area.

People were allowed to contribute knowledge in several ways. They could edit the wiki, fill out a comment form, send a plain e-mail, or contact by phone someone in St. Louis. Tags were set up so that those who had content on del.icio.us or Flickr could tag it in a special way and be harvested. A small set of stub articles and a working application were developed to illustrate the types of content and to identify some possibilities. They also developed a coding scheme that would support the categorization of the input and the placement of the results on the web pages.

Once the announcement was made at the meeting, e-mails were automatically sent to EPA staff back home, to the CIO Council, and to others. People were encouraged to share the challenge with anyone who might contribute. Since the National EPA Library Meeting was taking place at the same time, Ms. O’Neill challenged the librarians to find documents of value to such a project.

The challenge was only open during the 36 hours following the meeting. Updates were given at regular intervals throughout the sessions. It is important to note that this was not explained in advance and that the work on the challenge was in addition to the regular program, exhibit responsibilities, etc.

The number and quality of the contributions was remarkable. Over 175 contributions were received. The site received over 17,000 page views during this period. Surprisingly, no one submitted contributions by telephone. Information was received from all sectors and many different collaborators. This is important since EPA has only 16 percent of the budget in the environmental business line of the Federal Enterprise Architecture. However, six percent of the environmental professionals in the US work for EPA, giving them a large pool of talent that could be tapped.

Ms. O’Neill presented some of the highlights. Techniques and ideas for identifying relevant information on the web were contributed. This included searches for Puget Sound on the web and relevant tagged content on blogs and social network sites such as YouTube, Flickr and del.icio.us.  The latter site provided information about a previously unknown but related initiative that is applying the UrbanSim application to the Puget Sound region.  

Data contributions included historic data about precipitation, soil moisture, and other key water cycle measurements, as well as NASA contribution to help use and understand air quality data as viewed from space. New layers were able to be added to existing data through newly identified or newly built web services. Multiple groups submitted customized search applications, including customized Google searches that focused on environmental web content categorized as “data”, “indicators”, “maps”, etc. A method for doing spatial searching for documents not geo-referenced was proposed. In addition to data and software, people submitted ideas. For example, the EPA science community partnered with the Department of Transportation and an Estuary Program to come up with an idea to attach water quality monitors to ferry boats that travel the Sound. The U.S. Global Earth Observation Community developed a mash-up concept to make earth observation data actionable using geospatial tools and analysis in support of specific indicators.

On January 28, 2008, EPA will deliver the results to the Puget Sound Leadership Council. At that point, it is up to the Council to determine the follow up.

Several lessons were learned. People want to contribute but they don’t want technology to get in the way. They are anxious to see their contributions in context and to comment on the contributions of others. EPA learned a lot about data access. While there are many data resources that are accessible via web services, many are not. EPA needs to put priority money into the development of such services. Documents and data should all be accessible within the context of a particular geographic location. It is clear that the Internet can be used to collaborate. Perhaps similar technologies could be used to reach out to a larger community for other purposes such as receiving regulatory comments.

Ms. O’Neill would like to roll out a complete sandbox for continuing to investigate collaboration via the Internet and 2.0 approaches. There are issues of FOIA, data quality, intellectual property, branding, and maintaining the authoritative nature of government information. Ms. O’Neill looks to CENDI and other organizations like the CIO Council to help with these issues.

Ms. O’Neill believes the OMB wants to see these kinds of activities happen. The CIO Council is in partnership with Don Tapscott of Wikinomics to gather government case studies. There is a partnership with the National Academy of Public Administration (NAPA) to talk about how to bring the agencies together on the policy side.

The nature of the CIO Offices is changing. Some CIOs are primarily interested in policy while other are interested in IT. However, the responsibilities are changing. Of importance are the new Geographic Information Officers. Throughout the challenge cited above, they realized that EPA does not have enough content managers or information management specialists. The environment is also changing for librarians and web managers, and there are opportunities to support Gov 2.0.

The morning program concluded at approximately 12:00 Noon.

Previous Page