As reported by Federal Computer Week last week, the Congressional Research Service (CRS) recently published a report for Congress that evaluates the first year of the Open Government Directive: The Obama Administration’s Open Government Initiative: Issues for Congress (PDF)

From the summary:

The 112th Congress may have interest in accessing information and documents from the executive branch. This report examines and analyzes the Obama Administration’s initiative to make the executive branch more transparent, participatory, and collaborative. […]


The 112th Congress may oversee the Administration’s open government efforts and has the authority to codify any parts of the initiative. This report reviews and discusses the centerpieces of President Obama’s transparency initiatives, the Open Government Initiative and the Open Government Directive. The report analyzes agency response to the OGI and the OGD and examines whether the OGD’s requirements can meet the stated goals of the Administration. The report discusses the three central tenets of the Administration’s OGD—transparency, public participation, and collaboration—and analyzes each one individually to determine whether agencies are meeting these requirements and whether the requirements may improve the effectiveness of the federal government.

Several people have already weighed in (Joe GoldmanNancy Scola, Andrea di Maio and others).

The report puts a strong focus on objective-driven Open Government and the need for measurable results. And while I generally agree with that (a lot), I found the report’s analysis regarding public participation (page 30-31) somewhat lacking. Here’s why:

1. Fails to mention the need for good process design

In the analysis section, the report mentions a number of high-level concerns and challenges, paraphrased here:

  • The public may not be well-informed enough.
  • There may not be enough citizens who are motivated to engage in public policy deliberations and who are capable of doing so.
  • Public comments may not be useful to the federal government.
  • Responding to public comments may cause delays in government action.
  • The resource requirements (dedicated employees, work hours) needed to respond to these comments are unclear.
  • Most public participation may come from special interest groups.
  • Web-based public participation may give certain participants unfairly greater access to policy makers.

These are all valid and important points. However, what the report doesn’t mention is that good process design will almost always help address these challenges. For example, good process design takes into account the resources the convener has available to process participant input and will only make commitments that are feasible. Good process design creates ways for participants to learn about the issue (remember, inside every public participation program is a good public information — or, as I would phrase it today — public learning program). Good process design helps bring all voices to the table, not just special interests.

2. Generalizes based on one bad example

The report looks back on the Open Government Dialogue in 2009 and observes how the suggestions were “of varied relevance and utility.” It also quotes one editorial that concluded that the quality of public comments during the Open Government Dialogue “was not consistently encouraging.” While both statements are true, undoubtedly, they do not generally apply to all public participation, online or offline. What is not mentioned is that the Open Government Dialogue was, despite all its good intentions, poorly designed and poorly managed. In all likelihood, better process design and a more hands-on management would have yielded much better results.

* * *

On a final note, while it’s true that “[t]he OGD presumably was created, in part, using suggestions from the public”, there was never a proper follow-up that informed participants about if and how their input had influenced the final document. Again, something which a good public participation design would have taken care of in a timely manner.

Despite these minor shortcomings in a report that I found generally on target, one take-away for me is that the public participation field still has a lot of work to do in better documenting their successes, developing a consistent framework for measuring return on investment, and to much better share this information with any organization in the public, private or non-profit sector that is involved in engaging their constituents in public participation.

It’s not enough to know better, it’s essential that this knowledge becomes readily available to a much broader audience than it is today. As I mentioned on Twitter, this is something I would like to make a focus area of IAP2 USA over the next year or two.