Wednesday, September 2, 2009

Connected Nation: No Bid for KY RFP

So Connected Nation opted not to submit a proposal in response to the Kentucky Commonwealth Office of Technology’s broadband mapping RFP.

Seems the Commonwealth is asking bidders to turn around a statewide map, uh, yesterday. Well, in two months, which is effectively the same thing. From the Connected Nation blog:

Kentucky’s RFP for broadband mapping calls for a submission of a substantially complete dataset by Nov. 1, 2009, a full three months earlier than the timeline laid out in the federal guidelines....Connected Nation’s wealth of experience in creating broadband maps shows that this timeline is simply unrealistic.
I'll take a look at the responses to the non-response soon.

Saturday, August 29, 2009

Lots of Takers

So the basic stats are in. The first round of applications for broadband stimulus funding through the American Recovery and Reinvestment Act of 2009 have been received by the National Telecommunications and Information Administration and the Rural Utilities Service. A press release from NTIA provides the rundown:

they received almost 2,200 applications requesting nearly $28 billion in funding for proposed broadband projects reaching all 50 U.S. states and territories and the District of Columbia

That $28 billion is roughly seven times the amount (~$4 billion) available during the first round.

Friday, August 7, 2009

Trust But Verify

I read a piece in Business Wire with great interest on the entry of the Broadband Information Services Consortium (BISC) into the broadband mapping fray. After all the heat that Connected Nation has gotten in recent months for being the paid handmaiden of industry, I'm curious to see what a large-scale alternative might look like and how to build a better mousetrap.

I hasten to add that mapping is crucial. But as I've written it is only the first (or perhaps second) step in improving broadband access. While several states have contracted with Connected Nation for mapping services alone, I'd wager the greatest results in broadband deployment are those that take advantage of the broader community mobilization such as that which has occurred in Kentucky and several other states. That is because a map is necessary to point out gaps in service. But those gaps exist because large provider don't see near-term opportunities to recoup their investment. So mobilizing local knowledge and forging local coalitions is essential if anything is going to change in those areas.

And that's where the real power of Connected Nation's approach lies.

So what does BISC offer? It's really hard to tell. Here are a few clues from the Biz Wire piece:

"...provides states with customized solutions to broadband mapping to address the full supply-and-demand broadband continuum..."

"... ensures the most accurate, fully verified and up-to-date information available for broadband mapping..."

"...“Our collective experience and platform enable us to compile the multiple layers of real-time data of location and serviceability, either as a full-service approach or as a complement to state efforts....”

"Broadband maps created with geographic information system (GIS) technology provide an advantage to states..."
These notions all sound great. But one question remains: where do the data that form the base map actually come from? The Biz Wire piece doesn't say.

Connected Nation has been roundly criticized for accepting as gospel the information that they receive from broadband providers. That's a fair line of attack. Why should we trust providers, particularly when the data they release comes with strings attached?

As I've written previously, while they've not called it "crowdsourcing" in the past (what not hip enough, dudes?), Connected Nation has encouraged public validation of its base map. The information that providers contribute is not cast in stone, but is viewed as a starting point. Connected Nation has provided a mechanism for consumers to question the accuracy of the map, soliciting inputs from the public as means of improving on what providers are willing to share. For example, in Kentucky alone, over 4000 inquiries have been made based on the data providers have made, improving the quality and accuracy of the map as a whole.

The BISC plan seems to make heavy use of crowdsourcing (which amounts to polling or surveying the public) and extrapolating from the poll to estimate conditions over all. This is apparently the sum total of the approach taken by Broadband Census, which begins with a blank slate and maps on the basis of voluntary polling (not sure how many data points the Census has for the whole nation, but I've been told it's fewer than the 4000 Connected Nation has for Kentucky alone (as a corrective, I should, add, to a base map drawn from provider data)). As any statistician will explain, the fewer data points you have, the larger the error terms (i.e., the more the inaccuracy). So if you start from a blank slate it takes a lot of data points to generate anything meaningful.

Connected Nation's maps are not perfect. Nor are they intended to be. Nor could they be. Broadband deployment changes rapidly. And providers (for all sorts of reasons, and not just the big ones) don't provide entirely accurate spatial depictions of their deployments.

The big question is: Where does the map start? From a blank slate (in which case crowdsourcing is likely to contain lots of inaccuracy)? Or from a flawed, incomplete, temporally-bounded map of provider data (in which case crowdsourcing can and, in Connected Nation's case, already does make a big difference)?

Although they claim to be operating transparently, we don't yet know where BISC's base map comes from. I'll withhold judgment until I do.

Thursday, July 23, 2009

So Everyone's Clueless?

So how do you build a national plan for broadband?

It seems no one is offering many specifics.

This piece from John Eggerton from Multichannel news is pretty alarming:

The Federal Communications Commission's broadband czar is not impressed with the agency's submissions from the public and industry on the grand broadband plan, suggesting there is too much pie in the sky and not enough pie chart on the page.

How anemic are the offerings?

He ended his talk by literally begging for better input. "We really need your best ideas. And we need them quickly and clearly."

So Google's effort crowdsourcing the National Broadband Plan hardly inspires confidence, nor does it bespeak any national consensus. Now the FCC's Broadband Czar is begging for better public input, too.

Suppose this plea puts us all on the spot...

Tuesday, July 21, 2009

Crowdsourcing and the Broadband Map

Drew Clark of BroadbandCensus.com has a great idea:

One of the things that BroadbandCensus.com has been doing since our launch, in January 2008, is to provide a crowdsourced, public and transparent collection of data about local broadband Speeds, Prices, Availability, Reliability and Competition. We call this the Broadband ‘SPARC.’

The basic idea is to get as much information from as many sources as possible to create sort of a collage, a national broadband map extrapolated from an enormous set of data points, ideally from lots and lots of individuals. There's a lot to be said for such a bottom-up approach, setting aside the initial challenge of getting folks to participate. The main challenge is that the starting point is a blank slate.

The question, really, is when and how such a crowdsourced resource should inform planning and decision making.

Clark has been outspoken in his criticism of Connected Nation's approach to broadband mapping, offering crowdsourcing as an alternative.

Yet, it seems there's less light between Connected Nation's approach and BroadbandCensus.com's. Indeed, Connected Nation has placed public verification of their mapping results at the forefront. Though they've not referred to the effort as crowdsourcing, I suppose they could.

Connected Nation brokers arrangements with multiple providers offering to protect what those providers believe is proprietary data. In other words, Connected Nation works with providers, accepting the data that they are given. The basic map begins, then, as the map that providers would have us see.

There are multiple reasons to be skeptical of this base map as an outcome. But it is only the first step.

Some providers dramatically overstate their coverage, though it's not always who you might think. Many suspect that Verizon and Comcast claim availability where it is not to make it appear that the broadband challenge is smaller than it may appear. But, according to my conversations with Connected Nation's mapping team, many small providers claim universal access in their service areas. Anyone who claims to be lacking service can get it simply by asking. Whether the data come from large or small providers, there's no way to challenge the assertions providers make other than through address by address verification.

So without attributing motives, we can stipulate one simple fact: for numerous reasons (reticence of providers to cough up accurate data as well as evolutionary nature of their networks) no national broadband map can be accurate.

And given these manifold flaws, the best verifier of any map is the public itself. Hence the importance of some sort of crowdsourcing effort as a corrective to any national map (whether prepared by Connected Nation, some other private sector entity, the providers themselves, or the FCC). In fact, Connected Nation realizes the importance of verification and has, as it happens, probably gotten more inputs from its own rather quiet crowdsourcing efforts than BroadbandCensus.com.

Saturday, July 18, 2009

A Google Opinions on Broadband

So Google is encouraging folks to "Submit your ideas for a National Broadband Plan". At the time of this writing,

1,719 people have submitted 440 ideas and cast 35,988 votes...

As one might imagine, there's a lot of chaff, but there are some solid ideas. It's worth taking a look if only to get a sense of the breadth of opinion (and misinformation).

A few examples:

"Investigate ways to increase the span of wireless networks and make it more advantageous for local governments to provide free wireless internet."
swankestZACK, Louisville, KY

So these are probably both good goals. Longer range wireless networks. Free wifi. Awesome. But the vision isn't matched with tactics. Make it advantageous? Uh, great, but how?

"Make Broadband a Utility. Internet is like phone service, water service, or electricity. It is quickly becoming almost necessary to have it. Since Internet itself is a service, it should become a Utility with no filtering."
Navarr, Spring, TX

So the key for the National Broadband Strategy should be definitional? (I.e., redefining Internet as utility?) What difference would this make at a practical level?

I could go on. But suffice it to say there are a lot of well-intended ideas. But how these would (or will) fit together as a coherent strategy remains to be seen. My hat's off to Google for providing the space to try.

And there are a slightly more articulated visions:

"Encourage investment from diverse companies in broadband infrastructure and support innovative public-private collaborations to reach remote, unserved parts of our nation, so everyone is connected to an increasingly robust Internet."
NextGenWeb, Washington, DC

Okay, sure, so you'd expect NextGenWeb to have a reasonably lucid vision. Of course, like the other ideas, this is a vision, not an implementation plan.

Still this might be an interesting forum. It'll be interesting to see how many ideas and voices will be raised at the Google site.

Friday, July 17, 2009

Henchmen?

So don't know if you've seen it, but techdirt has an illuminating piece by Paul Masnick, who is largely sympathetic with Art Brodsky's (valedictory?!) rant aginst Connected Nation.

Masnick first makes this overwrought claim about Connected Nation:

First, it's just a "mapping" organization and it's run by the telcos themselves, allowing them to continue to fudge the data to make markets look a lot more competitive than they really are. And, yet, thanks to all the political love that goes out to Connected Nation, it looks like they're about to get hundreds of millions of dollars in broadband stimulus money.

Of course, it's disappointing that Connected Nation continues to be seen by its critics as merely a front for the telcos. But the larger disappointment is, without belaboring the point, this characterization of Connected Nation's work completely fails to acknowledge that Connected Nation believes mapping to be only the first step in improving broadband. In states like Tennessee, Ohio, and Kentucky the map has only been stage one run in tandem with statewide, county-by-county planning efforts that foster local empowerment and decision-making focused both on improving demand and channeling that demand toward feasible, meaningful, local improvements in supply.

After swallowing the bromide against Connected Nation hook, line, and sinker without critically engaging the organization's actual work or its results, Masnick then ends with this little bit of confusion:

I certainly agree that better data is important, but I have to admit I'm still somewhat confused as to what real problem we end up solving with mapping alone? Yes, it will give us more data to figure out just what the current situation is when it comes to broadband deployment, but that's got little to do with actually improving our broadband infrastructure.

Ahem, welcome to the party.

And if you'd look beyond your suspicion and prejudicial rage against Connected Nation, then you'd seen the organization's actual focus lies beyond the maps, too. The map is merely the first step in improving conditions on the ground. And those conditions will be vastly improved by enlisting the support and efforts of local leaders and residents and leveraging any and all willing assets, broadband platforms, and potential solutions. Not by throwing out the baby with the bathwater as Connected Nation's critics so often seem willing to do.

How Fast?

So the de facto standard for broadband seems to be 3 megabytes per second. The broadband stimulus will favor proposals (at least for now) coming from areas in which more than 50% of residents have under 3Mb/s.

Many folks argue that in order to be competitive with better-wired nations, we need to set a higher national standard. Say, 20 Mb/s everywhere. Or 50. Or more. By this logic, the broadband stimulus needs to support improvements in bandwidth universally, not just showing a preference for un- and underserved areas.

I won't quibble with that argument. I think it's largely true that for too many years our federal policy failed to push speed. Just think, the FCC definition of broadband for most of Martin's tenure was 200 KILObytes/second! Pretty easy benchmark to meet.

I agree that much should be done to promote higher speeds ubiquitously. But for now, we have literally, borrowing from Matelart, a broadband archipelago: islands of pervasive, high-speed access and large oceans of nuthin'. This differentiation threatens to create (or entrench) a technological underclass that is not healthy for the nation as a whole. So stimulating broadband improvements in underserved areas is essential.

The question for those presently in the ocean, then, is what a tolerable level of service is. For some, the stimulus may mean that the capital exists to build a fiber network. For many (if not most) underserved areas, though, the stimulus will be sufficient for fixed wireless, which tends not to allow bandwidths as high as fiber.

So do we hold all communities, rural and urban, to the same standard? I think a national goal of 100 Mb/s is laudable (hell, I support it myself). But I'm enough of a pragmatist to realize that this standard is out of reach without a much larger federal investment than $7.2 billion.

As such, I contend that the excellent should not get in the way of the good. A national goal (say, 100 Mb/s) should not interfere with local build-out at some lower level. When local actors deliberate over their present needs, their perception of future requirements, and the extant logjams to creating service, the outcome is generally a realistic depiction of what standard should obtain for that locale. This is the strength of statewide planning efforts in places like Kentucky, Tennessee, Ohio, California and elsewhere: coalitions of local residents and leaders have considered (and continue to consider) what solutions are possible for them. Right now. This pragmatic approach should supercede any national standard (which is essentially arbitrary anyway) because it based on a finer-grained depiction of local conditions.

In short, speed standards should be pushed by federal policy. But in the near term definitions of broadband should be an emergent property of a nationally-promoted, locally-conducted grassroots planning process.

Thursday, July 16, 2009

Muni Broadband In The Cold?

According to Ed Gubbins the rules for obtaining funding through the Stimulus Package (the name pales in rhetorical comparison with "New Deal", doesn't it?) are leaving municipalities with little hope. Specifically, the NTIA's definition of "underservice" keep most areas of any size out of the running for broadband dollars in the near term.

One group of broadband stimulus hopefuls that has been in large part swept out of the running by the specifics of the plan is individual municipalities of any size. Though the stimulus plan stoked broad interest from municipalities earlier this year, many of them have been frustrated by the program’s preference for “underserved areas,” which the government has defined as areas where where at least half of all households lack broadband, where fewer than 40% of households subscribe to broadband, or where no service provider advertises broadband transmission speeds of at least 3 Mb/s.

Those rules sent the city of Northfield, Minnesota, for example, which had hoped to secure stimulus funds, back to the drawing board in its efforts to finance its plans. Melissa Reeder, Northfield’s information technology director, told the local press, “Honestly, I don’t think there’s a single Minnesota city that would qualify.”

I've written before in this space that the urban-rural divide doesn't make a lot of sense anymore. I'll acknowledge that the "unserved/underserved/served" trichotomy may create as many perversions. The bottom line is that any one-size-fits-all federal definition or standard of service or need is likely to have a similar effect. This reality brings us back to the importance of local engagement and planning for broadband. If standards of service are the product of sober thinking coming from local actors, then local conditions and logics of feasibility can dictate what the ideal level of service is.

Tuesday, June 30, 2009

A Step in the Right Direction: USASpending Site

Haven't had much of a chance to poke around it yet, but the newly released USA Spending site is definitely worth a long look.

The purpose of the site, mandated by the Transparency Act:

To provide the public with information about how their tax dollars are spent. Citizens have a right and need to understand where tax dollars are spent. Collecting data about the various types of contracts, grants, loans, and other types of spending in our government will provide a broader picture of and much needed transparency to the Federal spending processes. The ability to look at contracts, grants, loans, and other types of spending across many agencies, in greater detail, is a key ingredient to building public trust in government and credibility in the professionals who use these agreements.

The Map Is Not the Territory

There's been quite a bit of acrimony in the recent debate over the FCC's soon-to-be-released rules for a national broadband map and plan. The stimulus package (or ARRA of 2009) fully funds provisions of the Broadband Data Improvement Act of 2008, which calls for a national strategy for improving broadband, an integral part of which rightly should be mapping where broadband is and isn't available (and with some reliable gauge of the type and speed of service).

As Art Brodsky of Public Knowledge rightly puts it, however:

It’s unfortunate that the issue of broadband mapping is taking up any time and energy, much less about $350 million in stimulus money. Discussion of mapping takes away from discussion of the real issue – deployment, and why large companies have to be begged to provide service to some areas while they go to court and to state legislatures to prevent others from filling the gap.

Brodsky is entirely correct. Mapping, while important, even crucial, to a sound broadband deployment strategy, is only a small fraction of the challenge. Mapping is relatively cheap. Done well, it benefits all parties (consumers and providers). And with knowledge of existing networks, sound strategies to fill gaps in service can be devised.

But the map is only the beginning. No broadband map can be 100% accurate (not even briefly). Providers do not gather spatial deployment data in a consistent way. Many small providers do not gather spatial data on their networks at all. Maps must be verified. And the best means of verification is through consumers themselves, not by trusting the validity of data providers cough up through any data gathering and compiling process.

A nationwide broadband map, then, is only as effective as the level of public involvement in scrutinizing and correcting it. In other words, the map, which is an artifact of a necessarily imperfect and incomplete and continually changing data collection is not the territory in which the public solicits broadband deployment and improvements in performance. Rather the territory comprises those far-flung locales where various publics either are or are not served adequately to meet their needs.

This reality is what makes Brodsky's blithe dismissal of comprehensive planning processes aimed at engaging those local publics so striking. In condemning Connected Nation, an entity for which Brodsky reserves especial venom, he claims, "Connected Nation charges up to millions of dollars for mapping and, in some occasions, to organize local teams to assess demand." In the same piece, Brodsky marvels at how small a ratio of Connected Nation's budget is dedicated to mapping. (Full disclosure: I have had a working relationship ConnectKentucky and more recently Connected Nation since 2004, consulting on the demand stimulation efforts.)

In unleashing his fury on the skewed nature of Connected Nation's mapping project budgets, however, Brodsky betrays his ignorance of the intricacies of broadband deployment. As is the case with all infrastructures, broadband is a socio-technical system. This means that supply and demand necessarily co-evolve along with regulatory institutions, legal frameworks, and even family life. Drawing a map of where broadband is available is essential. But doing so does nothing to mobilize interest at localities where it has never been available. Connected Nation's approach is two-pronged: mapping supply and working with providers to extend their networks while simultaneously working extensively with leaders at a local level to increase demand and adoption. In many instances, those local efforts have led to local and even regional broadband deployments when no provider was willing to step in and extend supply.

All these efforts are influenced by mapping and accuracy. Indeed, part of the process is to solicit extensive feedback from the public on the maps that Connected Nation produces. But mapping is only a fraction of the overall effort necessary to develop, extend, and improve broadband. Connected Nation's process demonstrates that they understand the complexities of the challenge. Brodsky's off-the-cuff dismissal of this approach is evidence that he doesn't have a clue.

Monday, June 29, 2009

Treading the Limn: Human, Non-Human, Agency?

The WaPo's Shankar Vendantam has the most interesting piece I've seen on the implications of the crash last week.

Money quote:

"The problem, said several experts who have studied such accidents, is that these investigations invariably focus our attention on discrete aspects of machine or human error, whereas the real problem often lies in the relationship between humans and their automated systems."

Wednesday, May 20, 2009

My Contribution to the Local Debate

So my letter to the editor got published in the local weekly.

I argue that commuter rail is a largely irrelevant part of contemporary commuting patterns but that that sad fact doesn't have to remain the case in the future.

Tuesday, May 12, 2009

Notes on Hughes: Rescuing Prometheus

Rescuing Prometheus Thomas P. Hughes Pantheon, New York (1998) So I've already written a bit on this text. But I picked it up again. And I was particularly interested in reviewing two things:

1) Given my topic, I was pretty interested in looking at the final case study in the book (which deals with four different instances of projects, arguing that the arc of the 20th Century is toward the "postmodern" paradigm. In the final case study, Hughes provides a history of ARPANET, the DoD's network of networks that eventually became the Internet.

What is telling, given that Hughes is stressing the postmodern shift toward flat, open, consensus-reaching (etc., it's a long list and it appears on the last page of the book). What's noteworthy, though, is that the story focuses nonetheless on the system builder perspective.

2) The wrap up of the book (the case studies in which are well worth attention) is brief. But it's the place to look for the relationship of Hughes' welcoming of open, participatory planning into contemporary projects (uh, well, maybe welcoming isn't the right word for it...).

So a couple of the big distinctions between the modern (i.e., pre-WWII) firm and the PoMo:

a) The firm. Pre-WWII were big projects associated with big, stable manufacturing firms. The latter day projects were conducted through joint ventures and projects. (p. 301)

b) Pre-WWII: "The maintenance of a system for mass-producing standardized products" (p. 301) But standardization no longer a top priority. Contemporary managers are open to change and heterogeneity.

c) Before managers tended to view judgment and "problem-solving techniques" were the provenance of experience, masters. Today, constant need to revision, refreshing, relearning the state of the art.

d) Typical pre-War firm a big, integrated, multiunit firm. Big management hierarchy. i.e., Fordism and Taylorism. Today: "The numerous contractors participating in projects like SAGE and Atlas are loosely coupled by information networks and by a coordinating and scheduling systems engineering organization" (p. 302)

e) The modern firm could not achieve its ends without first establishing a "managerial hierarchy". The hierarchy was itself "a source of power, continued growth, and "permanence" (p. 302). New model focuses on R&D and thus the hierarchy gives way to flexibility necessary to encourage innovation. "The result compromise, called "black-boxing," allows local research and development team to choose the technology that will fulfill system specifications" (p.303)

f) Past: value on "highly trained" specialists from disciplines suited to solving problems were viewed through this lens. Today interdisciplinarity reigns.

g) Old projects depended on "technical and economic factors" (p. 304): "such matters as the environment, political-interest group commitments, and public participatory design concerns lay beyond the horizons of 1950s systems engineers". Now projects "take into account the concerns of environmental and interest groups. Through public hearings, the project fosters participatory design. CA/T [viz, the Big Dig] is not an elegantly reductionist endeavor; it is messily complex embracing of contradictions" (p. 304)

Only after the organizers of CA/T made clear that would take into account public concerns about neighborhood integrity and environment was the project funded. CA/T has been socially constructed, not technologically and economically determined (p. 304)

h) ARPANET also suggests role of counterculture: preference for a flat management structure.

And now the list from the final page of the book:

Modern Postmodern
production system project
hierarchical/vertical flat/layered/horizontal
specialization interdisciplinarity
integration coordination
rational order messy complexity
standardization/homogeneity heterogeneity
centralized control distributed control
manufacturing firm joint venture
experts meritocracy
tightly coupled systems networked systems
unchanging continuous change
micromanagement black-boxing
hierarchical decision-making consensus-reaching
seamless web network with nodes
tightly coupled loosely coupled
programmed control feedback control
bureaucratic structure collegial community
Taylorism systems engineering
mass production batch production
maintenance construction
incremental discontinuous
closed open

Friday, May 8, 2009

On the Utility of the Urban-Rural Distinction

We note without being prompted that we are in or out of and urban setting. Leveling the distinction between city and country is not normally a controversial thing. That said, as the distinction plays itself out as a matter of policy, the point at which one gives way to the other is crucial. At the margin, what defines a place as one or the other? I would argue that among other sine qua nons of urbanity or rurality is connection to infrastructure. Elsewhere in this blog I have argued that retaining the distinction for the purposes of allocating federal stimulus dollars makes a certain sense. At the same time, channeling funding for broadband through the Rural Utilities Service both prejudices the definition of the broadband challenge and biases our approach to resolving it. Yes, the present approaches to providing access where it doesn't exist and improving service and adoption where it does exist suggest a new understanding and definition of urban and rural. Namely, the service areas for high-speed wired broadband service can be taken as the outer fringe of what is urban and what will be urban in the near future. Areas where wired services don't exist (i.e., in those areas where wireless broadband is the only solution on the horizon) can properly be understood as rural. Of course, there are shades of gray. Places where wired services exist along transportation corridors, but where service is spotty away from highways and pockets of population density are likely to already be considered sub- and exurban. So for those of us who are wasting time with labels, perhaps existing infrastructure can be our defining characteristic (since it likely correlates strongly with other attributes.

Tuesday, April 28, 2009

One To Watch: Google Antitrust Case

Keep your eyes on this one:

Justice Dept. Opens Antitrust Inquiry Into Google Books Deal (By Miguel Helft, NYTimes)

The skinny:

The settlement agreement stems from a class action filed in 2005 by the Author’s Guild and the Association of American Publishers against Google. The suit claimed that Google’s practice of scanning copyrighted books from libraries for use in its Book Search service was a violation of copyrights.

The settlement, which was announced in October, gives Google the rights to display the books online and to profit from them by selling access to individual text and selling subscriptions to its entire digital collections to libraries and other institutions. Revenues would be shared between Google, authors and publishers.

But critics say that Google alone will have a license over millions of so-called “orphan books,” whose authors and right holders are unknown or cannot be found. Some experts believe the orphan works account for the bulk of the collections of some of the major university libraries, that have allowed Google to scan books.

Some librarians fear that with no competition, Google will be free to raise prices. Some scholars have also said that the system for pricing books could raise antitrust concerns.

Thursday, April 16, 2009

On Flyvbjerg and Phronēsis

Some notes on Bent Flyvbjerg's Making Social Science Matter: Why Social Inquiry Fails and What Can Make It Succeed Again (Cambridge University Press, 2001)

Flyvbjerg begins with the fundamental problem of the social sciences vs. the natural sciences. I.e., is a true science of society possible given the variability and slipperiness of the human condition? Of course, this fundamental question goes way, way back. Flyvbjerg himself reminds us that this basic question plagues the social sciences from their origin. For example, Weber recognized the distinction between instrumental rationality (Zweckrationalität) and value rationality (Wertrationalität), a distinction that Foucault and Habermas in different ways recognized as central to their own programs.

Consider, for example, the high-modernist apotheosis of urban renewal projects. The engineering model at the core of this effort recognized a host of pathologies endemic to cities (crime, poverty, disease, etc). Since these problems seemed to reside at a large scale, the solution during the 50s and 60s was large-scale demolition of large swaths of inner cities without regard to the preferences of individual dwellers of those places. In other words, instrumental rationality viewed urban decay as a problem and presented a solution (viz., destruction and reconstruction) as the proper solution. But this perspective failed to view the particular circumstances of the particular denizens whose homes and lives were turned upside down by the process. Planners point to this period as a moment of reckoning, an occasion to reflect on if not reject in whole cloth their approach. Rather than proffering solutions from the outside, planners shifted to accommodate the perspectives of those whose lives might be touched by such projects (i.e., to accommodate value rationality).

But, as Flyvbjerg describes, this recognition of the limits of human knowledge is nothing new. Indeed, Aristotle understood that scientific knowledge (episteme) is bracketed by specific contexts; there are certain domains where this sort of understanding (and action based on it) do not obtain.

"Episteme thus concerns universals and the production of knowledge" (p. 56)

"Whereas episteme resembles our modern scientific project, techne and phronesis denote two contrasting roles of intellectual work." (p. 57)

"Episteme    Scientific knowledge. Universal, invariable, context-independent. Based on general analytical rationality. The original concept is know today from the terms "epistemology" and "epistemic"

Techne        Craft/art. Pragmatic, variable, context-dependent. Oriented toward production. Based on practical instrumental rationality governed by a conscious goal. The original concept appears today in terms such as "technique," "technical," and "technology."

Phronesis    Ethics. Deliberation about values with reference to praxis. Pragmatics, variable, context-dependent. Oriented toward action. Based on practical value-rationality. The original concept has no analogous contemporary term." (p. 57)

Flyvbjerg explains episteme as "know why" and techne as "know how," but doesn't proffer a similar schtick for phronesis. I would suggest that it is "know when". It's also important to note that phronesis is embodied (i.e., it doesn't exist without the phronimos, or the person of practical knowledge). Phronesis is thus best thought of as the judgment about what can be done given specific circumstances rather than what is physically possible. Perhaps it is best understood as the skill of knowing what is feasible rather than possible. As an example, when is the best time to call for a vote on a bill?

Phronesis "focuses on what is variable, on that which cannot be encapsulated by universal rules, on specific cases….requires an interaction between the general and the concrete; it requires consideration, judgment, and choice." (p. 57).

Flyvbjerg's basic argument, similar to that made by other previous advocates of phronesis, is that the social sciences encounter questions such as "What should be done?" and "What is desirable?" and perhaps "Who gains and who loses?". There is no universal, scientific answer to such questions, because they are rooted in particularity both temporally and spatially.

Most importantly, Flyvbjerg offers a set of "Methodological guidelines for a reformed social science" in Chapter 9:

  1. Focusing on values (p. 130-1): three questions animate the effort at steering toward value- rather than instrumental-rationality. (Where are we going? Is it desirable? What should be done?).
  2. Placing power at the core of analysis (p. 131-2): Not just "who governs?" but "what "governmental rationalities" are at work by those who govern? Power is productive and positive (even though it can be restrictive and negative).
  3. Getting close to reality (p. 132-3): "Phronetic researchers seek to transcend this problem of relevance by anchoring their research in the context studied and thereby ensuring a hermeneutic "fusion of horizons". This means being close to the ground (the group or phenomenon) at all stages of research.
  4. Emphasizing little things (p. 133-134): focus on minutiae, work phenomenologically. "thick description".
  5. Looking at practice before discourse (p. 134-5): focuses on practical activity and practical knowledge in everyday situations.
  6. Studying cases and contexts (p. 135-6): cases exist in context. And the essence of those particularities is only possible in attending to those cases.
  7. Asking "How?" Doing narrative (p. 136-7): quotes MacIntyre, "I can only answer the question, "What am I to do?" if I can answer the prior question "Of what story or stories do I find myself a part?"
  8. Joining agency and structure (p. 137-8):
  9. Dialoguing with a polyphony of voices (p. 139-140):

Wednesday, April 15, 2009

And Now For a Sample of What is To Come...

So I've written before about the rather clunky (if not anachronistic) set of agencies that will be driving the Broadband Stimulus $$. My claim in the past is that the relative slowness off the blocks of this initiative is attributable at least in part to the unwieldy set of agency interactions that the program calls for. Take, as a mere example, the ambiguity at the core of the Broadband Stimulus: is this an effort to address a rural problem or an "underserved" problem (since they're not one and the same, especially as our definition of what meets a basic level of service evolves).

Now we see that this fundamental problem of definition exists within a single agency. A report released on Monday (4/13/2009) by the Department of Agriculture's inspector general finds that the Rural Utility Service has been making too many loans in non-rural areas (tsk! tsk!):

In 2007, Congress requested that we determine if RUS had taken sufficient corrective actions in response to the issues disclosed in our report. In particular, members of the Appropriations Committee expressed concerns that RUS, “instead of focusing on rural areas that have no broadband service,” continues, “to grant loans to areas where broadband service is already being offered by private providers. Such practices penalize private providers that have already built broadband systems in the area. Such practices also do nothing to further the goal of bringing broadband to unserved areas.”

Take note of the distinction between rural/urban and served/unserved. And ask yourself whether a 200 k.b.s. standard is sufficient to claim that service is being provided. And while you’re at it, ask yourself if having a single location in a given zip code operate at this anemic standard is sufficient to claim that service is being provided. In both cases, the FCC claims it is. Hopefully, in the coming months (or years), we’ll have better data on present and planned deployments and have federal policy pushing a higher standard. But not yet.

Back to the report, since Congress raised this concern, the inspector general’s office issued a report to RUS, outlining several steps the agency should take. Alas,the IG report

found that RUS has not fully implemented corrective action in response to 8 of the 14 recommendations from our September 2005 audit report.

And the real kicker:

We remain concerned with RUS’ current direction of the Broadband program, particularly as they receive greater funding under the American Recovery and Reinvestment Act of 2008 (Recovery Act), including its provisions for transparency and accountability.
In my estimation, the essence of the broadband challenge isn't in some arcane internecine pissing match over how we're applying the definition of urban and rural. The problem is one of standards of service (where service exists) and providing service where it doesn't.

Friday, April 3, 2009

Are the Bigs Taking the Cash?

So Ryan Singel has an interesting perspective on recent speculation as to whether or not AT&T and Verizon's will apply for broadband stimulus cash.

What's really at stake here are definitions: what kind of service will the government define as 'broadband,' what counts as an 'open' network, and what areas are 'underserved' or 'unserved.'

I couldn't agree more.

Yes, defining and operationalizing these terms is precisely what is at stake in coming to the table (or not) for the Bigs. And as I've argued, they are precisely the matters that are unresolved in the present NTIA/RUS/FCC plan for investing the $7.2 billion of stimulus cash. What will happen? I'm still waiting to see.

Wednesday, March 25, 2009

The Grid

One more thing about Scott (Seeing Like a State).

He picks up on something I've often mused about: the American gridded landscape.

I've often mused that somehow dwelling within the objective correlative of modernist rage (i.e., living in a cartesian coordinate system of streets and other infrastructures) is bound to have some sort of cognitive if not metaphysical impact (if theories of behavior and environment are to be believed at all).

Scott is more interested in its origin as an example of high-modernist excess: The grid creates a "God's-eye view", stressing the Enlightenment bias toward formalized order. In the cartesian ideal, no local knowledge is necessary to navigate a grid.

Scott, Seeing Like a State (Notes, Reactions)

FULL CITATION:

James Scott (1998):Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed (New Haven: Yale University Press: 0300070160).

FWIW: Brad DeLong has a far better review than I could write here (even if his critique focuses on Hayek instead of Heidegger):

http://econ161.berkeley.edu/Econ_Articles/Reviews/Seeing_Like_a_State.html

This book has been around for a while, and everybody seems still to have something to say about it. I was a bit surprised how much I like about it. Perhaps it’s because as one deeply influenced by Arendt, which is to say as one deeply affected by Heidegger, I was quite interested and surprised to see someone build a theory around thinking an alternative to technÄ“, which may mean to technological thinking of a certain sort altogether. So Scott attempts to create an alternative, too.

A bit of history: recall that those in the room in Marburg in the winter of 1924-25 when Heidegger delivered his seminal lectures on Plato’s Sophist was a Who’s Who list of 20th Century intellectuals (to name a few, Arendt, Gadamer, Lowith, and Strauss). The lectures are noteworthy in that these students tended to become rather committed Platonists or Aristotelians (and ne’er the twain shall meet…). The lectures are also noteworthy in that these lectures, ostensibly on Plato, contained a vital excursus on Aristotle’s Nichomachean Ethics, focusing particularly on Book VI. It is in this section of the book, which deals in great detail with human virtues, that Aristotle explains that the chief intellectual virtue (which is to say, the most virtuous of virtues) is phronÄ“sis, translated into English by way of Latin as prudence but perhaps better understood as practical wisdom. For those familiar with the Serenity Prayer, phronÄ“sis might be thought of as “the wisdom to know the difference”.

In a nutshell, phronÄ“sis is the sort of wisdom that comes from experience, the capacity to make good judgments in a pinch (i.e., precisely the capacity one would seek in leaders). PhronÄ“sis, then, is taken by many to be the essential trait of democratic citizens (one hopes that wisdom guides the decisions of voters, jury peers, and elected officials). And phronÄ“sis should always be understood in contradistinction to its practical counterpart, technÄ“, which is technological knowledge (or in rough terms a kind of knowing and acting that bridges the gap between theoretical understanding and production, i.e., instrumental rationality. In a nutshell, the chief problem confronting modernity, from Heidegger’s perspective, was overcoming the provenance of technÄ“. Why? Because human freedom is impinged when discretionary judgment (i.e., phronÄ“sis) is eclipsed by the instrumental rationality of technÄ“.

Why this excursus in a review of Scott’s book, you might fairly inquire? Well, when you get to the meat of Scott’s alternative, it will make some sense. So I offer some pithy quotations interspersed with comment. I’ll return to the foregoing treatment of technÄ“ in a moment.

SEEING LIKE JAMES SCOTT:

The modern gambit, as Scott describes, is the impressive march of modern science and technology. Yet, those techniques, when applied to the domain of human affairs, create problems. Scott opens with the quintessential example, 19th Century German forestry. As the Germans attempted to maximize yield using the techniques that leveraged the power of emergent science, they created large, mono- and duo-crop swaths of conifer forests with tremendous yield for a few generations. But focusing on only a couple crops while “cleaning” the complications of underbrush to reduce fire damage and ease access for lumberjacks, led to rapid exhaustion of the forests (from lack of biodiversity). The myopic focus on productivity, steered by bureaucratic scientists far from the actual field, lasered in on an aspect of the forest and optimized it with impressive near-term results, only to discover that forests are complex systems, each element of which sustains itself on the basis on symbioses and intricate interrelationships. No science can reveal and comprehend these intricacies in their entirety. Yet scientific forestry management attempted to do just that.

Thus Scott reveals his main point: several features of modernity combine to create massive failures of hubris akin to German forestry (Scott focuses on the economic, social, and natural devastation that occurred in the wake of statist interventions such as Stalin’s collectivization). Modern tragedies are the result of 4 things: a) administrative ordering of society; b) high modernist ideology; c) authoritarian state apparati; d) a “prostrate civil society”.

At first blush, Scott’s etiology of these tragedies is not unlike that provided by the Viennese School, especially Hayek. That is to say, the modern state’s myopia is the problem for Hayek, too. The solution for Hayek, of course, is to do away with central governments and allow the unfettered market to operate unto its own logic. As we will see below, Scott’s solution is to call for an alternative mode of framing the modern gambit (not unlike Heidegger’s effort above). Of course, this appeal comes across as a rather empty formalism (“oh wow look what I read in Plato!...). But I’m getting ahead of myself.

SCOTT’S ASSESSMENT: HIGH MODERNISM TENDS TOWARD MYOPIA

Scott opposes the “imperialism of high-modernist, planned social order” (p. 6). In framing the problem in this way, Scott taps a rich legacy of thinkers who were suspicious of the “double-bind” of the Enlightenment: on one hand, modernity appears to be liberatory and progressive, yet it contains certain tendencies toward authoritarianism. Thinkers diverge on how exactly the modern condition (or postmodern one, whatever) constrains human freedom. For example, Strauss (again, Heidegger’s student) believed that a fundamental trap of modernity was the tension between liberty and equality (which de Tocqueville made note of). Liberal democratic institutions are, as Plato and Aristotle both noted, subject to the sway of demagogues and tyrants. Horkheimer and Adorno, exiled in Hollywood from Frankfurt, symptomatized the culture industry among other ideological organs of capitalism that limit human freedom. Arendt, following Heidegger, argued that the provenance of instrumental rationality limits the need for human judgment (and thereby human agency). What all these perspectives share is a depiction of a rear-guard action of a thinking public against the encroachment (Habermas calls it “colonization”) of those organs of instrumental rationality. Scott, in this sense, continues a rhetorical tradition that stresses the dire circumstances into which we moderns find ourselves thrown.

p. 26: states seek to render particularity legible, which requires disregarding local, historically-situated for universals.

The basic thrust of Scott’s particular adaptation of this skeptical trope is against the imperial inclinations of centralized authority. In its efforts to account for the vastness of their domains, states are forced constantly to simplify and generalize. This is quite similar in thrust and tenor to the critique raised by the Vienna School (i.e., Hayek and carried into the neoclassical tradition (i.e., Friedman) that the modern state is the primary cause of these ills. To be sure, Scott has a rogue’s gallery of villains, of which the state is one. He himself claims,

“Put bluntly, my bill of particulars against [the high-modernist centrally-planning social-engineering] state is by no means a case for politically unfettered market coordination as urged by Friedrich Hayek or Milton Friedman. “ (p.8
)

But sense his book casts particular blame on the state, the question of how Scott differentiates himself from the Vienna School. As DeLong puts it:

“Yet even as he makes his central points, Scott appears unable to make contact with his intellectual roots--thus he is unable to draw on pieces of the Austrian argument as it has been developed over the past seventy years. Just as seeing like a state means that you cannot see the local details of what is going on, so seeing like James Scott seems to me that you cannot see your intellectual predecessors.”

p. 30: Many things mark the increasing relevance of universality (CK’s observation: certainly reading from Kant forward, self-conscious reflection on universalism is a clarion of modernity). But several trends suggest a stepping toward the universal and abstract as opposed to the idiosyncratic and particular: a) increasing markets lead to increasing needs for standards, commonalities, consistencies….next stop is codes; b) popular sentiment, argues Scott, is a catalyst (as in the sense that growing centrality of rights to modern states leads to an increase in what Hegel called “abstract right” (again, my reference, not Scott’s)); c) the French Revolution made this general march of universalism ubiquitous.

p. 36: Modern states “aspire to measure, codify, and simplify”; systems of measurement and standards overcome the “Babel of measurement”

p. 39: cadastral map is the crowning achievement of modern states; provides “synoptic view of the state and supralocal market in land”

p. 44: “value of the cadastral map to the state lies in its abstraction and universality”

p. 45: “designed to make the local situation legible to an outsider”

p. 46: perhaps most importantly to seeing like a state: the cadastral map freezes social phenomena—more static, more schematic than reality”

Okay, so I think we get it. The modern state is a perpetrator of metaphysical as well as physical violence. It attempts (poorly) to shove the circular peg of local knowledge into the square hole of abstract reason. And this constrains freedom and results in large-scale tragedy and atrocity.

DeLong criticizes Scott for not being sufficiently attentive to his Viennese roots. His point is that Hayek and others have argued from the liberal economic position that state intervention inevitably causes more ills than it resolves. Hayek argued famously against planning (reacting specifically to planned economies in particular), claiming that central planners lack the perspective to see individuals, local networks, and specific contexts sufficiently to take decision-making power away from local actors. As DeLong quotes Hayek:

“...the fact that knowledge of the circumstances of which we must make use never exists in concentrated or integrated form, but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess.... “ from “The Use of Knowledge in Society

Although as DeLong rightly criticizes Hayek for being inattentive to the Viennese School, he pays little attention to the intellectual pedigree of mÄ“tis, which is Scott’s effort at redeeming the present.

SCOTT’S WAY OUT: REHABILITATING LOCAL WISDOM

Against this evil, Scott proposes mÄ“tis: “denotes the knowledge that can only come from personal experience” (pp. 6-7)

p. 7: “I am making a case for the resilience of both social and natural diversity and strong case for the limits, in principle, of what of what we are likely to know about complex, functioning order”

p. 346: the problem with high-modernist schemes: “little confidence they repose in skills, intelligence and experience of ordinary people”

p. 351: “…all socially engineered systems are formed and are in fact subsystems of a larger system on which they are dependent not to say parasitic” (CK: might we call this the lifeworld?)

p. 352: big problems occur: “fairly simple interventions into enormously complex natural and social systems”

p. 357: “Common law, as an institution, owes its longevity to the fact that it is not a codification of legal rules, but rather a set of procedures for continually adapting some broad principles to novel circumstances” (CK: sounds a lot like phronÄ“sis”)

p. 311: “formal order, to be explicit, is always and to some considerable degree, parasitic on informal processes, which the formal scheme does not recognize” p. 313: Scott explains mÄ“tis: “indigenous technical knowledge” “folk wisdom” “practical skill”

p. 316: mÄ“tis is knowing when and how to apply rules of thumb in concrete, specific situations. Example from Oakeshott: a ship pilot knows general rules of sailing (and has often actually authored many of those rules by dint of experience), but always applies those rules in particular circumstances. Twain’s Life of the Mississippi embodies this principle.

This is not technÄ“, which has to do with rules of thumb and how they’re created, but necessarily in their application.

p. 323: Scott explains a story of Squanto in his encounter with first English settlers. When asked when it was safe to plant corn, the advice from a local expert was to plant when the silver maple leaves began to emerge. In other words, the time it was safe from frost was not determined by the date in an almanac (an abstraction from the way days are lived). Rather the advice given allows for all sorts of complexity. In different microclimates, safe date might vary. But generally the maple tree will tell you when it thinks winter is over.

Monday, March 16, 2009

Actor Networks and Broadband Standards: An Annotated Bibliography

Okay, so here are a few pieces that will end up in the final draft of the literature review:

Tilson, D., Lyytinen, K. (2005). "Making Broadband Wireless Services: An Actor-Network Study of the US Wireless Industry Standard Adoption," Case Western Reserve University, USA . Sprouts: Working Papers on Information Systems, 5(21). http://sprouts.aisnet.org/5-21

NOTE: This is a classic ANT paper: "We adopt actor-network theory to examine how technical and human actors interact to reach agreement on the creation and adoption of wireless services and standards. We present a model in which actors formulate standardization strategies based on their perceptions of existing and future actor-network configurations in light of their interests..." The authors use the evolution of the 3G standard as the case to launch their theoretical claims. They find that the 3G standard was an occasion in which operators (users, I gather) were highly influential".

And here's another:

Yooa, Youngjin , Lyytinena, Kalle and Yang, Heedong (2005). "The role of standards in innovation and diffusion of broadband mobile services: The case of South Korea," The Journal of Strategic Information Systems, 14 (3) Pages 323-353

Similar to the above:

"We explore the evolution of the mobile infrastructure in South Korea through the lens of actor network theory. In particular, we analyze the roles of standards in promoting, enabling and constraining innovation in broadband mobile services over a 10-year period....Our study suggests that successful innovation and diffusion of broadband mobile services are collective achievements and firms need to deploy strategies that enable them to mobilize broad socio-technical networks that include technological, institutional, political and financial resources. At the heart of such strategies, standards play critical roles as they mediate different interests and motivations among participating actors."

This one might be helpful, it's an early effort at understanding rural actor-networks:

Murdoch, Jonathan (2000). "Networks — a new paradigm of rural development?," Journal of Rural Studies, 16(4), October 2000, Pages 407-419.

Abstract: The network concept has become widely utilised in socioeconomic studies of economic life. Following the debates around exogenous and endogenous development, networks may also have particular utility in understanding diverse forms of rural development. This paper assesses whether networks provide a new paradigm of rural development. It seeks to capture a series of differing perspectives on economic networks — including political economy, actor-network theory and theories of innovation and learning — and attempts to show how these perspectives might be applied to different types of rural areas. The paper demarcates two main “bundles” of networks: “vertical” networks — that is, networks that link rural spaces into the agro-food sector — and “horizontal” networks — that is, distributed network forms that link rural spaces into more general and non-agricultural processes of economic change. It is argued that rural development strategies must take heed of network forms in both domains and that rural policy should be recast in network terms.

Friday, March 13, 2009

Stimulus

So here's some grumbling from Joseph Upton:

The government just seems to be paralyzed and they are paralyzing the service providers, who need to drive on with business one way or another...Nothing has been decided of any consequence, just that more talk needs to happen and the public needs to comment on how to divvy up the money.

Quite true, quite true, indeed. Upton says give it up to the bigs:

So, kudos to the big players for keeping America moving. The RLECs need to take note and follow suit, and let the free money chips fall where they fall, so that the networks can begin to move again

So the present problem is indecision. Government is in the way. Move over and let the incumbents take care of the problem. Hmmm. Is that line of thinking particularly new?

And let's consider, shall we, what the FCC, RUS, and NTIA have to make decisions about. Well, for starters, there's, uh, EVERYTHING!!! And these decisions have incredibly important implications. So, hey, if y'all want to make sure we're all on the same page and get it right, take a couple weeks.

As an example, a common question is how, precisely, will we define "underserved"? To date, we've been working with FCC data that provided precious little direction from the government. The FCC gathered availability data of broadband using the standard of 200 kilobytes per second. That means that if you have a connection at that speed (okay, or greater), then you have broadband. And if one person in a zip code can connect to broadband then, by the FCC definition, that particular chunk of area however big was said to have broadband access.

Now the FCC has made the point repeatedly that to set concrete bandwidth benchmarks to a moving target is constantly reify and then hypostatize numbers that are essentially arbitrary (in so many words...).

So, sure, what's in a number? But how about a goal that at least inspires a little imagination?

But perhaps, if we made a policy of collecting better availability data (which in fact the ARRA does), perhaps we could develop more nuanced metrics of underserved that varies by local condition and circumstance. In the near term, for example, we will not blanket the US with fiber. That means that denser areas will for the foreseeable future will have higher bandwidths than rural areas. So underserved in Brooklyn may not mean exactly the same thing as it does in Nome.

Of course, defining these terms is highly political. There is a lot at stake in whether we redefine "broadband" so that we have a tangible policy goal for all this cash. And this instance is merely one incredibly thorny issue that is being, by Upton's reckoning, talked about ad nauseum.

I say let's move. But let's move prudently.

Thursday, March 12, 2009

Girding for the Last War

One Last Comment on the 3/10 Hearing:

We're using the army that fought the last war. In many respects, the last time the federal government was used to stimulate demand for a networked infrastructure was 70 years ago, during the Great Depression, with the creation of the Rural Electrification Administration and the Tennessee Valley Authority. In other words, the federal agencies that are taking the lead on broadband deployment policy are themselves artifacts of political and policy dynamics with long pedigrees.

At a public hearing regarding the broadband infrastructure allocations occasioned by the enactment of the American Recovery and Reinvestment Act (ARRA) on 10 March 2009, officials played their cards quite close to their chests. The three agencies involved in distributing over $7 billion toward increasing broadband access and adoption, themselves haunted by the ghosts of funding cycles past were reticent to overindulge specifics. Rather, they called eagerly upon "traditional and non-traditional stakeholders" in an effort "to ignore no sector of our national life".

I argue that this focus on soliciting public input is a helpful thing. After all, who are the agencies involved? First there is the National Telecommunications and Information Administration (NTIA) within the US Department of Commerce. NTIA didn't come into being until 1991, when Commerce's Office of Telecommunications absorbed the White House Office of Telecommunications Policy. Probably streamlining made sense, but this does give one a sense of the newness of the program. Now NTIA is taking on a big task, dispensing the largest chunk of the broadband dollars to establish the Broadband Technology Opportunities Program (BTOP), an agency that has yet to be created to dispense grant dollars and loans based on criteria that are yet to be determined in pursuing goals that are, in many respects, still under development (but we're hoping for jobs, right?).

The Rural Utilities Service (RUS), within the US Department of Agriculture, will dispense over $2 to provide grants and loans with similarly unspecified criteria is itself a legacy of the Depression. RUS got its start as the Rural Electrification Administration.

So we are being led out of the present malaise by an antique and in many ways obsolete federal structure. RUS is in a strong position to administer funds, of course: it has been funding rural projects for decades. There are many reasons why it makes sense to funnel money through a proven conduit.

But then again what we see at play in the broadband portion of the recovery package is the effect of last year's army. By dent of attempting to address the broadband challenge as one with a rural and an urban component, RUS is enacting what my good buddy Cory Knobel calls "ontic occlusion". The existing bureaucratic structures, since they are overdetermined by the past, are overdetermining what will happen next. It is ever thus, of course, but we may be watching it in action.

That said, one legacy of REA that is being reawakened is its focus on grassroots planning and implementation. More on that later...

Wednesday, March 11, 2009

Broadband Hearing a Dissapointment?

So Business Week was less than thrilled with yesterday's hearing, I hear:

At the first public discussion of the Obama Administration's much heralded broadband plan, government officials offered virtually no hard answers to the hundreds of people who gathered in person and the 2,500 more who participated via live Web video. For almost every substantive question about how the billions will be allocated, officials said they're looking for guidance from the public. Bernadette McGuire-Rivera, NTIA associate administrator, said the government is seeking input on "nearly every facet of the program."

I'd agree that the officials assembled provided very little specific guidance (or even guidelines), but I was hardly surprised. The event went like this:

Gov't Official: We've got money. Here's our timeline for disbursing it. Tell us how we should do that.

Earnest-looking Would-be Applicant: Do you have a preference for multi-juridictional applications?

GO: You tell us. We're looking for input on that. And however many applicants you put together, consider asking for $$ from more than one agency; that's what we're looking for: collaborative grant awarding.

Another EWA: How about platforms? Is there a preference for wifi? Fiber? Fibre? DSL?

GO: Yes, tell us about your preferences.

Another nuther EWA: What about this urban-rural thing? What do suburban providers like me do?

NTIA GO: Let's let the USDA handle that one.

USDA GO: Yes, we handle rural stuff. But urban agriculture is all the rage these days, so game on! Tell us where to draw the line.

And so on.

I'm not surprised in the least that Business Week saw it as a disappointment. But the first step is figuring out how to change the federal approach to this problem. The funding and oversight mechanisms that exist are probably not suited to the task, so the first step is that those agencies learn how to work across their stovepipes. And they communicated an interest in doing just that. I found it encouraging...

Tuesday, March 10, 2009

My summary

I often am asked by urban planners why I'm looking at a rural phenomenon (rural infrastructuration). I'm not, really, of course. I'm interested in how communities (irrespective of space and/or place) plan for broadband expansion. For the most part, the funding mechanisms for these sorts of public planning projects have addressed the rural aspects of the broadband challenge, because RUS has gotten most of the $$.

Without question, what I'm hearing from the interlocutors in today's meeting is a genuine interest in redefining how the broadband challenge is taken on by the federal government. A big funding mechanism is pointed at rural America, but the need is by all means not exclusively a rural one.

More Broadband Stimulus Live Blogging

Great questions from participants, focusing on redefining and allocating spectrum, allowing for in-kind contributions from local and state governments, platform neutrality.

Most of the questions are geared toward criteria. The answers are typically skirting specifics, stressing comments regarding suggestions for criteria. I.e., no one has nailed anything down yet, it appears, at all.

How will we determine effectiveness? Everyone is soliciting input regarding developing metrics. For example, a stated goal is "innovativeness". But how do we measure that, as Seiffert quipped, "Are you three times more innovative than me?"

Question focusing squarely on urban-rural. Fellow from right here in MontCo asked what a suburban wireless provider should do. A disproportionate amount of the broadband $$ are going to rural areas. Deutchman uses this as a plug for FCC's mapping project. In future, funding allocations will be geared toward definitions of underserved vs. unserved (aha, more contested terrain).

More on the Broadband Stimulus

All the interlocutors in the discussion portion are stressing the ways in which the governmental structures that will be at work in implementing the broadband portions of the stimulus.

That is to say, Seiffert keeps stressing that the nature of the collaboration among FCC, the USDA's Rural Utility Service (RUS), and the Commerce Department's National Telecommunications and Information Administration (NTIA) are up for grabs. That is to say, part of the infrastructuring that is occurring centers on the presently inadequate federal structures. Right now, RUS, FCC and NTIA have been funding mechanisms whose missions were suited for the times in which they originated. Broadband is rendering certain hypostasized and often obsolete structures of these previous epochs in stark relief.

It'll be interesting, in other words, to watch and see what, if anything, changes about the governance of broadband. Right now, I'd say follow the money. The NTIA got the biggest piece of the pie. They're steering most of the stimulus dollars (that makes it a hopeful signal that Seiffert is stressing the need for collaborative grants).

But is USDA really where broadband should be driven in the 21st Century?

What I Think of the Broadband Stimulus So Far

Here's what I think makes a lot of sense:

a) NTIA and RUS are obvious funding mechanisms for the investments being made. While the majority of the funds being released are heading to rural American infrastructure development RUS (more on that in a moment), there seems to be some reason for this. RUS has been developing actual, physical networks (power, phones, now broadband) for quite some time. They have institutional capacity.

b) Seiffert makes interesting point: As the ARRA stipulates, the NTIA's mandate is to address questions of "underserved and unserved", while RUS's is to deal with "urban and rural, focusing on the rural". How these distinctions are negotiated (if not politicked) will, in my estimation, be the crux of the success (or otherwise) of this broadband program.

Live Blogging the Public Mtg on BBand Initiatives

You can watch, too!

Meeting will be (or was) webcast.

The presentation began with a welcome from Anna Gomez, acting Administrator of NTIA.

Next Tom Vilsack, Sec of Ag: private sector, all levels of government should work together to find new models for implemention. "It's fair to say that we are not as far along as we need to be."

Key stakes: competitiveness. Creating platform to make the US competitive. "Very important technology that every American needs to have access to".

Next Michael Copps, acting head of the FCC: "at long last a proactive broadband buildup for our country"

Obama feels extending broadband to four corners of this country is key to country's future. For past seven years, FCC has received reassurances regarding pace of telecommunications development. But as recently as last week, US has received news that it continues to fall behind.

"Years of broadband drift and growing digital divides are coming to an end"
"Broadband is the central infrastructure challenge of our time" Then an excursus on previous epochs "eras of private enterprise supported by progressive public policy".
"We lost precious time."

FCC has important role to play. "On April 8, FCC will kick off an open, participatory, public process" to deliver a national broadband strategy within the next year. "Will seek out a range of traditional and non-traditional stakeholders to be heard."

And then, Rick Wade, Senior Advisor to NTIA:

Goals a) Extend broadband across US: spread "pipes" closer to need, allow private sector to serve public via these.
b) Jobs
c) Connect community anchor institutions (libraries, schools, health care centers, etc)
d) Stimulate demand

Develop proposals for funding across sectors, regions, and communities. "Are working to ensure that broadband capacities and needs of local communities are known"

Broadband Internet technology will create jobs both in the near and long term.

Next, programmatic stuff.

a) Dr. Bernadette McGuire-Rivera, Associate Administrator, NTIA Up to $350 million on broadband mapping and planning
Up to $200 million on demand ("sustainable broadband planning")
Just about anyone who meets the criteria can apply (i.e., all levels of gov't, private sector, non-profits, etc).

b) David Villano, Assistant Administrator for Telecommunication Programs
RUS has got over $2 billion in budget authority, meaning it can be deployed as grants or loans. Will thus attempt to use large portion to leverage additional funds.
Purpose of RUS throughout its history has been to spur economic activity and development.
Focuses on rural populations.
RUS is well-equipped and experienced for this sort of budget allocation. USDA Rural Development

c) Scott M. Deutchman, Acting Senior Legal Advisor to Acting Chairman Copps, FCC

d) Mark Seifert, Senior Advisor to NTIA lead roundtable. Made an appeal for rapid efforts to define what should determine what "best" proposals are, should focus on, etc. Moreover, he asked for recommendations regarding making the collaboration among FCC, NTIA and RUS work.

QUESTIONS:

a) Are multi-jurisdictions or groups of organizations going to receive priority? Answer, no preference, but multiple applicants are encouraged. Then, Villano suggested that grants across agencies are specifically encouraged.

b) Will number of towers and/or wired buildings information be included in the mapping? Answer from Deutchman: not ready to explain specifics, but goal will be granularity. Seiffert then suggested that suggestions for how best to leverage mapping technologies would be encouraged (hmmm. grant idea).